Archive for July, 2010
Recently a news report garnered some attention because it reported that today’s children are becoming less creative for the first time in recent history. There are many potential villains — television is the standard entity to blame, but if it were the cause lack of creativity would have come about long ago. Game stations with structured games are listed as a problem, as well as ‘no child left behind’ with a focus on teaching to the test rather than teaching to think. In general, I think two things push children to a less creative point: a) too much structure instead of free play, and b) adults so worried that their children ‘get ahead’ and ‘succeed’ that they stifle play. Now they’re selling products to teach babies to start reading at six months. Parents think this will help their child become especially intelligent and successful so they buy the product. But really – who needs to read at six months?
One thing good for children is if they go to a quality child care center that emphasizes play (not some kind of pre-school boot camp). Now, let me preface my argument with a caveat — if parents can afford to have one stay home, and if there are numerous other kids available to play and a wide range of activities and learning, then that’s just as good if not better than day care. However, if it’s just the parent and child/children, without play and consistent interaction with other kids, then I suspect the child would be better off at a good day care center.
This goes in the face of a lot of what our society has been used to. Moms used to stay home with kids, and it was with reluctance and a sense of guilt that mothers would give their children to someone else for care. Now mothers and fathers tend to share household duties, including raising kids, and often both work. I know in our household gender is irrelevant to who cleans, cares for the kids, or does the chores. Our boys do not think of any kind of job as more appropriate to women then men.
But today’s American family structure is a weird modern non-traditional arrangement. In the past families were clan like, with numerous brothers, sisters, cousins, aunts, uncles and other relatives in abundance. Children would congregate and play, probably watched by a team of women from the village while others would work. Traditionally women had a lot of work too, so it’s likely that in much of world it was a small group of women specializing in watching the kids at play. In other words, traditional life had the functional equivalent of day care centers.
Kids would learn to interact with other kids, and develop a sense of social connectedness. They were not part of one nuclear family isolated in a large house and white picket fence. Play did not depend upon complicated negotiations for play dates or walks to the local play ground. Play was on going, connected with learning. Of course, traditional societies defined learning differently — you learned the customs of the people and various skills. That kind of learning is easy through play.
Now a days, too many of us are isolated from family and even close friends. To compensate for lack of activities, the tendency is to structure activities for children. Lessons, organizations like boy scouts, sports, and other events become part of a structured routine where kids go from one place to another, often lacking time for real creative play. No wonder at home it’s relaxing for them to play video games or surf the web.
As kids get older, we need to recognize the danger of stress. Stress is something adults deal with on a daily basis — stress at work, in relationships, with children, etc. It’s well known that stress causes weight gain, poor health, and contributes to psychological ills like depression and anxiety. In adults, it’s a killer. In children it damages development. When adults over-schedule children, put pressure on them to “be the best” and try to shape them through too many lessons, activities and demands, this increases stress and pushes children into a “follow the routine” mood. That not only stifles creativity, but takes away time for the fun creative free play children need.
In schools now thanks to programs like “no child left behind” stress increases to study for the test, and learn quickly. Parents often feed that by wanting their children on the top. This stresses out teachers (that’s been widely documented — no child left behind has been dubbed ‘no teacher left standing’) and that isn’t good for their pupils. Moreover, learning becomes less about creative problem solving and truly learning to navigate the world, but rote memorization and preparation for questions expected on exams. This might yield students with a bit more book knowledge, but less creativity and problem solving capacity.
Don’t get me wrong — as a college teacher I see students come into a World Politics class with no understanding of the world and no foreign language background. That should be unacceptable in the current era, and I often feel like I need to teach a lot of high school level world history just to get them to understand the concepts of modern international relations study. I’m all for improving quality of writing and knowledge, and holding teachers accountable. But that should not be done on the backs of pupils.
Still, I can accept a need for a bit of that in high schools. But even third and forth graders find themselves being pushed to fulfill particular academic expectations. That isn’t right. Yes, the schools and teachers should be held accountable, but the stress and pressure should not be felt by the pupils. Learning through play and creative fun is possible — when my seven year old tears through “Diary of a Wimpy Kid” in one day, I know he’s not being pushed. Math games can be fun. There are ways to assess without big standardized pressure filled tests.
So when people look at children for the first time showing less creativity, I say it’s because we’ve pushed one of the greatest ills of our modern age — stress and all its consequences — on to children, harming their development and stifling creativity. Especially until age 12, play is more important than anything else — and learning through play is the best way, it enhances both knowledge and creativity. If learning isn’t fun, who would want to be a life long learner?
To hear everyone from President Obama to GOP Senators talk, the leaking of documents about US activity (as well as Pakistani and Afghan) during the on going Afghan war is horrible. But while they complain that individuals or operations may be put in jeopardy, that’s not really the cause of their ire. The real reason the US military is “disgusted” with the leaks is that it shows the truth of US operations in Afghanistan, and the truth is not pretty.
For my part, I applaud the leaks and the leakers, and believe that secrecy about what is done in our name is the most dangerous thing for a democracy. Short term secrecy is necessary at times, but clearly the documents detail aspects of the war that have been on going, and which we should know about. When a government fears the truth, then it’s more important than ever to get the truth out. Video of slaughters of civilians by US soldiers, documents about civilian deaths and cover ups will no doubt be a source of information and discussion in coming months, both here and abroad. Arguably this hurts the US military and embarrasses policy makers (even if most leaks involve information from the last Administration). Nonetheless, what’s good for the government isn’t necessarily what’s good for the country.
Americans often believe that what our military is doing overseas is always good and noble. That’s because most military personnel are good and noble. Yet war changes people, as recent statistics about high rates of mental illness in war vets, broken families, and economic distress indicate. Civilians in both countries have suffered the most deaths, and their whole infrastructure and way of life has been altered. In Afghanistan this started with the Soviet invasion of 1979 and hasn’t really ever let up.
While it is easy to condemn individual soldiers for civilian murders, slaughters, and other incidents, that is misplaced blame. You put a young man or woman in that kind of stress, often with multiple deployments, having buddies killed, surrounded by death constantly, and afraid what could come next, the ability of any human to stay sane is threatened. Most manage to handle it, but everyone has a limit. Some due to the intensity of their experiences or their own personality hit that limit earlier than others. Some process the experience effectively. But you can’t have a war like this without war crimes.
In our society we like to think in terms of individual responsibility. That comes in handy for the government or military leaders who can prosecute soldiers for violating the stated orders — orders clearly prohibit the kinds of acts that are getting reported. A young eager 19 year old ready to sacrifice for his country ends up perhaps court-martialed for some act against civilians. Focusing on what was done to him by placing him in such circumstances and stressing his young mind is not seen as a legitimate defense. We like to think of our soldiers as heroic by nature, those who violate the rules are ‘bad apples,’ a disgrace to our otherwise gallant fighters.
Yet that is a very convenient excuse for Presidents and Generals. They don’t have to endure the trauma, and Presidents like Obama and Bush don’t really know what life in war is like. Indeed, it’s a known fact that civilian politicians are more likely to choose military action than former military personnel — to civilians like me, war is an abstraction. Yet by reading people like Chris Hedges or other accounts of what war is like, it is possible to get a sense of what this does to people. To look at statistics about broken families and mental illness tells a story as well. Moreover, we know from past experience that despite all the hero’s welcomes and flowery rhetoric, it’s very likely that today’s war vets will be forgotten, experiencing higher unemployment, homelessness and poverty than others.
The damage done by our government to those sworn to protect our way of life is tremendous. As well intentioned as overthrowing the Taliban may have been, how understandable the rage at Bin Laden and desire to strike back certainly was, we’ve now been through nearly nine years of war that has veered so far from that initial anti-terrorist strike that it’s hard to even explain what we’re fighting for. It’s not to get Bin Laden. It’s not even to stop terrorism. It’s to have a way to leave while saving face, something Nixon called ‘peace with honor.’ Yet that honor is abstract, it’s simply a desire to avoid too much embarrassment. Is that really worth destroying lives?
I’m not trying to downplay the civilian suffering over there by focusing on what’s done to our soldiers; rather, I’m arguing that the only way to really think clearly about what we should do in Afghanistan and Iraq is to have the public know the reality of what the war is like, the actions being done in our name, and the impact this is having on those we send over there to fight. Only by having the “secrets” of the war revealed can we truly understand the nature of the acts being undertaken in our name. Only by deflating the myth of the ‘heroic American’ honorably defending democracy’ can we see how politicians use that myth to hide their true motives.
President Obama, pragmatist that he is, won’t do what I think he should do. He won’t welcome the release and call for a national conversation on the reality of the war. He won’t talk to the country about the details of the material, openly discussing issues that embarrass him or the country. That’s OK — his pragmatism probably has allowed him to accomplish more than people thought possible. But the wiki-leaks may put enough pressure on the White House and government to shift the terrain a little, and make it a pragmatic necessity to fundamentally rethink US policy. And perhaps Obama will find the courage to point the blame not at the soldiers who crack, but at the policy makers who put young people in such horrific situations. Perhaps we might even rethink the militarism of our policies; do our actions in Afghanistan and Iraq really reflect who we are as a nation?
Perhaps the most profound effect of the information revolution is the fact that information is now available at an instant, almost anywhere. Often it is trivial, but consider one example. We saw the musical Chicago at the Maine State Music Theater this weekend and read that the woman who played Velma had appeared on Desperate Housewives. There was no mention of who she played or when. After less than a minute of searching on the Droid (mobile phone with internet access) my wife found out it was season one, episode twenty, and she played a salesperson. We were in the car, having just left the theater.
Almost always if a question arises, an answer can be found fast. Information that at one point would require trips to the library, inquiries at town hall, or even travel abroad now can be found quickly. I had a research grant to live a year in Germany to study party and media archives in Bonn for my dissertation in 1991-92. Now I could do that kind of research from my office, printing and down loading files that twenty years ago I had to to Europe, find on microfiche and then copy.
What does this all mean? What will be the cultural, political and social impact of being able to find whatever information we want instantaneously?
There are a number of positive implications that are pretty obvious. I’ve always believed that the more knowledge you have about something (an event, a country to be visited, a play, etc.) the more valuable the experience. We can learn about the context of just about every thing we do, and thereby get more out of what we do. If I go see a play I’ve never heard of, I routinely research it and afterwards learn more about the actors, writer, and meaning of the play. When traveling it’s easier to learn the historical and cultural context of the places visited. Used properly, instant information is also a way to instantly enrich ones’ experience of life.
Problem solving becomes easier. No matter what realm the problem is in, you can find a solution on line, often in the form of others posing that problem and getting answers on line. If you want to buy a product there are on line reviews and ways to compare. Understanding world events is easier; you do not have to rely on Fox or CNN, you can read news sources from around the world and from a variety of perspectives.
Looked at from the perspective of history, the rise of the internet and instant information is of such profound significance as the development of the printing press in 1439. The printing press changed how people thought, communicated and ultimately helped usher in a revolution in politics and society that allowed the Europeans to conquer the world. Sure, the steam engine, science, and enlightenment thought were more directly causal, but without the printing press ideas could not have spread so fast, changing the entire face of the planet.
There is no reason to doubt that our current information revolution will have a similar impact. Everything about our world is set to change. When the generation growing up now reaches retirement age, their world will likely be organized much differently. Even the sovereign state could disappear, as well as the very way of life we associate with modern America. Because humans avoid cognitive dissonance and tend to expect things to continue as they have been, the inevitability of change is often unrecognized, and even denied as it occurs. This means that the way people think about the world changes only in response to events, usually involving generational change. A new generation raised in different circumstances embraces what the older generation rejected. However, since the older generation has political power, they often react to changes in a way that risks violence and instability — they try to force an old way of thinking on new circumstances. Will the same thing happen here?
Perhaps not. My generation, now the ‘older in power’ generation (Obama is my age, after all) is used to change. We’ve lived from pre-cable TV and vinyl record albums to I-pods and hulu.com. We’ve lived the 1980 Rush lyric, “changes aren’t permanent, but change is.” Hopefully, we’ll prove adept at navigating this new era — giving the debt we’re leaving the next generation, we owe them at least that! But what challenges might arise from “instant information?”
1) The demise of the sovereign state. As interdependence grows, and the US economy needs oil from the Mideast, cooperation from China and other states to whom we are in debt, and as cultures clash across borders, sovereignty as we’ve known it will be challenged. We’ll still have legal entities called states, but the idea of complete independence and ‘splendid isolation’ are already gone. That guarantees a nationalist backlash and a bout of xenophobia and fear, whether it’s about multilateral treaties, immigration, or relations with others. We have to navigate this change safely.
2) Terrorism and WMD. The good news is that al qaeda and Islamic extremists are being rejected by most of the Muslim world. They may not like the US, but they don’t want the anti-modern spartan life demanded by the extremists. They don’t want war and death, they want their own culture to undergo peaceful change. The bad news is that terrorism and WMD levels the playing field, and renders a country like the US vulnerable in a way not previously experienced (before 9-11-01). The next uprising may be based on economic rebellion from Africa, or perhaps an anti-Yankee wave from Latin America. We have to not over-react, and pro-actively deal with the roots of such potential uprisings, not waiting until it’s too late. Terror networks know how to use the ‘instant information’ to design weapons, plan attacks and communicate. They cannot be defeated by military means alone, but by making their message unpopular in their own cultures.
3) Lack of Communication. This seems odd, but think of it as the equivalent of yellow journalism with the printing press. Blogs, diverse news sources, and a lack of standards means people tend to be drawn to messages that fit what they already believe, and which often create a sense of righteousness against the “other side.” Rather than people with different views communicating, they might simply fight — like the tea party vs. the left. If that stymies political adaption to change, it could create paralysis when we need action. We need to talk about big issues across party and ideological lines if we are to adapt to these transformations.
We live in very interesting times, and the generation of young people now face unique and difficult challenges in the years ahead. It’s a bit scary, but also very exciting.
A friend of mine posted the following link to a yahoo finance story on a facebook:
Here’s some startling information from that website. This is worse than the last time I looked into these issues, about ten years ago:
• 83 percent of all U.S. stocks are in the hands of 1 percent of the people.
• 61 percent of Americans “always or usually” live paycheck to paycheck, which was up from 49 percent in 2008 and 43 percent in 2007.
• 66 percent of the income growth between 2001 and 2007 went to the top 1% of all Americans.
• 36 percent of Americans say that they don’t contribute anything to retirement savings.
• A staggering 43 percent of Americans have less than $10,000 saved up for retirement.
• 24 percent of American workers say that they have postponed their planned retirement age in the past year.
• Over 1.4 million Americans filed for personal bankruptcy in 2009, which represented a 32 percent increase over 2008.
• Only the top 5 percent of U.S. households have earned enough additional income to match the rise in housing costs since 1975.
• For the first time in U.S. history, banks own a greater share of residential housing net worth in the United States than all individual Americans put together.
• In 1950, the ratio of the average executive’s paycheck to the average worker’s paycheck was about 30 to 1. Since the year 2000, that ratio has exploded to between 300 to 500 to one.
• As of 2007, the bottom 80 percent of American households held about 7% of the liquid financial assets.
• The bottom 50 percent of income earners in the United States now collectively own less than 1 percent of the nation’s wealth.
• Average Wall Street bonuses for 2009 were up 17 percent when compared with 2008.
• In the United States, the average federal worker now earns 60% MORE than the average worker in the private sector.
• The top 1 percent of U.S. households own nearly twice as much of America’s corporate wealth as they did just 15 years ago.
• In America today, the average time needed to find a job has risen to a record 35.2 weeks.
• More than 40 percent of Americans who actually are employed are now working in service jobs, which are often very low paying.
• or the first time in U.S. history, more than 40 million Americans are on food stamps, and the U.S. Department of Agriculture projects that number will go up to 43 million Americans in 2011.
• This is what American workers now must compete against: in China a garment worker makes approximately 86 cents an hour and in Cambodia a garment worker makes approximately 22 cents an hour.
• Approximately 21 percent of all children in the United States are living below the poverty line in 2010 – the highest rate in 20 years.
• Despite the financial crisis, the number of millionaires in the United States rose a whopping 16 percent to 7.8 million in 2009.
• The top 10 percent of Americans now earn around 50 percent of our national income.
These are almost third world country levels. Due to horrible policy choices starting at the time of the last recession (1980-83), we’ve been living on decreasing production and higher debt. The very wealthy have managed to benefit, but the middle class is being decimated. When people talk about not letting the Bush tax cuts expire because the “wealthy create jobs,” that’s simply not true. They haven’t been creating many jobs, and those they create usually pay very poorly. Moreover, if you want to stimulate the economy you do it through direct spending, not tax policy. Much of the money now to be taxed (only the very wealthy) probably doesn’t go into stimulating the economy, but rather to consumption of foreign made goods.
Finally, the Obama tax cuts to 95% of the population certainly were needed — but ultimately if the decline of the middle class isn’t halted, the days of American prosperity and the American dream will be over. We devolve into an increasingly bifurcated society with on going economic malaise. This is perhaps the most dangerous issue facing the country. It creates a far greater risk for the state of the US than any foreign foe or terror network.
Samuel Huntington’s Clash of Civilization was panned in the world of academic political correctness as being full of stereotypes and ethnocentric ideals. Some thought his book was a rationalization for expansion of American power, others felt he had simplistic taxonomies with insensitivity to internal dynamics, class conflict, and the impact of imperialism. Yet Huntington, who died Christmas eve in 2008 at the age of 81, may have been one of the most insightful political scientists of the last century.
To be sure, his Harvard style of grand theorizing and limited methodological rigor was a turn off for the new generation of social scientists who became more comfortable with statistical analyses than big ideas. To them Huntington was engaged in speculative theory, gathering information and organizing it around his own subjective reflection on the world. That was old fashioned, the new generation thought social science should be truly scientific, dividing problems into smaller themes to investigate with clear methodology. Not only that, but with the advent of computer technology such “rigor” became easy, especially if you worked at an institution with the resources to buy data bases and sophisticated software.
Yet for all the journal articles published and sophisticated models constructed, the twin problems of complexity and perspective remain vexing. The problem of complexity means that in a multi-causal reality being able to identify and measure the impact of single factors is almost impossible in most cases. The problem of perspective means that all aspects of social reality can be interpreted in different ways, depending on the perspective taken. Moreover, with the world so complex, the choice of what to measure and how to interpret becomes subject to significant bias from the researcher. The quantitative analysis that yields a statistical result to a test of an hypothesis may be riddled with bias and error, even if it appears rigorous and sound.
Huntington had two ideas that were profound. Others shared these ideas, but Huntington made them mainstream. In 1968 in his book Political Order in Changing Societies he argued that the driving theory of US foreign policy in the third world, modernization theory, was wrong. That theory, attributed primarily to Walt Rustow, was that the problem in the post-colonial world was that societies were still traditional, not having gone through the process of modernization. Modernization entails having rational thought replace tradition, secular ideas replace religion, and materialism trump symbolic ritual. Modernism is progressive, while traditionalism is conservative. Huntington pointed out what was becoming obvious: unless modernization is accompanied by the growth of effective political institutions and rule of law, the result will be corruption, disorder and perhaps violence. The market alone will not bring stability or prosperity. Few doubt that now.
In 1993 he published a provocative article in Foreign Affairs called “The Clash of Civilizations?” He turned this into a book published in 1996. During that time period, the debate in the US was between those who thought the US should expand and try to shape the post-Cold War world into one conducive to US values against those who thought western imperialism entailed big corporations trying to exploit the third world and dominate. The former dismissed cultural differences as arbitrary — everyone wants to be free, the West simply discovered the “right” path first. The latter dismissed cultural differences as secondary to economic exploitation. Meanwhile the “scientific” study of political science shied away from cultural factors as impossible to measure and compare. Culture for them seemed a mushy kind of explanation people give when they don’t understand why differences occur. Their emphasis remained on measurable factors such as economic output, public opinion, and various political variables.
Huntington argued that the great ideological conflict of the 20th century was ending, and ending at a time when a new process, globalization, was progressing. Similar to an argument put forth in 1994 by Benjamin Barber in Jihad vs. McWorld, Huntington noted that this expansion of the world economy would lead to increasing contacts and interactions across borders that could threaten cultural values and identities in a way not experienced since perhaps the Roman Empire.
Barber saw this a battle between forces defending local custom (jihad) and efforts to centralize and standardize the world economy (McWorld). As globalization forced standardization — Barber was one of the first to really investigate the role of Hollywood movies to impact other cultures — people would rebel to try to reclaim and protect their traditions and identity. Balancing a respect for culture and diversity with the demands of an increasingly global economy would be key to avoiding future violence and terrorism.
Huntington took this a step farther and divided the world into different civilizations. Here is where he got in some trouble. Dividing the world into clear civilizations is tricky. Huntington’s civilizations included: Western, Latin American, Islamic, Sinic (Chinese), Hindu, Orthodox, Japanese, and African. Africa was a later edition; at first Huntington left it out, not being able to point to a clear cultural history as colonialism had wiped out so much of what normally defines a culture. Yet saying Africa was without a civilization appeared to many as racist and a sign of Huntington’s ethnocentrism. On top of that, internal differences (e.g., the Arab world vs. Indonesia in the Islamic civilization) and cross-civilizational links (Latin America with Spain and aspects of what is considered the ‘West’) made it easy to take pot shots at Huntington. He seemed engaged in stereotyping, and worse ignoring the leveling effect of globalization (either for good if you’re Thomas Friedman, or bad if you’re Noam Chomsky) on all cultures.
Yet, while acknowledging the imperfection of such a classification, Huntington persuasively argued that there are distinct cultural values and ideals crafted through history. It’s an arrogance of the West to assume that we have the “right’ culture and others naturally will drift our way. That assumption led to failure in Iraq, as the Bush Administration ignored culture in favor of an ideological belief in the universality of American values. It’s an arrogance of the Left to label acknowledgment of real cultural differences and distinct civilizations as mere stereotyping and prejudice. Culture is real, culture matters and in an era of new media and instantaneous global communication, cultural clashes are inevitable. That is a key factor which motivated the terrorists on 9-11! Quibble with Huntington’s taxonomy, but his core ideas appear accurate.
So if, as Barber claims, we need to balance the tendencies of ‘McWorld” vs. “jihad,” and if future conflict will be as much about cultural values colliding as about money, power or ideology, how should the US react? I think this requires a complete rethinking of our foreign policy and attitude about global affairs. I also think we need to be ready for the impact of the information revolution on our own society and culture — we’re not immune from dramatic change. I’ll write more on all this in future posts!
In various guises and in various forums I’ve had numerous discussions about political and human rights over the years. It is one of those inherently contestable concepts: are there fundamental transcendent human rights that all people have, or are rights there because we assert them, either as an individual or through a government?
I would argue that only one natural human right exists: An individual has a right to do whatever he or she wants, constrained only by ones’ capabilities, conditions, and the consequences of acting.
There is no way to prove that any rights exist beyond that one. What we have beyond that ought statements.
Those asserting the existence of rights (e.g., the right to be free and not the property of another human) are not positing actual transcendent rights in nature. Obviously, throughout history, humans have been treated as property. There is nothing in our nature which prevents that, slavery is probably as old as human history. Instead, we believe an ought statement: humans should not deny the freedom of others by holding them as property.
This right was created by humans, overturning millennia of what was seen as normal human behavior. The same is true about equal rights for women and notions of private property.
Recognizing these rights as social constructs — things we humans have created because we have developed ethical/moral belief systems requiring such rights — does not denigrate them. One gets the impression that if rights are constructed they don’t really exist. The reality is that if rights are not constructed, then they only exist if someone asserts them. Often people aren’t strong enough to assert such a right, so the slave remains a slave. When we have constructed a strong set of rights, we should be proud of the accomplishment. These are our rights, we built them!
People who assert one true set of “natural rights” can be viewed as trying to stack the deck in their favor. You see, if you assert some rights as the only “true” rights, then you do not have to delve into the complex themes, unclear causality and different world views that exist. You simply proclaim yourself right, defend that proclamation, and see those who hold different views as wrong. Instead of conversation about what ought statements are worthy of codifying as a right, it becomes akin to holy war. Yet I defy anyone to prove any right exists beyond the one natural right I list above (and I invite anyone to disprove the existence of that one right). Rights don’t exist without being constructed or asserted.
OK, but what about the ought statement. Is there a way to determine the truth of a particular ought statement vs. another? Yes, but no definitive way. Depending on core assumptions and philosophical values, you may end up saying universal health care ought to be a right, or you may end up saying taxation of any sorts violates human rights and ought not exist.
Therein lies the beauty of our capacity to construct rights. We have the ability to reflect on history, explore different arguments and ideologies, and even test them out. If there somehow is one correct list of ought statements defining what human rights should be respected, we’ll no doubt learn that through the course of history — acting, learning our capabilities, understanding the consequences of action. But it could be that there are so many varied ways of acting or experiencing consequences of the same actions that there is no one clear set of proper ought statements.
Historically, humans have seemed to need a veneer of authority to put forth moral ideas. It had to come from God, or the King. The idea we are free to construct moral codes probably scared people. Well, it is scary. That freedom to construct moral codes is one reason we’ve had so much bloodshed, warfare, holocausts and horror in our history. One can understand someone wanting an authority to just banish all that bad stuff with a set of rights or moral laws that we all follow. But clearly, that hasn’t work. Why not take responsibility for the world we create, and the moral codes and systems of rights we assert and construct? Isn’t that taking on more freedom and leaving less to authoritative fiat?
We’re human. We are in nature free to act as we want, limited only by our capabilities, conditions, and the consequences of our actions. So let’s understand our capacities and limits, reflect on our conditions, and study the consequences of action. Only then can we make a more informed choice in how to construct rights and moral codes. Also, let’s recognize that empathy is important. An ideology might cause a pro-free market person to rationalize the sweat shop life style of a third world worker, or might cause a committed communist to rationalize sending people to the Gulag. But if you really understand the life conditions of that worker, or life in the Gulag, then empathy about the conditions within which people can find themselves might help us construct our world.
Lastly, I think focusing on a small number of broadly defined rights (life, liberty and property, for example) can be misleading. Those things are so broad, and their nature changes in context. Does the Nazi well owner of the right to refuse water to a Jewish man dying of thirst? Perhaps ideas of rights need to be connected more with context. We do it already. War is murder, but in a different context. The context changes the act, the context changes what is regarded as a right.
So arguments about “is there a right to health care” or “is there a right to own guns” are literally non-sensical if one thinks about broad transcendent rights. They only make sense as ought statements (should we construct a right to health care or not — what are the conditions and what will be the consequences of such a construction) or as statements reflecting already constructed systems of rights (does the 2nd amendment guarantee the right to have any sort of firearm one wants?)
People want to look to God, ideology, or some philosophical system for a way to assert their perspective with a veneer of objective authority. I think that’s an illusion. We’re free to construct worlds, and we can’t do it alone. That’s why individuals can only exist as part of a society — even hermits act and think on the basis of how they were socialized — individuals in pure isolation in their formative years die. Individuals and societies co-exist in a symbiotic relationship, and constructing moral codes and systems of rights define how that is done. Because of our one natural right we are world builders.
While we were in South Dakota, we had a cat sitter. This person made sure the cat was cared for, and over the course of the ten days managed to coax out our skittish Ruby, a one year old kalikow cat (OK, it’s calico, but our University President, Theo Kalikow, pronounces her name the same way) who tends to avoid human contact, and to whom I’ve grown closer in the month and a half since we got her from the shelter. She still wouldn’t come up to be petted, but when she saw me she’d rush to her petting perch, knowing I’d come and stroke her.
As our sitter was packing the car, unbeknownst to her, Ruby snuck out. She thought Ruby was inside, and when we got home late at night, we went looking for her in the basement. We gave up, figuring Ruby was hiding, a bit scared. Alas, the next day it was clear that Ruby had escaped.
I’m probably the most upset — we finally had a cat I wasn’t allergic to, and I’d formed a bond with her. I also liked that she wasn’t overly aggressive in demanding affection. I got used to having her around, and genuinely was looking forward to seeing her again. On the 11th I saw Ruby, down about 100 meters from our house on the side of the dirt road. Alas, I was driving, and when I slowed down to call to her, she zipped into the woods. This Sunday night I saw her again. I got out of the call and called her name. She was running away, but stopped, I could tell she recognized my voice. I called again, but the car head lights probably made me look scary and she took off. I searched, called, but no avail. Still, I know she’s alive.
The cat sitter is contrite and apologetic. In fact, she’s shocked this happened, and offered to repay all the money we paid her, plus the cost of the cat. I told her no. I think she is really good at pet sitting, but humans make mistakes. Given how skittish Ruby is — and how she usually didn’t come out during the day — she just figured it was no big deal to leave the door open while loading the car. I appreciated her apology and willingness to return the money, but it was an honest mistake.
We did leave food out for Ruby, but aren’t sure that’s a good idea. The food got eaten, but in a way that didn’t seem like Ruby. She scatters some food and leaves some — once it was like that, but other times the food was finished off and the bowl overturned. We have skunks, raccoons and other cats roaming these woods (as well as deer, Moose, and it’s been reported a bear or two has been sighted). I don’t want to draw skunks!
She was “rescued from the wild,” injured, when the animal shelter got her. I’m sure she knows how to survive out there, and there are a lot of mice in these parts to keep her happy. It’s the height of summer, we live in the woods, things are lush, green and alive. Alas, she doesn’t have a flea/tic collar since we fancied her an indoor cat. So if when the weather gets cold she returns, we may need to make sure she’s cleaned up! For now, she’s in the wild, enjoying life, and I guess that’s OK. I saw her Sunday night, so I know she’s alive. But after our last cat, Whiskers, died a year ago (just two months after we got him from the shelter), it’s frustrating to have Ruby take off like this. But she’s a cat and she made her choice. Hopefully, she’ll choose sometime to return.