Archive for June, 2011
Social Welfare Programs Should Liberate
Modern social welfare programs began under Bismarck’s conservative German government as a means of undercutting the growing socialist movement. If workers saw that the state would help alleviate poverty and provide needed services, then the workers would not support Marxism. Moreover, conservatives tend to view the state as a “organic entity,” a collective bound together as a community. That means that it is in the interest of the state to make sure that people aren’t suffering or being exploited.
Other reasons for social welfare programs include ending poverty and suffering as an end in and of itself. This was behind Johnson’s “Great Society” programs, most of which were actually implemented in the Nixon and Ford administrations. In Europe, the left and right reached the great compromise, whereby the left would accept market capitalism in exchange for the right accepting that the state guarantee health care, pensions and a safety net. This led to unprecedented peace and prosperity for Europe, settling past ideological battles between left and right.
However, as demographic change makes most of these systems unviable in the long run, and high debt forces reconsideration of how governments spend money, it is time to rethink the purpose behind social welfare spending.
Bismarck’s goal of stability remains. Societies that see vast gaps between the rich and poor tends towards either authoritarianism (as the rich want to protect their share) or revolt (as the poor get angry about class difference). The “great compromise” was a brilliant solution; putting that at risk would threaten the core stability of western civilization.
Goals of ending poverty or equalizing wealth are suspect, in part because they are too vaguely defined. You could end poverty by simply transfering wealth to the poor, but what good does that actually do for the people themselves? It gives them more money, and may help them feed their families, but the goal is at too high a level of analysis. We should focus on social welfare programs for the sake of the people who are on them.
In the industrialized world people are generally responsible for their success in life. It’s a lesson I try to teach my children and my students: don’t blame others for the world you create for yourself, take responsibility. You can’t choose your circumstances, but you can take action and make choices to change them. Whining about injustice only increases the total whine volume. Claim your life! It’s yours to make, if you’re in college you have every opportunity to succeed, take control! That is a liberating experience, it’s freeing oneself from being confined by the shackles of low expectations and low self-esteem.
But what happens when we just give money to people? I’m reminded of the scene in Syriana where the oil tycoon talks about the money he’s made and how it’ll “probably ruin my children.” When you look at the children of the very wealthy, they have as many if not more problems than others, despite the wealth. When young athletes or film stars suddenly get large amounts of money, it often creates more problems for them than solutions. Some can handle it, many can’t. The reason: money itself does not help a person understand to how to live life.
Many poor are stuck in a situation where they do not believe they can take control of their lives, they don’t see opportunities, they haven’t had the chance to handle the risks through which we build self-esteem. If you just give them money, there is a real danger they’ll become addicts. Not drug addicts, but rather addicted to ‘free money.’ That will feed into a sense of victimization and entitlement. Rather than taking control of their lives, they’ll lose control of their lives and teach those lessons to their children who will start out psychologically unprepared for the demands of the real world.
So I would restate the goal of social welfare programs as being one of liberation. I do not mean this in a Marxian sense of ending exploitation. Rather, a person should be able to develop the confidence to grab opportunities and take control of his or her life. It should liberate a person to rise out of their circumstances, to provide positive role models to their children and community, and ultimately create a sustainable growing economy in communities once suffering economic stagnation.
Unlike some on the right, who take the approach that “if you cut the money they’ll be forced to pull themselves up by their bootstraps,” I do not believe just ending social welfare programs can work, nor do I think private donations would adequately do the trick. So I reject the dichotomy that says “either you give away money to the poor or you don’t.” Rather, we have to figure out ways to design a system that creates opportunities, works with communities, and helps people empower themselves.
Education must be part of this (and I think access to affordable health care is necessary too). People don’t automatically have confidence and self-esteem. Self-esteem cannot be gained just by being praised — it comes from learning one has the capacity to overcome obstacles. In fact, I’d say you can’t really gain self-esteem unless you risk failure and even have to overcome failure. In that sense, education has to be combined with opportunity.
But this needs to be more than job training or even workfare. To really function and become sustainable, opportunity has to connect with community. In that sense one of the most important roles is that of a community organizer, someone who can come in and bring a community together around opportunities for growth. Receiving any social welfare help should be linked to participation in some kind of community venture.
As communities arise, they will provide the opportunities and feedback for people to build confidence, have higher self- and other-expectations, and develop real self-esteem. They will take pride in what they build, and ultimately that will lead to them taking control of their own individual lives and recognizing that they have the power to make choices that will make it much less likely they’ll need assistance. In a recession no one is immune from some hard times, but ultimately the key not only to cutting social welfare spending but also regaining economic momentum and growth is to have people in society making good choices and wanting to be productive.
Community and opportunity based social welfare programs could succeed where bureaucratic programs fail. A community organizer in the field working with people is far more likely to help than a welfare caseworker sitting in an office asking questions and making sure the proper forms are filled out. And given the economic and budgetary crunch, now is the time to reassess our approach.
Is War Natural?
Posted by Scott Erb in Children, Children and war, Education, Political Science, War, World Affairs on June 29, 2011
My second on line course is “War and Peace,” looking at theories of why people go to war, and how peace can be built. I am by principle opposed to military action and war in most cases. The costs of war in human lives, social stability, and the psychological state of both soldiers and the populations involved is immense. Most of the time wars could be avoided through better communication, diplomacy and clear signals of intent. I’m not foolish enough to think humanity is at a point where war can simply be abolished — but I also don’t think war is natural.
My eight year old son is right now fascinated by war. He draws detailed pictures of various weapons and scenes, including a soldier with some kind of missile launcher destroying the Eiffel tower saying “USA Rocks!” While drawing it he asked me what the German word for their army was, so I told him “Bundeswehr.” He wrote that in front of guys defending the Eiffel tower. (The Eurocorps, perhaps?) He later had the same kind of seen with Big Ben, with the clock falling on the defending forces below.
I have friends who would be shocked if their children drew those kinds of pictures, but he’s eight — and he does know the difference between imagination and reality. One time when Ryan showed me a picture of some dead soldiers I said, “gee, I bet their dads and moms aren’t happy.” He stopped a second and said, “Dad, it’s just a picture, it’s imaginary, not real.” Anyway, I’m not going to stifle his creativity because of adult ideas of political correctness. And it was nice that both the Eiffel Tower and Big Ben were in Cars 2 that we saw this weekend.
So, besides the fact that I’m not an overly protective or controlling father, what does it mean that my son gets enamored with the idea of war and weapons? I think culturally it shows how we learn to see war, weaponry and conflict. It is cool, exciting, and one can have victory! The bad guys are defeated. Death is sanitary. “It’s imaginary, not real.” The ideals of honor, heroism and strength become part of who we are. It infiltrates video games, television shows and movies.
First, an aside to those who think I should try to protect my son from that culture: I understand the concern, but disagree with trying to shield children too much. Parents who think they can control the cultural inputs and produce a child that has their own exact values are naive. The more a child is protected and forced to follow paths that parents think are politically/socially/religiously correct, the more likely it is that a child will rebel or be unable to cope with the cultural forces that he or she will inevitably face. Better to let the child learn the culture, but reinforce lessons along the way. For me that means talking a bit about the drawings — acknowledging how cool it looks, how “awesome” the missile launcher is, and how gross the pool of red blood looks. But then at other times talking about the difference between real and imaginary. I actually have surprisingly “grown up” conversations with Ryan about war, religion, and life. In order not to be hypnotized by the culture, one has to be able to navigate it.
Yet the danger is that the glorification of war will desensitize children as they grow, and war will be seen as a big video game, covered by CNN, abstracted to the point that the spectators have no clue of what the participants in war endure — either the civilians caught in the cross fire “over there,” or the soldiers who have to deal with the reality of death and destruction around them. In such a case, the cultural messages of war as honorable, cool, a way of showing strength, and an abstract struggle of good vs. evil will overwhelm that part of war we don’t see — the grotesque, sickening, revolting and tragically sad destruction of families, lives and even cultures.
Is war natural? I think not. Conflict is natural. Self-defense is natural. Anger is natural. Aggression is natural. Sometimes these things turn into actual fights, but rarely does a participant die.
War is different. War is a social process, and in fact a social construct. A collective group (tribe, state, nation) chooses war against another group as an abstraction. Consider: the most poignant and successful anti-war book ever was All Quiet on the Western Front. It had no overt anti-war message, it simply described WWI as it was for German soldiers on the front. War was not glorious or heroic, but mundane, ugly and sickening. The British hated the book because it portrayed Germans as being as just as human and likable as the British. War requires you imagine the other as having evil traits, they are different from you — they don’t value life, they hate freedom. In order to justify killing them, we latch all sorts of absurdities onto the collective “other.” The Nazis and German militarists hated the book because it portrayed the soldiers as being normal, flawed and confused often afraid humans — not the noble heroes the military was supposed to be. War requires myth to be embraced; the reality of war revolts the senses.
War as we might define it (two collective groups fighting) probably began about the time people started farming, and created the notion of private property. The idea of private property is non-existent in many hunter-gather cultures — but once you farm you have to protect the land in order to get the benefit of your efforts. That means you protect the property.
Still, the formation of collective units is natural. Humans are social creatures, and throughout most of our history we have defined ourselves more as part of a group than as distinct individuals. Individualism is a western construct — one that is more myth than reality. So in that sense protection of and competition for resources by groups can be seen as a natural result of human progress in a world of scarcity.
So in regions where people truly lack, and there is a stiff competition for scarce resources, war may indeed be a natural manifestation of the human struggle to survive. Yet in places where people have enough to survive, that doesn’t cut it. In cases where war is about religion, ethnicity, ideology, conquest for the sake of glory, expansion, social darwinism or even to ‘spread democracy,’ war is human construct made possible by how we abstract it into something most people define and understand as something far different than its reality entails. Calling it ‘natural’ and ‘omnipresent in human history’ rationalizes that kind of approach. How can one condemn the inevitable?
But war is rare. Most states settle all their disputes peacefully; only 2% of the population actually fights in a war. Wars make the news because they are an anomaly from most of what’s happening in the world. Moreover, calling it a social construct does not mean we can easily choose to make it go away. All traditions, cultures, and rituals are social constructs. Yet once constructed people tend to reproduce them, and social reality becomes resilient. It’s difficult to, say, end slavery, racism or gain equal rights for women. Those changes required changing culturally shared beliefs, and people usually hold on to their beliefs, change thus can take generations.
So most war may not be natural, but that doesn’t make it easy to overcome or something we don’t have to try to understand, learn about, deal with and at times experience. My hope in this class is that by learning about war and peace, students are able to see international conflict in a realistic light. That means both seeing through the myths of glory, honor and heroism, and also understanding that naive chants of “no more war” are unrealistic. War may be necessary at times, but if one supports any given war, one should do so understanding what war really is, with a cold sober appreciation of the immense costs and uncertainties it creates.
Lessons from Communism
Posted by Scott Erb in Barack Obama, Communism, Democrats, Philosophy, Political Economy, Republicans, Russia, US Politics, Values on June 27, 2011
Monday my two summer on line courses start — “The Politics of Russia and Eastern Europe,” and “War and Peace.” Surprisingly the former has more students enrolled than the latter. Usually people find Russia and eastern Europe less interesting than learning about international conflict. Since they are both on line, I know it’s a choice people made rather than the time the class meets or something that might affect a “normal” course.
The course starts with Russian politics and the Communist era. Teaching about communism today is much different than it was 20 years ago. During the Cold War, Communism was seen by most as something evil and dangerous. Some contrarians tried to defend it, or claim that western capitalism was worse, but whether Democrat or Republican, there was unity in the belief that the Cold War was real and Communism was a threat.
In the years that have followed Communism has become something of a mystery to most students. This is made worse by way the right wing throws out “commie” and “socialist” in the political discourse. One student who likes Barack Obama asked me in 2008 “well, if he really is a socialist or a ‘commie’ as Glenn Beck says, then does that mean communism isn’t so bad — because his ideas make sense to me.” Of course, Obama isn’t anything close to a communist or true socialist — carelessly throwing terms like that around to insult ones’ opponents has the side effect of making the terms seem benign.
So what can we learn by looking at the experiences of Russia and Eastern Europe. The first lesson is that horrible things can come from good intent. The early communists were reacting to a system we’d today label as unfair and exploitive. They believed they were bringing not only equality, but true liberty and enlightenment rationality. Marx and the Communists had a well researched objective philosophy behind their effort, proving capitalism to be full of contradictions and doomed for failure. Even Lenin, whose centralization of power both rationalized brutality and later enabled Stalin’s atrocities, truly believed that they were nearing a day when people would be more free than ever before. The road to hell, as they say, can be paved with good intentions.
The second lesson is to beware of ideologies that give you a nice neat read on reality. Marxists could point to contradictions galore that show the irrationality of capitalism — the very principles capitalism rest on contradict each other, and point towards a different way of producing value. The ideology was so appealing, so internally consistent and persuasive that people came to use it as a system through which to interpret reality. When people stop thinking critically about an ideology and become a proponent, they lose their capacity to see their own errors. Rather than critically assessing Marx’s work (or that of similar theorists) they used it as a way to explain reality, treating it almost like a religion. Ideologies are simplistic representations of a complex reality, useful as a starting point, but not something to be believed in as one does an article of faith. Faith in ideology caused socialists and Marxists to become blind to reality, clinging intstead to slogans and interpretations of reality that explained away anomalies. The most dangerous was the idea that communism had never truly been tried because capitalists intervened and obstructed its implementation. “It’s never really been tried” is a very lame way to hold on to something that if it really could work, would have been tried!
Other lessons are more practical: bureaucracies are very conservative and will lead to stagnation without an external force holding them accountable and forcing change. That’s true of every bureaucracy, even ones at big corporations or labor unions. Big corporations have the market to keep them accountable — no moreso than ever. At one point GM, Ford and Chrysler could get lazy because they led the world and pretty much dominated the US car market. They stagnated, and then suddenly found themselves behind Japan and other new comers. That forced change. In Communism such a stagnation was unchecked and continued for decades.
Worse, in communism the bureaucracy was not held accountable to a political system because that system was controlled by the bureaucratic class. Many call Soviet and East European style communism ‘bureaucratic socialism,’ because ultimately the leaders became simply entrenched bureaucrats running a system that benefited the elite nomenklatura and believing that a stagnate status quo would be enough. It was a recipe for rapid decline and failure.
Finally, there is a real lesson in how the so-called “social contract” between the Party and the people led to disaster. Under communism everything was guaranteed: a job, an apartment, health care, a retirement pension, vacation time, education, and complete security. In exchange you were supposed to simply do your part for the country and not oppose the government. This doesn’t sound so bad at first — complete security in exchange for becoming apolitical. Given how apathetic a lot of people are in the US these days, that might seem tolerable.
However, the price was far higher than one might expect. Because of the bureaucratic control, the chance for creative input into society disappeared, and personal ambition was limited. Ambitious folk found the party the only mode of upward mobility, and there adherence to the bureaucratic status quo was the key to success. If you worked in a factory or any “normal” job you learned you went along with the procedures in place; even suggesting a ‘better way to do things’ was dangerous. Maybe it would be seen as a plus and you’d get rewarded, but it was more likely that you’d be seen as a trouble maker.
The cost of this denial of human achievement was spiritual decay. One might think it wouldn’t be such a big deal — in medieval times humans had material stagnation and no social mobility, yet lives were arguably meaningful and full. But that era was built on community and faith. The material world was not the most important aspect of life, close knit communities and belief in an eternal paradise gave a sense of meaning. Under communism that was gone. Since “community” was forced on people, with the secret police causing distrust as anyone could be an informant, people retreated into small circles of family and friends, alienated from the community communists claimed to offer. With faith in nothing but the material, life was dreary. Alcoholism rates skyrocketed, as did depression and apathy. Life was materially better than under the Czars, perhaps, but that might be Communism’s biggest lesson: Materialism isn’t everything.
There are other lessons. But it irks me when people call American “liberals” communist or socialist. “Real existing socialism” was brutal and trivializing it by calling support for a health care system, a mildly progressive tax system or regulations on banks “socialist” dismisses the the true dangers of real socialism. Social welfare states like Sweden are nothing like the Communist states of Eastern Europe; Barack Obama’s ideas are nothing like Communism. Ronald Reagan was right when he called the Soviet Union an “evil empire;” despite the allusions to Star Wars, it was an empire that was based on anti-human policies. We need to learn from that, especially how good ideas went so bad, and how ideological thinking can blind one. And don’t forget — Ronald Reagan grew the government and advocated tax rates higher than those being advocated by President Obama!
Finally, communism and capitalism are not opposites where one is good and the other bad. Some fall victim to thinking if they dislike one, the other (sometimes in extreme form) is the only alternative. There are many approaches to politics and economics — dichotomous thinking is a sign of laziness. Or, as I wrote at the end of a poem inspired by seeing the after effects of Communism in Russia ten years ago “reality defies any ideology.”
Too Early to Predict 2012
Posted by Scott Erb in 2012 Election, Afghanistan, Barack Obama, Democrats, Economy, Hillary Clinton, Nancy Pelosi, US Politics on June 24, 2011
At this point in 2007 a few things were virtually certain about the 2008 Presidential race. First, we knew who the Democratic candidate would be. Hillary Clinton was the presumptive nominee, a front runner so far ahead in money, endorsements, “super delegates” and polling support that as long as she didn’t collapse in scandal the race in the Democratic party was to see who might be the Vice Presidential nominee. Barack Obama was a possibility, though he lacked experience.
John McCain, after appearing to be a darling of the media in the past, had faded. His support for immigration reform was a nail in his coffin in terms of getting GOP support, and he was all but written off. Mitt Romney, Rudy Guilliani and Fred Thompson were getting more buzz.
Of course, McCain ultimately cruised to the GOP nomination despite active opposition from talk radio jocks and the right wing of the party. Hillary Clinton was shocked by upstart Barack Obama, and the two fought a long, sometimes bitter and for political junkies extremely entertaining battle for the Democratic nomination.
In June 2007 most people assumed the Iraq war would be a major issue in 2008. And while some people were warning that the sub prime debacle and housing bubble could portend a major recession, most thought that the economic slow down might be over by late 2008 and wouldn’t be a major factor.
Well, the world changed tremendously between June 2007 and November 2008!
Right now President Obama looks reasonably strong against a relatively weak Republican field, yet vulnerable due to economic woes. His success in ordering the assassination of Osama Bin Laden exist alongside an unpopular intervention in Libya and on going conflicts in Afghanistan and Iraq.
By mid 2012, things could be much different. First, a Republican nominee could emerge that captures the attention and support of independent voters. Jon Huntsman seems the most plausible choice to fill that role, but if you look way back to June 1979, the Carter White House thought Ronald Reagan would be the weakest candidate they could face, and if you told George Bush in June 1991, still enjoying high post-Desert Storm ratings, that Bill Clinton would be the Democratic nominee, he’d have known his re-election was assured. The idea that the GOP field is weak is pure speculation, in hindsight it may appear strong.
On the other hand, the economy could bounce back. Oil prices are dropping, which means gas prices will fall and that will stimulate the economy. Uncertainty over the debt ceiling and other issues may be slowing the economy, and once resolved, late 2011 could see some good economic news. If that’s the case, the dark fears of a double dip recession may give way to “Morning in America II,” as Obama cruises on good economic news to victory. Romney reminds me of Mondale in some ways.
If the economy does slip into double dip recession, Obama’s chances start to decline dramatically, as few Presidents have ever governed during four years of recession and kept their jobs. You have to go back to Roosevelt for that. To be sure, Obama didn’t cause this recession, and it’s a stretch to say he’s done anything to prevent recovery. We’re suffering 30 years of imbalances that can’t be cured over night, or perhaps even over four years. But that’s a case that will be difficult if not impossible for Obama to make in 2012. If the economy isn’t looking better, he’s likely to suffer the same fate as Bush the Elder and Jimmy Carter. The assassination of Bin Laden will be as helpful to him as Desert Storm was for Bush.
I’d be shocked if Gadaffi is still holding out in Libya by the end of this year. If Libya appears a success — the rebels overthrow Gadaffi and are reasonably successful at creating a government that is neither extremist nor anti-western, what now is a liability for Obama may become an asset. If the withdrawals from Iraq and Afghanistan continue with no major setbacks, Obama’s foreign policy could well be a strong point, perhaps enough to keep him competitive even if the economy remains sluggish.
Of course, setbacks in Iraq or Afghanistan, a shocking reversal of fortune in Libya, or crises in Iran and Pakistan could create problems. The Mideast is unpredictable. Another terror attack could help or hurt Obama, depending on what it is and how it gets handled. Instability is a liability for a sitting President.
It does seem unlikely that Obama will face a serious challenge in the primaries. Still, depending on the economy and foreign policy, even that could change. Simply, it’s too early to have a real clue on how the election will go. Obama’s campaign team is proven, can raise money and get out the vote. That will probably be enough to make it a competitive election, but in and of itself not enough to win.
So at this point predictions are predictable. Republican leaning pundits will write columns predicting Obama’s demise and try to paint him as the return of Jimmy Carter. Many will believe it, others are trying to shape the discourse. Democrats will do the reverse, mock the Republican field and make it sound as if Obama has a relatively easy course ahead. I suspect fewer of them believe it, until the economy picks up Democrats are worried about the election.
In Congress there is similar uncertainty. Democrats won the special election in New York that had appeared safe for the GOP, thanks to public reaction to GOP plans to cut medicare. This is the kind of seat Democrats would have to win a few of to get back the 24 seats necessary to bring Nancy Pelosi back into the office of Speaker of the House. That is possible — 24 seats aren’t that much, and given the turn out dynamics in a Presidential election, a number of districts are almost certain to shift back to the Democrats.
On the other hand, the Senate could swing to the Republicans if bad economic news persists in 2012 – the Democrats will be defending 23 Senate seats, while the Republicans will only be trying to hold on to 10. Senators don’t go down to defeat often, but the Republicans need only pick up four to get the majority, and that’s the same number of Democratic incumbents who have so far announced their retirement.
A victory for Obama with the Democrats holding the Senate and bouncing back to retake the House?
A victory for a Republican with the Republicans holding the House and taking the Senate?
Both are in my opinion equally plausible. Statistically the latter is more likely than the former because 24 seats in the House is tough. But there is so much uncertainty at this point that anyone making confident predictions is either faking it or a bit deluded. It’s simply too early to have much of a sense of what 2012 will bring.
A New Generation
Posted by Scott Erb in Afghanistan, Al Qaeda, Barack Obama, Communication, Economy, Education, US Politics on June 23, 2011
On Wednesday evening President Obama addressed the country to inform us that the war in Afghanistan was winding down and would be ended ‘responsibly.’ 10,000 troops will return this year, and another 23,000 by the middle of 2012. He neglected to say that over 65,000 would still be there, promising only to continue the draw down as security responsibility is handed over to the Afghans, with a goal of completing the process by 2014. A NATO/Afghan conference next May will work through the details.
Thursday morning in Summer Experience the class watched a shaky Youtube video of Obama’s speech, and critiqued it having read a number of pieces about war, and an article by Howard Zinn about our double standard when it comes to violence. Students were uniformly critical of the wars, though some said they understood why we went into Afghanistan in 2001 before Iraq pushed us off course. It’s interesting how in 2001-04 students showed a strong burst of patriotism and support for even the Iraq war, which by 2006 had shifted to anger about the on going wars, and since 2009 or so has become a kind of an apathetic cynicism. One fascinating aspect of teaching is seeing how attitudes can quickly change with new groups of college students.
Another piece we read was about Kent State. Most students don’t know what happened in May 4, 1970 when the Ohio National Guard killed four students and wounded others when confronting an angry student protest. To give background I played some of President Nixon’s speech announcing the invasion of Cambodia, which he gave on April 30, 1970. That speech sparked the protests that led to the shootings. What students noticed (and I hadn’t really expected) was the similarity between some of what each President said. Nixon was also announcing a draw down of forces from Vietnam, over 100,000. His explanation (have the Vietnamese take over responsibility for their own security – Vietnamization) and rhetoric about the US role was often similar to what Obama said. To be sure, Obama didn’t announce the invasion of another country, though one student noticed the parallel between the importance of Cambodia in that war, and Pakistan in the current one.
They were shocked about the protests and especially the fact live ammo would be used on students. One student compared that to China at Tienanmen Square, though clearly the scope was far less. They were surprised that many people even supported the shootings at the time, and said that this is another example of groups of people not understanding each other and thus rationalizing conflict and violence.
We ended up discussing the conditions my generation is handing off to them: a number of on going wars that need to be ended (they’ve cost over $1 trillion so far), government debt that started growing dramatically in the early eighties, private debt and credit card debt that has grown even faster (the public has mirrored the government in that regard), the current account deficit that has made the dollar and the US very vulnerable to outside shocks, and the growing gap between the rich and poor. I showed the charts that showed that the wealthy have done very well during the last thirty years, while the middle class and poor have actually lost ground. Finally, we talked about energy and touched yesterday on the environment.
Most of the problems, especially the economic ones, are rooted in choices made in the early 80s after the last recession when tax rates were cut and spending/debt increased. Thirty years of imbalances, and these 18 year olds now have to face the fact that unless this gets fixed, their future will not be as comfortable as the lifestyles enjoyed by the previous generation. They expressed disdain for the ideological bickering between the political parties and said that if people listened to each other (the point of a Walter Lippmann piece they read for today), we’d realize that the problems were real and we have to solve them.
It also seems that in a world of constant communication and technical sophistication, the allure of ideological thinking is fading. The reality of the problems we face and the messes such thinking has caused in the past presents them with a challenge: their future depends on shifting our political and economic thinking in a profound manner. We discussed the naive thinking of economic ideologues — those on the left who think government can plan and run an economy without markets, and those libertarians who think markets are magic and can operate without regulation and the state. A little common sense can cure such ideological blindness, and for all the faults people find with the ‘facebook generation,’ they seem to have little patience for putting theory ahead of reality.
I’ve taught summer experience for 12 years now, starting in 2000 in the midst of the dot.com crash. In the late 90s many students had bragged about making money through day trading and some thought they might never have to work since their investments could just keep proliferating. In the years since as technology progressed and the country has gone through extended wars and now a deep recession, I find myself more impressed than ever by the young people heading into college. There seems to be more pragmatism behind youthful idealism (I can’t imagine them burning down ROTC buildings and the like, regardless of how opposed they might be to a war), a willingness to consider and try to understand a variety of perspectives (I credit both the internet and globalization with this) and even improved knowledge about world events.
I hope my faith in the new generation is well placed, since I am losing faith in mine to actually start listening to each other and working for compromise and a pragmatic solution of the serious problems we face. If ideological screaming by the left and right continues, with elections zig zagging between parties as the public becomes frustrated by the inability to collaborate on creative solutions, we’ll need young people to come forth with new solutions. And, given their command of technology and the information revolution, they just might be able to do it — it’s not just Egypt that needs the youth to rise up and demand change!
Posted by Scott Erb in Culture, Environment, Ethics, Philosophy, Psychology, Science, Spirituality, Values on June 22, 2011
In the first comment in response to my last post Modestypress wrote: “I’ve decided to live life as if the world I sense is “real.” I don’t see any point for doing otherwise.”
That got me thinking. I did not mean to imply the world isn’t real. Rather, is reality constituted by each of us as a subject in a world populated with objects? If so, then subjectivity is a unique personal experience. We can assume that other humans are also subjects (and ethically we tend to believe we should treat them as such), but the rest of reality consists of objects of various sorts.
If we have a view of expanded subjectivity, then the nature of reality is different. We are connected at some level with that which we experience. Rather than being discrete entities navigating an external reality, we are entities enmeshed in experience, part of a deeper unity.
Such a possibility actually gets support from cutting edge science. The most obvious example is how particles can impact each other across vast distances instantaneously. This seems impossible, the fastest information should be able to get from one particle to another is the speed of light. (To read more on the science behind it check out the Wikipedia articles on quantum entanglement and the principle of locality.)
The only way that such a result makes sense is if at some level the two particles are connected. Yet they are not connected in space-time. If they are connected it is either through something outside space-time which we cannot fathom, or space-time itself is not populated by discrete separate objects but has a deep underlying unity.
While this meshes well with many eastern religions, it also captures neo-platonic thought which heavily influenced Augustine and the early church. The idea that reality is a unified whole containing diverse perspectives and attributes is not that hard to imagine. I experience my body as me, an entity comprised of different physical attributes. I can sit in nature and imagine myself part of the entire scene in much the same way; poetry explores this kind of imagined connection quite often.
So what would it mean if reality actually was unified? What would it mean if the self isn’t only the thinking mind inhabiting a body, but actually is connected to and a part of all we experience?
First, everything we do to others (whether living or not) we would be doing to a part of ourselves. We would at some level be connected to all the pain and joy that exist in the world; if we cause pain or joy, we also would at some level receive it.
Death would have a new meaning. Rather than being the annihilation of the self, with the only hope of continued identity being either a transcendent supreme being or the possibility that a soul could be reincarnated into different bodies, death would simply be the cessation of one perspective of experience. That happens all the time. The person I was 20 years ago no longer exists in the sense that the perspective of experience I had then has been transformed into something completely different. Life is constantly changing perspectives.
If reality is unified, then no perspective has a privileged position or permanence. Death may be less an ending than a change of focus — rather than experiencing the world as a human living at a certain period in history, my perspective could shift, perhaps mingling with other perspectives or taking on a new manner of experience. Death may be the equivalent to finishing one book and starting another one — or turning the channel on a TV.
Ones’ perspective on life would alter as well. One might better know oneself by looking at the world one inhabits. What kind of reality do I experience, and why is it that I have chosen (or have been drawn to) this type of experience? What does the world around me say about who I am? Usually identity is separate from the external world, here it would be integrated. How we look at luck, coincidence and chance would change completely. Life would be a maze of interrelated coincidences, full of symbolic meaning. Rather than seeing the world as a cold harsh stage upon which one lives a short often difficult existence, it would be a rife with opportunities and possibilities that we draw to ourselves in some way.
Success and failure would alter form completely. Neither would be completely real, and certainly not permanent or all that important in the grand scheme of things. Even poverty, wealth, exploitation and violence would shift meaning – if there is unity, the “self” experiences everything at some level. The idea I’m living a comfortable life is just a focus of perspective at this moment. At a deeper level all experience is shared.
Most people would simply dismiss all this as meaningless speculation. We have jobs to do, families to raise, and the reality we experience runs by particular rules we have to navigate. However, I would argue that thinking about reality from a new perspective might actually have some beneficial consequences.
It could certainly mean letting go of a lot of stress and anxiety — just entertaining the thought that the world is not cold and cruel but rather purposeful and full of opportunity alters ones’ mood. It also could cause one to consider different goals; if this moment of experiencing life through this perspective is only a partial taste of a greater reality, then striving for material success for the sake of material success alone starts to seem pointless.
The mind would shift to looking for clues in relationships and life activities that might hint at how one can enrich ones’ experience at a deeper level. The world as a whole would be more important; the day to day struggles and dilemmas could seem more trivial. Fear of death would give way to acceptance of transitions. Hatred would become irrational, since hatred of the other would be hatred of a part of the self. Love would be the ultimate truth, in that it would entail the connection between apparent-self and apparent-other.
Human history contains many versions of reality that seemed absolutely natural to those living within them. Slavery, the superiority of one gender over another, sacrifices to Gods, tribal customs, religious faith, and secular rationalism are all ways humans have conceptualized and thus interpreted reality and experience. The idea that what seems natural at this point in time is based on a misunderstanding of reality certainly is feasible.
If we are willing to try out different ways of conceiving experience and reality we can avoid being trapped into the mode of thinking dominant in our particular culture. To me, that’s liberating, and gives me some power over how I choose to interpret my experience. Rather than accepting a world view created by otherse, I can use reason and reflection — the heart and the head — to determine what I believe to be true, and choose how I want to live my life. That is real freedom.
The Illusion of Objectivity?
Posted by Scott Erb in Dreams, Philosophy, Psychology, Science and philosophy, Spirituality on June 21, 2011
The modern mind differentiates itself from the pre-modern by making a stark distinction between the object and the self. Rene Descartes takes this to its logical extreme by recognizing that consciousness is the root of the subjective self. The body and all the senses could be deceptive, but there is something that thinks – cogito ergo sum.
Yet consciousness is experience. Whether or not the world is as our senses indicate, we experience sensations of taste, touch, smell, sight and sound. Those senses constitute experience, they are all that exist to the self. From those sensations we make sense of reality, drawing conclusions, testing hypotheses and trying to figure out why the world is as it is.
Before modernism the world of the senses was part of the self. Instead of a stark distinction between object and subject, the notion of subjectivity was expanded to include objects. Existence was rife with symbols, consciousness pervaded all of what one sensed, the self mingled with all experience. Modernism broke this link, and separated the self from all of what was sensed. The self — or consciousness — thus stood outside of the rest of the world. That world consists of other conscious selves, no longer connected through shared subjectivity, but disconnected from each other as separate thinking beings.
From there sensory experience was categorized, studied and tested. Since other humans appear to be as we are we assume that they have the same kind of conscious experiences as we do. Thus we are in a world populated with “thinking machines,” or bodies that function in service of a mind that itself may simply be a myriad of material chemical reactions — an extremely sophisticated ‘natural’ computer.
Other entities appear to have sensual experience, but apparently do not have the capacity to reason and reflect. These creatures are animal life forms. We know that animals can reason in a limited manner, but the lack of linguistic capacity and reflection mean their consciousness is different. Other life forms do not seem to experience the world, they just grow — plants lack consciousness. Below that are non-life forms such as rocks, soil and inert matter. Energy (and plasmas like fire) have a different status, though we know realize that matter and energy are the same — matter gets converted to make energy.
All this is accepted because it works in the world. We can analyze reality as if we are discrete minds experiencing a reality we somehow find ourselves in for no apparent reason. Yet this is an odd conclusion. If our minds consist of only chemical and electric reactions, with DNA shaping our personality and capabilities, then there is really no separate mind able to comprehend experience. Where does the self end and the object begin?
The self is our reflective thinking capacity. Yet where is it? Is it the brain? But the brain is made up of objects – cells, blood, chemicals and the like. What mix of chemicals, electric nerve impulses and the like constitute the self at any time?
At this level the distinction between object and self breaks down. There are lots of objects that make up the brain, but the self emerges from it in some indistinguishable form. To be sure, the self cannot live without the body (especially not the brain). It can lose some parts of the body, but if the body cannot function at a fundamental level the self disappears — the body dies and the subjectivity is lost.
But what if the modern take on subjectivity is wrong? What if it is not the case that a discrete subjective self is in a world of objects, trying to make sense of the objective reality in which the self finds itself? What if objectivity is an illusion, what if the pre-modern view of expanded subjectivity is accurate?
First, it is only habit and bias that cause some to dismiss that possibility. Since the self is constituted only by sensual experiences the idea that the self is separate from those experiences (or the source of those experiences) is pure conjecture. In dream states, for instance, no distinction is made between what one takes within the dream as sensual experience (you can touch, see, hear and taste in dreams) and the self. The dream is an illusion of objective reality during the sleep state.
There is no logical reason why waking reality could not be seen in a similar light. It is more consistent and less malleable than dream reality, but that simply means it functions differently, not that one is pure subjectivity and the other is a discrete subject adrift in a world of objects. Moreover, modern physics and brain research have mainstream theories that draw the same conclusion. The holographic principle posits reality as a complex hologram, with the brain operating much as a hologram operates (it appears there is no other way to account for how the brain functions).
The fact that it is possible that the stark subject-object distinction is an illusion doesn’t mean we should dispense with it. Practically, it has proven very useful, allowing us to manipulate the world of objects to achieve numerous goals. Pragmatically, accepting the object/subject differences makes sense.
But does that practical manipulation of reality require that we posit a subject-object dichotomy? Probably not. It does allow us to remove ethical consideration of how we deal with the world because objects that are not conscious (or in the case of animals possess lower consciousness) do not require ethical treatment. Killing a tree, killing ants and forging steel are all acts with no ethical content — we’re dealing with non-conscious objects.
If objectivity is an illusion and the self is mingled with experience, ethical issues become more complex. It still may not be wrong to poison ants or build skyscrapers, but the act of doing so could affect the subject in some non-trivial way. Moreover, how we deal with others would change. With the subject-object distinction we can assume that our choices are individual and thus have no impact on others unless there is some objective trace (e.g., we hit another person, steal their money, or something like that). Without the distinction then humans are not discrete separate selves, but may indeed be linked at some level with the rest of the world of experience. In that case, ethical systems built on the idea of discrete individuals interacting through choice have to be rethought.
Of course, one could argue that the experience of reality as objective is enough to simply make that a working assumption. Samuel Johnson used a similar argument against Bishop Berkeley (who also doubted the existence of objective reality) when he kicked a rock and said “I refute Bishop Berkeley thusly.” The object moved when kicked, the world operates as a world of objects. Yet as one learns in philosophy class, that doesn’t refute Berkeley or prove anything. Johnson simply gave a pragmatic reason for accepting the subject-object distinction.
An expanded view of subjectivity seems odd to the modern mind, even if that’s the “natural” state of the human mind in nature. Seeming odd doesn’t make it wrong, however. It also wouldn’t mean the modern mind is inferior to the pre-modern mind, only that we may have one point wrong and we should consider the implications for how we live and understand the world. This also could be an alternative to cold positivism and meaningless relativistic skepticism.
Row, row your boat, gently down the stream, merrily, merrily, merrily, merrily, life is but a dream…
Giving in to Voice Recognition
For the last 38 years I’ve kept a journal. It started back when I was thirteen, paused when I hit 15, and started “for good” at age 16. Up through 1985 it was pen and paper, then I started to use word processing.
From 1985 to 1989 I used a word processing program called “Paper Clip” with my Commodore 64, which I had hooked up to both a printer and an old black and white TV as a monitor. I still have that old computer, disk drive and “Paper Clip” program. I believe I have the old floppy disks (the 5+ inch variety), but I have no clue if it would be possible to transfer those files to Word.
Last summer I started the task of typing up my old journals. I am a fast typist. In 7th grade I took typing and had homeroom in the typing room so I practiced a lot. Back at Patrick Henry Junior High in Sioux Falls in 1973 I was one of only three guys in my typing class. The reason the girls outnumbered us 10 to 1 was because typing just wasn’t a skill boys were expected to learn. Most guys took more shop courses (wood working, metals, drafting, etc.) while the girls took things like typing. You see, guys would likely end up in an office with a dictating machine, while the girls would be the secretaries who would have to type up the work. Why would a boy want to develop typing skills? A male secretary would be, well, weird.
I’d love to say I enlightened enough to oppose sexual discrimination back in the 8th grade, but the truth is I loved to write even back then. I wanted to type. I was going to be a journalist, preferably a sports writer. My hero was Sid Hartman, an insider for the Minneapolis Tribune (now the StarTribune) who wrote daily columns about the goings on inside the Twins, Vikings, Gophers and North Stars (the hockey team of that era) club houses. I could imagine myself following sports for a living. So I learned to type, and I was one of the fastest in the class — my typing teacher was thrilled to have a boy learn to type and be so good! In junior high, otherwise lost years for me academically, typing was my best subject.
I remember writing about the Ali-Foreman fight, various football games, and handing my “column” (I’d get to school early so I could write) to friends and have them comment and often argue about my effusive praise of Fran Tarkenton or prediction that Bert Blyleven would be a superstar. I learned and wrote on an Underwood manual typewriter, and still remember those drills to strengthen the little fingers, slapping the carriage return bar, and making sure that I didn’t type past the little mark representing the one inch bottom margin.
By college I had my own Royal Electric typewriter (I still have it, though I have no idea if I could get a ribbon for it), and for long papers I would go into my dad’s office to use his secretary’s IBM selectric. That was a sweet machine, and I fantasized about owning a Selectric. It had a backspace button that automatically whited out a mistake — and if you backspaced ten or so times, it would remember which letters to white out. It was sleek and easy to type on. I hit 100 WPM with no errors at one point.
Of course, I never bought a Selectric. Shortly after college the technology revolution brought the PC age, and at age 25 I got my Commodore 64. That’s also when I shifted to typing my journals.
Last summer I started retyping my old journals, getting 1973 to 1975 complete. But as I look at the stacks of paper representing journals between 1975 and 1989 and consider the aches and pains of constant typing, I realize that I lack the time to quickly type them all up. I’m also not sure my hands and wrists could take it. So today I went on line and ordered voice recognition software.
The typist in me has been resisting that, the same way I resist texting. I don’t have that many skills in life, but typing is one of them! To be sure, most of my writing will still be done via keyboard. I think through my fingers. To me typing is the process of writing, I don’t do well with a pen and paper, or by talking it. I could never create blog entries with voice recognition software, my fingers on the keyboard are integral to the creative process. But copying already written material? Yeah, I can see just reading it aloud.
I also have “dream journals” to copy. These were made from 1986 to 1990 and contain thousands of dreams. I would become what I called “dream aware” (I’ve since learned the official term is lucid dreaming) and then do experiments, waking up to jot down the ideas I’d type up (on my Commodore 64) the next morning.
I’m not sure how well voice recognition will work, if I find that I’ll use it more often than expected, or end up hating it — I’ll blog the result when that happens. And who knows — maybe I’ll balance giving in to this new technology with a purchase down the line of an old Underwood manual typewriter. I’m sure my fingers (especially the pinkies) have gotten lazy and soft being used to these sensitive PC keyboards. My fingers could use a good workout!
My garage is full of chemicals. I can kill wasps, ants, and weeds. I can kill weeds but keep the grass alive. I can weed and feed, I can fertilize, I can clean. I can clean grease, stains, and various surfaces of my car.
I have spent most of my life under the illusion that most of these materials sold to make life easier are relatively safe. I know you shouldn’t mix various cleaning materials, with some you should wear gloves (something I’ve generally neglected to do), and that too much could be bad. Yet certainly the companies that make all these things for home and garden have tested them out. And even if the market couldn’t prevent dangerous items from being made available, government regulations must focus on safety.
I’ve been using speed stick deodorant, Nivea shaving cream and moisturizer, Pantene shampoo and conditioner, and the kids have enjoyed yummy smelling colorful soaps, and plastic tub toys. We’ve been cooking with no stick pans, and drinking sodas from plastic bottles, even if they’ve been sitting in the car on hot days. It’s just life. We don’t smoke, we try to have healthy habits, but plastics, chemicals and additives are ubiquitous. The idea I should have a lush green lawn not marred by clover and crab grass seems normal. Everybody, save a few old hippies, lives this way.
I’ve now come to the realization that how we live with chemicals is analogous to how the Romans lived with lead poisoning. We’re poisoning ourselves and our children (look how autism and ADHD rates have skyrocketed) by injecting massive amounts of under tested chemicals in every aspect of our lives. I’m not sure how much poison we’re getting. People still live long, but chemicals seem to play a role in making us fatter and creating expensive health problems. Rates of cancer are up, so are a variety of other health problems. Moreover, the generation now in their 20s and 30s are far more affected by these chemicals than people my age, since widespread usage really got going in the 80s and 90s. We may not know the real impact for quite some time.
This creates a perfect storm. There is a danger that seems obvious, but it can’t be verified with certainty, and the exact impact is unknown. Moreover, studies can always be criticized (and a very well oiled chemical industry lobbying machine responds to everything) and it may be decades before we know for sure just what the impact of all this is. Humans tend to ignore issues like that — if the problems aren’t obvious, why bother?
That’s been me for years. I’ve known the chemicals in our lives are dangerous and under tested. I’ve had suspicions that my use of them was probably not all that wise. But those have been fleeting thoughts at the back of my head. They haven’t been strong enough to get me to actually dig into the science and think about changing my ways. There are classes to teach, kids to take care of, things to do. So it’s grab what’s at Walmart and whether in cleaning or lawn care, take the easy route.
In May I read What’s Gotten Into Us by McKay Jenkins, a book that gave an excellent and well supported overview of the scope of the problem, showing clearly how little we know, how studies are often hidden and attacked, and just how powerful the chemical lobby is in the US. In the EU there are far stricter regulations (so I’m going to keep using my made in Germany Nivea skin care products), here the burden of proof is not on the chemical companies to prove their products are safe, but for others to give definitive proof that there are dangers.
My wife read the book as well, and was convinced. She replaced most of our household cleaning and personal hygiene materials with Seventh Generation, Tom’s of Maine, or Burt’s Bees. I went along with it, figuring that at the very least this is the kind of thing that companies will pay attention to — if the market shifted towards low chemical alternatives businesses would have to follow. But I was still skeptical. The thing about books with a mission — and Jenkins’ clearly is convinced of his argument and has a desire to open peoples’ eyes — is that they are prone to overstatement. The book cited science, but the author is not a scientist. So I’ve been investigating further.
But the more I read the more convinced I am that not only is there a lot of uncertainty on what these chemicals are doing to us, but that the massive increase in chemical use and our exposure, especially children, is by definition extremely risky. Chemicals are powerful, they have side effects that often aren’t understood. The websites and blogs criticizing those concerned about chemicals tend to have weak arguments. They either deride/ridicue “anti-chemical” folk for being too alarmist and blaming things on chemicals that might be caused by something else.
Both of those points may be true, but that doesn’t lead to the conclusion that chemicals aren’t a problem. Same for the statement that ‘just because it’s not natural doesn’t make it bad.’ That’s true. But it doesn’t mean they are good or harmless either. Defenders of chemical usage rest on the arguments that: a) they are beneficial in helping us achieve our goals; and b) studies haven’t definitively shown the specific harm being done. Most of the evidence against chemicals is circumstantial.
OK. Consider this chart:
This rapid and alarming rate of increased obesity corresponds to the rise of chemical additives in food, many of which affect hormones and other aspects of the anatomy that influence fat retention. Is this circumstantial? Yes. The causes of increased obesity are many, despite many links connecting the two (see: The Body Restoration Plan by Dr. Paula Baillie-Hamilton for a good argument on this issue), definitive proof remains elusive.
The same goes for the rapidly growing rates of autism (and other neurological disorders) in children, increases in depression and psychological problems in adults, hyper sensitivity issues, increased allergies, and increased incidences of cancer (even if medical science is much better at curing it). If I demanded total proof that these chemicals are harmful before deciding not to use them, I could find arguments that cast doubt on every claim of harm made.
But my health is at stake. The lives and health of my children are at stake. And the evidence that harm is likely is very persuasive, even if absolute proof is elusive (it was also elusive for cigarette smoking for decades, where another powerful lobby tried to fight making such a connection). Simply the fact that chemical usage has risen so dramatically is enough to get me to realize it is irrational for me not to be concerned — throw that much new artificial under tested chemicals into our bodies and homes and its clear no one knows exactly what the result will be.
So my habits are changing. Some changes are gradual, others are instantaneous. I’ll not try to get a perfect lawn, so no more weed and feed. I will finish my Speed Stick before using the natural (no aluminum) deodorant. I’ll try to hit the Farmers’ Market more frequently and buy organic/local as much as possible, but now and then we’ll still get some fast food. Most importantly I’m going to start thinking about these things, investigating the products we use and the food we eat. Big agribusiness doesn’t care about us. Regulators lack staff and the politicians listen to the lobbyists of big money more than they listen to scientists or citizens. We have to educate ourselves and hope that enough people see the danger that we can have an impact on the market.
Chemicals are everywhere, no matter how much I change at home our world is defined by chemical usage at work, by local governments, stores, and other people. This is something we have to live with. But making some different choices at home can’t hurt — and may yield long term benefits.
Posted by Scott Erb in 2012 Election, Democrats, Economy, Republicans, Unions, US Politics on June 15, 2011
The argument that the US is heading for collapse is probably overstated, but sometimes when I look at the nature of political debate in the US it seems plausible. Rather than taking a cool, pragmatic view of the problems and potential solutions, politicians are increasingly locked in an ideological fog, looking for sound bites and “easy” solutions. Rather than recognizing the complex array of problems that brought us to this point they prefer to point to some villain who has foisted harmful policies on the US which, if eradicated, would eliminate our problems.
For instance, it’s become mantra for some on the right to see unions as some all powerful bad guy. The proposition is so absurd as to be laughable. As this story shows, US workers are lower paid than ever. One reason is that unions have become impotent and are smaller than ever. There is no possible way to deny the fact that relative income has shifted away from workers (who have gone from ‘middle class’ to ‘working poor’) towards the wealthy, and that unions have become rare and ineffective. This is a problem. With the working class increasingly receiving less income, not only is the divide growing in the US, but economic growth becomes more difficult. Companies are earning record profits as worker income declines.
If people were rational and objective, they’d see this and come to the obvious conclusion: corporations and businesses with resources are structuring the game in their favor (and the favor of their share holders) while workers, no longer protected by effective unions, pay the cost. Instead, driven by irrational ideological faith, many argue that unions are evil and holding back growth and that the government is unfair to big business. Morever, it’s clear that the wealthiest have had an explosion of income growth in recent years, while the poor and middle class have not. An objective, rational analysis would say that given high debt and deficits, those who have benefited so much at the expense of others should pay more taxes. Instead, efforts are underway to cut taxes on the wealthiest and cut services to those poor who have become worse off in the last 30 years.
As I noted yesterday, the argument that these policies trickle down wealth to the poor has been discredited. Europe, which has stronger labor unions and more wealth equality has also created more jobs in the last 12 years than has the US. We’ve seen a net shift of wealth towards the already wealthy. Moreover, there is little class mobility in the US. It’s not as if the successful are rewarded for hard work and innovation; rather, those with wealth are being rewarded for having wealth.
What’s surreal about this is the way that especially the right in the US ignores the data and continues to argue for tax cuts while demonizing labor unions. There is no evidence supporting such claims, such views are driven solely by ideological propaganda. But the left is not without blame either. Their focus has been increasingly on politically powerful interest groups, meaning the working poor are often left out. Moreover, they’ve ignored the problem of higher debt and ineffective social welfare programs, giving ammo to the right to show the problems of big government.
In short, the left has been AWOL in terms of truly identifying and offering solutions to the problem of this loss of wealth, status and opportunity of the working poor, defending instead existing government programs. After all, what political clout do the working poor possess? They won’t win elections for the Democrats, and defending them only gives fodder to the GOP to scare the wealthier into thinking Democrats want to “transfer wealth” to the poor.
Transferring wealth won’t work. The problem is not unequal outcomes, but warped opportunities and power structures. Corporate profits shouldn’t skyrocket while worker income plummets. There needs to be more fairness in the structure of pay and power in the market place. This doesn’t call for massive government transfers of wealth, but rather government support of labor unions, transparency in the market, and regulation of big business. The goal is not to combat the free market, but to make it work effectively.
That’s the most surreal point of all. The free market has not been working well in the US because it’s been manipulated by insiders with information and power to control how it functions. The state is supposed to prevent that and protect the market, but increasingly both parties listen to lobbyists representing the current “winners.” Left unchecked, this will lead to a bifurcated class structure in the US and a potential backlash. Left unchecked, the US will lose out to others in the global market. Republicans and Democrats should together recognize the problem; instead, ideology trumps pragmatism, and problems go unsolved.
It feels like I’m watching a great power decline, even though there are ample opportunities to change and regain vibrancy and growth. Yelling at each other and lost in ideological fog, those opportunities are ignored. Instead of problem solving, opponents are demonized in a weird effort to pin problems on a particular group. The way some Republicans talk about “liberals” — as if all the problems the country faces are caused by ‘liberals’ — is dangerous. Rather than seeing problems that need solutions, others are demonized and uncritically blamed for all that is wrong.
As someone who is critical of both the left and the right, and who believes that pragmatic solutions require compromises from each side, I hope the surreal political debate gives way to more rational, practical, problem solving. If not, then perhaps the best days of America are behind us…or at the very least, will take awhile to return.