Archive for January, 2010

Out of the Woods?

Last quarter the economy grew at 5.7%, one of the highest rates of growth in six years.  With inflation nearing 3%, it appears to many that the economy is starting to move forward again.   Moreover, the dollar is showing surprising strength, going under the $1.40 per Euro rate for the first time in a long time.   The current account deficit has stabilized at 3.0% of GDP, a vast improvement on the 7% of GDP figure reached in 2006.   New jobless claims, a lagging indicator, are starting to fall.  Could we be seeing the light at the end of the tunnel?

In a word: No.   But a qualified no.   President Obama had a bad hand dealt to him when he took the oath of office.  The US economy faced collapse in September 2008, saved only by a massive bailout.   For all its downsides, that bailout probably prevented a catastrophic seizing of the credit markets.  As unemployment was rapidly climbing and the US sinking fast into recession, Obama gambled.   He used political capital to rush through an emergency stimulus package designed to push the US into economic growth by the end of 2009.   It appears the stimulus has worked, and the US economy has turned around.

All other things being equal, one could expect this growth to take and the economy to start another cycle.   However, all other things are not equal, structural problems that both existed before and in part caused this current crisis need to be addressed before we can say we’ve accomplished a sustainable economy.

The biggest problem, of course, is the debt.   There is no easy way to cut spending, but already US debt and deficits are at the border of what can be financed without risking a steep decline in the value of the dollar.   Increasing the debt was the stimulus gamble — the idea that we have to spur growth again before structural reform of the economy can begin.   If there is economic growth, it should be possible to start significantly trimming deficits to get the budget back at a sustainable level.   However, there is a risk: the spending cuts and tax increases needed to balance the budget (or at least limit deficits) could stifle growth.

Politically the problem is made worse by the fact that tax increases have less of a harmful effect on growth than spending cuts.  Spending injects money directly into the economy, and thus economists tend to see government spending as a much more effective stimulus than a tax cut.  Conversely, tax increases are less likely to slow economic growth than spending cuts.   Spending cuts are easier to sell politically than tax increases, however, so Obama will have to find a politically acceptable balance.   And, while it is expected that these priorities might cause Republicans to howl (if military spending is cut and taxes increased), it’s likely liberals who will be most put off by cuts in government programs moving forward.   Obama will have the continuing attacks from the right to fend off, while trying to keep together his Democratic coalition — all of this in an election year which is likely to see Republican gains.

One thing Obama can’t do is simply take the Reagan approach — continue to increase the debt and not worry about deficits.   Right now Obama is experiencing a situation similar to Reagan in 1982 — increased debt is helping stimulate the economy, creating short term unpopularity (Reagan’s approval numbers were in the 40’s in his first term) but setting up a rebound — something Reagan called “Morning in America.”   But while Reagan had a country with relatively low debt and a low current account deficit, Obama starts with staggering debts.   While Reagan was governing as the baby boomers were heading into their most productive and prosperous years, Obama is governing as they start to retire, and pull money out of the system rather than put it in.  Obama does not have the luxury of simply continuing to borrow and spend, unlike Reagan’s second term.

It will take bold leadership to pull this off, and it’s not yet clear Obama is up to the task, if anyone is.   I have more faith in his capacity to make these tough decisions than I’ve had for any recent President, however, so there is hope.  First, Obama has to finish the job of ending US involvement in Iraq, get Afghanistan to a point that we can cut commitments there, and then start downsizing the US military.   It’s a fallacy that we need to spend half the world’s military budget for our own defense – our interventionism probably does more to make us a target than keep us safe.    Bases need to close,  Congress has to stop funding projects the Pentagon doesn’t want, and the US has to accept that we can’t afford to try to be a dominant super power.

Second, somehow Republicans and Democrats have to set up a sustainable set of priorities for budget cuts and tax increases.   Both sides will have to swallow some bitter poison they’ve avoided in the past.   I do not know if this is possible.   Until recently the “compromise” was that Republicans would accept spending increases in exchange for tax cuts.   Each side gets something, even as the country drifts further into debt.   Now they have to find a way to share pain and avoid economic populism.  If Obama can facilitate that happening, he would right there become one of the greats.

Third, the US has to make a real shift in economic policies to support new technologies that will position us for the inevitable shift away from oil towards solar and other alternative energy sources.   The EU has already marched ahead of us on that, thanks to their implementation of the Kyoto accords.   The US complained that agreement would hurt the economy, but clearly in the EU it actually boosted new economic activity.   We can’t be left behind.   Real productive capacities in new technology needs to be a core of US economic activity — real production, not just consumption and wealth ‘on the cheap.’

Finally, the US has to avoid slipping back into ‘bubble’ economies where speculators bought the “something for nothing” mentality that you can get rich quick if only you invest in the hot market of the day.   People have been trying to get rich by being clever — investing in stocks, flipping real estate, etc. — rather than actually producing.   Investment has its role, but it cannot alone produce mass amounts of prosperity without increases in productive capacity.   The bubble economy hid the structural imbalances that have been building the last thirty years, and contributes to our current problem of massive public and private debt, and unsustainable budgetary practices.  The US needs tough financial regulations, closer monitoring of markets (especially things like derivatives) and transparency in the banking sector.

If Obama and the Congress can move on these fronts, we may start solving these problems.  But even if we make all the right moves, the challenges ahead are so deep that it may take more crises to truly force the changes that need to take place.  Obama has made a good first step, but it was the easiest step.   The path forward will be very difficult.

1 Comment

Hypnotized Humanity?

We underestimate the power of suggestion.  If you’ve ever been to a performance by a hypnotist, it is obvious just how much power suggestions can have over our experience of reality.    People can be convinced they have snakes crawling over them, are in danger, take on very different personas, and do things that they would otherwise not choose to do.   It can be a bit creepy.

Yet you do not need to be in an hypnotic trance for suggestions to be powerful.   Advertisers know this.  That is why McDonalds spends less time talking about its food than creating a mood.  “You deserve a break today.”  Cute scenes of kids, friends, people laughing, “lovin’ it”!   These are meant to place suggestions in your mind about how to feel when you see a McDonalds, making you more likely to think, “gee, let’s go eat at McDonalds.”   You haven’t really compared dining possibilities, you just feel like having McDonalds.   You feel loyalty to a particular brand.   Mountain Dew seems to be a more exciting soft drink than Sierra Mist.  It used to be that advertisers would place subliminal messages in ads, brief bits with words or suggestions that went by too fast to be consciously perceived, but expected to have a subconscious impact.   Suggestions.

In life we are constantly bombarded by suggestions.  Politicians making statements, cultural biases and fads, claims on the news, in the media, within a sub culture.   What has value?   What is common sense?   What will make me happy?   For example, for a long time our culture was convinced that blacks should be treated differently than whites, and the races should not mix in marriage.   That’s the way the world was, that’s what people grew up with.    All customs, beliefs, political justifications and the like reinforced the suggestion that this is how the world should be.

Those with the power (usually whites) had little reason to resist these suggestions.  This made it seem like there was something better about their race, rationalized unequal treatment, and gave a nice narrative to why conditions in the US were as they were.  Even progressives found it hard to break through the programing.   Blacks more easily rejected the suggestions because they were the ones hurt by the social structure, and its injustices were felt more keenly by blacks.  Even then, the power of these suggestions made it hard to mount a serious challenge to the existing social structure.

Martin Luther King’s “Letter to a Birmingham Prison,” wherein he expressed disappointment that white clergy and liberals were saying “slow down” or “don’t demand too much,” was really a shout to “wake up!”  You’ve been hypnotized into accepting as normal a state of affairs that should enrage you.   In fact, if we look back at history we see slavery, sexism, and an atrocious state of human rights as normal throughout most of human history.   If we could go back to Savannah, Georgia in 1824 and yell “wake up, slavery is evil,” we’d be marginalized as radical abolitionists — dangerous and misguided.

Political struggles are often efforts to yell “wake up” loud enough to shake people into questioning the suggestions being fed them, and get them to see things from a different perspective.  Yet it is always easier to stick with the world we’ve become accustomed to.  The suggestions that define how we interpret reality are comfortable, we’re used to them, and as we get older it becomes ever more difficult to truly question the world view we hold, especially if it has been rather stable.  Alternative suggestions sound weird, radical, dangerous or contrary to all we hold true.   That is one reason why change tends to be generational — a new generation may be open to new suggestions in a way older folk are not.

How do we break this hypnosis?   I think first and foremost is to recognize it exists — to recognize that every political, religious, social and cultural perspective we hold is in part shaped by the world around us.   Once we recognize that, then we can start reflecting on those beliefs and perspectives.   Second is to acknowledge the capacity of those with power and money to make and reinforce suggestions about how to think.  Whether political propaganda, advertising, or media messages that stick within narrow narratives (even if you can have the diversity between, say, Fox and MSNBC) all have enormous power to shape how people understand their world.  It creates a pervasive discourse, a set of ‘normal’ meanings and understandings through which reality is interpreted.

All of this means we have to don some humility.   Each of us may think ourselves clever, smart, critical and independent, but we are all to some extent products of our culture.   If we were born in Cairo or Tehran we’d think differently about the world than we do now; if we lived in the 1700’s our world view would be fundamentally changed.  Much of who we are depends on where and when we are.   Yet, I do think we can wake up.

Once we recognize that we’ve been subjected to massive suggestions about the world from our culture, friends, family and media, we can start the reflective process which gives humans the capacity to be ourselves and claim our own identity.   It’s not that we can completely come up with a totally autonomous worldview — the impact of culture and the suggestions that we “listen to” every day is always there, and always a part of us.   But we can resist their ability to shape us, and learn to question those suggestions being thrown our way.

In advertising: as we pull into McDonalds we can say, “do I really want this, why do I feel I want this, is it due to advertising, is this what I want,” and start thinking of alternatives.  At least make the decision consciously.    It can also be useful in thinking of politics.   In recent debates on this site about taxes and the role of government, I think it’s important for all sides to think seriously about the other’s perspective.   By that I don’t just mean entertain the arguments, but really listen and try to comprehend how a perspective can be different than ones’ own.   The more we do that in all issues, the more likely we are to have a clearer view of the world, and how suggestions may be leading us to embrace things we actually don’t believe in.

This also explains the importance of both art and satire.   Art can reflect creativity, the ability to explore aspects of life beyond conventional thought and the status quo.   It can be an impetus for “waking up” from a series of comfortable suggestions.   Since the Roman Empire and satirists like Juvenal, satire has had as its core method a refusal to accept conventional norms and rules of the discourse, and play with strange takes and perspectives on things.  Sometimes that allows the satirist to see reality more clearly than the serious pundit.

It may be comfortable to live hypnotized by the world around us, being programmed to want certain things, define success a certain way, and go with the flow.  But when we do that, we give up living an autonomous life, and soon feel alienated from life.   That emptiness can come out as depression, anxiety, hopelessness, or a desire for thrill, something to make us feel alive.   And, though we can never be completely outside our culture and the suggestions it provides, I do think it’s possible to resist and reflect.   Learning to view things from diverse perspectives, taking seriously the world views of those who think differently, and never being too settled in ones’ own beliefs and ideas about the world is one way to do that.   Living “awake” may be the key to a really satisfying life.

8 Comments

This Time, This Place

Commenting on the last post, Jim Sullivan expresses his frustration with being coerced to pay taxes and have his property taken from him — a form of theft, he claims.   When I respond that it is a kind of fee for the numerous services that create stability and prosperity, he can correctly point to the lack of an opt out.  Even if he is much better off because of government, he never had a choice in the matter.

Others on all parts of the political spectrum will point to numerous other social ills: poverty, starvation, genocide, wars throughout the third world, first world imperialism (funded through taxes), and all sorts of conditions and actions that are intolerable.   When there is an industry worth over $60 billion a year turning young girls into sex slaves and destroying their lives (sometimes they are sold in to it by their parents), torture, and human rights violations across the globe, one has to wonder what is going on in this world.  Compared to much of the evil on the planet, I find taxation not to be so bad.  I live in material comfort and security — most of us in the US are in the top 1 or 2% of the planet in material well being.   Clearly, it’s not so bad.

Or is it?  Are we in a gilded cage, living lives guided by trend and advertising fashion — the ‘culture industry’ — in a way that dulls our desire for achievement and meaning?   Do we medicate ourselves (whether in alcohol, legal or illegal drugs) and indulge in various distractions simply because the world around us seems to have a lot of toys, but little of substance?    And what about Jim’s objection to the lack of choice — he’s born in a world where taxation is mandatory, and none of us can do much about it, like it or not.

Ever since we left our hunter-gatherer roots and started to form communities, the problem of individual interests vs. collective interests has been there.  Traditional societies used tradition, norms, and religious rule to coerce people to adhere to the rules of the community.   These were pervasive and often people were simply indoctrinated to follow them.   But it worked — for the most part people freely chose to do things that benefited the community, often at their own expense, ranging from providing labor to sacrificing ones’ life to the gods.

Moreover, in smaller communities people see directly the negative effects of looking out for only oneself and not others in the community.   So people often choose to pull together to provide for the common good our of a clear self-interest as well.   That creates a sense of security and meaning — we’re there for each other, we are stronger as a community than as individuals.   When the polity grows in size, and when factory life alienates workers from the products of their labor, there is a de-personalization of the community.  It is too big and diffuse to really identify with.  You don’t know who made the chair you’re sitting on, who grew the apple you’ve just eaten, or even who drove the plow that just cleared the road outside your house.

This de-personalization makes it harder to find incentives to contribute or participate.   If you bought apples from the orchard down the road, and the family had a severe problem, you’d be tempted to help out — you know these folk, they provide you food!   But if a big corporation employs large numbers of apple pickers at low wages, there is no connection.   Thus the collective action problem grows — the community needs support from citizens to survive and thrive, but the old ways of providing it (voluntary, through church, or communities pulling together) becomes sparse, as people no longer have those connections.   The result is a series of needs for the community to maintain itself, while people have no incentive to respond to those needs, they don’t see the connection.   The leaders of the community (now a large sovereign state) turn to force and taxation as a necessary evil to prevent collapse.

Yet this de-personalization also means that it becomes very easy for both politicians and citizens to separate out their actions from the consquences of those actions.  Cut taxes while raising spending?   Sure, that will get votes, and we can find ways to manipulate monetary and fiscal policies to put off having to actually deal with the imbalances this leaves behind.   This also makes war easier to support — it’s abstract, you can construct a caricatured enemy, and it’s a media show.  Only when the reality becomes hard to avoid due to consequences that can be felt and seen (as happened with the Iraq war by 2006) does the public start asking hard questions.  Hyper-consumption is embraced without concern for the environment.

In short, de-personalization creates abstraction, and abstraction allows people to replace real human concerns with concepts that can be rationalized through arguments which appear reasonable and common-sensical.    Within this framework, politics can be easily manipulated by the powerful (big business and big government), and average folk get increasingly alienated from the “big” decisions.  Politics becomes spectacle and entertainment at best, delusion and subterfuge at worst.

So within this framework there are a host of injustices that vary in seriousness, yet each have validity.  It isn’t right to sell women into sex slavery, genocide is wrong, war for oil or ethnic conquest is wrong, taking other peoples’ property is wrong, having some live in abject poverty while others live in opulence is wrong, and destroying our environment is just plain stupid, since it will limit the ability of future generations to have a quality life.

We can focus on which of these “evils” to combat, and that will cause political division.  To combat poverty and human rights violations, government power and taxation is often a means to that end.   To combat the ability of powerful actors (government, supported by big money) to take one’s property would require an inability to act on many other problems.   Those of us who accept government power and taxation do so as the ‘lesser of two evils,’ not allowing this means no action to protect others and work against serious problems.

At this time and place in history, that’s where we’re at.   Sovereign states are the form of government which exists, and while someday they will be replaced by something else, that won’t be anytime soon.  Poverty will exist, warfare will continue to exist, people will be sold into slavery and often forced prostitution, lives will be destroyed, and children violated and exploited.   People will be taxed, and will not be free to choose how they want to organize their lives.   Pragmatically, the best we can hope for is to keep those with power accountable to both the public and rule of law.   Unfortunately, that’s an imperfect solution and more often than not governments become corrupt and in the hands of elites.   It’s a nexus of big money and big government, with the irony that the right often ignores the misdeeds of big money, while the left excuses the misdeeds of big government.  Yet the two are in this together, complicit in driving the world we have.

Perhaps there is no “right” political answer to this dilemma, just a constant balancing of pragmatic concerns, with a goal of over time moving closer to an ideal of real liberty for all — liberty from government intrusion, as well as from poverty caused by class difference, exploitation, and the abuse of power by either government or big money.   We have a long way to go.  Sometimes I think we live in the pre-history of humankind, a dark, violent and dangerous era.   Sometime in the future humans will look back at our era and be thankful they did not endure this time and this place.

Yet, despite that all, I love life and enjoy every day, and see beauty all around me.  I do not let myself get burdened psychologically by the political and social ills of this world — I cannot change the whole, I can only live my life in a way that can try to spread a little love and kindness.  Somehow, the big issues will take care of themselves over time.  We can vote, participate, and contribute — but most importantly, if we live right, we can have a good life in this time and place, and perhaps make small steps towards a better future.

18 Comments

Rights and Privileges

Perhaps one of the most bizarre aspects of the debate on health care reform is about whether or not health care should be considered a “right.”    The whole concept of “rights” and the difference between “rights” and “privileges” is tricky.   There are various ways to approach the issue.

Natural rights theory: Natural rights theory, dating back to Locke, seeks to find rights inherent in nature.   To be human, you need to eat, have shelter from the elements, and be able to fend for yourself.  From that Locke deduces a natural right to life (you need to be alive to be human), liberty (freedom to take care of your needs), and property (the ability to have stuff needed to survive).    This theory seems persuasive, though it ultimately rests on certain unfalsifiable assumptions and cultural specific interpretations.

First, of course, is the fact that despite a claim to be natural, this is a normative view on rights.  It is not a theory that posits rights as transcending human volition or action.   Compare rights with gravity.   Gravity affects you no matter what, you cannot get around its pull.   “Natural” rights have been violated throughout history, and there is no force leading people to respect those rights.  Therefore, claims on natural rights are normative “ought” claims rather than testable “is” claims.  Often such claims rely on appeal to emotion for support   If someone asks “don’t you think everyone has a right to live?” with the idea that saying ‘no’ seems weird or cruel, that’s not a true philosophical argument.    Other times people put weird linguistic tricks in this.  One person once asked “don’t you think you own your own body,” suggesting that if I didn’t, then that means someone else has the right to ownership.   Of course, ownership is a construct of a culture with private property, where all objects can be owned.  Since I am a subject, and my body simply is, I do not own it, I simply exist.   No one “owns” it, it is not property.

Positive rights theory: The initial response to the weaknesses in natural rights theory is to put forth positivism — rights are whatever the legal authorities say they are.   If the Constitution and the government declare a right to health care, then ipso facto, it is a right.  It need not rest on nature, religion, or any other transcendent condition.  The problem here, of course, is that this seems to shed legitimacy on all sorts of questionable practices (slavery, Jews in Nazi Germany, etc.)   The problem, bluntly, is that positivism takes rights from being a normative statement about what we think “ought” to be, to instead be a descriptive statement about what “is.”   What “is” almost always is a creation of the most powerful and often corrupt forces in a polity.  Do we really want to sacrifice the normative power of human rights arguments for a cold descriptive “is” statement?

Democratic theory: Another view of rights is overtly political — the only valid rights are those that allow a democratic polity to continue.  You need freedom of speech, of religion, of assembly, equal rights to vote, and all sorts of things to assure a participatory democracy.   Usually these folk are overt about the normative aspect of their project — they believe in democracy, defend that belief, and argue that it makes sense to tie a concept of normative rights to the goal of having a functional democratic polity.   But how far do you take this?   Since the wealthy can buy more lobbyists and political influence, does a democracy require limits on free speech, or perhaps income redistribution?   Must universal health care be a right so that all are able to be healthy enough to participate democratically?   And what about social democracy as opposed to liberal democracy?  The former focuses more on economic conditions, the latter on political freedoms.

Perhaps the most persuasive theory is Rawl’s notion of a veil of ignorance.  If you would be ignorant of what place in society you will find yourself — the beggar, the laborer, the wealthy, the powerful — what kind of rights would you support?   The idea here is that people would want to have optimum rights and policies for everyone in society, since theoretically each individual could end up in any social role.   I find this approach quite compelling, but it is subjective.  And, of course, it’s hard for people to really think as if they had a veil of ignorance.

Social Constructivist theories: This approach looks at rights as human concepts, not grounded in anything but human choice, yet differs from positivism in embracing a strong critical component.    Consider:  In objective or “natural” reality rights do not exist.   That reality simply is — actions have consequences, people have to deal with those consequences, and fate can ehand out wealth or devastation regardless of how one has lived.   Subjective understandings of rights, on the other hand, is an individual’s personal point of view — what normative rights he or she thinks should exist.  One might be a positivist, a natural rights proponent, a Rawlsean or simply go from the gut.   One’s subjective ideals are fine so long as one is alone, once you deal with others who have different beliefs, you have to compromise or compete; a society or community cannot reflect any one person’s specific subjective beliefs, and a society where everyone thinks the same is likely impossible and undesirable.

In the realm of social reality, therefore, people build sets of rights that they want to guarantee for their people.  This can include life, liberty and property.  It can also include a right to education or a right to health care.   It might include a right to paid vacation, or a right to shoot people of another race, deeming them subhuman.  Since any social construct is a contingent human creation, all conceptions of rights must first be created, then they can be either reproduced or transformed.   The right to own slaves was transformed into a right for all to be free in 19th century America.  Unlike positivism, social constructivist notions of rights do not take a given set of rights as legitimate just because it exists.  In fact, it calls on people to critique, try to change, and if driven, violate existing rights in order to transform what they see as wrong (or defend what they see as right).    There is no finally measure as to what rights are “best” or ultimately “should” exist, only human freedom to create normative rights, and then change our minds about what we’ve created.

Individuals are the players in this drama by having to make a choice to reproduce or try to transform rights.   But no individual can do this alone, you need to convince others to think differently if there is to be a cultural shift in the notion of what is a right or not.

So in the health care debate the question should not be “is health care a right or privilege?”   Rather, it should be “do we want a right to health care, and if so, what does that right mean?”   Does it mean absolute equal treatment, a right to basic health care (with the healthy able to have better access), access to ‘affordable’ care, or what?   Does it mean a single payer government system?   Those are questions we are free to answer however we want as individuals, and if we can convince others to agree, then we can construct or transform existing systems of rights.   Sometimes we’ll make things worse by doing so, sometimes better.   But I like to think that over time we humans generally learn from our mistakes.

7 Comments

A Better Health Care Alternative?

Senator elect Scott Brown from Massachusetts is offering a different take on the health care issue.  Rather than trying a one size fits all, he argues that Massachusetts, which covers 98% of its citizens, should be put forth as a model.   The current reform effort, which has yielded a bill that essentially tries to cut costs by limiting disbursements, doesn’t deal with the real problem.  The health care system as we know it is unsustainable.   Anyone looking at the data and trends cannot deny that.   A lot of people think that as long as they’re covered things are OK — but that’s shutting ones’ eyes to reality.

The health care compromise reached in the Senate did not satisfy anyone.  I didn’t like it, but argued last month that it was the best possible given the realities of Washington DC, and it would be better to pass something and tweak it later than let health care reform die.    Commentators on the left and right disagreed, albeit for different reasons.

Scott Brown’s idea, however, is intriguing — and a sign of how he was able to defeat a Democrat in the state of Massachusetts.   He isn’t opposing health care reform, and certainly not giving into the “teaparty” rhetoric of some on the far right.   The fact he holds up his own state’s health care plan as a model is something many on the right probably find abhorrent.    But he is challenging the Democrats’ effort to create a single national plan.

I think Brown may be right.   I believe health care should be a right in advanced industrialized countries.   Every other advanced industrialized state has a system that covers everyone, provides quality care, and costs much less than ours.  Moreover, our current system is unsustainable, as prices keep spiraling upward.   Health insurance costs have gone up 120% since 1999 while average income has gone up only 34%.   Projections are the cost could double again within the next six years.   Moreover, many people have little or no coverage, depending on their position in society.  As the boomers retire, and as the country’s health weakens and addiction to prescription drugs grow, this will only get worse.   We can’t sustain our current system.  But we can’t agree on a national reform.

Yet the states in Europe are smaller than the US.  They do not need a bureaucracy or a network the size of ours, nor do they have such a diversity in conditions as we do from Maine to California to Mississippi to Wyoming.  In Germany or France they can have one system.   One system for the entire US may be a  pipe dream.   And, of course, it also appears to be politically out of reach.

There is a clear message in the vote last night in Massachusetts.   People want change, and they want the two parties to work together.   Talk to most from the Bay State, and Brown didn’t win because he came on like Glenn Beck or Rush Limbaugh.  Quite the contrary, he was complimentary to Obama, said he agreed we need health care reform, and staked in independent stance, willing to cooperate with Democrats.   That’s what people want.  They punished the GOP for its riding roughshod over the Democrats during the Bush years,  now they are punishing the Democrats for partisanship in Obama’s first term.   Regardless of which party is most to blame — the Republicans clearly didn’t reach out to work with Obama — the public is telling both parties work together.  Solve our problems.  Stop bickering like children.

Senator elect Brown’s idea of the federal government incentivizing state initiatives to create health care reform at the state level might have the capacity to not only bring substantive refrom, but could also bring Democrat and Republican together.    Any real reform needs to focus on improving our health (wellness programs, fight obesity, etc.), cutting overuse of prescription drugs, cutting benefits for operations that are unnecessary, limiting payment for operations that make little sense (extensive unnecessary surgery for an elderly person), and controlling prices.

It’s hard to imagine a federal program effectively achieving these goals.   But effective reform could conceivably happen through state initiatives, provided that the federal government helps state have the resources to do this, most obviously in re-directing federal tax money back to the states.   Moreover, states could develop different models, and we could compare how each model works — states could learn from each other.

So I retreat from the position that this plan is the best we can do — let’s pass it and tweak it — to an embrace of a new direction, one put forth by the Republican who won in Massachusetts yesterday.    Let’s shift from a big national universal reform, to an effort to create universal health care at the state level, with federal support.  Moreover, let’s make this the first big bi-partisan effort to help the country recover from over thirty years of political and economic mismanagement.

10 Comments

William James and Walter Lippmann

As my on line course ends and my research sabbatical begins, as I delve into a project which is both exciting and intimidating.  I made the decision not to stick with German foreign policy or the European Union for my next research project, and instead focus on my current interest: why are we in the industrialized West in the predicament that we are in?

The predicament is multifaceted, and my research won’t touch all of it.  It includes militarism and warfare, degradation of the environment, hyper-materialism, high levels of depression and anxiety, consumerism, and a desire for ‘something for nothing,’ as if being in the world entitled us to luxury and security.   It shows itself everywhere from greed on Wall Street to desire for the government to protect us from everything.   It’s crass capitalism and stifling socialism.  It’s amusement park religion and cold atheism.

A core assumption I make is that reality, being a social construction, reflects how we think.   Our thoughts guide our acts, our acts create our world.   That is true at an individual level, and it is true at a societal level.   Moreover, at each level there are shallow and deep ways to think about it.  “Positive thinking” at the individual level is often shallow, and used as a way to “make yourself wealthy” or “get what you want through creative visualization.”  There the effort isn’t really to reflect on life, but to delude oneself.   The deeper way to think about it means to question values and anxieties and find a true positivism, one where values rather than external conditions provide meaning.   At a cultural level the pop approach is politics — Obama will bring change, Bush will keep us safe, etc.  The pop politics approach is superficial, and generally does not solve the deeper societal problems any more than happy thoughts will send money pouring our way (and even if it does, that money will probably only yield an unquenchable desire for more).

If our cultural “way of thinking” is warped, why is that so?   One possible answer is to look at the general way of thinking we’ve embraced — enlightenment style rationality.   Do we simply worship reason and science, yielding a materialist approach to life that defies true value reflection?   Perhaps.   But then again, the famous atheist Denis Diederot saw reason and rationality as having an ethical core — if we are responsible for our world, not God, then don’t we have a responsibility for doing what is necessary to have the world we want?

Perhaps it’s reason combined with something else.   Thinking theoretically about this, my first glance was at the Frankfurt school, and the work of Adorno and Horkheimer.   They were German Jews who managed to escape before the war, and were horrified by what their country had embraced.   How could the center of culture and enlightenment philosophy so wholeheartedly embrace the essence of anti-rationality, fascism?   The enlightenment was supposed to be about liberation, why did it go wrong?   Their argument was complex, but what I got from it was this: the enlightenment itself does not provide a true sense of what to value.  There are no first principles that give you a clear answer, and thus the discourse is inherently open to interpretation.  In that realm the powerful — those who control political parties, the media, and of course advertising — are able to manipulate people to think a particular way.   Instead of liberation, there is a new form of enslavement, albeit at least in capitalist societies, a kind of gilded cage.

Alongside Adorno and Horkheimer (along with a dose of neo-Freudian thought) I looked at the work of Antonio Gramsci, an Italian Communist jailed by Mussolini between the wars in Italy.   Gramsci realized that Marx had made fundamental errors, most importantly his ignorance of culture and politics.  For Gramsci, the fascists won because they had created a ‘hegemonic discourse’ that defined reality in ways that seemed to be “the way things are ” and “common sense” to the masses.   In short, the discourse constructed their understanding of self-interest and meaning, thus making it seem natural for laborers to support the fascists, even as their income declined and the business class blossomed.  I find this connection between the Frankfurt School and Gramsci relatively compelling.   It avoids the “hypodermic” model of media studies which says the media simply injects ideas into the public, for a broader sense of a socially constructed discourse, wherein people have their understandings of reality subtly manipulated in order to serve the needs of the elite, governmental or business — capitalist or socialist.

Yet for a long time I could not figure out how to find a way out of this.  Adorno ended up simply turning to the arts as his solace, while Gramsci’s idea of creating an ‘alternative discourse’ seemed uncompelling.   That simply would be to create a battle of discourses or narratives fought out on the political realm — people would remain manipualted, but they might be manipulated by side “X” rather than side “Y.”

Lately I’ve realized that while the Europeans have defined the problem well, American philosophers may have the best take on the answer, developed in a uniquely American philosophy, pragmatism.  In fact, pragmatism may be the American philosophy, sharing many roots with the more exotic Nietzschean perspectivism and Foucaultian post-modernism, but avoiding a slip into the abstract world of philosophical naval gazing and arguing over how many angels are dancing on the head of a pin.  Instead, pragmatism provides a means of coping with the issues of values and social communication.

James was a physician turned philosopher, who was often mistrusted by the philosophical elite because he spoke in ways that every day folk could understand.  He thought philosophy was useless if it could only be understood by a small cadre of well educated elite.   His pragmatism and cosmopolitianism reflected a fundamentally open mind — he would look at every claim and statement fairly, assessing its worth and core values.   Yet in so doing he did not give up his own capacity to hold positions with fierce conviction, standing on his beliefs with as much strength as a dogmatist.  He could accept that he might be wrong, and yet still fight for what he held true.

Lippmann (who I’ve written about before) was a student of James and others at Harvard, but found the ivory tower world a bit too boring, and went into the action-packed world of journalism.   Lippmann was critical of propaganda and media manipulation (though he worked during the war making propaganda for the US government), and shared James’ cosmopolitanism and belief in open-mindedness and open communication.  That pragmatic approach evades the “enlightenment” desire to find a “true” system of ethics or values based on some kind of rational argument.   Rather than seeking ultimate truth, one compares ideas and examines results, and makes a pragmatic choice about what works in the world.

However, that kind of process requires thought and is time consuming.   Politics as a product for the masses, political campaigns run on slogans and competing narratives works against the kind of thoughtful approach people like James and Lippmann promoted.   Ideological jihad is the antithesis of pragmatism.  So I feel like part of my research is to both define how the media operates to construct discourses (hegemonic or competing), how this manipulates, and to critique this from a pragmatic perspective.

So at least I feel like the research has theoretical (if not yet methodological) direction.  That direction may change, but for now, it’s underway…

3 Comments

Teaching and Change

Those who read this blog regularly note that I have not been very prolific in my posts so far this month.   I’ve been even worse at following other peoples’ blogs.  There is a reason for that – I’m teaching a winter term course on line (German and Italian Politics), a purely web based course.  It has been a work intensive experience, but one which is going well, and something I’ll probably do a lot of in the future.   All this has me thinking about the way technology has changed education over the years.

My benchmark is my own undergraduate career.   In some ways I was at the end of the ‘classic era’ of post-war education, graduating from Augustana College in Sioux Falls in 1982, just before technology started to change universities in a fundamental manner.

On an average day I’d get up, shower, and with a few friends from the dorm floor (the dorms were mixed gender, but floors were either male or female) head to breakfast at the cafeteria.    If I had an 8:00 class I literally could time it that my alarm could go off at 7:53, and I’d brush my teeth, pee (those two at the same time — multi-tasking), get dressed and be in class on time.   Our cafeteria was not that much different than a high school cafeteria.  There were two lines; for breakfast you’d get toast, cereal, maybe pancakes, eggs and sausage.  To drink there was orange juice and milk.  Lunch and dinner usually had a choice of two entres (perhaps a pasta or some chicken dish) and there was no going back for seconds.  That’s changed dramatically — now students have salad bars, pizza bars, sandwich centers, main courses, ice cream, self-serve soda, and can graze on whatever they want.   We could only fantasize about such college cuisine — yet I think students now complain just as much as we did.   Augustana was served by the ARA company, which merged with another to form Aramark, which serves UMF.

In class the main “technologies” involved either chalk on a board, or an overhead projector.   I hated overheads.  When I started teaching they were still widely used, but I refused.  They have since all but disappeared.  I do not miss them.  Every once in awhile a film would be shown, usually from a reel on a projector.   VCRs existed, but in these early years they cost nearly $1000.  Televisions were not used in classrooms, though I still remember going to the dorm lobby on Thursday as everyone crowded around to watch the new classic — Mork and Mindy. Nanu Nanu!

Research had to be done in the library.  It was important to know how to use a few tools — the card catalogue, which involved rows of drawers of index cards, and the readers guide to periodicals.   The library was short on journals (it was not a wealthy private school), so we didn’t use journal articles very much.  We had a few — I remember Foreign Affairs — but most of the research was through books.   That limited what you could do.  Interlibrary loan was not around, and though personal computers existed, they didn’t even rudimentary word processing programs yet, let alone anything useful for research.

Still, we wrote papers.   Revising a paper was a major affair.  You could not simply go in and edit and make changes like one can with Word. I would type a draft (not ‘print out’ — we didn’t have access to ‘printers’), and then make major revisions by hand on that draft.  Any time I revised anything I’d have to type the whole paper over.   If it was small maybe I could make it just the page, or I could use white out.  Sometimes I’d go to my dad’s office and use his Secretary’s IBM Selectric (you could use a key to erase the previous letter).  But it was time consuming.  Then if I didn’t pay attention and went past the pencil mark indicating I was an inch from the bottom, I’d have to retype the page.   Luckily, I was a fast typist (still am), but for some the effort meant the first draft would be the last draft, revising was too difficult.

Exams and handouts were all mimeographed, or “ditto’d.”  The ‘ditto machine’ was hand cranked, and would make copies quickly and cheaply.   Handouts were sometimes called ‘dittos.’   Xeroxes (that’s what photocopies were called, thanks to the ubiquitous nature of the Xerox company at the time) were far too expensive.  The library had a xerox machine that charged ten cents a copy — in dimes or nickles of course.  If I was going to do some research, I’d often have to stop by the bank and get a couple rolls of dimes.

Teachers had little “grade books” where grades were marked.   Changes in schedules or assignments were announced in class — if you weren’t there, you missed it.    No internet, no texting, no Facebook, no e-mail, no cell phones (though each dorm room did have its own phone).

Nowadays just about every classroom is “smart” – you can show DVDs, hook up your computer, do power point presentations (with video and web access), and look things up right away if a student asks a question the teacher doesn’t have the answer for.  E-mail notices keep students connected to professors, and often replace the old fashioned visit to the office to ask about an assignment.   Research involves on line data bases, interlibrary loan, and simple web research.   A student at a small rural campus has access to information unimaginable in the past.  Research I traveled to Germany to do as a graduate student could be done easily now by an undergraduate on line.

On line courses are the new trend, growing far beyond the rather gimmicky market driven approach that “Phoenix University” had in the 90s.  Using software like “Blackboard” (a term students might soon not comprehend), students can submit assignments, get feedback, have grades posted, participate in discussions, and even view power points, video recordings, podcasts and the like.  You can use chat sessions, share websites, and engage the material and each other without leaving home.  Students who work can participate around their work schedule.

A few faculty don’t like this — especially a public liberal arts university like ours has built its reputation on close contact between students and faculty, personal relationships that seem endangered by technology.   After doing an on line course I say to that: rubbish.   In some ways, I think students have engaged the content of the course more intensely than in a normal course — they have to, they need to post daily work that gets an almost immediate grade in order to pass.   They can’t float through and then cram for an exam.  The discussion board will probably hit nearly 800 posts for three weeks — that’s 40 posts a student on average, talking about course content.  I’m in constant contact by e-mail or through Blackboard with students about research and questions concerning the course.   I read much more work than I do in a normal semester, because it’s all based on writing.  I feel like I’ve gotten to know new students and have had quality interactions.

Watching the rapid pace of technology driven change in the field of higher education is exciting.  In the 90s some thought web based courses would drive universities out of business.   Few believe that any more, especially since quality institutions are integrating that into their offerings.  But students are able to take courses form different institutions (multi-campus students) and while more information is available, some students still simply google what they need to know at the moment.   Like all technology, it’s a mistake to either condemn it as bad or praise it as good.  Technology provides tools, and if you use tools correctly, the results are good.

3 Comments