Archive for June, 2011
Social Welfare Programs Should Liberate
Modern social welfare programs began under Bismarck’s conservative German government as a means of undercutting the growing socialist movement. If workers saw that the state would help alleviate poverty and provide needed services, then the workers would not support Marxism. Moreover, conservatives tend to view the state as a “organic entity,” a collective bound together as a community. That means that it is in the interest of the state to make sure that people aren’t suffering or being exploited.
Other reasons for social welfare programs include ending poverty and suffering as an end in and of itself. This was behind Johnson’s “Great Society” programs, most of which were actually implemented in the Nixon and Ford administrations. In Europe, the left and right reached the great compromise, whereby the left would accept market capitalism in exchange for the right accepting that the state guarantee health care, pensions and a safety net. This led to unprecedented peace and prosperity for Europe, settling past ideological battles between left and right.
However, as demographic change makes most of these systems unviable in the long run, and high debt forces reconsideration of how governments spend money, it is time to rethink the purpose behind social welfare spending.
Bismarck’s goal of stability remains. Societies that see vast gaps between the rich and poor tends towards either authoritarianism (as the rich want to protect their share) or revolt (as the poor get angry about class difference). The “great compromise” was a brilliant solution; putting that at risk would threaten the core stability of western civilization.
Goals of ending poverty or equalizing wealth are suspect, in part because they are too vaguely defined. You could end poverty by simply transfering wealth to the poor, but what good does that actually do for the people themselves? It gives them more money, and may help them feed their families, but the goal is at too high a level of analysis. We should focus on social welfare programs for the sake of the people who are on them.
In the industrialized world people are generally responsible for their success in life. It’s a lesson I try to teach my children and my students: don’t blame others for the world you create for yourself, take responsibility. You can’t choose your circumstances, but you can take action and make choices to change them. Whining about injustice only increases the total whine volume. Claim your life! It’s yours to make, if you’re in college you have every opportunity to succeed, take control! That is a liberating experience, it’s freeing oneself from being confined by the shackles of low expectations and low self-esteem.
But what happens when we just give money to people? I’m reminded of the scene in Syriana where the oil tycoon talks about the money he’s made and how it’ll “probably ruin my children.” When you look at the children of the very wealthy, they have as many if not more problems than others, despite the wealth. When young athletes or film stars suddenly get large amounts of money, it often creates more problems for them than solutions. Some can handle it, many can’t. The reason: money itself does not help a person understand to how to live life.
Many poor are stuck in a situation where they do not believe they can take control of their lives, they don’t see opportunities, they haven’t had the chance to handle the risks through which we build self-esteem. If you just give them money, there is a real danger they’ll become addicts. Not drug addicts, but rather addicted to ‘free money.’ That will feed into a sense of victimization and entitlement. Rather than taking control of their lives, they’ll lose control of their lives and teach those lessons to their children who will start out psychologically unprepared for the demands of the real world.
So I would restate the goal of social welfare programs as being one of liberation. I do not mean this in a Marxian sense of ending exploitation. Rather, a person should be able to develop the confidence to grab opportunities and take control of his or her life. It should liberate a person to rise out of their circumstances, to provide positive role models to their children and community, and ultimately create a sustainable growing economy in communities once suffering economic stagnation.
Unlike some on the right, who take the approach that “if you cut the money they’ll be forced to pull themselves up by their bootstraps,” I do not believe just ending social welfare programs can work, nor do I think private donations would adequately do the trick. So I reject the dichotomy that says “either you give away money to the poor or you don’t.” Rather, we have to figure out ways to design a system that creates opportunities, works with communities, and helps people empower themselves.
Education must be part of this (and I think access to affordable health care is necessary too). People don’t automatically have confidence and self-esteem. Self-esteem cannot be gained just by being praised — it comes from learning one has the capacity to overcome obstacles. In fact, I’d say you can’t really gain self-esteem unless you risk failure and even have to overcome failure. In that sense, education has to be combined with opportunity.
But this needs to be more than job training or even workfare. To really function and become sustainable, opportunity has to connect with community. In that sense one of the most important roles is that of a community organizer, someone who can come in and bring a community together around opportunities for growth. Receiving any social welfare help should be linked to participation in some kind of community venture.
As communities arise, they will provide the opportunities and feedback for people to build confidence, have higher self- and other-expectations, and develop real self-esteem. They will take pride in what they build, and ultimately that will lead to them taking control of their own individual lives and recognizing that they have the power to make choices that will make it much less likely they’ll need assistance. In a recession no one is immune from some hard times, but ultimately the key not only to cutting social welfare spending but also regaining economic momentum and growth is to have people in society making good choices and wanting to be productive.
Community and opportunity based social welfare programs could succeed where bureaucratic programs fail. A community organizer in the field working with people is far more likely to help than a welfare caseworker sitting in an office asking questions and making sure the proper forms are filled out. And given the economic and budgetary crunch, now is the time to reassess our approach.
Is War Natural?
Posted by Scott Erb in Children, Children and war, Education, Political Science, War, World Affairs on June 29, 2011
My second on line course is “War and Peace,” looking at theories of why people go to war, and how peace can be built. I am by principle opposed to military action and war in most cases. The costs of war in human lives, social stability, and the psychological state of both soldiers and the populations involved is immense. Most of the time wars could be avoided through better communication, diplomacy and clear signals of intent. I’m not foolish enough to think humanity is at a point where war can simply be abolished — but I also don’t think war is natural.
My eight year old son is right now fascinated by war. He draws detailed pictures of various weapons and scenes, including a soldier with some kind of missile launcher destroying the Eiffel tower saying “USA Rocks!” While drawing it he asked me what the German word for their army was, so I told him “Bundeswehr.” He wrote that in front of guys defending the Eiffel tower. (The Eurocorps, perhaps?) He later had the same kind of seen with Big Ben, with the clock falling on the defending forces below.
I have friends who would be shocked if their children drew those kinds of pictures, but he’s eight — and he does know the difference between imagination and reality. One time when Ryan showed me a picture of some dead soldiers I said, “gee, I bet their dads and moms aren’t happy.” He stopped a second and said, “Dad, it’s just a picture, it’s imaginary, not real.” Anyway, I’m not going to stifle his creativity because of adult ideas of political correctness. And it was nice that both the Eiffel Tower and Big Ben were in Cars 2 that we saw this weekend.
So, besides the fact that I’m not an overly protective or controlling father, what does it mean that my son gets enamored with the idea of war and weapons? I think culturally it shows how we learn to see war, weaponry and conflict. It is cool, exciting, and one can have victory! The bad guys are defeated. Death is sanitary. “It’s imaginary, not real.” The ideals of honor, heroism and strength become part of who we are. It infiltrates video games, television shows and movies.
First, an aside to those who think I should try to protect my son from that culture: I understand the concern, but disagree with trying to shield children too much. Parents who think they can control the cultural inputs and produce a child that has their own exact values are naive. The more a child is protected and forced to follow paths that parents think are politically/socially/religiously correct, the more likely it is that a child will rebel or be unable to cope with the cultural forces that he or she will inevitably face. Better to let the child learn the culture, but reinforce lessons along the way. For me that means talking a bit about the drawings — acknowledging how cool it looks, how “awesome” the missile launcher is, and how gross the pool of red blood looks. But then at other times talking about the difference between real and imaginary. I actually have surprisingly “grown up” conversations with Ryan about war, religion, and life. In order not to be hypnotized by the culture, one has to be able to navigate it.
Yet the danger is that the glorification of war will desensitize children as they grow, and war will be seen as a big video game, covered by CNN, abstracted to the point that the spectators have no clue of what the participants in war endure — either the civilians caught in the cross fire “over there,” or the soldiers who have to deal with the reality of death and destruction around them. In such a case, the cultural messages of war as honorable, cool, a way of showing strength, and an abstract struggle of good vs. evil will overwhelm that part of war we don’t see — the grotesque, sickening, revolting and tragically sad destruction of families, lives and even cultures.
Is war natural? I think not. Conflict is natural. Self-defense is natural. Anger is natural. Aggression is natural. Sometimes these things turn into actual fights, but rarely does a participant die.
War is different. War is a social process, and in fact a social construct. A collective group (tribe, state, nation) chooses war against another group as an abstraction. Consider: the most poignant and successful anti-war book ever was All Quiet on the Western Front. It had no overt anti-war message, it simply described WWI as it was for German soldiers on the front. War was not glorious or heroic, but mundane, ugly and sickening. The British hated the book because it portrayed Germans as being as just as human and likable as the British. War requires you imagine the other as having evil traits, they are different from you — they don’t value life, they hate freedom. In order to justify killing them, we latch all sorts of absurdities onto the collective “other.” The Nazis and German militarists hated the book because it portrayed the soldiers as being normal, flawed and confused often afraid humans — not the noble heroes the military was supposed to be. War requires myth to be embraced; the reality of war revolts the senses.
War as we might define it (two collective groups fighting) probably began about the time people started farming, and created the notion of private property. The idea of private property is non-existent in many hunter-gather cultures — but once you farm you have to protect the land in order to get the benefit of your efforts. That means you protect the property.
Still, the formation of collective units is natural. Humans are social creatures, and throughout most of our history we have defined ourselves more as part of a group than as distinct individuals. Individualism is a western construct — one that is more myth than reality. So in that sense protection of and competition for resources by groups can be seen as a natural result of human progress in a world of scarcity.
So in regions where people truly lack, and there is a stiff competition for scarce resources, war may indeed be a natural manifestation of the human struggle to survive. Yet in places where people have enough to survive, that doesn’t cut it. In cases where war is about religion, ethnicity, ideology, conquest for the sake of glory, expansion, social darwinism or even to ‘spread democracy,’ war is human construct made possible by how we abstract it into something most people define and understand as something far different than its reality entails. Calling it ‘natural’ and ‘omnipresent in human history’ rationalizes that kind of approach. How can one condemn the inevitable?
But war is rare. Most states settle all their disputes peacefully; only 2% of the population actually fights in a war. Wars make the news because they are an anomaly from most of what’s happening in the world. Moreover, calling it a social construct does not mean we can easily choose to make it go away. All traditions, cultures, and rituals are social constructs. Yet once constructed people tend to reproduce them, and social reality becomes resilient. It’s difficult to, say, end slavery, racism or gain equal rights for women. Those changes required changing culturally shared beliefs, and people usually hold on to their beliefs, change thus can take generations.
So most war may not be natural, but that doesn’t make it easy to overcome or something we don’t have to try to understand, learn about, deal with and at times experience. My hope in this class is that by learning about war and peace, students are able to see international conflict in a realistic light. That means both seeing through the myths of glory, honor and heroism, and also understanding that naive chants of “no more war” are unrealistic. War may be necessary at times, but if one supports any given war, one should do so understanding what war really is, with a cold sober appreciation of the immense costs and uncertainties it creates.
Reality
Posted by Scott Erb in Culture, Environment, Ethics, Philosophy, Psychology, Science, Spirituality, Values on June 22, 2011
In the first comment in response to my last post Modestypress wrote: “I’ve decided to live life as if the world I sense is “real.” I don’t see any point for doing otherwise.”
That got me thinking. I did not mean to imply the world isn’t real. Rather, is reality constituted by each of us as a subject in a world populated with objects? If so, then subjectivity is a unique personal experience. We can assume that other humans are also subjects (and ethically we tend to believe we should treat them as such), but the rest of reality consists of objects of various sorts.
If we have a view of expanded subjectivity, then the nature of reality is different. We are connected at some level with that which we experience. Rather than being discrete entities navigating an external reality, we are entities enmeshed in experience, part of a deeper unity.
Such a possibility actually gets support from cutting edge science. The most obvious example is how particles can impact each other across vast distances instantaneously. This seems impossible, the fastest information should be able to get from one particle to another is the speed of light. (To read more on the science behind it check out the Wikipedia articles on quantum entanglement and the principle of locality.)
The only way that such a result makes sense is if at some level the two particles are connected. Yet they are not connected in space-time. If they are connected it is either through something outside space-time which we cannot fathom, or space-time itself is not populated by discrete separate objects but has a deep underlying unity.
While this meshes well with many eastern religions, it also captures neo-platonic thought which heavily influenced Augustine and the early church. The idea that reality is a unified whole containing diverse perspectives and attributes is not that hard to imagine. I experience my body as me, an entity comprised of different physical attributes. I can sit in nature and imagine myself part of the entire scene in much the same way; poetry explores this kind of imagined connection quite often.
So what would it mean if reality actually was unified? What would it mean if the self isn’t only the thinking mind inhabiting a body, but actually is connected to and a part of all we experience?
First, everything we do to others (whether living or not) we would be doing to a part of ourselves. We would at some level be connected to all the pain and joy that exist in the world; if we cause pain or joy, we also would at some level receive it.
Death would have a new meaning. Rather than being the annihilation of the self, with the only hope of continued identity being either a transcendent supreme being or the possibility that a soul could be reincarnated into different bodies, death would simply be the cessation of one perspective of experience. That happens all the time. The person I was 20 years ago no longer exists in the sense that the perspective of experience I had then has been transformed into something completely different. Life is constantly changing perspectives.
If reality is unified, then no perspective has a privileged position or permanence. Death may be less an ending than a change of focus — rather than experiencing the world as a human living at a certain period in history, my perspective could shift, perhaps mingling with other perspectives or taking on a new manner of experience. Death may be the equivalent to finishing one book and starting another one — or turning the channel on a TV.
Ones’ perspective on life would alter as well. One might better know oneself by looking at the world one inhabits. What kind of reality do I experience, and why is it that I have chosen (or have been drawn to) this type of experience? What does the world around me say about who I am? Usually identity is separate from the external world, here it would be integrated. How we look at luck, coincidence and chance would change completely. Life would be a maze of interrelated coincidences, full of symbolic meaning. Rather than seeing the world as a cold harsh stage upon which one lives a short often difficult existence, it would be a rife with opportunities and possibilities that we draw to ourselves in some way.
Success and failure would alter form completely. Neither would be completely real, and certainly not permanent or all that important in the grand scheme of things. Even poverty, wealth, exploitation and violence would shift meaning – if there is unity, the “self” experiences everything at some level. The idea I’m living a comfortable life is just a focus of perspective at this moment. At a deeper level all experience is shared.
Most people would simply dismiss all this as meaningless speculation. We have jobs to do, families to raise, and the reality we experience runs by particular rules we have to navigate. However, I would argue that thinking about reality from a new perspective might actually have some beneficial consequences.
It could certainly mean letting go of a lot of stress and anxiety — just entertaining the thought that the world is not cold and cruel but rather purposeful and full of opportunity alters ones’ mood. It also could cause one to consider different goals; if this moment of experiencing life through this perspective is only a partial taste of a greater reality, then striving for material success for the sake of material success alone starts to seem pointless.
The mind would shift to looking for clues in relationships and life activities that might hint at how one can enrich ones’ experience at a deeper level. The world as a whole would be more important; the day to day struggles and dilemmas could seem more trivial. Fear of death would give way to acceptance of transitions. Hatred would become irrational, since hatred of the other would be hatred of a part of the self. Love would be the ultimate truth, in that it would entail the connection between apparent-self and apparent-other.
Human history contains many versions of reality that seemed absolutely natural to those living within them. Slavery, the superiority of one gender over another, sacrifices to Gods, tribal customs, religious faith, and secular rationalism are all ways humans have conceptualized and thus interpreted reality and experience. The idea that what seems natural at this point in time is based on a misunderstanding of reality certainly is feasible.
If we are willing to try out different ways of conceiving experience and reality we can avoid being trapped into the mode of thinking dominant in our particular culture. To me, that’s liberating, and gives me some power over how I choose to interpret my experience. Rather than accepting a world view created by otherse, I can use reason and reflection — the heart and the head — to determine what I believe to be true, and choose how I want to live my life. That is real freedom.
The Illusion of Objectivity?
Posted by Scott Erb in Dreams, Philosophy, Psychology, Science and philosophy, Spirituality on June 21, 2011
The modern mind differentiates itself from the pre-modern by making a stark distinction between the object and the self. Rene Descartes takes this to its logical extreme by recognizing that consciousness is the root of the subjective self. The body and all the senses could be deceptive, but there is something that thinks – cogito ergo sum.
Yet consciousness is experience. Whether or not the world is as our senses indicate, we experience sensations of taste, touch, smell, sight and sound. Those senses constitute experience, they are all that exist to the self. From those sensations we make sense of reality, drawing conclusions, testing hypotheses and trying to figure out why the world is as it is.
Before modernism the world of the senses was part of the self. Instead of a stark distinction between object and subject, the notion of subjectivity was expanded to include objects. Existence was rife with symbols, consciousness pervaded all of what one sensed, the self mingled with all experience. Modernism broke this link, and separated the self from all of what was sensed. The self — or consciousness — thus stood outside of the rest of the world. That world consists of other conscious selves, no longer connected through shared subjectivity, but disconnected from each other as separate thinking beings.
From there sensory experience was categorized, studied and tested. Since other humans appear to be as we are we assume that they have the same kind of conscious experiences as we do. Thus we are in a world populated with “thinking machines,” or bodies that function in service of a mind that itself may simply be a myriad of material chemical reactions — an extremely sophisticated ‘natural’ computer.
Other entities appear to have sensual experience, but apparently do not have the capacity to reason and reflect. These creatures are animal life forms. We know that animals can reason in a limited manner, but the lack of linguistic capacity and reflection mean their consciousness is different. Other life forms do not seem to experience the world, they just grow — plants lack consciousness. Below that are non-life forms such as rocks, soil and inert matter. Energy (and plasmas like fire) have a different status, though we know realize that matter and energy are the same — matter gets converted to make energy.
All this is accepted because it works in the world. We can analyze reality as if we are discrete minds experiencing a reality we somehow find ourselves in for no apparent reason. Yet this is an odd conclusion. If our minds consist of only chemical and electric reactions, with DNA shaping our personality and capabilities, then there is really no separate mind able to comprehend experience. Where does the self end and the object begin?
The self is our reflective thinking capacity. Yet where is it? Is it the brain? But the brain is made up of objects – cells, blood, chemicals and the like. What mix of chemicals, electric nerve impulses and the like constitute the self at any time?
At this level the distinction between object and self breaks down. There are lots of objects that make up the brain, but the self emerges from it in some indistinguishable form. To be sure, the self cannot live without the body (especially not the brain). It can lose some parts of the body, but if the body cannot function at a fundamental level the self disappears — the body dies and the subjectivity is lost.
But what if the modern take on subjectivity is wrong? What if it is not the case that a discrete subjective self is in a world of objects, trying to make sense of the objective reality in which the self finds itself? What if objectivity is an illusion, what if the pre-modern view of expanded subjectivity is accurate?
First, it is only habit and bias that cause some to dismiss that possibility. Since the self is constituted only by sensual experiences the idea that the self is separate from those experiences (or the source of those experiences) is pure conjecture. In dream states, for instance, no distinction is made between what one takes within the dream as sensual experience (you can touch, see, hear and taste in dreams) and the self. The dream is an illusion of objective reality during the sleep state.
There is no logical reason why waking reality could not be seen in a similar light. It is more consistent and less malleable than dream reality, but that simply means it functions differently, not that one is pure subjectivity and the other is a discrete subject adrift in a world of objects. Moreover, modern physics and brain research have mainstream theories that draw the same conclusion. The holographic principle posits reality as a complex hologram, with the brain operating much as a hologram operates (it appears there is no other way to account for how the brain functions).
The fact that it is possible that the stark subject-object distinction is an illusion doesn’t mean we should dispense with it. Practically, it has proven very useful, allowing us to manipulate the world of objects to achieve numerous goals. Pragmatically, accepting the object/subject differences makes sense.
But does that practical manipulation of reality require that we posit a subject-object dichotomy? Probably not. It does allow us to remove ethical consideration of how we deal with the world because objects that are not conscious (or in the case of animals possess lower consciousness) do not require ethical treatment. Killing a tree, killing ants and forging steel are all acts with no ethical content — we’re dealing with non-conscious objects.
If objectivity is an illusion and the self is mingled with experience, ethical issues become more complex. It still may not be wrong to poison ants or build skyscrapers, but the act of doing so could affect the subject in some non-trivial way. Moreover, how we deal with others would change. With the subject-object distinction we can assume that our choices are individual and thus have no impact on others unless there is some objective trace (e.g., we hit another person, steal their money, or something like that). Without the distinction then humans are not discrete separate selves, but may indeed be linked at some level with the rest of the world of experience. In that case, ethical systems built on the idea of discrete individuals interacting through choice have to be rethought.
Of course, one could argue that the experience of reality as objective is enough to simply make that a working assumption. Samuel Johnson used a similar argument against Bishop Berkeley (who also doubted the existence of objective reality) when he kicked a rock and said “I refute Bishop Berkeley thusly.” The object moved when kicked, the world operates as a world of objects. Yet as one learns in philosophy class, that doesn’t refute Berkeley or prove anything. Johnson simply gave a pragmatic reason for accepting the subject-object distinction.
An expanded view of subjectivity seems odd to the modern mind, even if that’s the “natural” state of the human mind in nature. Seeming odd doesn’t make it wrong, however. It also wouldn’t mean the modern mind is inferior to the pre-modern mind, only that we may have one point wrong and we should consider the implications for how we live and understand the world. This also could be an alternative to cold positivism and meaningless relativistic skepticism.
Row, row your boat, gently down the stream, merrily, merrily, merrily, merrily, life is but a dream…
Giving in to Voice Recognition
For the last 38 years I’ve kept a journal. It started back when I was thirteen, paused when I hit 15, and started “for good” at age 16. Up through 1985 it was pen and paper, then I started to use word processing.
From 1985 to 1989 I used a word processing program called “Paper Clip” with my Commodore 64, which I had hooked up to both a printer and an old black and white TV as a monitor. I still have that old computer, disk drive and “Paper Clip” program. I believe I have the old floppy disks (the 5+ inch variety), but I have no clue if it would be possible to transfer those files to Word.
Last summer I started the task of typing up my old journals. I am a fast typist. In 7th grade I took typing and had homeroom in the typing room so I practiced a lot. Back at Patrick Henry Junior High in Sioux Falls in 1973 I was one of only three guys in my typing class. The reason the girls outnumbered us 10 to 1 was because typing just wasn’t a skill boys were expected to learn. Most guys took more shop courses (wood working, metals, drafting, etc.) while the girls took things like typing. You see, guys would likely end up in an office with a dictating machine, while the girls would be the secretaries who would have to type up the work. Why would a boy want to develop typing skills? A male secretary would be, well, weird.
I’d love to say I enlightened enough to oppose sexual discrimination back in the 8th grade, but the truth is I loved to write even back then. I wanted to type. I was going to be a journalist, preferably a sports writer. My hero was Sid Hartman, an insider for the Minneapolis Tribune (now the StarTribune) who wrote daily columns about the goings on inside the Twins, Vikings, Gophers and North Stars (the hockey team of that era) club houses. I could imagine myself following sports for a living. So I learned to type, and I was one of the fastest in the class — my typing teacher was thrilled to have a boy learn to type and be so good! In junior high, otherwise lost years for me academically, typing was my best subject.
I remember writing about the Ali-Foreman fight, various football games, and handing my “column” (I’d get to school early so I could write) to friends and have them comment and often argue about my effusive praise of Fran Tarkenton or prediction that Bert Blyleven would be a superstar. I learned and wrote on an Underwood manual typewriter, and still remember those drills to strengthen the little fingers, slapping the carriage return bar, and making sure that I didn’t type past the little mark representing the one inch bottom margin.
By college I had my own Royal Electric typewriter (I still have it, though I have no idea if I could get a ribbon for it), and for long papers I would go into my dad’s office to use his secretary’s IBM selectric. That was a sweet machine, and I fantasized about owning a Selectric. It had a backspace button that automatically whited out a mistake — and if you backspaced ten or so times, it would remember which letters to white out. It was sleek and easy to type on. I hit 100 WPM with no errors at one point.
Of course, I never bought a Selectric. Shortly after college the technology revolution brought the PC age, and at age 25 I got my Commodore 64. That’s also when I shifted to typing my journals.
Last summer I started retyping my old journals, getting 1973 to 1975 complete. But as I look at the stacks of paper representing journals between 1975 and 1989 and consider the aches and pains of constant typing, I realize that I lack the time to quickly type them all up. I’m also not sure my hands and wrists could take it. So today I went on line and ordered voice recognition software.
The typist in me has been resisting that, the same way I resist texting. I don’t have that many skills in life, but typing is one of them! To be sure, most of my writing will still be done via keyboard. I think through my fingers. To me typing is the process of writing, I don’t do well with a pen and paper, or by talking it. I could never create blog entries with voice recognition software, my fingers on the keyboard are integral to the creative process. But copying already written material? Yeah, I can see just reading it aloud.
I also have “dream journals” to copy. These were made from 1986 to 1990 and contain thousands of dreams. I would become what I called “dream aware” (I’ve since learned the official term is lucid dreaming) and then do experiments, waking up to jot down the ideas I’d type up (on my Commodore 64) the next morning.
I’m not sure how well voice recognition will work, if I find that I’ll use it more often than expected, or end up hating it — I’ll blog the result when that happens. And who knows — maybe I’ll balance giving in to this new technology with a purchase down the line of an old Underwood manual typewriter. I’m sure my fingers (especially the pinkies) have gotten lazy and soft being used to these sensitive PC keyboards. My fingers could use a good workout!
Changing Habits
My garage is full of chemicals. I can kill wasps, ants, and weeds. I can kill weeds but keep the grass alive. I can weed and feed, I can fertilize, I can clean. I can clean grease, stains, and various surfaces of my car.
I have spent most of my life under the illusion that most of these materials sold to make life easier are relatively safe. I know you shouldn’t mix various cleaning materials, with some you should wear gloves (something I’ve generally neglected to do), and that too much could be bad. Yet certainly the companies that make all these things for home and garden have tested them out. And even if the market couldn’t prevent dangerous items from being made available, government regulations must focus on safety.
I’ve been using speed stick deodorant, Nivea shaving cream and moisturizer, Pantene shampoo and conditioner, and the kids have enjoyed yummy smelling colorful soaps, and plastic tub toys. We’ve been cooking with no stick pans, and drinking sodas from plastic bottles, even if they’ve been sitting in the car on hot days. It’s just life. We don’t smoke, we try to have healthy habits, but plastics, chemicals and additives are ubiquitous. The idea I should have a lush green lawn not marred by clover and crab grass seems normal. Everybody, save a few old hippies, lives this way.
I’ve now come to the realization that how we live with chemicals is analogous to how the Romans lived with lead poisoning. We’re poisoning ourselves and our children (look how autism and ADHD rates have skyrocketed) by injecting massive amounts of under tested chemicals in every aspect of our lives. I’m not sure how much poison we’re getting. People still live long, but chemicals seem to play a role in making us fatter and creating expensive health problems. Rates of cancer are up, so are a variety of other health problems. Moreover, the generation now in their 20s and 30s are far more affected by these chemicals than people my age, since widespread usage really got going in the 80s and 90s. We may not know the real impact for quite some time.
This creates a perfect storm. There is a danger that seems obvious, but it can’t be verified with certainty, and the exact impact is unknown. Moreover, studies can always be criticized (and a very well oiled chemical industry lobbying machine responds to everything) and it may be decades before we know for sure just what the impact of all this is. Humans tend to ignore issues like that — if the problems aren’t obvious, why bother?
That’s been me for years. I’ve known the chemicals in our lives are dangerous and under tested. I’ve had suspicions that my use of them was probably not all that wise. But those have been fleeting thoughts at the back of my head. They haven’t been strong enough to get me to actually dig into the science and think about changing my ways. There are classes to teach, kids to take care of, things to do. So it’s grab what’s at Walmart and whether in cleaning or lawn care, take the easy route.
In May I read What’s Gotten Into Us by McKay Jenkins, a book that gave an excellent and well supported overview of the scope of the problem, showing clearly how little we know, how studies are often hidden and attacked, and just how powerful the chemical lobby is in the US. In the EU there are far stricter regulations (so I’m going to keep using my made in Germany Nivea skin care products), here the burden of proof is not on the chemical companies to prove their products are safe, but for others to give definitive proof that there are dangers.
My wife read the book as well, and was convinced. She replaced most of our household cleaning and personal hygiene materials with Seventh Generation, Tom’s of Maine, or Burt’s Bees. I went along with it, figuring that at the very least this is the kind of thing that companies will pay attention to — if the market shifted towards low chemical alternatives businesses would have to follow. But I was still skeptical. The thing about books with a mission — and Jenkins’ clearly is convinced of his argument and has a desire to open peoples’ eyes — is that they are prone to overstatement. The book cited science, but the author is not a scientist. So I’ve been investigating further.
But the more I read the more convinced I am that not only is there a lot of uncertainty on what these chemicals are doing to us, but that the massive increase in chemical use and our exposure, especially children, is by definition extremely risky. Chemicals are powerful, they have side effects that often aren’t understood. The websites and blogs criticizing those concerned about chemicals tend to have weak arguments. They either deride/ridicue “anti-chemical” folk for being too alarmist and blaming things on chemicals that might be caused by something else.
Both of those points may be true, but that doesn’t lead to the conclusion that chemicals aren’t a problem. Same for the statement that ‘just because it’s not natural doesn’t make it bad.’ That’s true. But it doesn’t mean they are good or harmless either. Defenders of chemical usage rest on the arguments that: a) they are beneficial in helping us achieve our goals; and b) studies haven’t definitively shown the specific harm being done. Most of the evidence against chemicals is circumstantial.
OK. Consider this chart:
This rapid and alarming rate of increased obesity corresponds to the rise of chemical additives in food, many of which affect hormones and other aspects of the anatomy that influence fat retention. Is this circumstantial? Yes. The causes of increased obesity are many, despite many links connecting the two (see: The Body Restoration Plan by Dr. Paula Baillie-Hamilton for a good argument on this issue), definitive proof remains elusive.
The same goes for the rapidly growing rates of autism (and other neurological disorders) in children, increases in depression and psychological problems in adults, hyper sensitivity issues, increased allergies, and increased incidences of cancer (even if medical science is much better at curing it). If I demanded total proof that these chemicals are harmful before deciding not to use them, I could find arguments that cast doubt on every claim of harm made.
But my health is at stake. The lives and health of my children are at stake. And the evidence that harm is likely is very persuasive, even if absolute proof is elusive (it was also elusive for cigarette smoking for decades, where another powerful lobby tried to fight making such a connection). Simply the fact that chemical usage has risen so dramatically is enough to get me to realize it is irrational for me not to be concerned — throw that much new artificial under tested chemicals into our bodies and homes and its clear no one knows exactly what the result will be.
So my habits are changing. Some changes are gradual, others are instantaneous. I’ll not try to get a perfect lawn, so no more weed and feed. I will finish my Speed Stick before using the natural (no aluminum) deodorant. I’ll try to hit the Farmers’ Market more frequently and buy organic/local as much as possible, but now and then we’ll still get some fast food. Most importantly I’m going to start thinking about these things, investigating the products we use and the food we eat. Big agribusiness doesn’t care about us. Regulators lack staff and the politicians listen to the lobbyists of big money more than they listen to scientists or citizens. We have to educate ourselves and hope that enough people see the danger that we can have an impact on the market.
Chemicals are everywhere, no matter how much I change at home our world is defined by chemical usage at work, by local governments, stores, and other people. This is something we have to live with. But making some different choices at home can’t hurt — and may yield long term benefits.
Surreal America
Posted by Scott Erb in 2012 Election, Democrats, Economy, Republicans, Unions, US Politics on June 15, 2011
The argument that the US is heading for collapse is probably overstated, but sometimes when I look at the nature of political debate in the US it seems plausible. Rather than taking a cool, pragmatic view of the problems and potential solutions, politicians are increasingly locked in an ideological fog, looking for sound bites and “easy” solutions. Rather than recognizing the complex array of problems that brought us to this point they prefer to point to some villain who has foisted harmful policies on the US which, if eradicated, would eliminate our problems.
For instance, it’s become mantra for some on the right to see unions as some all powerful bad guy. The proposition is so absurd as to be laughable. As this story shows, US workers are lower paid than ever. One reason is that unions have become impotent and are smaller than ever. There is no possible way to deny the fact that relative income has shifted away from workers (who have gone from ‘middle class’ to ‘working poor’) towards the wealthy, and that unions have become rare and ineffective. This is a problem. With the working class increasingly receiving less income, not only is the divide growing in the US, but economic growth becomes more difficult. Companies are earning record profits as worker income declines.
If people were rational and objective, they’d see this and come to the obvious conclusion: corporations and businesses with resources are structuring the game in their favor (and the favor of their share holders) while workers, no longer protected by effective unions, pay the cost. Instead, driven by irrational ideological faith, many argue that unions are evil and holding back growth and that the government is unfair to big business. Morever, it’s clear that the wealthiest have had an explosion of income growth in recent years, while the poor and middle class have not. An objective, rational analysis would say that given high debt and deficits, those who have benefited so much at the expense of others should pay more taxes. Instead, efforts are underway to cut taxes on the wealthiest and cut services to those poor who have become worse off in the last 30 years.
As I noted yesterday, the argument that these policies trickle down wealth to the poor has been discredited. Europe, which has stronger labor unions and more wealth equality has also created more jobs in the last 12 years than has the US. We’ve seen a net shift of wealth towards the already wealthy. Moreover, there is little class mobility in the US. It’s not as if the successful are rewarded for hard work and innovation; rather, those with wealth are being rewarded for having wealth.
What’s surreal about this is the way that especially the right in the US ignores the data and continues to argue for tax cuts while demonizing labor unions. There is no evidence supporting such claims, such views are driven solely by ideological propaganda. But the left is not without blame either. Their focus has been increasingly on politically powerful interest groups, meaning the working poor are often left out. Moreover, they’ve ignored the problem of higher debt and ineffective social welfare programs, giving ammo to the right to show the problems of big government.
In short, the left has been AWOL in terms of truly identifying and offering solutions to the problem of this loss of wealth, status and opportunity of the working poor, defending instead existing government programs. After all, what political clout do the working poor possess? They won’t win elections for the Democrats, and defending them only gives fodder to the GOP to scare the wealthier into thinking Democrats want to “transfer wealth” to the poor.
Transferring wealth won’t work. The problem is not unequal outcomes, but warped opportunities and power structures. Corporate profits shouldn’t skyrocket while worker income plummets. There needs to be more fairness in the structure of pay and power in the market place. This doesn’t call for massive government transfers of wealth, but rather government support of labor unions, transparency in the market, and regulation of big business. The goal is not to combat the free market, but to make it work effectively.
That’s the most surreal point of all. The free market has not been working well in the US because it’s been manipulated by insiders with information and power to control how it functions. The state is supposed to prevent that and protect the market, but increasingly both parties listen to lobbyists representing the current “winners.” Left unchecked, this will lead to a bifurcated class structure in the US and a potential backlash. Left unchecked, the US will lose out to others in the global market. Republicans and Democrats should together recognize the problem; instead, ideology trumps pragmatism, and problems go unsolved.
It feels like I’m watching a great power decline, even though there are ample opportunities to change and regain vibrancy and growth. Yelling at each other and lost in ideological fog, those opportunities are ignored. Instead of problem solving, opponents are demonized in a weird effort to pin problems on a particular group. The way some Republicans talk about “liberals” — as if all the problems the country faces are caused by ‘liberals’ — is dangerous. Rather than seeing problems that need solutions, others are demonized and uncritically blamed for all that is wrong.
As someone who is critical of both the left and the right, and who believes that pragmatic solutions require compromises from each side, I hope the surreal political debate gives way to more rational, practical, problem solving. If not, then perhaps the best days of America are behind us…or at the very least, will take awhile to return.
Recent Comments