Archive for the ‘By the author’ Category

(First delivered at the UU of Franklin, June 19, 2016)

Talk in everlasting words
And dedicate them all to me
And I will give you all my life
I’m here if you should call to me
You think that I don’t even mean
A single word I say
It’s only words, and words are all I have
To take your heart away

I sometimes talk to my cats. Okay, I’ll admit it, I often talk to my cats. And even though one of them is named Chomsky, none of them are linguists. I know they don’t understand a word I say, but I talk to them. I even talk to my dog, a Chihuahua named Bernie, who is deaf as a box of rocks.

I’d guess you talk to your pets if you have pets. My guess is based on the fact that we all talk, all the time. Mostly we talk to ourselves, of course. But we talk.

If two strangers meet at a bus stop, pretty soon they talk. Usually they’d start with the weather. Perhaps the bus schedule if theirs is running late. Then on to more impersonal trivia. But the need to connect is very real. Most of us want to be accepted and and to be accepting socially, most of the time.

Psychologists have found that one of the most difficult tasks they can give to volunteers is to put two people in a room and tell them not to talk to each other. It rarely takes very long for a conversation to begin.

Naturally enough most people when asked would offer the opinion that the whole point of language is for communication with others. We chat, we bare our souls, we argue, we opinionate, we instruct or give orders, we cajole and we flatter. We say all kinds of things for all kinds of reasons and listen and read and reach agreement or find inspiration or end up thinking that the other person is hopelessly stupid. And sometimes we do all of these things on FaceBook.

But modern language theory suggests that communication, which of course means communication with others, is a minor and secondary function. The deepest thinkers about thinking now tend to believe that language is first and foremost an internal matter. In this view our language ability is principally a benefit to thought. Furthermore, it is argued that most language never emerges from our brains.

If you think about it for a few moments – by which I mean, if you talk to yourself about it – that immediately becomes obvious. We incessantly carry on conversations with ourselves – at least until we take up Buddhist meditation and try to make our monkey brains stop talking. Although my experience with meditation some decades ago suggests to me that no matter how successful we might be in stopping the internal dialogue, it comes back with a vengeance when we quit saying “Ohm!”

We talk to ourselves. We argue with ourselves. We lapse into sing-song when an ear-worm infects us with a favorite song. We think about what we should have said or what we ought to say. We remember past conversations and imagine future ones.

But it goes much deeper than that. It seems that our innate ability for language, the so-called “language gene” has equipped us with a language that is deeper than the sum of all the words we know. There exists an interior “knowing” that is expressed in our thoughts but which very often fails when we attempt public expression.

Have you ever seen the movie version of a book you have previously read and loved? My own experience, and an experience I have often heard repeated by others, is that the movie version fell short in some way. That falling short, despite the best efforts of screen writers, directors and actors, is, I think, because we have created an interior version, triggered by the author’s words, that is deeper and richer and more nuanced than the attempted transcription. Our interior version is expressed in ideas we can’t easily articulate, because the language of exterior communication is so much more limited than our personal internal vocabulary. The pictures in our heads are better than the pictures on the screen.

Imagine for a moment what it might have been like to be the first human being with a language gene with an innate ability to put thoughts together in a row. Of course, when I say “imagine for a moment” I mean talk to yourself for a moment. Our closest living primate relatives, the chimpanzees, have been tested extensively and show not the slightest evidence that they possess the faculty of linguistic thought. They can learn some sign language, for example, but are completely unable to distinguish between specifying an apple, the place where an apple might be stored, the knife that cuts the apple, the person providing the apple, and often the difference between an apple and some other treat.

So at some point after our line of hominids veered off from the chimps one person suddenly had some sort of ability to use what we call language. Evolutionary change never happens in groups because genetic variations are individual. It takes a single individual change to begin the process of wider adaptation.

So at some point one person began to formulate ideas in sequences that we would have to call words. Abruptly what we think of as thought became possible. Alone among her tribe she would have begun to use her brain in a new way. Before that point her people would have operated as almost all other animals do, following what we call instinct, following the food supply through the seasons, knowing in the same sense that your dog knows it is dinner time or a bird knows when to fly south. Suddenly ideas began to string together via an internal language, an internal calculation. As the first person with the ability there was no possibility of talking things over with others.

Surely the first glimmer of internal thought was a small step, but hard to imagine from our own place in evolution. So it was first one and then her children who had this huge advantage in considering their actions and the future. And in turn their children had the ability as the genetic inheritance spread. Very gradually, and much later, a spoken language emerged.

Over time language blossomed into all the many tongues that have been spoken over many thousands of years, new ones emerged or combined with others while some died out. But here’s the thing – linguists have discovered that all human languages follow similar syntactical rules, core ways of expression that are apparently innate. One could say we are hard wired to use language. Babies quickly pick up on the spoken language that surrounds them, and it doesn’t matter whether it is Mandarin or Spanish or Swahili or English.
It’s often observed that youngsters seem to learn new languages more easily than adults. Perhaps that’s because they haven’t formed preconceptions about communication and are simply open to fitting new words into that preexisting framework. Once we are older and set in our ways we might think that Italian is going to be way different from English or Japanese and focus on trying to learn words instead of just accepting that it can all be natural and normal. I’m no expert on that, but it could be so.

In any event we started talking to ourselves perhaps 100,000 years ago and haven’t stopped since. In a previous talk here I mentioned a theory offered by Unitarian Universalist psychologist Julian Jaynes regarding that inner dialogue. He posited that the two hemispheres of our brains weren’t initially as coordinated as they seem to be today and that when one hemisphere heard the other talking it was often attributed to gods or angels. His theory is that we didn’t realize that we were creating those voices until the advent of alphabetic language, when we began to replicate not just what we thought and what others thought, but also the sound, and could share those thoughts and sounds across time and space. Jaynes believes that what we regard as consciousness began at that point.

So it’s interesting to consider the origin of written language. Our earliest writing took the form of pictures that gradually became stylized in the form of Egyptian hieroglyphics and then complicated characters as in China. On another front it seems to have started as counting marks that evolved into cuneiform. Only people with special knowledge could interpret those early forms and literacy was limited. The big leap came with alphabetic writing that permitted anyone who understood the letter sounds to replicate the voice of the originator. In a sense, alphabetic writing was the first form of sound recording. At first the few literate people in a community would read messages and texts aloud to others, but literacy spread.

Thinking of reading aloud on this father’s day weekend calls up one of my fondest memories of my Dad, who read aloud to me and my brother night after night. I think most of the books were from his own youth. Each night we’d get a chapter or two before we fell asleep and then be eager for the story to continue the next evening. The Three Musketeers, Captains Courageous, Treasure Island, the Oz books and more. In later years I wondered if Dad geared the reading level to my personal developmental level, since I became a constant reader and my two year younger brother did not. I wondered if he got left behind, or if we were just very different people. In any event, that love of books and reading has continually enriched my life, the greatest gift my father could have bestowed.

When I recall that memory I tell myself a story about it, and an interesting sidelight is that we change our memories when we remember them – in a sense playing that childhood game of telephone with ourselves, passing along the tale from past to future but changing it a little each time. Today I’d tell you that my Dad read to us nightly for years, but it couldn’t have been more than a few, because I was soon reading on my own – with a flashlight under the covers because I was supposed to be asleep. And it may have only been in the winter months when early darkness curtailed after dinner outdoor activity. We now know that the more often we remember something the less accurate it gets.

One of my favorite characters was Dr. Doolittle in a series of books written by Hugh Lofting. Doolittle’s ability to communicate with animals utterly fascinated me, together with his strange adventures in Puddleby-on-the-Marsh or in Africa. The possibility of really communicating with animals has tantalized me ever since.
As I came to know over the years, we can’t actually communicate in a human sense with any other animals. Of course some animals can learn commands and some seem to know their names. Some certainly know our voices and can tell us apart, and we can read their behaviors and sounds. I know when my cats are hungry or when my dog wants to go out. And to an extent they have learned behaviors that elicit responses they want from me, pretty much limited to food and petting.

But we’ve pretty much hit a brick wall in terms of syntactical communication – stringing together ideas with verbs and nouns and modifiers, discussing future and past and so forth. Some gorillas and chimps have famously learned some sign language, but as I mentioned earlier the meanings are blurry and a lot depends on the interpretation of the trainer.

The most intriguing exceptions in the animal kingdom are cetaceans: the dolphins and whales. Their brains are as big or bigger than ours and more complex at the neurological level. They very clearly communicate with each other and the more we study them, the more complex their communication seems to be.

Rather oddly, in my view, Noam Chomsky, deemed the greatest linguist of the modern era by many, and one of the deepest thinkers about thought who has ever lived, flatly denies that the cetaceans have the sort of capacity for language that we do.

I know I don’t have the academic credentials or standing to challenge him, but I can’t help but think he shows a singular lack of imagination. The fact that we can’t understand dolphins doesn’t mean they aren’t discussing all manner of things, both inside their heads – talking to themselves like we do – and in the wide ocean. Due to physiology they can’t display facial expressions or talk with their hands and have no need or ability for writing – but we do know they can carry on conversations with each other on two frequencies at once. That would be like me delivering two talks on different subjects simultaneously and you understanding both. We do know they have names for each other and researchers other than Chomsky believe they have discovered syntax in killer whale language, phrases that seem to ask questions and answer them. Though again, we don’t know what they are saying.

I stumbled on that discovery of syntax while I was researching my book Whale Falls, and thinking about why some people regard dolphins and whales as our peers and others think of them as sushi. That led to the theme of my subsequent novel, She Walks on Water, in which I imagined how actual communication with dolphins might play out.

The ability to communicate emotion in some form and how we react to it, how intelligent we deem a creature to be, has a good bit to do with our willingness to eat them. The taboo against cannibalism is nearly universal, and even those few cannibalistic tribes like the Anazazi in the American southwest, or some New Guineans, generally only ate their enemies, and those enemies almost certainly spoke a different language. Most of us in this room are probably very unlikely to eat dogs and cats, or gorillas and chimps, but they are dietary items in other parts of the world as are dolphins and whales.

As an aside, it’s interesting how sensibilities change. The Dr. Doolittle books reflected the sensibilities of 1920s, and included some stereotyping of African people that is considered offensive today. In a reissue of the books in 1988, Lofting’s son expurgated the stories, after long deliberation about whether his father would approve. I understand the choice but it left out some lovely and pointed humor. In the original when Prince Bumpo was sent by his father the African king to England to attend Oxford, he was afraid he’d be eaten by white cannibals. That’s missing in the new version, and what the modern reader misses then is a wry commentary on cultural assumptions. Bumpo also expressed a desire to become a white man at one pont, which offered another potent bit of cultural commentary, and that’s missing in the rewrite.

So we form judgements about other people and other animals based on external communication whether it is language or signals. And those judgements are processed via our internal language in thinking patterns that never fully emerge from our mouths or pens or keypads. Yet we do learn to read into what people say, to read between the lines as the saying goes.

Very specifically we can read a great deal in other people’s eyes, and looking another person in the eye has powerful connotations. To begin with, we don’t ever look acquaintances in the eye for very long – it is too intimate, or too threatening. Generally speaking, long gazes are reserved for those we love. A long stare is considered rude at best and often aggressive. Eyes and facial expressions often reveal when a person is lying, and we talk about con men who can lie with a straight face. Or card players who maintain a poker face. Because we are all talking to ourselves all the time we know that everyone else is as well. We know they aren’t saying everything they are thinking, and pretty certainly CAN’T say everything, because much of it can’t be put into words. Even if you never really consciously thought about it before I mentioned it at the start of this talk, you know you’ve known that your whole life.

Probably you’ve had the experience being silent for a spell and of having someone, usually someone dear to you, ask: “What are you thinking?”

The answer, at least in my experience, is approximately impossible. Only the most immediate thought is available, and answering leads to a lot more about that immediate thing than I was actually thinking when asked, and completely ignores a dozen or a thousand other things that I had been thinking before I was interrupted. And all of that doesn’t touch the filtering that might go on if I was thinking something I didn’t think I wanted to share.

And ultimately these thoughts about language and thought arrive at a very deep question. People seem drawn to the idea of body and soul, but if I say “my body and my soul” there is a piece missing. Who’s body and soul am I referencing? If there is an “I” who possesses that body and soul, it is something different from either of those identifications. So now there is a third player. This must be the thinking part, the part possessing language, the part able to think about bodies and souls. And is that thinking part a function of the physical brain, or something beyond? What would beyond mean in that question? Then one step further when we understand that everything we experience as physical is actually space, since there is more space than electrons, protons and neutrons in every object we normally identify as solid. And then, is our thinking part a function of all those subatomic particles whirling around in our bodies, or is it located somewhere else in some realm we have not yet defined?

Now, all these thinky thoughts about thinking suggest to me that much of what we enjoy doing we enjoy because of the internal discussion the activity stirs up. To take the most obvious, crossword puzzles and Scrabble are quite popular. Searching our mental storehouse for words we don’t use all the time triggers cascades of internal dialogue. Song lyrics and poetry do the same, as do longer form written works. But that’s only the beginning. Whether we are sitting in a boat with a fishing pole, or sitting in a stadium full of action, or baking cookies, or mowing the grass, or attending an opera or looking at paintings in the Louvre, or shouting out loud at a football game or watching a Sunday morning talk show or spending Sunday morning at the UU in Franklin, we are constantly telling ourselves a story about our lives. We make it up as we go along.

I hope your story is a good one.


Read Full Post »

(Delivered in a forum on humane animal agriculture at the VeganFest in Asheville, June 12, 2016)

I have been an organic gardener and an active recycler for more than 40 years. I lived off the grid in a solar powered house built largely of recycled materials for 22 years and pooped in a composting toilet to recover my waste as fertilizer. Today I live in a grid-connected, all electric home with a full solar array. I confess to using a flush toilet. I’m approximately net zero and this summer I’ll add enough more solar panels that I can charge an electric car. I ate an ovo-lacto vegetarian diet for about twenty years and was a vegan for eight. I have written books dealing with the ethics of our diet, our relationship to animals and the earth and as a member of Asheville’s City Council have done my utmost to reduce the City’s energy use, to increase recycling, to reduce pesticide use, to make Asheville the first Bee City USA, to facilitate farmers’ markets and to find ways to make public land available for food production.

I have tried throughout my life to live up to something I learned from my father when I was a child – a lesson bolstered by my years as a Boy Scout. Always leave a campsite cleaner than you found it. Or in the wider world, always leave the place you live better than when you arrived.

But there is one thing I haven’t mentioned that has had and will have more impact on the future of planet earth than everything else I have done put together. I chose not to have children.

There is no problem confronting us today that is not made worse by population growth. It is the scale of human numbers that is creating the climate crisis, the phenomenon of world-wide drought, the poisoning of waterways and the chemical changes in the ocean, life threatening air pollution, the death of coral reefs, the mass extinction of species and the constant pressure toward war. In wild animal populations the food supply is always a limiting factor. We humans have gamed the system.

If a single lifestyle choice has any relevance to the human future, it is for millions of people to decide not to have children.

But this weekend event is focussed on diet, so I should probably discuss my current thinking regarding food, though it greatly hinges on our burgeoning numbers.

No vegan who is also a gardener can easily escape the reality that agriculture kills animals. If I go out in the yard with a shovel I am signing up as an executioner. Of course at the personal level it is mostly earthworms and other soil creatures that die, though this spring I inadvertently killed a baby snake as I was turning over the soil. Then too, I pick off pests and have very occasionally resorted to so-called organic pesticides to get rid of a pestiferous infestation. I have done that reluctantly and with full knowledge that I was killing a whole lot more than the target bugs, possibly including birds, small mammals, reptiles and amphibians somewhere in the food chain.

Last summer the netting I strung up for snow peas caught a sparrow, dead before I discovered it. And the year before I trapped a ground hog that was mowing down my garden and released it several miles away in a woods. I then felt bad all summer having cheated the critter of his well dug habitat and having released it in a place that had much less of the food it needs to thrive. But this year I moved another. It was wiping out my garden.

Looking down from 30,000 feet one can reasonably argue that agriculture, not eating apples, was our original sin. We escaped the bounds of nature and set about transforming the earth.

Of course most vegan apologists would argue that the worms and millipedes and ants and beetles and so forth are low forms of life and that the sparrow’s death was an unfortunate accident. But taking such a narrow view elides the truth. Living does not demand cruelty, but it inevitably requires dying. Agriculture displaces preexisting natural systems. The death of many animals, even extinction of some species, is inherent in our diversion of land and water to our own use. The ground hog I moved is just one small example.

Rodents, to take another example, do immense damage to our food supply, not to mention the rat-borne diseases that have occasionally wiped out hundreds of thousands of humans. There is no large scale food system that does not rely on eradication of rodents. Once again our lives depend on death.

I recall many years ago visiting Kings Canyon in California, near Sequoia National Park, and witnessing the incredible power of the Kings River with a current so forceful that boulders were being tossed into the air. And then learning that the river no longer reached the Pacific Ocean – diverted to agriculture. Back then I visited the Grand Canyon and the amazingly huge Colorado River, only to learn that it no longer reaches Mexico and that we have drilled wells to pump water into the river to meet our treaty obligations with our southern neighbor. By some accounts we now use or divert more than half of the fresh water on earth to human enterprises and we have entered what appears to be a permanent de facto drought. Water we use is generally not available to other creatures, and certainly not in the way it was before. Whether it is hot water pouring out of a power plant cooling system, agricultural run-off with its soup of nutrients and pesticides, the effluent from sewage systems, warm water lakes behind dams on formerly cold rivers, and on and on and on … we have twisted the hydrological cycle to our own ends..

Furthermore, the agriculture that feeds 7 or 8 billion people is entirely dependent on the oil industry, a business that is very hard on animal life even without the Exxon Valdez and the BP oil platform explosion. The fertilizer that made the so-called Green Revolution possible is manufactured from natural gas. The tractors in the fields and the trucks that deliver food run on oil and gas. And yes, we may be able to shift a great deal of our energy production to solar and wind, but I haven’t heard a plausible argument for a large-scale nitrogen fertilizer alternative in the foreseeable future. Modern sewage sludge is so toxic it ranks as a hazardous waste.

Perhaps the massive destruction of the natural world could be minimized if we each grew all of our own food using only the rain that falls on our gardens and hand tools. We could use our own waste for fertilizer as I did for twenty years with my composting toilet. But I don’t see personal gardening as a realistic option given our numbers and the massive concentration of human beings in cities. Even there, as I’ve noted, we are displacing animals.

This touches on an environmental argument favoring veganism, which involves the idea that it takes a lot more land area to support an omnivorous diet. There is some truth in that, particularly with grain fed beef. That argument spoke to me 30 years ago, but I’m less certain today. Animal manure used to be the principal nitrogen fertilizer source on farms, today it is replaced as I mentioned with natural gas. Manure is much healthier for the soil than the chemicals used today. And conversion of sunlight via grass into manure while producing protein is the natural way to preserve topsoil health. We are all, inextricably, dependent on topsoil to live. Any vegan who buys local produce from a small farm is almost certainly benefitting from manure or other animal products. If you buy organic fertilizer, check the label – it generally includes feathers, bones and blood.

On another track I have followed the work of many biologists, ethologists and evolutionary researchers and found this to be true. Hominid apes are omnivores. I recall how surprised Jane Goodall was when she discovered that chimpanzees hunt. Volunteering each week at the WNC Nature Center I’ve had the chance to show children the skulls of various animals and discuss their diets. Strict carnivores have fangs and cutting teeth. Strict herbivores have biting and grinding teeth. Omnivores like humans and chimpanzees have both.

Moreover, all of the higher functioning animals are either omnivores or carnivores – which makes a bit of sense since it presumably takes more cunning to stalk prey than to run. An interesting corollary to this is that our brains need fats to function well, and there is strong evidence that low fat diets contribute to Alzheimers and other brain disorders. Animals are, of course, not the only source of fats, but they contain a higher concentration of fat than virtually all vegetable foods. Mothers’ milk is a very high-fat animal-based food that is perfect for a quickly developing brain.

While researching and writing my book Whale Falls: An exploration of belief and its consequences, I discovered the only other animals on this planet who seem to have brains as complex as ours and which have developed syntactical language are the dolphins and whales – all primarily carnivores. I would note that the animals we tend to cherish as pets are also carnivores or omnivores and even chickens, which some Ashevillians hold dear, love nothing better than frogs. At least that was my experience when I had free range chickens and lived near a swamp.

So we kill to live. Beyond that the dietary discussion is reduced to where we draw our lines. As I described in Whale Falls, cultural decisions fall all over the map. Some Jews don’t eat pork, others don’t eat shellfish and some keep strict Kosher – separate containers and serving ware for different foods. Catholics weren’t allowed to eat meat on Fridays so they served fish, while some Native American cultures held a proscription against eating fish at all. In China cats are a normal dietary item and in Japan they eat whales. One mideastern religion abjures lettuce and rain forest tribes tend to eat a lot of insects. There is very little meat below the forest canopy in rainforests so they invented blow guns and occasionally bring down a monkey. Neanderthals didn’t understand that fish were edible and our direct ancestors apparently ate Neanderthals.

Another dietary argument repeatedly offered in favor of veganism involves health. It is plausibly argued that eating meat contributes to heart disease and stroke, and less plausibly to a long list of other ill effects. The problem with this view is first that it assumes good health is everyone’s highest goal, and it demonstrably is not. People do all kinds of things that are more or less likely to shorten their lives. On the flip side, while personal experience is hard to generalize, I know that when we became vegan my then-partner was going through menopause. We ate a lot of soy products. Before she died of estrogen positive breast cancer one line of research I read indicated that her high intake of soy estrogen might very well have accelerated her very aggressive cancer. Would she have survived if we hadn’t become vegans at the wrong time in her life? There’s no way to know.

Personally I favor decent treatment of the animals I eat. I am appalled at the horrible conditions and practices that are often justified in the name of commerce. But I have come to accept that my living requires dying and I am comfortable with my decision to eat meat.

I fully understand that those who choose to attempt veganism are well intended, but when it is held out as a form of moral superiority I get very uncomfortable. I’m embarrassed today by the holier than thou attitude I somewhat embraced during my vegan years, laying a head trip on people who didn’t share in my purity. I am way over myself as an authority figure. A lot of true believers seem to fall into that trap, and it’s probably even easier for those who give up something they like: Hey, I’m suffering for this moral superiority, unlike you sinners. Priestly celibacy comes to mind.

But I also firmly believe that it is impossible to be fully vegan in the sense of not participating at all in the killing of animals. There is approximately no way around complicity. Plastic bags, shampoo, tires for your car or bike or the bus you ride to work, the threads in your garments, transportation fuel, your walls, your roof, heating, cooling, your cell phone, your alfalfa sprouts … all of it has a bad impact on other living creatures. Echoing the philosopher Albert Camus one might plausibly argue that the only serious philosophical question for a determined vegan is suicide.

The dominant life on earth began once as far as we can tell – though life might have emerged and failed multiple times before things finally worked out in our favor. Everything since then has been part of an immense food chain that ebbs and flows through photosynthesis, metabolism, growth and decay. In a very real sense the whole planet is one organism and it is that planetary organism that is threatened by the current dominance of one specie that learned to rig the game in its favor. Our 10,000 year experiment with agriculture has been devastating to all of our cohabitants on planet earth.

I greatly fear we will not be among the survivors.


Addendum: I should probably have been more specific – pursuant to the above, I believe an organic diet is better for the planet than a strict vegan diet.

Read Full Post »

Here’s the basic text of the message I delivered to the Unitarian Universalist congregation in Franklin, NC, August 17, 2014. (The lyrics marked with a * are sung, not spoken.)

*15 men on a dead man’s chest
Yo Ho Ho and a Bottle of Rum*

On July 11 I woke up at 4:30 a.m. with a great title for today’s talk. “Yo Ho Ho and a bottle of rum!” Together with the subtitle: Black death, white sugar and the quest for a living wage.”

Three weeks later I sat down to actually write this thing and abruptly realized that I was going to have to connect a whole lot of dots over about seven centuries. To begin with, I should probably have said “brown sugar” even though the imagery of black death and white sugar seemed pretty strong. So I did the only thing a reasonable person can do when faced with that sort of problem. I went outside and pulled weeds.

Later I tried again. The first, obvious, question to ask is what were 15 men doing on a dead man’s chest? Was he still breathing when they sat down? Thinking back to my childhood I recalled that my immediate assumption when I first heard that song was that it must have been a treasure chest. But Wikipedia set me straight. There’s an island in the West Indies called Dead Chest Island. It’s a rocky little bump with no trees or water which looks a little like a floating body. Legend has it that Blackbeard once left several unruly pirates on the island as punishment. Each man was supposedly given only a single bottle of rum. As the story goes, when the ship returned at the end of a month, a few of the pirates were still alive. Robert Louis Stevenson wrote the song for his novel, Treasure Island and turned Dead Chest into Dead Man’s Chest.

Good story, but it seems that Blackbeard was actually quite a gentleman and ran his boat with the support and consent of his crew who he apparently paid pretty well. He paid what we could call a “living wage,” or at least a fair crew-share of the proceeds. He avoided violence while cultivating a violent image because he believed fear was better than murder in achieving his goal, which was looting merchant ships from the Indies to coastal Carolina.

Piracy was one reason that a lot of those merchant ships were carrying molasses. Not many pirates wanted barrels of molasses which is a sticky mess after you shoot up the boat with a cannon. And there wasn’t much of a black market, or maybe you’d call it a brown market, for molasses.

I could see I was getting ahead of myself, so I went back to weeding and pretty soon I realized I should have started with Christopher Columbus.

In the late 15th Century European ships had improved to the point that exploration and trade were becoming popular with Queens and Kings. The marvelous goodies that had come from the Far East via the Silk Road had dried up when the Turks captured Constantinople in 1453.

So the Portuguese were exploring the African coast looking for a western route to China, and Columbus convinced the King and Queen of Spain that he could beat the Portuguese by sailing east. He promised to make them very, very rich, which is something Queens and Kings like even better than spices and silk.

Columbus promised gold, but in the course of his voyages he didn’t find much. So he switched to slaves, which were also becoming popular in Europe, with a regular trade developing along the Gold Coast of Africa.

Slavery had always enjoyed some popularity in Europe, but there was a new reason for the demand.

In the 14th and 15th century the Black Death swept Europe. One third to two thirds of the people died over the course of about 100 years. Historians still debate the numbers. The principal disease itself, bubonic plague, was only the beginning of the problem. Many farmers quit planting crops believing that the end times had come, so starvation ensued.

The germ-theory of disease was way off in the future, and whole towns-ful of Jews were murdered because they were thought to be poisoning wells.

Witchcraft was blamed, so witches were burned and cats were exterminated because they were obviously involved in witchcraft. My four cats and I have long thought that was one of the highest ironies of that era, since rat fleas were the carriers of the disease and cats were and are one of the most effective rodent control systems on earth.

The results of the Black Death were extremely beneficial for most survivors. There were a lot of empty houses. Demand for goods collapsed so prices fell. And labor was in short supply, so wages rose. Landlords desperate for workers were outbidding each other. Serfs who didn’t like their treatment simply left, knowing they could find other work. The first strikes occurred and in some places serfs revolted and took over whole towns and regions.

You can see why there was a burgeoning demand for slaves.

So when gold failed to materialize, despite the reasonable rule laid down by Columbus that natives would deliver set amounts of gold each year or have their arms cut off, jolly old Christopher started shipping slaves back to Spain.

Big problem. Over half of each boatload died en route, and the survivors didn’t last long. Other than the Vikings, way up north, there hadn’t been any contact between European and Asian germs and Western Hemisphere natives for tens of thousands of years. Bubonic plague, smallpox, measles, and other diseases for which Europeans and Africans had developed some immunity were lethal.

Columbus and crew also apparently took home syphilis, which was new to Europe. Not sure the Kings and Queens were wild about that.
Between cutting off arms, disease and horribly abusive slavery, Columbus and his followers quickly depopulated every island they visited.
This continued everywhere Europeans landed, and disease ran ahead of advancing troops and settlers, ravaging two continents. Cortez conquered the Aztecs before they took sick, but most Incas were dead before Pizarro arrived in Peru, and most North American tribes were felled before they ever saw a white face.

What to do? Well, one of the other things that Queens and Kings had taken a fancy to, and that the Turks had cut off, was sugar. Sugar cane had been domesticated in Asia a couple thousand years ago, and then the process for deriving sugar crystals was invented in India a thousand years later. Later still sugar cane was planted in Mesopotamia, but now the Turks controlled the candy and the candy store.

Portugal began growing sugar in Brazil, and then Spain and England recognized that conditions were perfect in the newly conquered islands. Soon the islands had been converted to huge monocrops of sugar cane, with smaller plantations of limes, which were also in short supply since the old lime groves were in Persia. Unfortunately the potential local workers were dropping like flies, so pretty quickly the same ships that delivered sugar to Europe were delivering African slaves to the islands.

Then someone invented the daiquiri. Actually, what happened is this. Fermented sugar cane had been consumed for thousands of years, but in the 17th century slaves in Brazil and the West Indies discovered that distilling the brew made it much tastier and of course, much stronger. Soon enough there was a thriving rum trade. Kings and Queens and nobles and tradesmen and everybody else who could afford it thought it was a great addition to the bar. Pirates and sailors liked it too.

It seems that sailors really couldn’t be trusted with barrels of rum and some of it inevitably disappeared en route. Worse still, pirates both enjoyed it and knew where to sell it.

There was another problem as well. Distillation requires a lot of fuel for boiling and fuel was getting scarce in the islands. But lo and behold, New England was covered in hardwood forests just aching to be clearcut for farmland and sheep pastures, and the wood was going to waste.

Soon molasses, which sailors didn’t drink and pirates didn’t steal, was being shipped in quantity to Boston, where it was converted into rum. In short order there was more rum than the locals could drink, although anyone who’s been to a Red Sox game might doubt that, and shiploads of rum were sent to Europe and Africa.

The sailors still drank some, but piracy is a lot less likely on a cross-Atlantic trip than sailing up the coast from the Indies to Boston. Poor Blackbeard was out of luck. Now the New England traders could exchange rum for slaves in Africa, whom they took to the Caribbean where they traded the slaves for molasses, and everyone was happy. Except the slaves, of course.

Although modern Americans mostly remember the Tea Act which resulted in the Boston Tea Party, and the Stamp Act which precipitated the American Revolution, we often forget that the first tax protests were against the Molasses Act, a tax on molasses from non-British colonies. This was a price support measure intended to force New Englanders to buy British molasses for their rum production. As with all such efforts, smuggling was the result. The American colonials mostly ignored the law.

In regard to the American Revolution, I’d also note that the Continental Congress borrowed huge sums of money from France in 1781 to keep the war effort going. Soldiers hadn’t been paid for months and were threatening mutiny, so one of the first military supply purchases was 300 barrels of rum.

Along the way, sugar also became more and more available, and was tremendously popular among the tea drinking English and their American colonists.

*So drill, ye tarriers, drill
And drill, ye tarriers, drill
Oh it’s work all day for the sugar in your tay
Down beyond the railway
So drill, ye tarriers, drill.*

In fact, over the years, it became abundantly clear to rulers around the globe that assuring their populations of a steady supply of sugar and other sweeteners, along with alcohol, was a very good way to dampen discontent and revolutions and other unpleasantness. When was the last time you were in a government office where the clerical desks didn’t sport candy dishes? And have you taken a good look at the amount of real estate in Ingles devoted to candy, cookies, soft drinks, beer and wine? Not to mention the corn sweetener in pretty much every prepared food item on the shelves. Sometimes we seem to act just like the hummingbirds at my hummingbird feeder, aggressively chasing each other away in order to protect our sugar supply.

Next came cotton. The invention of the cotton gin made large scale production possible, but picking cotton remained a manual task until the 1950s. So the well established slave trade began to supply workers to the American south. After the Civil War, sharecropping took the place of slavery, and due to a lack of opportunity elsewhere, the system continued to depress wages in the South until mechanization of farms and industrial growth in the North began to erode the sharecropping system.

During the Civil War somewhere between 650,000 and 850,000 men died, most from disease. I haven’t been able to trace the specific effect on wages of this enormous loss. However, the South lost more workers than the North, and plantation owners were soon complaining about a labor shortage. Adding to that was a sudden shift away from field labor by many black women, no longer slaves, who saw more benefit to their families in tending to children, raising and processing food for the home and so forth. Meanwhile, many northern widows entered the work force, which helped offset losses of labor there.

What is clear is that unions began to gain strength in the late 1800s, hundreds or thousands of labor strikes occurred each year, the National Guard and federal troops were often called in to break strikes, and many organizers were gunned down or executed. Populism and socialism found tens of thousands of advocates. In 1900 there were 2 million union members in America, less than three percent of the labor force. By 1920 that had risen to more than 12 percent.

Fifty years after the Civil War another plague swept the world. The flu pandemic, which was sometimes called the Spanish flu, though Spain had nothing to do with it. In the U.S. an estimated 675,000 died. Globally it killed more people in one season than the Black Death had killed in a century. Unlike the strains of flu we are familiar with today, it was most deadly for young adults, age 20-30, and so it had a tremendous effect on the labor force.

According to an in-depth study of the effect of the flu on economies, the resulting labor shortage drove up wages. Workers were less mobile in the 1920s than today, so wage rates were more local. In states hardest hit by the pandemic, the average income of survivors increased much more than in states where the disease was less prevalent.

During the 1920s powerful business interests fought off unions with open shop rules, like the ones still in place in North Carolina, but after the Depression unions successfully pressed for federal legislation and greatly improved wages and benefits for most American workers.

The pandemic was coupled with the devastation of World War I, in which somewhere between 9 and 15 million people died. Because the physical destruction never reached the United States, we benefited enormously in the aftermath, with industry taking up the slack in Europe. This was repeated again following WWII. Wages rose with the help of a strengthening union movement operating in a rising economy.

The greatest downward pressure on wages today is arguably mechanization. As one wag has it, the factory of tomorrow will be run by one man and one dog. The man is there to feed the dog and the dog is there to keep the man from touching any of the machines. Automation is coupled with global population growth and the ease with which employers can change location.

While factory jobs offered a way out of the south in earlier generations, leading most noticeably to the so called Great Migration of African Americans to the industrial north, today’s factory jobs require far fewer people. The new automobile factories across the South use robotics, and southern anti-labor laws keep wages low. Just like the poor whites who fought for the Confederacy, hoping to preserve the slave system that was helping to keep them poor, today’s southern voters keep voting for politicians who support labor laws that depress their wages. They seem to have forgotten where their sugar comes from.

Today’s living wage campaigns face enormous hurdles thrown up by both mechanization and politicians reliant on corporate donations. As Elizabeth Warren pointed out last February, “If the minimum wage had kept pace with productivity over the last several years, the minimum wage today would be $22 an hour. Productivity went up, but wages didn’t.”

In the same conversation, economist Robert Reich said, “I think that Sen. Warren’s $22 is certainly defensible, but it’s at least $15 an hour.”

According to Just Economics, based in Asheville: “A “living wage” is the minimum amount that a worker must earn to afford his or her basic necessities, without public or private assistance. In short, a living wage is the real, just, minimum wage.”

“The living wage for a single individual living in Western North Carolina for 2014 is $11.85/hour without employer provided health insurance, or $10.35/hour with health insurance provided by the employer.
While large companies are mostly very resistant to raising base pay, small businesses tend to be more in touch with their employees. Just Economics has certified well over 200 businesses in WNC as Living Wage Employers.

The cities of Asheville, Montreat and Weaverville have all adopted living wage rates for full time employees as well. In Asheville we even voted to make a living wage requirement part of all City contracts, but the General Assembly killed that idea last year, banning any pay restrictions in municipal contracts.

One of the early names for rum was Kill-Devil, memorialized in this state in the name of Kill Devil Hills, where the Wright Brothers first flew in 1903. The area got its name because shipwrecks were once common in the area and enterprising locals often salvaged barrels of rum which they then buried in the sand dunes for later recovery.

Interestingly, when Orville returned to Kill Devil Hills in 1911 to set a new world glider record, he glided into the wind for more than 10 minutes but made almost no forward progress. Looking at the plight of working people through the centuries, that could be said of the struggle toward a living wage. Sometimes the demand for increased wages and more benefits gets airborne, but the aircraft is as likely to move backwards and forwards.

Today in the United States the wealth gap, that is the disparity between the rich and the poor, is arguably the highest it has ever been. One percent of the people control 25 percent of the wealth, and globally the richest one percent own 45 percent of everything. In former colonial territories around the globe as fast as countries shook off colonial rule, powerful elites took over and diverted wealth to Swiss bank accounts.

In China and Russia communists once promised to level society, but when the old dictatorship model collapsed, the politically powerful engineered exactly the same result.

Meanwhile increasingly automated factories and farms need fewer and fewer workers, and industry moves around the globe to employ whichever work force will labor for the lowest price.

In conclusion, and playing the devil’s advocate, a not illogical conclusion one might reach is that the best hope for a general pay increase for the workers of the world is another devastating pandemic.

H1N1 anyone? (In sort of a call-and-response a few voices in the congregation added “Ebola?”)

*Yo Ho Ho and a bottle of rum.*

Read Full Post »

For immediate release: 1/1/13

From: Cecil Bothwell, Asheville City Council

Subject: Gun shows on City owned property

Contact: cecil@braveulysses.com, 828-713-8840


Bothwell demands enforcement of City gun ordinance

Asheville City Council member Cecil Bothwell today called for the City of Asheville to ban gun shows from City-owned properties, including the WNC Agricultural Center.

“Our municipal code specifically prohibits the carrying of weapons on City owned properties. I don’t understand why that law is not being enforced,” Bothwell said.

The City of Asheville’s Civic Center and WNC Agricultural Center have both been rented to gun show promoters in recent years, despite this long-standing ban.

Bothwell explained, “Many citizens have contacted Council members asking for action in the wake of the Newtown school murders, but the City has very little ability to regulate guns, permitting or background checks under North Carolina and United States law. However, we do have the power to enforce the laws that are on the books.”

“Gun shows not only promote the ownership and use of weapons, including the glamorization of the assault-type, semi-automatic killing machines used in too many mass murders, but sellers at shows are not required to perform background checks on buyers. That means that guns intended for rapid fire killing may easily fall into the hands of persons who are mentally unstable or who have criminal intent.”

“This is one place we can easily draw the line,” Bothwell added. “The law is already on the books.”



Section 12-42 of Asheville’s City Ordinances reads as follows:

(a) No person shall possess, use or carry any firearm, gun, rifle, pistol, air rifle, spring gun or compressed air rifle or pistol, or other similar device or weapon which impels or discharges with force any bullet, shot or pellet of any kind, including arrows with metallic tips or sharp tips of any nature, designated to penetrate and propel a bow or spring dvice, in any park or other city-owned facility. Further, no person shall possess, use or carry any knife, other than an ordinary pocket knife, which means a small knife, designed for carrying in a pocket or purse and which has a cutting edge and point entirely enclosed by its handle and that may not be opened by throwing, explosive or spring action, or a kitchen knife, when it is used or intended to be used for its ordinary purposes, in any park or other city-owned facility.


It goes on to exempt those holding conceal-carry permits from the restriction on parks (as mandated last year by the General Assembly) and law enforcement officers.

Read Full Post »

It is well to consider during our deliberations the amorality of corporations, created to shield investors from liability and for the legal obligation to generate profit. It isn’t that they are inherently bad, but being artificial persons they have no conscience. Short term profit always trumps long term societal good. In the words of Thomas Jefferson,

“Merchants have no country. … I hope we shall crush in its birth the aristocracy of our monied corporations which dare already to challenge our government to a trial by strength, and bid defiance to the laws of our country.”

The founders were unable to crush that aristocracy, though they tried. Abraham Lincoln observed:

“I see in the near future a crisis approaching that unnerves me and causes me to tremble for the safety of my country. . . . corporations have been enthroned and an era of corruption in high places will follow, and the money power of the country will endeavor to prolong its reign by working upon the prejudices of the people until all wealth is aggregated in a few hands and the Republic is destroyed.”

Fifty years later, President Teddy Roosevelt broke up the too big to fail corporations of his era, and thirty years on, President Franklin Delano Roosevelt acted again to reign in corporate power.

The American Dream is ever threatened by greed, and ever defended by true patriots. May we always, in this chamber, be ready to defend the life, liberty and happiness of the people we serve.

Read Full Post »

McArthur Wheeler attempted to rob two banks in Pittsburgh. He walked into the banks without a mask or other disguise and he was openly carrying a gun. He smiled directly at the security cameras and went to the teller windows to demand money. Several hours later the images were broadcast on TV, the police were informed of his identity by numerous callers, and he was arrested that night.

When questioned by the police, Wheeler expressed shocked amazement that he had been identified and caught. “I used the juice,” he said.

It turned out that he had been told by friends that if he rubbed lemon juice on his face he would be invisible on camera. He didn’t take them at their word, so he applied lemon juice and took a picture of himself with a Polaroid camera and to his surprise he wasn’t in the picture! Apparently he accidentally aimed the camera at the ceiling. But, it was enough to make a believer of him, and he proceeded with his crime spree.

As I write in my latest book, Whale Falls, we believe we are rational beings but have very little proof to offer in its defense, apart from some low resistance to magical thinking that a vanishingly small subset of our number call up from time to time.
During the last decade of the last century, I referred to the fantasies amorphously embraced by the label “New Age,” as “woo-woo.”
In conversation this emerged as “She’s into that woo-woo stuff,” or, “Sounds pretty woo-woo to me!”

The “woo-woo” wasn’t really meant to be harsh or unkind. It was more in the way of gentle sarcasm, triggered not so much by the particular beliefs espoused (since we are all entitled to believe what we will), as by the mercantile slant of many of its practitioners. Sometimes, popular New Age cosmology at the turn of the century seemed like the first fundamentally mail-order religion. Snake oil used to be peddled off the back of wagons, but the business had diversified and gone digital.
The underlying skepticism, however, was more consequential. We are entitled to beliefs, but that doesn’t guarantee their truth or utility.
Those who question the dominant paradigm of corporate greed, mercenary wars, boundless consumerism, upward mobility and other pillars of unbridled capitalism seem to split into two camps. On one side of the river, reside the woo-woos. Over here on my side, we practiced the bunny-hug.

Bunny- (or tree-) hugging is the manifestation of a core belief entirely opposite that embraced by the woos. However-many attempts are made to bridge the divide, peaceful coexistence involves a large measure of good-natured tolerance. Those who pretend to embrace both perspectives are lost in the fog of a comfortable delusion.
This schism invokes the same issues which spurred Martin Luther to nail his ninety-five theses on the door of his local indulgence-monger. Are we saved by our faith, or by our works?

Orthodox Woos clearly come down on the side of faith. I know generalizations ignore subtle wrinkles, but the bedrock remains: Woos place the inner world first and believe that changing the self will change the rest.
Devout Huggers believe in salvation through work. For us changing the world is physical and political, and the changes in self necessary to achieve that work are also physical and political. Sacralizing work may be useful as a motive force, but in any case, the outer work must be done.

No doubt many Woos are vegetarian bicycling recyclers, while many huggers entertain deep spiritual beliefs, but the practical behavioral divide is as real and deep as a river.

This difference emerged in a conversation with a Woo of my acquaintance. We were discussing the concept of embracing abundance—the belief that the universe will provide for all of our needs if we simply open ourselves to that truth. A simple example of this would be the use of visualization to manifest a loaf of bread, which my friend believed could really happen.
Then my friend suggested, “Existence is not a zero-sum game.”
The idea here is that everyone can enjoy abundance without anyone else giving up anything. Reality is a bottomless cornucopia. We’ll make more! I skidded to a halt.
“Wrong,” I thought. “It is.”

Here is the hurdle: If the world is not a zero-sum game, then faith alone might set us free. If it is, faith will not suffice. In a physically limited system on an increasingly crowded and resource-poor planet we need to curb our appetites and impose pollution controls. Protecting whole watersheds and building bicycles instead of cars become imperative. In short, we need to work, not meditate, if reality is circular—that is, if the loops of hydrology, nutrients, and energy are closed.

The best evidence is that the total biomass of our planet has not changed since the last ice age. That is, if you compare the total mass of all living things 20,000 years ago to the total mass of all living things today, they’re about equal. Back then there were more mastodons and giant ground sloths and whales, today there are more cattle and a whole lot more people. But the sum total has not changed. The game is zero-sum.

Life depends on sunlight and the amount of sunlight striking our planet each year is fairly constant. We now divert more than half of the sunlight that falls on the earth’s land mass to human use and that use is expanding fast. The rest of the earth’s species are headed for extinction at our hands.

Hold up your hands, make two fists and take a look. At the current rate of extinction there will be vanishingly few wild creatures on earth larger than your fists 100 years from today, unless we change our course.

To the Hugger, the Woo embraces pretty illusions which might bring personal joy, but permit the world to die. To the Woo, the Hugger focuses on negative images that block personal joy—and, thus, prevent a perfect world from manifesting.

Environmentalism is the philosophic stance taken by those who believe that we are likely doomed but might be saved by our work; therefore the work must be done. We have no choice.

In everyday life, Huggers and Woos can get along, and do. Both might equally appreciate a sunny summer day, the glory of gladiolus and cosmos and roses, a fresh breeze off the ocean and the spark in loving eyes. They may well agree with Alice that, at the end of the game, we can all be Kings and Queens together. But still, the divide remains. Work or faith?

In reading Thom Hartmann’s well-considered and deeply disturbing volume about our oil-dependency, The Last Hours of Ancient Sunlight, I was brought up short by his conclusion that the most important step in addressing our pending energy-starved doom is meditation. He states, “It’s amazing to think that it’s possible to change the world by changing ourselves, by changing the way we think and live and experience every moment, but that’s been the core message of virtually every religion in history, from the most ancient and primal to the most modern and recent. You can change and save the world by changing yourself.”

Well, it may be amazing to think that, but it would be more amazing if anything came of it. Religion has always failed us as a practical approach to problem solving. Magical thinking is magical thinking, no matter how it’s done up.

This harkens back to the thoroughly debunked “proof” that prayer by strangers for patients who don’t know they’re being prayed for affects medical outcomes.

Meditation is a fine practice for those who find it rewarding, as is prayer, but demonstrable success in changing the world is sadly missing. Woos give their mystical practices credit when things work out and then allow that the desired outcome must not be God’s will when the belief-train leaves the tracks.

A deeper difference of opinion embodied in the Woo-Hugger debate involves death and survival. Woos see spirit as separate from flesh and generally believe in some sort of transcendence of this earthly plane—whether that means personal salvation, reincarnation or dissolution into Krishna consciousness and liberation. Such beliefs seem to place a high degree of centrality in homo sapiens sapiens, and consequently feed the idea that we are apart from nature, that we are somehow special. At the same time, the idea that the true self will survive death must devalue life. If you believe in a glorious heaven, boundless enlightenment, permanent liberation, why hang around this vale of tears? Why not strap a bomb to your chest?

Clear-eyed Huggers see our consciousness as a function of our brains and therefore terminal. This life is the only life we will experience, so we’d best make the most of it. Making this world better for everyone, helping those we love and those in need, sharing our joy and ideas and creations, listening to the stories of others, all of it will end far too soon. Time’s a wasting!
Furthermore, Huggers’ acceptance of science and most particularly evolution cuts hubris down to size. Our species is new in the history of our planet, and temporary. The sun is a middle-aged star. As cosmologist Martin Rees, of Cambridge University, once observed, “Most educated people are aware that we are the outcome of nearly 4 billion years of Darwinian selection, but many tend to think that humans are somehow the culmination. … It will not be humans who watch the sun’s demise, six billion years from now. Any creatures that then exist will be as different from us as we are from bacteria or amoebae.”
As will their consciousness. From their distant perspective we will be just one among many species that came and went from this planetary stage, if they are aware of us at all.
Tor Norretranders wrote a fascinating book titled, The User Illusion: Cutting Consciousness Down to Size, back in 1999.

Among his surprising insights is the idea that there is more information in a mess than in order. The expensive part of knowledge is not gaining new information but getting rid of the old. Calculation involves eliminating irrelevance—the total on your grocery bill involves less information than all of the individual item prices taken separately, and is therefore more useful. The value of any piece of information is directly related to how much exformation (discarded data) resulted during its creation.
The brain receives about 11 million bits of information per second from sensory sources but conscious thought can handle—at most about 40 bits per second. (15-25 is more likely) There is an awful lot going on that you are completely unaware of, and which you cannot possibly ever notice.

The Illusion of this work’s title is drawn from the user illusion you are experiencing right now listening to me.

If you use a computer you are probably aware that the documents on your screen, the file folders, the cascading menus, the trash can, the pictures of your children, and all the rest, are illusory in the sense that they do not exist inside your computer. They only exist on the screen. Inside one would find a network of impossibly complicated electrical circuits processing apparently endless strings of binary numbers.
As a computer user you don’t care how the innards work, as long as they do. You interact with a surface illusion which allows you to accomplish work or play. What you see doesn’t need to be accurate or real, it needs to offer a manageable working hypothesis.

In the same way, suggests Norretranders, our consciousness is the result of one half second of processing by the most powerful computer known—the human brain. The world we interact with is entirely a simulation, a very detailed user interface, in which almost all inputs and computation are hidden. It is very deep, resulting as it does from the creation of massive exformation. (Remember that we process about 11 million bits of sensory input per second, plus whatever signals such input creates internally; and only consciously experience about 30 bits per second.) But we experience that depth as surface, just as we experience our computer “desktop” versus the quick flicker of binary code inside the CPU.

Life is largely a non-conscious experience.
Consciousness is far too slow to save us. When a car veers into your lane, you swing a ball bat, or sit on a tack, your “Me” takes over and your “I” finds out the result. The order is: input, action, consciousness.

The most troubling aspect of this unfolding of modern brain research, math, physics and information theory involves free will. It turns out that conscious free will consists of veto power. Conscious thought can halt a hand, but not un-wish to slap the silly grin off a face. This is profoundly at odds with the usual illusion that “I am in charge here.” (For example: it flies in the face of the Christian notion that one can choose not to think sinful thoughts.)

Norretrander’s concluding chapter is entitled, “The Sublime.” Heaven is all around us, he suggests … it exists one half second in your past. Just as a map offers the barest outline of a journey, and the computer screen a pleasant workplace, consciousness provides only a hint of the depth and richness and wonder of human experience.

David Dunning, a Cornell professor of social psychology has observed that
Psychologists over the past 50 years have demonstrated the sheer genius people have at convincing themselves of congenial conclusions while denying the truth of inconvenient ones.  You can call it self-deception, but it also goes by the names rationalization, wishful thinking, defensive processing, self-delusion, and motivated reasoning. There is a robust catalogue of strategies people follow to believe what they want to, and research psychologists are hardly done describing the shape or the size of that catalogue.  All this rationalization can lead people toward false beliefs, or perhaps more commonly, to tenaciously hang on to false beliefs they should really reconsider.

An interesting result of our tenacious adherence to belief over reason, is that we often seem to judge others based on their expressed beliefs rather than on their evident behavior.

It seems to me that belief has very little to do with the good or bad results we leave in our wake. Mother Theresa’s legacy is her charitable work minus whatever one knows about her dark side, not her Catholicism. The Dalai Lama is an atheist, but that doesn’t make a whit of difference in his work to free the Tibetan people or, more broadly, to sow peace around the globe. Ghandi practiced Hinduism, but asserted that all religions were equal – still, it is his invention of nonviolent resistance that changed India and the world. Martin Luther King, Jr.’s legacy is equality under the law, and it was nonviolent resistance and community organizing, not prayer that brought the changes King achieved.
In the same way, the burning of Salem witches or the torture of unbelievers during the Spanish Inquisition are repellent to us today not because of the beliefs of the practitioners, but their acts. Islamic suicide bombers are not a threat because of their religious tenets, but due to explosives strapped to their chests, and U.S. predator drones that target wedding parties are not made moral by the prayers of Senators and Congressmen.

As one of my favorite songwriters, Carrie Newcomer, once put it, “We shall surely be known forever by the tracks we leave.”
Yet, all too often, we forget that profession of belief is not of much use in evaluating the world or our fellow beings. The problem has always been due to the things we don’t know. There are questions about our ultimate origin and our ultimate destiny that, so far, at least, are beyond the reach of scientific inquiry. Our questions and fears are sometimes soothed by woo-woo practitioners who claim to know more, to have seen more clearly, to have received stone tablets or golden records or heard angels or been taken for a ride in a flying saucer.
The real problems emerge when we fail to question our beliefs. Francis Bacon said it almost 400 years ago: if you begin in certainty you are likely to end in doubt, but if you begin in doubt you can gradually build to certainty. As UUs we have placed a reminder right up front in our fourth principle, in which we promote a free and responsible search for truth and meaning. When we find that truth we can act on it, and move our world toward health, happiness, inclusion and justice.

As my fellow non-theist Noam Chomsky has written, “We are after all biological organisms not angels . . . If humans are part of the natural world, not supernatural beings, then human intelligence has its scope and limits, determined by initial design.  We can thus anticipate certain questions will not fall within [our] cognitive reach, just as rats are unable to run mazes with numerical properties, lacking the appropriate concepts.  Such questions, we might call ‘mysteries-for-humans’ just as some questions pose ‘mysteries-for-rats.’ Among these mysteries may be questions we raise, and others we do not know how to formulate properly or at all.”

I have no brief against Woo-woo’s who choose to believe that faith can feed the world, but if I were a sad and hungry little bunny, I think I’d opt for a carrot and a hug over prayer. We might laugh at McArthur Wheeler for believing that lemon juice would make him invisible, or feel some pity for his evident ignorance, but what we decry is not his belief but his criminal action. Will we leave our grandchildren and great-great-grandchildren an abundant or a barren earth? One hundred years from now, or one thousand, our professions of faith will ring hollow and we will surely be praised or damned for what we did or didn’t do.

May it be so.

Read Full Post »

In 2010 I was lucky enough to be asked to address conventions, civic groups, book store crowds and conducted church services in: Newark, NJ; Asheville, Black Mountain, Boone, Burnsville, Charlotte, Franklin, Raleigh, and Tryon, NC; Charleston, Columbia and Spartanburg, SC; Minneapolis and Denver. To date, 2011 has taken me to UNCA’s Reuters Center, Burnsville, Hendersonville, Greensboro, Knoxville, Tenn., Cambridge, Mass., Des Moines, Iowa, and Washington, DC.

Read Full Post »

Older Posts »