October 23, 2013 Rituals

Click here for a pdf version.

Probing Culture’s Secrets, From Capuchins to Children

Michael Balter

LONDON—Scientists once designated culture as the exclusive province of humans. But that elitist attitude is long gone, as evidenced by a recent meeting* here on how culture, usually defined as the passing on of traditions by learning from others, arises and changes. The 700 attendees, a mixture of researchers and members of the public, heard talks on cultural transmission in fish, meerkats, birds, and monkeys, as well as in extinct and living humans. Researchers probed questions such as what sparks cultural trends and how complex traditions are transmitted, and most agreed that studies of both animals and children will provide important clues. “The field of cultural evolution ranges from fish to humans and includes child development,” says meeting co-organizer Andrew Whiten, a psychologist at the University of St. Andrews in the United Kingdom.
But why do certain cultural trends, such as fashions, begin and catch on? Even science finds it hard to answer that question. At the meeting, anthropologist Susan Perry of the University of California (UC), Los Angeles, described her team’s work observing white-faced capuchin monkeys since the early 1990s at several sites in Costa Rica. The monkeys have adopted a number of local traditions, some directly related to foraging for food, such as either cracking or rubbing woody capsules of Luehea fruits to get out their seeds. But other traditions have no clear survival purpose, such as sniffing each other’s fingers and inserting them into a companion’s nose, or biting off a big chunk of another monkey’s fur and holding it in the mouth while he or she playfully tries to get it back. Although foraging traditions tend to be long-lasting, Perry has found that, perhaps like some human fashions, these more mysterious capuchin trends tend to last only about 10 years or so before fading.
In one group of capuchins, the team’s long-term observations have allowed them to witness a rare event: the emergence of a new tradition. In what Perry calls a “bizarre” and “high-risk” ritual, the monkeys poke each other’s eyeballs. One monkey will insert his or her long, sharp, dirty fingernail deep into the eye socket of another animal, between the eyelid and the eyeball, up to the first knuckle. In videos Perry played for the meeting, the monkeys on the receiving end of the fingernail, typically social allies, could be seen to grimace and bat their eyelids furiously (as did many members of the audience) but did not attempt to remove the finger or otherwise object to the treatment. Indeed, during these eye-poking sessions, which last up to an hour, monkeys insisted on the finger being reinserted if it popped out of the eye socket.
Why would the monkeys do something potentially dangerous? Perry suggests that capuchins, which, like humans, are highly cooperative and live in large groups, use this apparently pain-inflicting behavior to test the strength of their social bonds. Back in the 1970s, evolutionary biologist Amotz Zahavi of Tel Aviv University in Israel suggested that some animals engage in certain behaviors to solidify alliances, and researchers have observed some examples. For example, some male baboons will hold each other’s testicles before teaming up to fight higher-ranking individuals, apparently to establish trust before going into battle.
When it comes to the capuchins, “this is a plausible hypothesis,” Whiten says, especially because more functional explanations do not seem to explain the eye poking. Nevertheless, Whiten adds, “it is difficult to test directly.”
Perry notes that capuchin behaviors such as eye poking and cracking fruit capsules are true traditions, but they don’t ratchet up into the kinds of complex culture prevalent in every human society, from language to literature to sophisticated technology. Animal traditions lack this cumulative cultural evolution.
How do humans wind up the cultural ratchet? At the meeting, Derek Lyons, a developmental psychologist at UC Irvine, presented new data on a phenomenon in young children that he and others think may be key to humans’ faithful transmission of complex culture: “overimitation,” or the tendency to copy the actions of an adult even when they are unnecessary for achieving a goal. No other animal has been shown to copy in this way, Lyons and others say.
Lyons’s work builds on a landmark 2005 study by Whiten and primatologist Victoria Horner, now at Emory University in Atlanta. They demonstrated that when young chimpanzees and children are shown how to retrieve a reward from a box using a series of both relevant and irrelevant steps, the chimps skipped the unnecessary steps, whereas children tended to imitate everything. Recent work by another team suggests that overimitation is universal in human children (http://news.sciencemag.org/sciencenow/2010/05/kids-overimitate-adults-regardle.html). Lyons and his co-workers reported further work in 3- to 5-year-old children in 2007 in the Proceedings of the National Academy of Sciences. For example, children were shown how to retrieve toy turtles from transparent plastic containers using irrelevant steps such as tapping the container with a feather and relevant steps such as opening the container’s door. The children continued to overimitate even when they were led to believe that the experiment was over or when they were explicitly told to avoid “silly” extra steps.
Why do children do this? In London, Lyons played a new series of videotaped experiments with children of the same ages in which he attempted to, as he put it, “snap them out of ” their overimitative tendencies. In one experiment, a puppet orangutan named Felix, stationed at an opening on the other end of the box, competed with the children to see who could get the toy turtle out of the box first. Again, Lyons showed each child how to get the turtle while mixing in irrelevant actions such as tapping the box and pushing unnecessary levers. The children, who could not see what Felix was doing, continued to perform most of Lyons’s irrelevant actions, even when Felix kept winning and getting the turtle.
The only way to avoid overimitation, Lyons found, was to convey that one of his actions was unintentional. When he pretended to get a call from his mother on his cell phone and “accidentally” flipped a useless lever while gesturing during the supposed conversation, the children did not flip that lever.
These findings are inconsistent with earlier hypotheses that children overimitate to please adults, Lyons said. Rather, he concluded, they support something he called “automatic causal encoding” (ACE), in which a child assumes that the adult knows what he or she is doing and that each step in the procedure is necessary. “ACE is an important mechanism kids use to bootstrap their knowledge of complex artifacts,” he says. Archaeologist Dietrich Stout of Emory University, who studies prehistoric tool making, says ACE may have been important for the cultural transmission of stone-tool technologies in early hominins. “Certain things, like the internal workings of the plastic box or the precise force with which to hit a stone core, are not directly available to the observer,” Stout says. He agrees with Lyons that such a strategy is “a logical approach when confronted with a complicated, unfamiliar artifact.”
Uta Frith, a cognitive neuroscientist at University College London, concurs. “This is an example of actions for which we cannot see rhyme or reason but which we believe are important and relevant to us,” Frith says. “I am persuaded that this is the secret of the evolution of human culture.”
— ——–

Mind and Matter :  How Irrational Rituals Bring Us Together,

Alison Gopnik,

Human beings love rituals. Of course rituals are at the center of religious practice.  But even secularists celebrate the great transitions of life with arbitrary actions, formalized words and pecluliar outfits.  To become part  of my community of hard-headed, rational , scientific Ph.D.s, I had to put on a weird gown and even weirder hat, walk solemnly down the aisle of a cavernous building, and listen to rhythmically intoned Latin.

Our mundane actions are suffused with arbitrary conventions, too.  Grabbing food with your hands is efficient and effective, but we purposefully slow ourselves down with cutlery rituals.  In fact, if you’re an American, the chances are that you cut your food with your fork in your left hand, then transfer the fork to your right hand to eat the food, and then swap it back again.  You may not even realize that you’re doing it.  That elaborate fork and knife dance makes absolutely no sense.

But that is the central paradox of ritual.  Rituals are intentionally useless, purposefully irrational.  So why are they so important to us

The cognitive psychologist Christine LeGarre at the University of Texas at Austin has been trying to figure out where rituals come from and what functions they serve.  One idea is that rituals declare that you are a member of a particular group.

Everybody eats, but only Americans swap their knives and forks.  (Several spy movies have used this as a plot point).  Sharing your graduation ceremony marks you as part of the community of Ph.D.’s more effectively than the solitary act of finishing your dissertation.

The fact that rituals don’t make practixal sense is just what makes them useful for social identification.  If someone just puts tea  in a pot and adds hot water then I know only that they are a sensible person who wants tea.  If instead they kneel on a mat and revolve a special whisk a precise number of times, or carefully use silver tongs to drop exactly two lumps into a china cup, I can conclude that they are members of a particular aristocratic tea culture.

It turns out that rituals are deeply rooted and they emerge early.  Surprisingly young children are already sensitive to the difference between purposeful actions and rituals, and they adopt rituals for themselves.

In a new paper forthcoming in the Journal Cognition, Dr. LeGare and colleagues showed 3- to 6-year old children a video of people performing a complicated sequence of eight actions with a mallet and a pegboard.  Someone would pick up the mallet, place it on one side, push up a peg with her hand etc.  Then the experimenters gave the children the mallet and pegboard and said “Now it’s your turn.”

You could interpret this sequence of actions as an intelligent attempt to bring about a particular outcome, pushing up the pegs.  Or you could interpret it as a ritual.

Sometimes the children saw a single person perform the actions twice.  Sometimes they saw two people perform the actions simultaneously.  The identical synchronous actions suggested that they two people were from the same group.

When they saw two people do exactly the same thing at the same time, the children produced exactly the same sequence of actions themselves.  They also explained their actions by saying things like “I had to do it the way that they did.”  They treated the actions as if they were a ritual.

When they saw the single actor, they were much less likely to imitate exactly what the other person did. Instead, they treated it  like a purposeful action.  They would vary what they did themselves to m ake the pegs pop up in a new way.

Dr. LeGare thinks that, from the time we are very young children, we have two ways of thinking about people – a “ritual stance” and an “instrumental  stance.”  We learn as much from the irrationaly and arbitrary things that people do, as from the intelligent and sensible ones.
— —-

Sense and Superstition

By JANE L. RISEN and A. DAVID NUSSBAUM

SUPERSTITIOUS people do all sorts of puzzling things. But it’s not just the superstitious who knock on wood. From time to time, we all rap our knuckles on a nearby table if we happen to let fate-tempting words slip out. “The cancer is in remission, knock on wood,” we might say.

In fact, it’s so common we often don’t think about it. But it’s worth asking: why do people who do not believe that knocking on wood has an effect on the world often do it anyway? Because it works.

No, knocking on wood won’t change what happens. The cancer is no more likely to stay in remission one way or the other. But knocking on wood does affect our beliefs, and that’s almost as important.

Research finds that people, superstitious or not, tend to believe that negative outcomes are more likely after they “jinx” themselves. Boast that you’ve been driving for 20 years without an accident, and your concern about your drive home that evening rises. The superstitious may tell you that your concern is well founded because the universe is bound to punish your hubris. Psychological research has a less magical explanation: boasting about being accident-free makes the thought of getting into an accident jump to mind and, once there, that thought makes you worry.

That makes sense intuitively. What’s less intuitive is how a simple physical act, like knocking on wood, can alleviate that concern.
In one study, to be published in the Journal of Experimental Psychology: General, one of us, Jane L. Risen, and her colleagues Yan Zhang and Christine Hosey, induced college students to jinx themselves by asking half of them to say out loud that they would definitely not get into a car accident this winter. Compared with those who did not jinx themselves, these students, when asked about it later, thought it was more likely that they would get into an accident.

After the “jinx,” in the guise of clearing their minds, we invited some of these students to knock on the wooden table in front of them. Those who knocked on the table were no more likely to think that they would get into an accident than students who hadn’t jinxed themselves in the first place. They had reversed the effects of the jinx.

Knocking on wood may not be magical, but superstition proved helpful in understanding why the ritual was effective. Across cultures, superstitions intended to reverse bad luck, like throwing salt or spitting, often share a common ingredient. In one way or another, they involve an avoidant action, one that exerts force away from oneself, as if pushing something away.

This pushing action turns out to be important, because people’s beliefs are often influenced by bodily feelings and movements. For example, other research shows that people tend to agree with the same arguments more when they hear them while they are nodding their head up and down (as if they were saying “yes”) rather than shaking it from side to side (as if they were saying “no”).

Because people generally push bad things away, we suggest that they may have built up an association between pushing actions and avoiding harm or danger. This led us to speculate that when people knock on wood, or throw salt, or spit, the ritual may help calm the mind, because such avoidant actions lead people to simulate the feelings, thoughts and sensations they experience when they avoid something bad.

To test this, in our knocking-on-wood experiment we asked some people to knock down on the table and away from themselves, while we had others knock up on the underside of the table, toward themselves. Those who knocked up engaged in an approach action, not an avoidant one. Despite knocking on wood, people who knocked up failed to reverse the perceived jinx; if anything, their concerns were made worse compared with people who did not knock at all.

Next we tested whether avoidant movements would have the same effect in situations free from the baggage of superstition. Instead of having participants knock down on wood after jinxing themselves, we had them throw a ball (also an avoidant action, but not one associated with a superstition). We conducted two studies, one in Chicago and another in Singapore. We found that the act of throwing a ball also reduces people’s concerns following a jinx, in either culture. Even pretending to throw a ball has the same effect as actually throwing it.

While almost any behavior can be turned into a superstitious ritual, perhaps the ones that are most likely to survive are those that happen to be effective at changing how we feel. We can seek to rid ourselves of superstitions in the name of enlightenment and progress, but we are likely to find that some may be hard to shake because, although they may be superficially irrational, they may not be unreasonable. Superstitious rituals can really work — but it’s not magic, it’s psychology.

Jane L. Risen and A. David Nussbaum are, respectively, an associate professor of behavioral science and an adjunct assistant professor of behavioral science at the Booth School of Business at the University of Chicago.

October 9, 2013 Without God (Rerun from Dec 2009)

Click here for a pdf version.

Without God

By Steven Weinberg

In his celebrated 1837 Phi Beta Kappa Oration at Harvard, titled “The American Scholar,” Ralph Waldo Emerson predicted that a day would come when America would end what he called “our long apprenticeship to the learning of other lands.” His prediction came true in the twentieth century, and in no area of learning more so than in science. This surely would have pleased Emerson. When he listed his heroes he would generally include Copernicus and Galileo and Newton along with Socrates and Jesus and Swedenborg. But I think that Emerson would have had mixed feelings about one consequence of the advance of science here and abroad—that it has led to a widespread weakening of religious belief.[1]

Emerson was hardly orthodox—according to Herman Melville, he felt “that had he lived in those days when the world was made, he might have offered some valuable suggestions”—but he was for a while a Unitarian minister, and he usually found it possible to speak favorably of the Almighty. Emerson grieved over what he saw in his own time as a weakening of belief, as opposed to mere piety and churchgoing, in America and even more so in England, though I can’t say that he attributed it to the advance of science.

The idea of a conflict between science and religion has a long pedigree. According to Edward Gibbon, it was the view of the Byzantine church that “the study of nature was the surest symptom of an unbelieving mind.” Perhaps the best-known portrayal of this conflict is a book published in 1896 by Cornell’s first president, Andrew Dickson White, with the title A History of the Warfare of Science with Theology in Christendom.

In recent times there has been a reaction against talk of warfare between science and religion. White’s “conflict thesis” was attacked in a 1986 paper by Bruce Lindberg and Ronald Numbers, both well-known historians of science, who pointed out many flaws in White’s scholarship. The Templeton Foundation offers a large prize to those who argue that there is no conflict between science and religion. Some scientists take this line because they want to protect science education from religious fundamentalists. Stephen Jay Gould argued that there could be no conflict between science and religion, because science deals only with facts and religion only with values. This certainly was not the view held in the past by most adherents of religion, and it is a sign of the decay of belief in the supernatural that many today who call themselves religious would agree with Gould.

Let’s grant that science and religion are not incompatible—there are after all some (though not many) excellent scientists, like Charles Townes and Francis Collins, who have strong religious beliefs. Still, I think that between science and religion there is, if not an incompatibility, at least what the philosopher Susan Haack has called a tension, that has been gradually weakening serious religious belief, especially in the West, where science has been most advanced. Here I would like to trace out some of the sources of this tension, and then offer a few remarks about the very difficult question raised by the consequent decline of belief, the question of how it will be possible to live without God.

1.

I do not think that the tension between science and religion is primarily a result of contradictions between scientific discoveries and specific religious doctrines. This is what chiefly concerned White, but I think he was looking in the wrong direction. Galileo remarked in his famous letter to Grand Duchess Christina that “the intention of the Holy Ghost is to teach us how to go to heaven, not how heaven goes,” and this was not just his opinion; he was quoting a prince of the Church, Cardinal Baronius, the Vatican librarian. Contradictions between scripture and scientific knowledge have occurred again and again, and have generally been accommodated by the more enlightened among the religious. For instance, there are verses in both the Old and New Testament that seem to show that the earth is flat, and as noted by Copernicus (quoted by Galileo in the same letter to Christina) these verses led some early Church fathers like Lactantius to reject the Greek understanding that the earth is a sphere, but educated Christians long before the voyages of Columbus and Magellan had come to accept the spherical shape of the earth. Dante found the interior of the spherical earth a convenient place to store sinners.

What was briefly a serious issue in the early Church has today become a parody. The astrophysicist Adrian Melott of the University of Kansas, in a fight with zealots who wanted equal time for creationism in the Kansas public schools, founded an organization called FLAT (Families for Learning Accurate Theories). His society parodied creationists by demanding equal time for flat earth geography, arguing that children should be exposed to both sides of the controversy over the shape of the earth.

But if the direct conflict between scientific knowledge and specific religious beliefs has not been so important in itself, there are at least four sources of tension between science and religion that have been important.

The first source of tension arises from the fact that religion originally gained much of its strength from the observation of mysterious phenomena—thunder, earthquakes, disease—that seemed to require the intervention of some divine being. There was a nymph in every brook, and a dryad in every tree. But as time passed more and more of these mysteries have been explained in purely natural ways. Explaining this or that about the natural world does not of course rule out religious belief. But if people believe in God because no other explanation seems possible for a whole host of mysteries, and then over the years these mysteries were one by one resolved naturalistically, then a certain weakening of belief can be expected. It is no accident that the advent of widespread atheism and agnosticism among the educated in the eighteenth century followed hard upon the birth of modern science in the previous century.

From the beginning, the explanatory power of science worried those who valued religion. Plato was so horrified at the attempt of Democritus and Leucippus to explain nature in terms of atoms without reference to the gods (even though they did not get very far with this) that in Book Ten of the Laws he urged five years of solitary confinement for those who deny that the gods exist or that they care about humans, with death to follow if the prisoner is not reformed. Isaac Newton, offended by the naturalism of Descartes, also rejected the idea that the world could be explained without God. He argued for instance in a letter to Richard Bentley that no explanation but God could be given for the distinction we observe between bright matter, the sun and stars, and dark

matter, like the earth. This is ironic, because of course it was Newton and not Descartes who was right about the laws of motion. No one did more than Newton to make it possible to work out thoroughly nontheistic explanations of what we see in the sky, but Newton himself was not in this sense a Newtonian.

Of course, not everything has been explained, nor will it ever be. The important thing is that we have not observed anything that seems to require supernatural intervention for its explanation. There are some today who cling to the remaining gaps in our understanding (such as our ignorance about the origin of life) as evidence for God. But as time passes and more and more of these gaps are filled in, their position gives an impression of people desperately holding on to outmoded opinions.

The problem for religious belief is not just that science has explained a lot of odds and ends about the world. There is a second source of tension: that these explanations have cast increasing doubt on the special role of man, as an actor created by God to play a starring part in a great cosmic drama of sin and salvation. We have had to accept that our home, the earth, is just another planet circling the sun; our sun is just one of a hundred billion stars in a galaxy that is just one of billions of visible galaxies; and it may be that the whole expanding cloud of galaxies is just a small part of a much larger multiverse, most of whose parts are utterly inhospitable to life. As Richard Feynman has said, “The theory that it’s all arranged as a stage for God to watch man’s struggle for good and evil seems inadequate.”

Most important so far has been the discovery by Charles Darwin and Alfred Russel Wallace that humans arose from earlier animals through natural selection acting on random heritable variations, with no need for a divine plan to explain the advent of humanity. This discovery led some, including Darwin, to lose their faith. It’s not surprising that of all the discoveries of science, this is the one that continues most to disturb religious conservatives. I can imagine how disturbed they will feel in the future, when at last scientists learn how to understand human behavior in terms of the chemistry and physics of the brain, and nothing is left that needs to be explained by our having an immaterial soul.

Note that I refer here to behavior, not consciousness. Something purely subjective, like how we feel when we see the color red or discover a physical theory, seems so different from the objective world described by science that it is difficult to see how they can ever come together. As Colin McGinn has said in these pages:

The problem is how to integrate the conscious mind with the physical brain—how to reveal a unity beneath this apparent diversity. That problem is very hard, and I do not believe anyone has any good ideas about how to solve it.[2]

On the other hand, both brain activity and behavior (including what we say about our feelings) are in the same world of objective phenomena, and I know of no intrinsic obstacle to their being integrated in a scientific theory, though it is clearly not going to be easy. This does not mean that we can or should forget about consciousness, and like B.F. Skinner with his pigeons concern ourselves only with behavior. We know, as well as we know anything, that our behavior is partly governed by our consciousness, so understanding behavior will necessarily require working out a

detailed correspondence between the objective and subjective. This may not tell us how one arises from the other, but at least it will confirm that there is nothing supernatural about the mind.

Some nonscientists seize on certain developments in modern physics that suggest the unpredictability of natural phenomena, such as the advent of quantum mechanics or chaos theory, as signs of a turn away from determinism, of the sort that would make an opening for divine intervention or an incorporeal soul. These theories have forced us to refine our view of determinism, but not I think in any way that has implications for human life.

A third source of tension between science and religious belief has been more important in Islam than in Christianity. Around 1100, the Sufi philosopher Abu Hamid al-Ghazzali argued against the very idea of laws of nature, on the grounds that any such law would put God’s hands in chains. According to al-Ghazzali, a piece of cotton placed in a flame does not darken and smolder because of the heat of the flame, but because God wants it to darken and smolder. Laws of nature could have been reconciled with Islam, as a summary of what God usually wants to happen, but al-Ghazzali did not take that path.

Al-Ghazzali is often described as the most influential Islamic philosopher. I wish I knew enough to judge how great was the impact on Islam of his rejection of science. At any rate, science in Muslim countries, which had led the world in the ninth and tenth centuries, went into a decline in the century or two after al-Ghazzali. As a portent of this decline, in 1194 the Ulama of Córdoba burned all scientific and medical texts.

Nor has science revived in the Islamic world. There are talented scientists who have come to the West from Islamic countries and do work of great value here, among them the Pakistani Muslim physicist Abdus Mohammed Salam, who in 1979 became the first Muslim scientist to be awarded a Nobel Prize, for work he did in England and Italy. But in the past forty years I have not seen any paper in the areas of physics or astronomy that I follow that was written in an Islamic country and was worth reading. Thousands of scientific papers are turned out in these countries, and perhaps I missed something. Still, in 2002 the periodical Nature carried out a survey of science in Islamic countries, and found just three areas in which the Islamic world produced excellent science, all three directed toward applications rather than basic science. They were desalination, falconry, and camel breeding.

Something like al-Ghazzali’s concern for God’s freedom surfaced for a while in Christian Europe, but with very different results. In Paris and Canterbury in the thirteenth century there was a wave of condemnations of those teachings of Aristotle that seemed to limit the freedom of God to do things like create a vacuum or make several worlds or move the heavens in straight lines. The influence of Thomas Aquinas and Albertus Magnus saved the philosophy of Aristotle for Europe, and with it the idea of laws of nature. But although Aristotle was no longer condemned, his authority had been questioned—which was fortunate, since nothing could be built on his physics. Perhaps it was the weakening of Aristotle’s authority by reactionary churchmen that opened the door to the first small steps toward finding the true laws of nature at Paris and Lisieux and Oxford in the fourteenth century.

There is a fourth source of tension between science and religion that may be the most important of all. Traditional religions generally rely on authority, whether the authority is an infallible leader, such as a prophet or a pope or an imam, or a body of sacred writings, a Bible or a Koran. Perhaps Galileo did not get into trouble solely because he was expressing views contrary to scripture, but because he was doing so independently, rather than as a theologian acting within the Church.

Of course, scientists rely on authorities, but of a very different sort. If I want to understand some fine point about the general theory of relativity, I might look up a recent paper by an expert in the field. But I would know that the expert might be wrong. One thing I probably would not do is to look up the original papers of Einstein, because today any good graduate student understands general relativity better than Einstein did. We progress. Indeed, in the form in which Einstein described his theory it is today generally regarded as only what is known in the trade as an effective field theory; that is, it is an approximation, valid for the large scales of distance for which it has been tested, but not under very cramped conditions, as in the early big bang.

We have our heroes in science, like Einstein, who was certainly the greatest physicist of the past century, but for us they are not infallible prophets. For those who in everyday life respect independence of mind and openness to contradiction, traits that Emerson admired—especially when it came to religion—the example of science casts an unfavorable light on the deference to authority of traditional religion. The world can always use heroes, but could do with fewer prophets.

The weakening of religious belief is obvious in Western Europe, but it may seem odd to talk about this happening in America. No one who expressed doubt about the existence of God could possibly be elected president of the United States. Nevertheless, though I don’t have any scientific evidence on this point, on the basis of personal observation it seems to me that while many Americans fervently believe that religion is a good thing, and get quite angry when it is criticized, even those who feel this way often do not have much in the way of clear religious belief. Occasionally I have found myself talking with friends, who identify themselves with some organized religion, about what they think of life after death, or of the nature of God, or of sin. Most often I’ve been told that they do not know, and that the important thing is not what you believe, but how you live. I’ve heard this even from a Catholic priest. I applaud the sentiment, but it’s quite a retreat from religious belief.

Though I can’t prove it, I suspect that when Americans are asked in polls whether they believe in God or angels or heaven or hell they feel that it is a religious duty to say that they do, whatever they actually believe. And of course hardly anyone today in the West seems to have even the slightest interest in the great controversies—Arians vs. Athanasians, monophysites vs. monothelites, justification by faith or by works—that used to be taken so seriously that they set Christians at each other’s throats.

I have been emphasizing religious belief here, the belief in facts about God or the afterlife, though I am well aware that this is only one aspect of the religious life, and for many not the most important part. Perhaps I emphasize belief because as a physicist I am professionally concerned with finding out what is true, not what makes us happy or good. For many people, the

important thing about their religion is not a set of beliefs but a host of other things: a set of moral principles; rules about sexual behavior, diet, observance of holy days, and so on; rituals of marriage and mourning; and the comfort of affiliation with fellow believers, which in extreme cases allows the pleasure of killing those who have different religious affiliations.

For some there is also a sort of spirituality that Emerson wrote about, and which I don’t understand, often described as a sense of union with nature or with all humanity, that doesn’t involve any specific beliefs about the supernatural. Spirituality is central to Buddhism, which does not call for belief in God. Even so, Buddhism has historically relied on belief in the supernatural, specifically in reincarnation. It is the desire to escape the wheel of rebirth that drives the search for enlightenment. The heroes of Buddhism are the bodhisattvas, who, having attained enlightenment, nevertheless return to life in order to show the way to a world shrouded in darkness. Perhaps in Buddhism too there has been a decline of belief. A recent book by the Dalai Lama barely mentions reincarnation, and Buddhism is now in decline in Japan, the Asian nation that has made the greatest progress in science.

The various uses of religion may keep it going for a few centuries even after the disappearance of belief in anything supernatural, but I wonder how long religion can last without a core of belief in the supernatural, when it isn’t about anything external to human beings. To compare great things with small, people may go to college football games mostly because they enjoy the cheerleading and marching bands, but I doubt if they would keep going to the stadium on Saturday afternoons if the only things happening there were cheerleading and marching bands, without any actual football, so that the cheerleading and the band music were no longer about anything.

2.

It is not my purpose here to argue that the decline of religious belief is a good thing (although I think it is), or to try to talk anyone out of their religion, as eloquent recent books by Richard Dawkins, Sam Harris, and Christopher Hitchens have. So far in my life, in arguing for spending more money on scientific research and higher education, or against spending on ballistic missile defense or sending people to Mars, I think I have achieved a perfect record of never having changed anyone’s mind. Rather, I want just to offer a few opinions, on the basis of no expertise whatever, for those who have already lost their religious beliefs, or who may be losing them, or fear that they will lose their beliefs, about how it is possible to live without God.

First, a warning: we had better beware of substitutes. It has often been noted that the greatest horrors of the twentieth century were perpetrated by regimes—Hitler’s Germany, Stalin’s Russia, Mao’s China—that while rejecting some or all of the teachings of religion, copied characteristics of religion at its worst: infallible leaders, sacred writings, mass rituals, the execution of apostates, and a sense of community that justified exterminating those outside the community.

When I was an undergraduate I knew a rabbi, Will Herberg, who worried about my lack of religious faith. He warned me that we must worship God, because otherwise we would start worshiping each other. He was right about the danger, but I would suggest a different cure: we should get out of the habit of worshiping anything.

I’m not going to say that it’s easy to live without God, that science is all you need. For a physicist, it is indeed a great joy to learn how we can use beautiful mathematics to understand the real world. We struggle to understand nature, building a great chain of research institutes, from the Museum of Alexandria and the House of Wisdom of Baghdad to today’s CERN and Fermilab. But we know that we will never get to the bottom of things, because whatever theory unifies all observed particles and forces, we will never know why it is that that theory describes the real world and not some other theory.

Worse, the worldview of science is rather chilling. Not only do we not find any point to life laid out for us in nature, no objective basis for our moral principles, no correspondence between what we think is the moral law and the laws of nature, of the sort imagined by philosophers from Anaximander and Plato to Emerson. We even learn that the emotions that we most treasure, our love for our wives and husbands and children, are made possible by chemical processes in our brains that are what they are as a result of natural selection acting on chance mutations over millions of years. And yet we must not sink into nihilism or stifle our emotions. At our best we live on a knife-edge, between wishful thinking on one hand and, on the other, despair.

What, then, can we do? One thing that helps is humor, a quality not abundant in Emerson. Just as we laugh with sympathy but not scorn when we see a one-year-old struggling to stay erect when she takes her first steps, we can feel a sympathetic merriment at ourselves, trying to live balanced on a knife-edge. In some of Shakespeare’s greatest tragedies, just when the action is about to reach an unbearable climax, the tragic heroes are confronted with some “rude mechanical” offering comic observations: a gravedigger, or a doorkeeper, or a pair of gardeners, or a man with a basket of figs. The tragedy is not lessened, but the humor puts it in perspective.

Then there are the ordinary pleasures of life, which have been despised by religious zealots, from Christian anchorites in the Egyptian deserts to today’s Taliban and Mahdi Army. Visiting New England in early June, when the rhododendrons and azaleas are blazing away, reminds one how beautiful spring can be. And let’s not dismiss the pleasures of the flesh. We who are not zealots can rejoice that when bread and wine are no longer sacraments, they will still be bread and wine.

There are also the pleasures brought to us by the high arts. Here I think we are going to lose something with the decline of religious belief. Much great art has arisen in the past from religious inspiration. For instance, I can’t imagine the poetry of George Herbert or Henry Vaughn or Gerard Manley Hopkins being written without sincere religious belief. But nothing prevents those of us who have no religious belief from enjoying religious poetry, any more than not being English prevents Americans from enjoying the patriotic speeches in Richard II or Henry V.

We may be sad that no more great religious poetry will be written in the future. We see already that little English-language poetry written in the past few decades owes anything to belief in God, and in some cases where religion does enter, as with poets like Stevie Smith or Philip Larkin, it is the rejection of religion that provides their inspiration. But of course very great poetry can be written without religion. Shakespeare provides an example; none of his work seems to me to show the slightest hint of serious religious inspiration. Given Ariel and Prospero, we see that poets can do without angels and prophets.

I do not think we have to worry that giving up religion will lead to a moral decline. There are plenty of people without religious faith who live exemplary moral lives (as for example, me), and though religion has sometimes inspired admirable ethical standards, it has also often fostered the most hideous crimes. Anyway, belief in an omnipotent omniscient creator of the world does not in itself have any moral implications—it’s still up to you to decide whether it is right to obey His commands. For instance, even someone who believes in God can feel that Abraham in the Old Testament was wrong to obey God in agreeing to sacrifice Isaac, and that Adam inParadise Lostwas right to disobey God and follow Eve in eating the apple, so that he could stay with her when she was driven from Eden. The young men who flew airplanes into buildings in the US or exploded bombs in crowds in London or Madrid or Tel Aviv were not just stupid in imagining that these were God’s commands; even thinking that these were His commands, they were evil in obeying them.

The more we reflect on the pleasures of life, the more we miss the greatest consolation that used to be provided by religious belief: the promise that our lives will continue after death, and that in the afterlife we will meet the people we have loved. As religious belief weakens, more and more of us know that after death there is nothing. This is the thing that makes cowards of us all.

Cicero offered comfort in De Senectute by arguing that it was silly to fear death. After more than two thousand years his words still have not the slightest power to console us. Philip Larkin was much more convincing about the fear of death:

This is a special way of being afraid No trick dispels. Religion used to try, That vast moth-eaten musical brocade Created to pretend we never die,

And specious stuff that says No rational being Can fear a thing it will not feel, not seeing
That this is what we fear—no sight, no sound, No touch or taste or smell, nothing to think with, Nothing to love or link with,

The anaesthetic from which none come round.

Living without God isn’t easy. But its very difficulty offers one other consolation—that there is a certain honor, or perhaps just a grim satisfaction, in facing up to our condition without despair and without wishful thinking—with good humor, but without God.

The New York Review of Books
Volume 55, Number 14 · September 25, 2008

September 25, 2013 Science and Big Data

                                                                                                                                                                                 Click here for a pdf version.

To Know, but Not Understand: David Weinberger on Science and Big Data

by David Weinberger  original

Thomas Jefferson and George Washington recorded daily weather observations, but they didn’t record them hourly or by the minute. Not only did they have other things to do, such data didn’t seem useful. Even after the invention of the telegraph enabled the centralization of weather data, the 150 volunteers who received weather instruments from the Smithsonian Institution in 1849 still reported only once a day. Now there is a literally immeasurable, continuous stream of climate data from satellites circling the earth, buoys bobbing in the ocean, and Wi-Fi-enabled sensors in the rain forest. We are measuring temperatures, rainfall, wind speeds, C02 levels, and pressure pulses of solar wind. All this data and much, much more became worth recording once we could record it, once we could process it with computers, and once we could connect the data streams and the data processors with a network.

This would not be the first time. For example, when Sir Francis Bacon said that knowledge of the world should be grounded in carefully verified facts about the world, he wasn’t just giving us a new method to achieve old-fashioned knowledge. He was redefining knowledge as theories that are grounded in facts. The Age of the Net is bringing about a redefinition at the same scale. Scientific knowledge is taking on properties of its new medium, becoming like the network in which it lives.

In this excerpt from my new book, Too Big To Know, we’ll look at a key property of the networking of knowledge: hugeness.

In 1963, Bernard K. Forscher of the Mayo Clinic complained in a now famous letter printed in the prestigious journal Science that scientists were generating too many facts. Titled Chaos in the Brickyard, the letter warned that the new generation of scientists was too busy churning out bricks — facts — without regard to how they go together. Brickmaking, Forscher feared, had become an end in itself. “And so it happened that the land became flooded with bricks. … It became difficult to find the proper bricks for a task because one had to hunt among so many. … It became difficult to complete a useful edifice because, as soon as the foundations were discernible, they were buried under an avalanche of random bricks.”

If science looked like a chaotic brickyard in 1963, Dr. Forscher would have sat down and wailed if he were shown the Global Biodiversity Information Facility at GBIF.org. Over the past few years, GBIF has collected thousands of collections of fact-bricks about the distribution of lifeover our planet, from the bacteria collection of the Polish National Institute of Public Health to the Weddell Seal Census of the Vestfold Hills of Antarctica. GBIF.org is designed to be just the sort of brickyard Dr. Forscher deplored — information presented without hypothesis, theory, or edifice — except far larger because the good doctor could not have foreseen the networking of brickyards.

How will we ever make sense of scientific topics that are too big to know? The short answer: by transforming what it means to know something scientifically.

Scientific knowledge is taking on properties of its new medium, becoming like the network in which it lives.
Indeed, networked fact-based brickyards are a growth industry. For example, at ProteomeCommons.org you’ll find information about the proteins specific to various organisms. An independent project by a grad student, Proteome Commons makes available almost 13 million data files, for a total of 12.6 terabytes of information. The data come from scientists from around the world, and are made available to everyone, for free. The Sloan Digital Sky Survey — under the modest tag line Mapping the Universe — has been gathering and releasing maps of the skies gathered from 25 institutions around the world. Its initial survey, completed in 2008 after eight years of work, published information about 230 million celestial objects, including 930,000 galaxies; each galaxy contains millions of stars, so this brickyard may grow to a size where we have trouble naming the number. The best known of the new data brickyards, the Human Genome Project, in 2001 completed mapping the entire genetic blueprint of the human species; it has been surpassed in terms of quantity by the International Nucleotide Sequence Database Collaboration, which as of May 2009 had gathered 250 billion pieces of genetic data.

There are three basic reasons scientific data has increased to the point that the brickyard metaphor now looks 19th century. First, the economics of deletion have changed. We used to throw out most of the photos we took with our pathetic old film cameras because, even though they were far more expensive to create than today’s digital images, photo albums were expensive, took up space, and required us to invest considerable time in deciding which photos would make the cut. Now, it’s often less expensive to store them all on our hard drive (or at some website) than it is to weed through them.

Second, the economics of sharing have changed. The Library of Congress has tens of millions of items in storage because physics makes it hard to display and preserve, much less to share, physical objects. The Internet makes it far easier to share what’s in our digital basements. When the datasets are so large that they become unwieldy even for the Internet, innovators are spurred to invent new forms of sharing. For example, Tranche, the system behind ProteomeCommons, created its own technical protocol for sharing terabytes of data over the Net, so that a single source isn’t responsible for pumping out all the information; the process of sharing is itself shared across the network. And the new Linked Data format makes it easier than ever to package data into small chunks that can be found and reused. The ability to access and share over the Net further enhances the new economics of deletion; data that otherwise would not have been worth storing have new potential value because people can find and share them.

Third, computers have become exponentially smarter. John Wilbanks, vice president for Science at Creative Commons (formerly called Science Commons), notes that “[i]t used to take a year to map a gene. Now you can do thirty thousand on your desktop computer in a day. A $2,000 machine — a microarray — now lets you look at the human genome reacting over time.” Within

days of the first human being diagnosed with the H1N1 swine flu virus, the H1 sequence of 1,699 bases had been analyzed and submitted to a global repository. The processing power available even on desktops adds yet more potential value to the data being stored and shared.

The brickyard has grown to galactic size, but the news gets even worse for Dr. Forscher. It’s not simply that there are too many brickfacts and not enough edifice-theories. Rather, the creation of data galaxies has led us to science that sometimes is too rich and complex for reduction into theories. As science has gotten too big to know, we’ve adopted different ideas about what it means to know at all.

For example, the biological system of an organism is complex beyond imagining. Even the simplest element of life, a cell, is itself a system. A new science called systems biology studies the ways in which external stimuli send signals across the cell membrane. Some stimuli provoke relatively simple responses, but others cause cascades of reactions. These signals cannot be understood in isolation from one another. The overall picture of interactions even of a single cell is more than a human being made out of those cells can understand. In 2002, when Hiroaki Kitano wrote a cover story on systems biology for Science magazine — a formal recognition of the growing importance of this young field — he said: “The major reason it is gaining renewed interest today is that progress in molecular biology … enables us to collect comprehensive datasets on system performance and gain information on the underlying molecules.” Of course, the only reason we’re able to collect comprehensive datasets is that computers have gotten so big and powerful. Systems biology simply was not possible in the Age of Books.

The result of having access to all this data is a new science that is able to study not just “the characteristics of isolated parts of a cell or organism” (to quote Kitano) but properties that don’t show up at the parts level. For example, one of the most remarkable characteristics of living organisms is that we’re robust — our bodies bounce back time and time again, until, of course, they don’t. Robustness is a property of a system, not of its individual elements, some of which may be nonrobust and, like ants protecting their queen, may “sacrifice themselves” so that the system overall can survive. In fact, life itself is a property of a system.

The problem — or at least the change — is that we humans cannot understand systems even as complex as that of a simple cell. It’s not that were awaiting some elegant theory that will snap all the details into place. The theory is well established already: Cellular systems consist of a set of detailed interactions that can be thought of as signals and responses. But those interactions surpass in quantity and complexity the human brains ability to comprehend them. The science of such systems requires computers to store all the details and to see how they interact. Systems biologists build computer models that replicate in software what happens when the millions of pieces interact. It’s a bit like predicting the weather, but with far more dependency on particular events and fewer general principles.

Models this complex — whether of cellular biology, the weather, the economy, even highway traffic — often fail us, because the world is more complex than our models can capture. But sometimes they can predict accurately how the system will behave. At their most complex these are sciences of emergence and complexity, studying properties of systems that cannot be seen by looking only at the parts, and cannot be well predicted except by looking at what happens.

This marks quite a turn in science’s path. For Sir Francis Bacon 400 years ago, for Darwin 150 years ago, for Bernard Forscher 50 years ago, the aim of science was to construct theories that are both supported by and explain the facts. Facts are about particular things, whereas knowledge (it was thought) should be of universals. Every advance of knowledge of universals brought us closer to fulfilling the destiny our Creator set for us.

This strategy also had a practical side, of course. There are many fewer universals than particulars, and you can often figure out the particulars if you know the universals: If you know the universal theorems that explain the orbits of planets, you can figure out where Mars will be in the sky on any particular day on Earth. Aiming at universals is a simplifying tactic within our broader traditional strategy for dealing with a world that is too big to know by reducing knowledge to what our brains and our technology enable us to deal with.

We therefore stared at tables of numbers until their simple patterns became obvious to us. Johannes Kepler examined the star charts carefully constructed by his boss, Tycho Brahe, until he realized in 1605 that if the planets orbit the Sun in ellipses rather than perfect circles, it all makes simple sense. Three hundred fifty years later, James Watson and Francis Crick stared at x- rays of DNA until they realized that if the molecule were a double helix, the data about the distances among its atoms made simple sense. With these discoveries, the data went from being confoundingly random to revealing an order that we understand: Oh, the orbits are elliptical! Oh, the molecule is a double helix!

They are so complex that only our artificial brains can manage the amount of data and the number of interactions involved.
With the new database-based science, there is often no moment when the complex becomes simple enough for us to understand it. The model does not reduce to an equation that lets us then throw away the model. You have to run the simulation to see what emerges. For example, a computer model of the movement of people within a confined space who are fleeing from a threat–they are in a panic–shows that putting a column about one meter in front of an exit door, slightly to either side, actually increases the flow of people out the door. Why? There may be a theory or it may simply be an emergent property. We can climb the ladder of complexity from party games to humans with the single intent of getting outside of a burning building, to phenomena with many more people with much more diverse and changing motivations, such as markets. We can model these and perhaps know how they work without understanding them. They are so complex that only our artificial brains can manage the amount of data and the number of interactions involved.

The same holds true for models of purely physical interactions, whether they’re of cells, weather patterns, or dust motes. For example, Hod Lipson and Michael Schmidt at Cornell University designed the Eureqa computer program to find equations that make sense of large quantities of data that have stumped mere humans, including cellular signaling and the effect of cocaine on white blood cells. Eureqa looks for possible equations that explain the relation of some likely pieces of data, and then tweaks and tests those equations to see if the results more accurately fit the data. It keeps iterating until it has an equation that works.

Dr. Gurol Suel at the University of Texas Southwestern Medical Center used Eureqa to try to figure out what causes fluctuations among all of the thousands of different elements of a single

bacterium. After chewing over the brickyard of data that Suel had given it, Eureqa came out with two equations that expressed constants within the cell. Suel had his answer. He just doesn’t understand it and doesn’t think any person could. It’s a bit as if Einstein dreamed e = mc2, and we confirmed that it worked, but no one could figure out what the c stands for.

No one says that having an answer that humans cannot understand is very satisfying. We want Eureka and not just Eureqa. In some instances well undoubtedly come to understand the oracular equations our software produces. On the other hand, one of the scientists using Eureqa, biophysicist John Wikswo, told a reporter for Wired: “Biology is complicated beyond belief, too complicated for people to comprehend the solutions to its complexity. And the solution to this problem is the Eureqa project.” The world’s complexity may simply outrun our brains capacity to understand it.

Model-based knowing has many well-documented difficulties, especially when we are attempting to predict real-world events subject to the vagaries of history; a Cretaceous-era model of that eras ecology would not have included the arrival of a giant asteroid in its data, and no one expects a black swan. Nevertheless, models can have the predictive power demanded of scientific hypotheses. We have a new form of knowing.

This new knowledge requires not just giant computers but a network to connect them, to feed them, and to make their work accessible. It exists at the network level, not in the heads of individual human beings.

September 11, 2013 Critical Thought Under Attack

 Click here for a pdf version.

The United States Is Awash in Public Stupidity, and Critical Thought Is Under Assault

Henry A. Giroux July 22, 2013

America has become amnesiac – a country in which forms of historical, political, and moral forgetting are not only willfully practiced but celebrated. The United States has degenerated into a social order that is awash in public stupidity and views critical thought as both a liability and a threat. Not only is this obvious in the presence of a celebrity culture that embraces the banal and idiotic, but also in the prevailing discourses and policies of a range of politicians and anti-public intellectuals who believe that the legacy of the Enlightenment needs to be reversed. Politicians such as Michelle Bachmann, Rick Santorum and Newt Gingrich along with talking heads such as Bill O’Reilly, Glenn Beck and Anne Coulter are not the problem, they are symptomatic of a much more disturbing assault on critical thought, if not rational thinking itself. Under a neoliberal regime, the language of authority, power and command is divorced from ethics, social responsibility, critical analysis and social costs.

These anti-public intellectuals are part of a disimagination machine that solidifies the power of the rich and the structures of the military-industrial-surveillance-academic complex by presenting the ideologies, institutions and relations of the powerful as commonsense. [1] For instance, the historical legacies of resistance to racism, militarism, privatization and panoptical surveillance have long been forgotten and made invisible in the current assumption that Americans now live in a democratic, post-racial society. The cheerleaders for neoliberalism work hard to normalize dominant institutions and relations of power through a vocabulary and public pedagogy that create market-driven subjects, modes of consciousness, and ways of understanding the world that promote accommodation, quietism and passivity. Social solidarities are torn apart, furthering the retreat into orbits of the private that undermine those spaces that nurture non-commodified knowledge, values, critical exchange and civic literacy. The pedagogy of authoritarianism is alive and well in the United States, and its repression of public memory takes place not only through the screen culture and institutional apparatuses of conformity, but is also reproduced through a culture of fear and a carceral state that imprisons more people than any other country in the world. [2] What many commentators have missed in the ongoing attack on Edward Snowden is not that he uncovered information that made clear how corrupt and intrusive the American government has become – how willing it is to engage in vast crimes against the American public. His real “crime” is that he demonstrated how knowledge can be used to empower people, to get them to think as critically engaged citizens rather than assume that knowledge and education are merely about the learning of skills – a reductive concept that substitutes training for education and reinforces the flight from reason and the goose-stepping reflexes of an authoritarian mindset. [3]

Since the late1970s, there has been an intensification in the United States, Canada and Europe of neoliberal modes of governance, ideology and policies – a historical period in which the foundations for democratic public spheres have been dismantled. Schools, public radio, the media and other critical cultural apparatuses have been under siege, viewed as dangerous to a market-driven society that considers critical thought, dialogue, and civic engagement a threat to

its basic values, ideologies, and structures of power. This was the beginning of an historical era in which the discourse of democracy, public values, and the common good came crashing to the ground. Margaret Thatcher in Britain and soon after Ronald Reagan in the United States – both hard-line advocates of market fundamentalism – announced that there was no such thing as society and that government was the problem not the solution. Democracy and the political process were all but sacrificed to the power of corporations and the emerging financial service industries, just as hope was appropriated as an advertisement for the whitewashed world, a culture whose capacity to critique oppressive social practices was greatly diminished. Large social movements fragmented into isolated pockets of resistance mostly organized around a form of identity politics that largely ignored a much-needed conversation about the attack on the social and the broader issues affecting society such as the growing inequality in wealth, power and income.

August 28, 2013 Criminal Mind

 Click here for a pdf version.

The Criminal Mind

Adrian Raine

Advances in genetics and neuroscience are revolutionizing our understanding of violent behavior—as well as ideas about how to prevent and punish crime

April 26, 2013,website, The Wall Street Journal 4/27-28/2013.P C1,2

The scientific study of crime got its start on a cold, gray November morning in 1871, on the east coast of Italy. Cesare Lombroso, a psychiatrist and prison doctor at an asylum for the criminally insane, was performing a routine autopsy on an infamous Calabrian brigand named Giuseppe Villella. Lombroso found an unusual indentation at the base of Villella’s skull. From this singular observation, he would go on to become the founding father of modern criminology.

Lombroso’s controversial theory had two key points: that crime originated in large measure from deformities of the brain and that criminals were an evolutionary throwback to more primitive species. Criminals, he believed, could be identified on the basis of physical characteristics, such as a large jaw and a sloping forehead. Based on his measurements of such traits, Lombroso created an evolutionary hierarchy, with Northern Italians and Jews at the top and Southern Italians (like Villella), along with Bolivians and Peruvians, at the bottom.

These beliefs, based partly on pseudoscientific phrenological theories about the shape and size of the human head, flourished throughout Europe in the late 19th and early 20th centuries. Lombroso was Jewish and a celebrated intellectual in his day, but the theory he spawned turned out to be socially and scientifically disastrous, not least by encouraging early-20th-century ideas about which human beings were and were not fit to reproduce—or to live at all.

The racial side of Lombroso’s theory fell into justifiable disrepute after the horrors of World War II, but his emphasis on physiology and brain traits has proved to be prescient. Modern-day scientists have now developed a far more compelling argument for the genetic and neurological components of criminal behavior. They have uncovered, quite literally, the anatomy of violence, at a time when many of us are preoccupied by the persistence of violent outrages in our midst.

The field of neurocriminology—using neuroscience to understand and prevent crime—is revolutionizing our understanding of what drives “bad” behavior. More than 100 studies of twins and adopted children have confirmed that about half of the variance in aggressive and antisocial behavior can be attributed to genetics. Other research has begun to pinpoint which specific genes promote such behavior.

Brain-imaging techniques are identifying physical deformations and functional abnormalities that predispose some individuals to violence. In one recent study, brain scans correctly predicted which inmates in a New Mexico prison were most likely to commit another crime after release. Nor is the story exclusively genetic: A poor environment can change the early brain and make for antisocial behavior later in life.

Most people are still deeply uncomfortable with the implications of neurocriminology. Conservatives worry that acknowledging biological risk factors for violence will result in a society that takes a soft approach to crime, holding no one accountable for his or her actions. Liberals abhor the potential use of biology to stigmatize ostensibly innocent individuals. Both sides fear any seeming effort to erode the idea of human agency and free will.

It is growing harder and harder, however, to avoid the mounting evidence. With each passing year, neurocriminology is winning new adherents, researchers and practitioners who understand its potential to transform our approach to both crime prevention and criminal justice.

The genetic basis of criminal behavior is now well established. Numerous studies have found that identical twins, who have all of their genes in common, are much more similar to each other in terms of crime and aggression than are fraternal twins, who share only 50% of their genes.

(Donta Page’s brain scan, left, shows the reduced functioning of the ventral prefrontal cortex—the area of the brain that helps regulate emotions and control impulses—compared to a normal brain, right.)

In a landmark 1984 study, my colleague Sarnoff Mednick found that children in Denmark who had been adopted from parents with a criminal record were more likely to become criminals in adulthood than were other adopted kids. The more offenses the biological parents had, the more likely it was that their offspring would be convicted of a crime. For biological parents who had no offenses, 13% of their sons had been convicted; for biological parents with three or more offenses, 25% of their sons had been convicted.

As for environmental factors that affect the young brain, lead is neurotoxic and particularly damages the prefrontal region, which regulates behavior. Measured lead levels in our bodies tend to peak at 21 months—an age when toddlers are apt to put their fingers into their mouths. Children generally pick up lead in soil that has been contaminated by air pollution and dumping.

Rising lead levels in the U.S. from 1950 through the 1970s neatly track increases in violence 20 years later, from the ’70s through the ’90s. (Violence peaks when individuals are in their late teens and early 20s.) As lead in the environment fell in the ’70s and ’80s—thanks in large part to the regulation of gasoline—violence fell correspondingly. No other single factor can account for both the inexplicable rise in violence in the U.S. until 1993 and the precipitous drop since then.

Lead isn’t the only culprit. Other factors linked to higher aggression and violence in adulthood include smoking and drinking by the mother before birth, complications during birth and poor nutrition early in life.

Genetics and environment may work together to encourage violent behavior. One pioneering study in 2002 by Avshalom Caspi and Terrie Moffitt of Duke University genotyped over 1,000 individuals in a community in New Zealand and assessed their levels of antisocial behavior in adulthood. They found that a genotype conferring low levels of the enzyme monoamine oxidase A (MAOA), when combined with early child abuse, predisposed the individual to later antisocial behavior. Low MAOA has been linked to reduced volume in the amygdala—the emotional center of the brain—while physical child abuse can damage the frontal part of the brain, resulting in a double hit.

Brain-imaging studies have also documented impairments in offenders. Murderers, for instance, tend to have poorer functioning in the prefrontal cortex—the “guardian angel” that keeps the brakes on impulsive, disinhibited behavior and volatile emotions.

Of course, not everyone with a particular brain profile is a murderer—and not every offender fits the same mold. Those who plan their homicides, like serial killers, tend to have good prefrontal functioning. That makes sense, since they must be able to regulate their behavior carefully in order to escape detection for a long time.

So what explains coldblooded psychopathic behavior? About 1% of us are psychopaths—fearless antisocials who lack a conscience. In 2009, Yaling Yang, Robert Schug and I conducted structural brain scans on 27 psychopaths whom we had found in temporary-employment agencies in Los Angeles. All got high scores on the Psychopathy Checklist, the “gold standard” in the field, which assesses traits like lack of remorse, callousness and grandiosity. We found that, compared with 32 normal people in a control group, psychopaths had an 18% smaller amygdala, which is critical for emotions like fear and is part of the neural circuitry underlying moral decision-making. In subsequent research, Andrea Glenn and I found this same brain region to be significantly less active in psychopathic individuals when they contemplate moral issues. Psychopaths know at a cognitive level what is right and what is wrong, but they don’t feel it.

What are the practical implications of all this evidence for the physical, genetic and environmental roots of violent behavior? What changes should be made in the criminal-justice system?

Let’s start with two related questions: If early biological and genetic factors beyond the individual’s control make some people more likely to become violent offenders than others, are these individuals fully blameworthy? And if they are not, how should they be punished?

Take the case of Donta Page, who in 1999 robbed a young woman in Denver named Peyton Tuthill, then raped her, slit her throat and killed her by plunging a kitchen knife into her chest. Mr. Page was found guilty of first-degree murder and was a prime candidate for the death penalty.

Working as an expert witness for Mr. Page’s defense counsel, I brought him to a lab to assess his brain functioning. Scans revealed a distinct lack of activation in the ventral prefrontal cortex—the brain region that helps to regulate our emotions and control our impulses.

In testifying, I argued for a deep-rooted biosocial explanation for Mr. Page’s violence. As his files documented, as a child he suffered from poor nutrition, severe parental neglect, sustained physical and sexual abuse, early head injuries, learning disabilities, poor cognitive functioning and lead exposure. He also had a family history of mental illness. By the age of 18, Mr. Page had been referred for psychological treatment 19 times, but he had never once received treatment. A three-judge panel ultimately decided not to have him executed, accepting our argument that a mix of biological and social factors mitigated Mr. Page’s responsibility.

Mr. Page escaped the death penalty partly on the basis of brain pathology—a welcome result for those who believe that risk factors should partially exculpate socially disadvantaged offenders. But the neurocriminologist’s sword is double-edged. Neurocriminology also might have told us that Mr. Page should never have been on the street in the first place. At the time he committed the murder, he had been out of prison for only four months. Sentenced to 20 years for robbery, he was released after serving just four years.

What if I had been asked to assess him just before he was released? I would have said exactly what I said in court when defending him. All the biosocial boxes were checked: He was at heightened risk for committing violence for reasons beyond his control. It wasn’t exactly destiny, but he was much more likely to be impulsively violent than not.

This brings us to the second major change that may be wrought by neurocriminology: incorporating scientific evidence into decisions about which soon-to-be-released offenders are at the greatest risk for reoffending. Such risk assessment is currently based on factors like age, prior arrests and marital status. If we were to add biological and genetic information to the equation—along with recent statistical advances in forecasting—predictions about reoffending would become significantly more accurate.

In a 2013 study, Kent Kiehl of the University of New Mexico, looking at a population of 96 male offenders in the state’s prison system, found that in the four years after their release, those with low activity in the anterior cingulate cortex—a brain area involved in regulating behavior—were twice as likely to commit another offense as those who had high activity in this region. Research soon to be published by Dustin Pardini of the University of Pittsburgh shows that men with a smaller amygdala are three times more likely to commit violence three years later.

Of course, if we can assess criminals for their propensity to reoffend, we can in theory assess any individual in society for his or her criminal propensity—making it possible to get ahead of the problem by stopping crime before it starts. Ultimately, we should try to reach a point where it is possible to deal with repeated acts of violence as a clinical disorder.

Randomized, controlled trials have clearly documented the efficacy of a host of medications—including stimulants, antipsychotics, antidepressants and mood stabilizers—in treating aggression in children and adolescents. Parents are understandably reluctant to have their children medicated for bad behavior, but when all else fails, treating children to stabilize their uncontrollable aggressive acts and to make them more amenable to psychological interventions is an attractive option.

Treatment doesn’t have to be invasive. Randomized, controlled trials in England and the Netherlands have shown that a simple fix—omega-3 supplements in the diets of young offenders—reduces serious offending by about 35%. Studies have also found that early environmental enrichment—including better nutrition, physical exercise and cognitive stimulation—enhances later brain functioning in children and reduces adult crime.

Over the course of modern history, increasing scientific knowledge has given us deeper insights into epilepsy, psychosis and substance abuse, and has promoted a more humane perspective. Just as mental disorders were once viewed as a product of evil forces, the “evil” you see in violent offenders today may someday be reformulated as a symptom of a physiological disorder.

There is no question that neurocriminology puts us on difficult terrain, and some wish it didn’t exist at all. How do we know that the bad old days of eugenics are truly over? Isn’t research on the anatomy of violence a step toward a world where our fundamental human rights are lost?

We can avoid such dire outcomes. A more profound understanding of the early biological causes of violence can help us take a more empathetic, understanding and merciful approach toward both the victims of violence and the prisoners themselves. It would be a step forward in a process that should express the highest values of our civilization.

—Dr. Raine is the Richard Perry University Professor of Criminology, Psychiatry and Psychology at the University of Pennsylvania and author of “The Anatomy of Violence: The Biological Roots of Crime,” to be published on April 30 by Pantheon, a division of Random House.

Dr. Raine is the Richard Perry University Professor of Criminology, Psychiatry and Psychology at the University of Pennsylvania and the author of “The Anatomy of Violence: The Biological Roots of Crime,” to be published Tuesday by Pantheon, a division of Random House.

A version of this article appeared April 27, 2013, on page C1 in the U.S. edition of The Wall Street Journal, with the headline: The Criminal Mind.

— —-

Timothy Brown  << Conservatives worry that acknowledging biological risk factors for violence will result in a society that takes a soft approach to crime, holding no one accountable for his or her actions. Liberals abhor the potential use of biology to stigmatize ostensibly innocent individuals. >>
—–
If there were a Pulitzer Prize for sweeping generalizations, I think this passage might deserve the honor.

I, for one, am a conservative, and my worry is that liberals will make brain scans and genetic screening mandatory prerequisites to getting a permit to own a firearm.
Which would NOT prevent the criminally insane from actually owning or using guns.
But it would create a huge barrier for those who are neither sociopathic nor criminal: “So, you want to own a gun, do you? I’m from the government, and I think you might be crazy. Therefore, you need to have your head examined.”

Harold Richard  Hey Man, Ima Demo and I think the same as you on this. I also own a gun now and want to buy another firearm. 
Why don’t we just dream up some Pre-Cog computer software and install it on one of those Superduper Supercomputers. Then we can just lock up people at birth based on their genetics and brain shape. Oh wait, maybe we already have and haven’t been told about it yet.
Minority Report here we come

John Mcrae  Concepts of the impulsive violent psychotic, the impulsive violent mentally ill, and the impulsive violent evil person (possessed) have interested many but remained largely elusive for the past several centuries. In 1843, the M’Naghten Rule separated the insane from the criminal at common law, and is still largely in vogue ……although probably not valid. 
Responsibility after a violent act, whether the impulse be psychotic, criminal, or evil requires protective custody for the safety sake of society. The social fetish of need to assign criminal responsibility in accord with a rigorous protocol while ignoring the dangerous mentally ill, or confusing criminality with evil needs update…..as evidence mounts of multiple genetic, developmental, and imposed conditions triggering the impulse.
As the death penalty becomes increasingly challenged, so to should the effect of the Miranda warning, and petition for habeus corpus. Responsibility can be judged and protective custody imposed….but then a prudent delay in sentencing, even of a few years, to ferret out and weigh all factors of uncertainty where the criminal code calls for prolonged incarceration or execution should be routine. A few offenders will soon be identified who can be treated…..at the discretion of the supervising court. The cost for such would not be greater than the current costs of exhaustive appeal of capital crimes, nor lifetime imprisonment expense. 
The difficulty is that no one can demonstrate the efficacy, the risk, nor the benefit of such until it is tried…..being human behavior and unperfected hypothetical science. In the 1950s, small almost anecdotal experience with early stereotactic ablative ‘psychosurgery’ showed promising results with far less recidivism. Then came the ‘liberated’ 60s doing way with two centuries of improving more humane protective custody of the mentally ill as well as the career criminal. Today, sixty years later technology has greatly intensified the power of such inquiry…..and social consequences of the libertine ignoring of the problem are demanding answers. The facts are that there are psychotics, criminals, and evil persons.

check out http://healthland.time.com/2013/04/23/qa-criminologist-adrian-raine-on-the-marathon-bombs-the-biology-of-violence/

Jay Martin  “No other single factor can account for both the inexplicable rise in violence in the U.S. until 1993 and the precipitous drop since then.”
I believe the authors of “Freakonomics” laid out a much more believable theory regarding the drop in crime starting in the 90s. That being Roe v Wade 1973

Kevin Fisher   Assuming that all behavior is based on brain physics it’s not a total surprise that technology has developed to the point that behavioral aberrations and variations can be “seen” with modern scanning methods.
That doesn’t free any of us from responsibility for what our brain is doing. Those with a criminally aberrant brain must be dealt with as criminals if (and only if) they commit a crime. 
However, it may suggest a certain amount of compassion toward criminals whose brain configuration was caused by factors outside of their control – whatever that mans, with lots of interesting questions regarding free will. In the end, that doesn’t imply leniency in sentencing or other measures needed to protect the public

Kevin Kilty   This was interesting to read, but when the author says that this knowledge puts us on difficult terrain he is a master of understatement. In addition to the worry about eugenics, and the potential to justify permanent incarceration using narrow and not very robust measures, there is also a worry about using it to deal with inconvenient differences of opinion. One does not have to search very hard for Prof. Raines colleagues in academia referring to “conservative” or libertarian thinking as a pathology

Kevin Kilty   “Randomized, controlled trials in England and the Netherlands have shown that a simple fix—omega-3 supplements in the diets of young offenders—reduces serious offending by about 35%….”How many people were involved in the trial, has it been replicated, when was it done, and is the measure a risk measure, or an odds measure? How am I to view such statements if they are so vague

Paul S. Boyer  Well, removing revenge from the hands of the aggrieved is one of the functions of punishment. It has the function of preventing continuous vendetta.
Revenge is one of the legitimate functions of punishment. It is also possible in some societies for the victim (or the family of the victim) to forgive the perpetrator, which they sometimes do (usually for a price). This turns simple revenge into forgiveness, or a fine. The point is that the case is then closed, and a vendetta avoided.
It used to be considered an advance in civilization to consider crime not just as a personal wrong to the victim, but an offense to “society.” Thus the criminal would pay a “debt to society.” This, again, would take revenge away from the aggrieved, and put it into “official” hands which are supposed to be more dispassionate, and to apply more uniform, accepted standards.
It is foolish Utopianism to imagine that the revenge motive can simply be dreamed away. Not even brain research can eliminate it, for it is in human nature. Indeed, it is in the nature of our closest primate relatives, as well as in many other, less closely related animals. There is a clear reason in evolution for the revenge behavior, for it acts to remove a proven threat

Lisa Partridge   “No other single factor can account for both the inexplicable rise in violence in the U.S. until 1993 and the precipitous drop since then.”
Weren’t the Grateful Dead active from the late 60s until the early to mid 90s?
Look, mom! I’m a neurocriminologist!

Ezra Blumenthal   Violent behavior is NOT criminal, -in fact the real criminals of the world are seldom violent. For example, when the Catholic Pope encouraged forced conversion of native Americans into Catholicism, the Pope (whoever it was) could hardly be considered ‘violent’, although his decision resulted in much pain and death among native Americans. Similarly, when the Americans dropped Atomic Bombs and killed millions of civilians in Japan, the American general was probably calm and calculating, as were the generals giving the order to drop Napalm on Vietnamese people who had done nothing wrong against America. 
If we consider financial crooks, like the Swiss Bankers (UBS, for example), who help Americans and Germans to cheat on taxes, we can really see the difference between ‘criminal’ and ‘violent’. The average Swiss is probably well dressed, and pretty civilized, although the Swiss are the biggest criminals in the world as they go around helping thugs and dictators hide their loot :)
Catching a genetically violent person may be useful, but it would be far more useful if we could catch white collar criminals, …. or scan the brain of a politician or a banker or a lawyer, to tell us when he is lying! Well, some would say they are always lying…. but still, maybe we could tell when they really really lying 🙂

August 14, 2013 Embrace Apocalypse

                 Click here for a pdf version.

We Have to Embrace Apocalypse If We’re Going to Get Serious About Sticking Around on This Planet

Robert Jensen

To think apocalyptically is not to give up on ourselves, but only to give up on the arrogant stories we modern humans have been telling about ourselves.

This is an excerpt from We Are All Apocalyptic Now: On the Responsibilities of Teaching, Preaching, Reporting, Writing, and Speaking Out , in print at Amazon.com and
on Kindle (CreateSpace Independent Publishing Platform, 2013):

Here’s my experience in speaking apocalyptically about the serious challenges humans face: No matter how carefully I craft a statement of concern about the future of humans, no matter how often I deny a claim to special gifts of prognostication, no matter now clearly I reject supernatural explanations or solutions, I can be certain that a significant component of any audience will refuse to take me seriously. Some of those people will make a joke about “Mr. Doom and Gloom.” Others will suggest that such talk is no different than conspiracy theorists’ ramblings about how international bankers, secret cells of communists, or crypto-fascists are using the United Nations to create a one- world government. Even the most measured and careful talk of the coming dramatic change in the place of humans on Earth leads to accusations that one is unnecessarily alarmist, probably paranoid, certainly irrelevant to serious discussion about social and ecological issues. In the United States, talk of the future is expected to be upbeat, predicting expansion and progress, or at least maintenance of our “way of life.”

Apocalyptic thinking allows us to let go of those fanciful visions of the future. As singer/ songwriter John Gorka puts it: “The old future’s gone/We can’t get to there from here.” The comfortable futures that we are comfortable imagining are no longer available to us because of the reckless way we’ve been rolling the dice; there is nothing to save us from ourselves. Our task is to deal with our future without delusions of deliverance, either divine or technological. This planet is not a way station in a journey to some better place [emphasis is mine, this is our history: Settle in use up the environment, move on]; it is our home, the only home we will know. We will make our peace with ourselves, each other, and the larger living world here.

The first step in thinking sensibly about the future, of course, is reviewing the past. The uncertainty of our future will be easier to accept and the strength to persevere will be easier to summon if we recognize:

–We are animals. For all our considerable rational capacities, we are driven by non- rational forces that cannot be fully understood or completely controlled. Even the most careful scientist is largely an emotional creature, just like everyone else.

–We are band/tribal animals. Whatever kind of political unit we live in today, our evolutionary history is in small groups; that’s how we are designed to live.

–We are band/tribal animals living in a global world. The consequences of the past 10,000 years of human history have left us dealing with human problems on a global scale, and with 7 billion people on the planet, there’s no point in fantasizing about a retreat to Eden.

With that history in mind, we should go easy on ourselves. As Wes Jackson said, we are a species out of context, facing the unique task of being the first animals who will have to self-consciously impose limits on ourselves if we are to survive, reckoning not just with what we do in our specific place on the planet but with what other people are doing around the world. This is no small task, and we are bound to fail often. We may never stop failing, and that is possibly the most daunting challenge we must face: Can we persevere in the quest for justice and sustainability even if we had good reasons to believe that both projects ultimately will fail? Can we live with that possibility? Can we ponder that and yet still commit ourselves to loving action toward others and the non- human world?

Said differently: What if our species is an evolutionary dead end? What if those adaptations that produced our incredible evolutionary success — our ability to understand certain aspects of how the world works and manipulate that world to our short-term advantage — are the very qualities that guarantee our human systems will degrade the life-sustaining systems of the world? What if that which has allowed us to dominate will be that which destroys us? What if humanity’s story is a dramatic tragedy in the classical sense, a tale in which the seeds of the hero’s destruction are to be found within, and history is the unfolding of the inevitable fall?

We love stories of individual heroes, and collectively we tend to think of ourselves as the heroic species. The question we might ask, uncomfortably, about those tales of heroism: Is Homo sapiens an epic hero or a tragic one? Literature scholars argue over the specific definitions of the terms “epic” and “tragedy,” but in common usage an epic celebrates the deeds of a hero who is favored by, and perhaps descended from, the gods. These heroes overcome adversity to do great things in the service of great causes. Epic heroes win.

A tragic hero loses, but typically not because of an external force. The essence of tragedy is what Aristotle called “hamartia,” an error in judgment made because of some character flaw, such as hubris. That excessive pride of protagonists becomes their downfall. Although some traditions talk about the sin of pride, most of us understand that taking some pride in ourselves is psychologically healthy. The problem is excessive pride, when we elevate ourselves and lose a sense of the equal value of others. When we fall into hubris individually, the consequences can be disastrous for us and those around us. When we fall into that hubris as a species — when we ignore the consequences of the exploitation on which our “lifestyle” is based — the consequences are more dramatic.

What if our task is to give up the dream of the human species as special? And what if the global forces set in motion during the high-energy/high-technology era are beyond the point of no return? Surrounded by the big majestic buildings and tiny sophisticated electronic gadgets created through human cleverness, it’s easy for us to believe we are smart enough to run a complex world. But cleverness is not wisdom, and the ability to create does not guarantee we can control the destruction we have unleashed. It may be that there is no way to rewrite this larger epic, that too much of the tragedy has already been played out.

But here’s the good news: While tragic heroes meet an unhappy fate, a community can learn from the protagonist’s fall. Even tragic heroes can, at the end, celebrate the dignity of the human spirit in their failure. That may be our task, to recognize that we can’t reverse course in time to prevent our ultimate failure, but that in the time remaining we can recognize our hamartia, name our hubris, and do what we can to undo the damage.

That may be the one chance for us to be truly heroic, by learning to leave center stage gracefully, to stop trying to run the world and to accept a place in the world. We have to take our lives seriously but take Life more seriously.

We certainly live in a dangerous time, if we take seriously the data that our vast intellectual enterprises have produced. Ironically, the majority of intellectuals who are part of those enterprises prefer to ignore the implications of that data. The reasons for that will of course vary, and there is no reason to pretend these issues are simple or that we can line up intellectuals in simple categories of good/bad, brave/cowardly, honest/ dishonest. Reasonable people can agree on the data and disagree on interpretation and analysis. Again, my argument is not that anyone who does not share my interpretation and analysis is obviously wrong or corrupt; many of the assertions I have made require more lengthy argument than available in this space.

But I hold to one point without equivocation: When the privileged intellectuals subsidized by the institutions of the dominant culture look away from the difficult issues that we face today, they are failing to meet their moral obligations. The more privileged the intellectual, the greater the responsibility to use our resources, status, and autonomy to face these issues. There is a lot riding on whether we have the courage and the strength to accept that danger, joyfully. This harsh assessment, and the grief that must accompany it, is not a rejection of joy. The two, grief and joy, are not mutually exclusive but, in fact, rely on each other, and define the human condition. As Wendell Berry puts it, we live on “the human estate of grief and joy.”

This inevitably leads to the question: where can we find hope? My short answer: Don’t ask someone else where to find it. Create it through your actions. Hope is not something we find, but is something we earn. No one has the right to be hopeful until they expend energy to make hope possible. Gorka’s song expresses this: “The old future’s dead and gone/Never to return/There’s a new way through the hills ahead/This one we’ll have to earn/This one we’ll have to earn.”

Berry speaks repeatedly of the importance of daily practice, of building a better world in a practical ways that nurture real bonds in real communities that know their place in the world. He applies this same idea to a discussion of hope: [Y]ou’re not under any obligation to construct a hope for the whole human race. What you are required to do is to be intelligent. And that means you’ve got to have an array of examples you want more or less to understand. Some are not perfect, and others are awful, and to be intelligent you’ve got to know why some are better than the others.

If people demand that intellectuals provide hope — or, worse, if intellectuals believe it is their job to give people hope — then offering platitudes about hope is just another way of avoiding the difficult questions. Clamoring for hope can be a dangerous diversion. But if the discussion of hope leads to action, even in the face of situations that may be hopeless, then we can hold onto what Albert Camus called a “stubborn hope”: Tomorrow the world may burst into fragments. In that threat hanging over our heads there is a lesson of truth. As we face such a future, hierarchies, titles, honors are reduced to what they are in reality: a passing puff of smoke. And the only certainty left to us is that of naked suffering, common to all, intermingling its roots with those of a stubborn hope.

I would call this a hope beyond hope, the willingness not only to embrace that danger but to find joy in it. The systems that structure our world have done more damage than we can understand, but no matter how dark the world grows, there is a light within. That is the message of the best of our theological and secular philosophical traditions, a recurring theme of the best of our art. Wendell Berry has been returning to this theme for decades in essays, fiction, and poetry, and it is the subject of one of his Sabbath poems:

It is hard to have hope. It is harder as you grow old, for hope must not depend on feeling good
and there is the dream of loneliness at absolute midnight. You also have withdrawn belief in the present reality of the future, which surely will surprise us, and hope is harder when it cannot come by prediction any more than by wishing. But stop dithering. The young ask the old to hope. What will you tell them? Tell them at least what you say to yourself.

This is what I say to myself: Whatever our chances of surviving, we define ourselves in the present moment by what we do. There are two basic tasks in front of us. First, we should commit some of our energy to movements that focus on the question of justice in this world, especially those of us with the privilege that is rooted in that injustice. As a middle-class American white man, I can see plenty of places to continue working, in movements dedicated to ending white supremacy, patriarchy, capitalism, and U.S. wars of domination.

I also think there is important work to be done in experiments to prepare for what will come in this new future that we can’t yet describe in detail. Whatever the limits of our predictive capacity, we can be pretty sure we will need ways of organizing ourselves to help us live in a world with less energy and fewer material goods. We all have to develop the skills needed for that world (such as farming and gardening with fewer inputs, food preparation and storage, and basic tinkering), and we will need to recover a deep sense of community that has disappeared from many of our lives. This means abandoning a sense of ourselves as consumption machines, which the contemporary culture promotes, and deepening our notions of what it means to be humans in search of meaning. We have to learn to tell different stories about our sense of self, our connection to others, and our place in nature. The stories we tell will matter, as will the skills we learn.

Berry’s basis for hope begins with a recognition of where we are and who we are, at our best:

Found your hope, then, on the ground under your feet. Your hope of Heaven, let it rest on the ground underfoot. Be lighted by the light that falls
freely upon it after the darkness of the nights
and the darkness of our ignorance and madness.
Let it be lighted also by the light that is within you, which is the light of imagination. By it you see
the likeness of people in other places to yourself
in your place. It lights invariably the need for care toward other people, other creatures, in other places
as you would ask them for care toward your place and you.

In my own life, I continue to work on those questions of justice in existing movements, but I have shifted a considerable amount of time to helping build local networks that can create a place for those experiments. Different people will move toward different efforts depending on talents and temperaments; we should all follow our hearts and minds to apply ourselves where it makes sense, given who we are and where we live. After offering several warnings about arrogance, I’m not about to suggest I know best what work other people should do. If there is any reason for hope, it will be in direct proportion to our capacity for humility and seeing ourselves as part of, not on top of, the larger living world. Berry ends that Sabbath poem not with false optimism but a blunt reminder of how easy it is for us to fall out of right relation with ourselves, others, and the larger living world:

No place at last is better than the world. The world is no better than its places. Its places at last are no better than their people while their people continue in them. When the people make dark the light within them, the world darkens.

The argument I have made rests on an unsentimental assessment of the physical world and the life-threatening consequences of human activity over the past 10,000 years. We would be wise not to plan on supernatural forces or human inventions to save us from ourselves. It is unlikely that we will be delivered to a promised land by divine or technological intervention. Wishing the world were less harsh will not magically make it less harsh. We should not give into the temptation to believe in magic. As James Howard Kunstler puts it, we should stop “clamoring desperately for rescue remedies that would allow them to continue living exactly the way they were used to living, with all the accustomed comforts.”

But we should keep telling stories. Our stories do not change the physical world, but they have the potential to change us. In that sense, the poet Muriel Rukeyser was right when she said, “The universe is made of stories, not of atoms.”

Whatever particular work intellectuals do, they are also storytellers. Artists tell stories, but so do scientists and engineers, teachers and preachers. Our work is always both embedded in a story and advancing a story. Intellectual work matters not just for what it discovers about how the world works, but for what story it tells about those discoveries.

To think apocalyptically is not to give up on ourselves, but only to give up on the arrogant stories we modern humans have been telling about ourselves. Our hope for a decent future — indeed, any hope for even the idea of a future — depends on our ability to tell stories not of how humans have ruled the world but how we can live in the world. The royal must give way to the prophetic and the apocalyptic. The central story of power — that the domination/subordination dynamic is natural and inevitable — must give way to stories of dignity, solidarity, equality. We must resist not only the cruelty of repression but the seduction of comfort.

The songs we sing matter at least as much as the machines we build. Power always assumes it can control. Our task is to resist that control. Gorka offers that reminder, of the latent power of our stories, in the fancifully titled song “Flying Red Horse”:

They think they can tame you, name you and frame you, Aim you where you don’t belong.
They know where you’ve been but not where you’re going, And that is the source of the songs.

 

July 24, 2013 Racist Profiling

                                                                                                                                                                                  Click here for a pdf version.

The Banality of Richard Cohen and Racist Profiling

Ta-Nehisi Coates      Jul 17 2013, 7:00 AM ET, The Atlantic

Yesterday Richard Cohen wrote this:

“In New York City, blacks make up a quarter of the population, yet they represent 78 percent of all shooting suspects — almost all of them young men. We know them from the nightly news.

Those statistics represent the justification for New York City’s controversial stop-and-frisk program, which amounts to racial profiling writ large. After all, if young black males are your shooters, then it ought to be young black males whom the police stop and frisk.

Still, common sense and common decency, not to mention the law, insist on other variables such as suspicious behavior. Even still, race is a factor, without a doubt. It would be senseless for the police to be stopping Danish tourists in Times Square just to make the statistics look good.

I wish I had a solution to this problem. If I were a young black male and were stopped just on account of my appearance, I would feel violated. If the police are abusing their authority and using race as the only reason, that has got to stop. But if they ignore race, then they are fools and ought to go into another line of work.”

It is very important to understand that no one is asking the NYPD to “ignore race.” If an officer is looking for an specific suspect, no one would ask that the NYPD not include race as part of the description. But “Stop And Frisk” is not concerned with specific suspects, but with a broad class of people who are observed making “furtive movements.”

With that said, we should take a moment to appreciate the import of Cohen’s words. They hold that neither I, nor my twelve year old son, nor any of my nephews, nor any of my male family members deserve to be judged as individuals by the state. Instead we must be seen as members of a class more inclined to criminality. It does not matter that the vast, vast majority of black men commit no violent crime at all. Cohen argues that that majority should unduly bear the burden of police invasion, because of a minority who happens to live among us.

Richard Cohen concedes that this is a violation, but it is one he believes black people, for the good of their country, must learn to live with. Effectively he is arguing for a kind of racist public safety tax. The tax may, or may not, end with a frisking. More contact with the police, and people who want to be police, necessarily means more deadly tragedy. Thus Cohen is not simply calling for my son and I to bear the brunt of “violation,” he is calling for us to run a higher risk of death and serious injury at the hands of the state. Effectively he is calling for Sean Bell’s fianceé, Trayvon Martin’s parents, Amadou Diallo’s mother, Prince Jones’ daughter, the relatives of Kathryn Johnston to accept the deaths of their love ones as the price of doing business in America.

The unspoken premise here is chilling — the annihilation of the black individual. To wit:

“Jews are a famously accomplished group. They make up 0.2 percent of the world population, but 54 percent of the world chess champions, 27 percent of the Nobel physics laureates and 31 percent of the medicine laureates.”

I think we would concede that it would be wrong of me to assume that every Jewish person I meet is good at chess, physics or medicine. This year I am working at MIT where a disproportionate number of the students are Asian-Americans. It would be no more wise for me to take from that experience that individual Asian-Americans are good at math, then it would be for anyone to look at the NBA and assume I am good at basketball. And we would agree with this because generally hold that people deserve to be seen as individuals. But by Cohen’s logic, the fact of being an African-American is an exception to this.

Perhaps the standards should be different when it comes to public safety and violence. But New York City’s murder rate is as low as it has been in 50 years. How long should a racist public-safety tax last? Until black people no longer constitute a disproportionate share of our violent criminals, one assumes. But black people do not constitute such a group — victims of hundreds of years of racist state policy constitute that group. “Black on Black” crime is the racecraft by which the fact of what was done to us disappears, and the fact of our DNA becomes criminalized.

I think Richard Cohen knows this:

“The problems of the black underclass are hardly new. They are surely the product of slavery, the subsequent Jim Crow era and the tenacious persistence of racism. They will be solved someday, but not probably with any existing programs. For want of a better word, the problem is cultural, and it will be solved when the culture, somehow, is changed.”

This paragraph is the American approach to racism in brief. Cohen can name the root causes. He is not blind to history. But he can not countenance the import of his own words. So he retreats to cynicism, pronouncing the American state to bankrupt to clean up a problem which it created, and, by an act of magic, lays it at the feet of something called “culture.”

To paraphrase the old Sidney Harris cartoon, the formula for weak-sauce goes something like this

(Forced Labor + Mass Rape)  AUCTIONING YOUR CHILDREN
+ (Poll tax + Segregation + Grandfather clause)  THE KLAN
+ (Redlining + Blockbusting + Race Riots)  CUTTING YOU OUT OF THE NEW DEAL
–  THEN A MIRACLE OCCURS
= “Meh, you figure it out.”

An capricious anti-intellectualism, a fanatical imbecility, a willful amnesia, an eternal sunshine upon our spotless minds, is white supremacy’s gravest legacy. You would not know from reading Richard Cohen that the idea that blacks are more criminally prone, is older than the crime stats we cite, that it has been cited since America’s founding to justify the very kinds of public safety measures Cohen now endorses. Black criminality is more than myth; it is socially engineered prophecy. If you believe a people to be inhuman, you confine them to inhuman quarters and inhuman labor, and subject them to inhuman policy. When they then behave inhumanely to each other, you take it is as proof of your original thesis. The game is rigged. Because it must be.

You should not be deluded into thinking Richard Cohen an outlier. The most prominent advocate of profiling our current pariah classes — black people and Muslim Americans — is now being mentioned in conversations to lead the Department of Homeland Security. Those mentions received an endorsement from our president:

Kelly hasn’t spoken about whether he wants the post, but in an interview with Univision, the president said he’d want to know if Kelly was considering a job change.

“Ray Kelly’s obviously done an extraordinary job in New York,” Obama said. “And the federal government partners a lot with New York, because obviously, our concerns about terrorism often times are focused on big-city targets, and I think Ray Kelly’s one of the best there is.

What you must understand is that when the individual lives of those freighted by racism are deemed less than those who are not, all other inhumanities follow. That is the logic of Richard Cohen. It is the logic of Barack Obama’s potential head of the DHS. This logic is not new, original or especially egregious. It is the logic of the country’s largest city. It is the logic of the American state. It is the logic scribbled across the lion’s share of our history. And it is the logic that killed Trayvon Martin.

———————

Comments:
———————
Josh Jasper

When I was living in Singapore, I was in the minority as a white person, but when a white person committed a crime, white teens were not pulled over and harassed by the police. Were they were doing it wrong? Of course not. No white American living anywhere outside the US would tolerate being treated the way our police treat black people.

Cohen’s argument is easily demolished, and has a clear core of white supremacy.

Cohen has won 4 Pulitzer Prizes. My mind stops after that fact. I can go no further. I’m lost.

Avatar

I lived in South Africa for about two years and experienced some pretty terrifying official harassment. While on holiday in Mozambique, my wife and I traveled to the capital, Maputo. On the very first night, we walked out of the hotel to find a place to have a drink and not fifty yards from the entrance, a soldier with an AK-47 (or some other sort of assault rifle) shouted at us, ran across the avenue, and demanded to see our passports (in English, no less). As we naturally didn’t have them on us, we were threatened with a fine, and failing to pay that was going to result in jail. A truck with some seats bolted in the cab then pulled up, along with three to four other soldiers with AK-47s. After a few tense minutes of negotiation where he told me to get into the truck and I refused to pay or go to jail, they left. Twice in the next three days this happened again (though this time with police officers who only had pistols). I also watched other white families get harassed and pay fines.

Harassment of white Americans happens

caton

You weren’t harassed because you were white, you were extorted because you were a foreign tourist in a country that is still recovering from a pretty ruinous civil war, and your whiteness marked you out as someone who probably had a little more money than others. The fact that this happened right outside a hotel backs this premise up. This sort of thing happens even in developed countries, try going through customs in Dubai without being shaken down for a few grand.

Tyler

I actually have been to Dubai, but didn’t get shaken down. Been to lots of places, and the shakedowns only happened in India and southern Africa. But I don’t see how their civil war, which ended more than 20 years ago and they no doubt are still recovering from, excuses such behaviour.

I may not have been harassed because I was white, but I was profiled because I was white; indeed, I was accurately profiled as being a foreigner and having money to pay fines/bribes. Similar to what Mr Coates was arguing, to these officials, I was reduced to the characteristics of my race.

The wrongs are different, as you note, and do not compare to systematic profiling. I don’t mean to be pedantic, but at no point did I argue they did, either; I was responding to a previous commentator who argued that it never happens to white Americans.

exitr

But your story somewhat confirms Josh Jasper’s point. Yes, harassment of white Americans happens. But you, in fact, did not “put up with” the treatment of the Mozambican police offer – and you were able to enforce your rights in large part due to your status as an American citizen. See how that works for a black teenager stopped by a cop in the U.S.

Tyler

Ah, yes, rereading it now, I agree with you; I did stand my ground, knowing they were not very likely to do anything to me. That said, I cannot stress enough that four soldiers with AK-47s is enough to get most Americans handing over all their cash

June 26, 2013 What Science Really Says About the Soul

To download a pdf click here.

What Science Really Says About the Soul

by Stephen Cave

Nathalie was hemorrhaging badly. She felt weak, cold, and the pain in her abdomen was excruciating. A nurse ran out to fetch the doctor, but by the time they arrived she knew she was slipping away. The doctor was shouting instructions when quite suddenly the pain stopped. She felt free—and found herself floating above the drama, looking down at the bustle of activity around her now still body.

“We’ve lost her,” she heard the doctor say, but Nathalie was already moving on and upwards, into a tunnel of light. She first felt a pang of anxiety at leaving her husband and children, but it was soon overwhelmed by a feeling of profound peace; a feeling that it would all be okay. At the end of the tunnel, a figure of pure radiance was waiting with arms wide open.

This, or something like it, is how millions imagine what it will be like to die. In 2009, over 70 percent of Americans said they believe that they, like Nathalie, have a soul that will survive the end of their body.1 That figure may well now be higher after the phenomenal success of two recent books describing vivid near death experiences: one from an innocent—the four year old Todd Burpo—the other from the opposite: a Harvard scientist and former skeptic, neurosurgeon Dr. Eben Alexander.2 Both argue that when their brains stopped working, their souls floated off to experience a better place.

This is an attractive view and a great consolation to those who have lost loved ones or who are contemplating their own mortality. Many also believe this view to be beyond the realm of science, to concern a different dimension into which no microscope can peer. Dr. Alexander, for example, said in an interview with the New York Times, “Our spirit is not dependent on the brain or body; it is eternal, and no one has one sentence worth of hard evidence that it isn’t.”3

But he is wrong. The evidence of science, when brought together with an ancient argument, provides a very powerful case against the existence of a soul that can carry forward your essence once your body fails. The case runs like this: with modern brain-imaging technology, we can now see how specific, localized brain injuries damage or even destroy aspects of a person’s mental life. These are the sorts of dysfunctions that Oliver Sacks brought to the world in his book The Man Who Mistook His Wife For A Hat.4 The man of the title story was a lucid, intelligent music teacher, who had lost the ability to recognize faces and other familiar objects due to damage to his visual cortex.

Since then, countless examples of such dysfunction have been documented—to the point that every part of the mind can now be seen to fail when some part of the brain fails. The neuroscientist Antonio Damasio has studied many such cases.5 He records a stroke victim, for example, who had lost any capacity for emotion; patients who lost all creativity following brain surgery; and others who lost the ability to make decisions. One man with a brain tumor lost what we might call his moral character, becoming irresponsible and disregarding of social norms. I saw something similar in my own father, who also had a brain tumor: it caused profound changes in his personality and capacities before it eventually killed him.

The crux of the challenge then is this: those who believe they have a soul that survives bodily death typically believe that this soul will enable them, like Nathalie in the story above, to see, think, feel, love, reason and do many other things fitting for a happy afterlife. But if we each have a soul that enables us to see, think and feel after the total destruction of the body, why, in the cases of dysfunction documented by neuroscientists, do these souls not enable us to see, think and feel when only a small portion of the brain is destroyed?

To make the argument clear, we can take the example of sight. If either your eyes or the optic nerves in your brain are sufficiently badly damaged, you will go blind. This tells us very clearly that the faculty of sight is dependent upon functioning eyes and optic nerves.

Yet curiously, when many people imagine their soul leaving their body, they imagine being able to see—like Nathalie, looking down on her own corpse surrounded by frantic doctors.6 They believe, therefore, that their soul can see. But if the soul can see when the entire brain and body have stopped working, why, in the case of people with damaged optic nerves, can’t it see when only part of the brain and body have stopped working? In other words, if blind people have a soul that can see, why are they blind?

So eminent a theologian as Saint Thomas Aquinas, writing 750 years ago, believed this question had no satisfactory answer.7 Without its body—without eyes, ears and nose—he thought the soul would be deprived of all senses, waiting blindly for the resurrection of the flesh to make it whole again. Aquinas concluded that the body-less soul would have only those powers that (in his view) were not dependent upon bodily organs: faculties such as reason and understanding.

But now we can see that these faculties are just as dependent upon a bodily organ—the brain—as sight is upon the eyes. Unlike in Aquinas’s day, we can now keep many people with brain damage alive and use neuroimaging to observe the correlations between that damage and their behavior. And what we observe is that the destruction of certain parts of the brain can destroy those cognitive faculties once thought to belong to the soul. So if he had had the evidence of neuroscience in front of him, we can only imagine that Aquinas himself would have concluded that these faculties also stop when the brain stops.

In fact, evidence now shows that everything the soul is supposed to be able to do—think, remember, love—fails when some relevant part of the brain fails. Even consciousness itself—otherwise there would be no general anesthetics. A syringe full of chemicals is sufficient to extinguish all awareness. For anyone who believes something like the Nathalie story—that consciousness can survive bodily death—this is an embarrassing fact. If the soul can sustain our consciousness after death, when the brain has shut down permanently, why can it not do so when the brain has shut down temporarily?

Some defenders of the soul have, of course, attempted to answer this question. They argue, for example, that the soul needs a functioning body in this world, but not in the next. One view is that the soul is like a broadcaster and the body like a receiver—something akin to a television station and a TV set. (Though as our body is also the source of our sensory input, we have to imagine the TV set also has a camera on top feeding images to the distant station.)

We know that if we damage our TV set, we get a distorted picture. And if we break the set, we get no picture at all. The naive observer would believe the program was therefore gone. But we know that it is really still being transmitted; that the real broadcaster is actually elsewhere. Similarly, the soul could still be sending its signal even though the body is no longer able to receive it.

This response sounds seductive, but helps little. First, it does not really address the main argument at all: Most believers expect their soul to be able to carry forward their mental life with or without the body; this is like saying that the TV signal sometimes needs a TV set to transform it into the picture, but once the set is kaput, can make the picture all by itself. But if it can make the picture all by itself, why does it sometimes act through an unreliable set?

Second, changes to our bodies impact on our minds in ways not at all analogous to how damage to a TV set changes its output, even if we take into account damage to the camera too. The TV analogy claims there is something that remains untouched by such damage, some independent broadcaster preserving the real program even if it is distorted by bad reception. But this is precisely what the evidence of neuroscience undermines. Whereas damage to the TV set or camera might make the signal distorted or fuzzy, damage to our brains much more profoundly alters our minds. As we noted above, such damage can even change our moral views, emotional attachments, and the way we reason.

Which suggests we are nothing like a television; but much more like, for example, a music box: the music is not coming from elsewhere, but from the workings within the box itself. When the box is damaged, the music is impaired; and if the box is entirely destroyed, then the music stops for good.

There is much about consciousness that we still do not understand. We are only beginning to decipher its mysteries, and may never fully succeed. But all the evidence we have suggests that the wonders of the mind—even near-death and out of body experiences—are the effect of neurons firing. Contrary to the beliefs of the vast majority of people on Earth, from Hindus to New Age spiritualists, consciousness depends upon the brain and shares its fate to the end.

References

  1. What People Do and Do Not Believe In, The Harris Poll, December 15, 2009
  2. Burpo, T and Vincent, L. 2010. Heaven is For Real: A Little Boy’s Astounding Story of His Trip to Heaven and Back. Thomas Nelson Publishers; Alexander, Eben. 2012. Proof of Heaven: A Neurosurgeon’s Journey into the Afterlife. Simon & Schuster.
  3. Kaufman, L. 2012. “Readers Join Doctor’s Journey to the Afterworld’s Gates.” The New York Times, November 25, page C1.
  4. Sacks, Oliver. 1985. The Man Who Mistook His Wife For A Hat. New York: Simon & Schuster.
  5. Damasio, Antonio. 1994. Descartes’ Error: Emotion, Reason, and the Human Brain. New York: Putnam Publishing.
  6. Descriptions of heaven also involve being able to see, from Dante to Heaven is For Real, cited above.
  7. Aquinas’s views on the soul can be found in his Summa Theologica and elsewhere. Particularly relevant to the question of the soul’s limited faculties are Part 1, question 77, article 8 (“Whether all the powers remain in the soul when separated from the body?”) and supplement to the Third Part, question 70, article 1 (“Whether the sensitive powers remain in the separated soul?”), in which he writes: “Now it is evident that certain operations, whereof the soul’s powers are the principles, do not belong to the soul properly speaking but to the soul as united to the body, because they are not performed except through the medium of the body—such as to see, to hear, and so forth. Hence it follows that such like powers belong to the united soul and body as their subject, but to the soul as their quickening principle, just as the form is the principle of the properties of a composite being. Some operations, however, are performed by the soul without a bodily organ—for instance to understand, to consider, to will: wherefore, since these actions are proper to the soul, the powers that are the principles thereof belong to the soul not only as their principle but also as their subject. Therefore, since so long as the proper subject remains its proper passions must also remain, and when it is corrupted they also must be corrupted, it follows that these powers which use no bodily organ for their actions must needs remain in the separated body, while those which use a bodily organ must needs be corrupted when the body is corrupted: and such are all the powers belonging to the sensitive and the vegetative soul.”

http://www.skeptic.com/eskeptic/13-03-20/

 

June 12, 2013 I Am Not “Spiritual,” but I Am Religious

Download a pdf version of this article.

by Sarah Oelberg

I claim to be religious — not just spiritual. Spirituality is nice – it can be comforting, awesome, beautiful and sustaining. But it is a lonely endeavor. I choose religion so that I can continue to stimulate my mind and continually ask and try to answer important life questions; so I can be a member of a religious community that gives form and structure to my belief system and enables me to work together on the problems and challenges of the times; so my family can partake of rites of passage and celebrations that fit with our beliefs and values; and so I can enjoy the support and companionship of people who share similar beliefs and values.

I’m not spiritual – at least not in the way many say they are …

Spirituality is a word and concept I have largely avoided, thinking there are better ways to describe what is important in the human condition. One reason is because it means something different to everyone, so it is not a word which gives clarity to conversation. The old chestnut is true: if you ask ten UUs what “spirituality” means to them, you will get dozens of answers. It is hard to put a finger on what this spirituality is that so many profess to seek, believe in, or to be – as in “I am not religious, but I am very spiritual.” It has come to be something of a garbage word, possibly signifying just about anything from astrology to Zen Buddhism.

Part of my (and many Humanists’) resistance to the concept of spirituality comes from some of the meanings it holds for some people – meanings which do not speak to my experience. For example, to some, spirituality is an act, such as accepting Jesus Christ as Lord and Savior. To many, it is the equivalent of theology and metaphysics. Traditional notions of spirituality deal with a nonphysical realm of the world separate from earth and its inhabitants – a realm full of gods, spirits, ghosts, and the like.

It is also used to refer to some transcendental spirit or figure which is supposedly understandable to those who believe in it, but unavailable to the rest of us who don’t buy into their particular views, or have not shared the kind of ethereal experience which has given them this belief. I tend to be suspicious, even to resent when persons or groups try to claim exclusive knowledge or ownership of something, which they say is wonderful, but which is not made accessible to others.

I have also noticed that the word spirituality is often applied to everything lumped into the category New Age: i.e, crystals, guardian angels, channeling, entities, various divinations, out-of- body experiences, ritual transformation, psychic healing, trance states, etc. As a humanist, I have difficulty with these fanciful non-material notions. I have also noted that the spirituality peddled in bookstores and at retreats and on TV talk shows tends to be kind of wispy and misty and rich in appeal to narcissism. You know – if it feels right, or is something that one instinctively knows or which makes them content, then it is spiritual and good – at least for the person experiencing the feeling.

In my experience, the “very spiritual” people who hold forth in these venues are often not the kind of folks who join with others to staff homeless shelters or carry out other works of love; they often despise organized religion, preferring personal evanescence, and many “don’t play well with others.” I know that one’s system of beliefs is supposed to be a very personal thing – didn’t Jesus say that we should pray alone – but I don’t think he meant that our beliefs should remove us from being involved in society. I think he wanted us to contemplate the state of the world, so that we can more effectively enter into it. It is not about us as individuals; it is about how we move and live and serve in the world around us. I am not convinced that much of what passes as “spirituality” does that.

In fact, I sense that, for some, “spirituality” serves as a form of escapism. It seems not to be grounded, at least not in the real world; not in what we know in our time about the nature of the world and the nature of the universe. It appears, often, to be a retreat into some pristine, past, foreign or imaginary world. And it seems to me that an authentic spirituality would require us to boldly and bravely face our world, the world of our

time, the world as we know it today – to face it and embrace it.
I also find that some use the word to express their disaffection with organized religion. They’ll say, “I’m

not religious. I don’t go to church or synagogue, but I’m very spiritual!” I think this might mean: “I have had a bad experience with organized religion, or I think it is all suspect, but I enjoy feeling a sense of awe beneath the stars by myself.”

I think everyone is religious in some way. The religious impulse is, apparently, embedded in our very being. Yes, we find different ways to express it and nurture it, but it is there. And the fact that these church- avoiders often have a need to find some other kind of group to fulfill their need for meaning and companionship – like a twelve-step group, or a Course in Miracles, or a Covenant group or study of angels class – tells me that the human need to be part of something beyond themselves is also very strong. It is quite apparent that, for many, these “alternative” groups have become the equivalent of church, and their teachings a form of religion.

But some claim spirituality in ways I could live with…

Despite problems with using the word spirituality, I find it does not have to be negative. There have been many wonderful things written in the name of spirituality. I especially like one from humanist John Dietrich, which I found long ago in one of the many of his sermons in my attic, and have used many times since: “there is an energy which springs from the heart of humanity. What it is we do not know, any more than we know what electricity is. How it works we cannot say… but that it is real, that it produces results, is as certain as that we can breathe.”1

“Spirituality” might be an acceptable word to point to an indescribable happening like the smell of a rose, walking alone in a quiet wood, being in love, being moved by a beautiful poem or piece of music, or the sense of awe when we see or experience something wonderful. It might be how astronauts have felt when they looked down upon the earth from space, and drank in the glory of what they saw.

Maybe it is how we connect with the source – whatever process made this universe and everything in it. Or perhaps spirituality is the feeling of connection we have to each other and to the all. It is the idea that we are never alone, really, that no matter how isolated and atomistic we might feel, we are part of a vast interdependent web of being; we are a small, but important, cog in the wheel of life. We are never actually separate from the very ground of existence, and what moves one part affects us all. The basis of spirituality, says Sam Harris, is that the range of possible human experience far exceeds the ordinary limits of our subjectivity.

Richard Erhardt suggests that spirituality is about how we live our lives. He asks questions such as: Are we focused or scattered? Are we continually challenging ourselves, our world views, our attitudes and outlooks? Or are we so afraid of being challenged that we hold on frantically against the tides of change?

The spiritual question is really, are we tossed about by every single wind that blows our way, or are we grounded firmly and calmly where we are? A person who is in touch with his own spirituality may say that there is an inner strength that keeps her centered and whole, when all around the world is trying to pull us into fragments.

So could I be spiritual?

My experience of what I might call the spiritual dimension of life comes out of my engagement with the natural world and my, albeit limited, knowledge of how that world works. It reminds me that just outside my normal range of vision there is a world of truth which I seldom seek, but which influences my life daily and wholly. The spiritual dimension helps bring together the different aspects of life, which it is all too tempting to keep separate. It reminds me that there are other ways of knowing, other ways of seeing other realities which have the possibility of changing us as we cannot deliberately change ourselves.

I would add that spirituality can provide meaning and values without a god telling us what is right and wrong. It may be a substitute for being godly – or maybe it is the same thing – being “goodly”! It

may be Kierkegaard’s “power of a person’s understanding over his or her life,” or Matthew Fox’s reminder of the tension between mysticism (awe) and the prophetic tradition, the struggle for justice. We must always balance that tension so that spirituality does not become an escape from working toward justice, or from the trials of living in the world.

And should I expect my minister to help with that?

One of the things that still bothers me about spirituality is that people expect ministers to “give” it to then. Often parishioners will say that they want “more spirituality” in services. I suspect that what they mean is that they want to feel more – feelings of connection, relief, forgiveness, belonging, contentment, joy, emotion. Sometimes it is a code word for the use of historic rituals and art forms such as prayers, litanies, special holidays, flower communion, bells, sacraments, choirs and hymns, vestments, candles – in short, everything sensual and colorful,

Farley Wheelwright, one of our oldest and most respected humanist ministers, would have nothing to do with this last idea, that disciplinary practices and liturgy have anything to do with spirituality. “It either happens to us or it does not… It is bred in the bones and defies translation or definition.”2 I agree. For me, insofar as spirituality exists, it does so when it becomes the better part of a good person’s life. I don’t believe it can be packaged in piety, or in meditation, isms, dogma or definition. Spirituality has no necessary connection to religious faiths; it has everything to do with humanity. Spirituality is that indefinable something which we all feel but cannot manufacture.

There are some expressions and definitions of spirituality in which I find some solace and meaning. I enjoy the lovely things of life as much as anyone; I experience great joy in art, music, literature, human kindness, and so on. I find a sense of peace when I connect with nature – stalking the wild asparagus or walking in the woods or prairie. I feel awe when I see a newborn child, or a cloud in a bright blue sky. These are wonderful things, and I am glad I can appreciate them. But for me, what passes for spirituality is not enough.

In all the various descriptions, definitions and explanations of spirituality, it is always very personal. It is an inner experience, which can be experienced only by an individual alone. It does not connect people, because everyone experiences things differently. It does not form community, but rather encourages separatism.

What then, do I want my minister (and church) to do, to help with?

From the clergy, from the ministries of the church, I need more than (and something different from) spirituality; I need religion. There is a reason why religion has been around virtually as long as humankind has existed; it fulfills a basic human need. From the beginning of time, people have needed a way to explain the world, to find answers to perplexing questions, to understand how the world works, where we came from, what is the nature of god and humanity, what happens when we die, how did life begin, and so on.

Many different answers have been found to these questions, depending on the times, the place, the needs of the people, etc. And so we have many, many different religions. But what they all have in common is that people derived them by trying to figure out answers to difficult questions. The Bible dictionary says that “religion may be thought of as a system of embodying the means of attaining and expressing in conduct the values deemed characteristic of the ideal life.”3

In other words, one’s religion is how one views the world and one’s place in it. It is the result of experiences, study, reason and thoughtfulness. It involves using one’s mind to come to an understanding of how to live. This is one of the major differences between religion and spirituality, but one which is very important, for we cannot live to the fullest only on instinct and good feeling. A.C. Grayling writes: “Religion offers something ‘higher,’ something overarching, something that seems to make sense of things, to organize the inchoate nature of experience and the world into a single framework of apparent meaning.”4

In the introduction to his wonderful book, Religion is Not about God, Loyal Rue writes: “If Religion is

not about God, then what on earth is it about (for heaven’s sake?) It is about manipulating our brains so that we might think, feel, and act in ways that are good for us, both individually and collectively. Religious traditions work like the bow of a violin, playing upon the strings of human nature to produce harmonious relations between individuals and their social and physical environments.”5

As Sophia Fahs said, “one’s religion is the construct (or gestalt) of all his or her smaller specific beliefs. It is the philosophy of life that gathers up into one emotional whole… all the specific beliefs one holds about many kinds of things in many areas of life.”6 As a liberal religious educator, she advocated for children being exposed to many points of view, learning about nature and science, and using reason to decide what to believe. I and my children were raised with her wonderful curriculum. I guess that is one reason why I claim to be religious – I believe what I believe because it makes sense and seems reasonable.

As with all religions, however, I find that my beliefs, although unique to me, have some correspondence with those of many others, and I have found great satisfaction in joining with those of similar beliefs. This is another aspect of religion that people over eons have found compelling. There are many good reasons to gather together; a community can provide comfort and assistance when it is needed; a group of people with similar outlook and values can bond together to accomplish much more than individuals can. It is much easier to put one’s religious values into practice when you are doing it with others – and it will probably have a much greater effect.

Charles Vail suggests in an as yet unpublished paper that religion satisfies the basic human needs of congregation, communion, creed and covenant – people coming together, sharing their thoughts and feelings, seeking and formalizing a consensus of the ideals they share in common, and pledging themselves to honor those shared ideals.7 This is why I affiliate with UU churches – they provide a place where I can find people of similar interests and values, who work for causes and issues that meet my values. It gives me a community. I often say that my religion is humanism; my community is UU.

Even though we may no longer worship the supernatural, there is still value in celebrations, in meditations, in the use of the arts to extend and deepen our feelings, our sense of significance and meaning. The person who has no need of celebrations, whether sacred or secular, natural or supernatural, is a dull person.

Religious celebration that meets our individual needs but takes place in a community, is the most composite and complete of all the arts, being the full celebration of life itself. As we create and shape our ideal ends, we should be able to project them into the friendly and demonstrative forms of poetry, song, dance, drama, prayer and ritual. This is why I go to church, for I could never experience the quality and range of the arts by myself, no matter how spiritual I feel.

Religion also offers rituals and routines for dealing with the more significant of life’s transitions, from the arrival of a child, to marriage, to death – rituals which match the values and ideals of its members. In UU churches, we provide child dedications that are not based on a notion that children are born in sin; we offer coming of age programs that help kids wrestle with the issues and problems and challenges of their lives and decide what they believe, not what someone else tells them to believe; we perform weddings tailored to the beliefs and ideals of the couple; and we have memorial services that celebrate and affirm the lives of the dead and uplift their immortality in terms of their accomplishments and presence here on earth. Sharing a somewhat common lexicon and symbology provides a means of engendering wonder, and consoling explanations to ease experiences of hardship.

All of these are reasons why I claim to be religious – not just spiritual. Spirituality is nice – it can be comforting, awesome, beautiful and sustaining. But it is a lonely endeavor. I choose religion so that I can continue to stimulate my mind and continually ask and try to answer important life questions; so I can be a member of a religious community that gives form and structure to my belief system and enables me to work together on the problems and challenges of the times; so my family can partake of rites of passage and celebrations that fit with our beliefs and values; and so I can enjoy the support and companionship of people who share similar beliefs and values. My religion is centered in myself as a human being, but it also encourages me to be part of larger community outside myself.

So, as a religious humanist, I say, “I am not very spiritual, but I am very religious.” Actually, I believe everyone is religious, if it is defined properly, and not just connected to belief in god, or accepting certain

dogma, or being the property of one church. We can be religious without god; we can be good without god. But we all need community, celebration, and answers to life’s unanswerable questions, whether we claim to be primarily spiritual, secular, or religious.

Notes

  1. John Dietrich, from an unpublished sermon manuscript. I can no longer find the exact place where Dietrich said this – but I have it written down, and have used it enough as a quotation to be fairly certain of it.
  2. Farley Wheelwright, my written lecture notes, unknown date.
  3. Madeleine S. and J. Lane Miller, Bible Dictionary, 1958, Harper Bros. New York, p. 608
  4. A.C. Grayling, from the essay “Debating Humanism,” in Humanism, Religion and Ethics, 2006, Oxford University Press, pp. 47-54
  5. Loyal Rue, Religion is Not about God, 2005, Rutgers University Press, Introduction, p.l
  6. Sophia Lyons Fahs, Today’s Children and Yesterday’s Heritage: A Philosophy of Creative Religious Development, 1952, Beacon Press
  7. Charles Vail, posted on the Humanists list (uu lists) on July 20, 2012.

Taken from:  religious humanism volume xliii number 1 fall 2012

Page 3 of 3
1 2 3