March 11, 2015 Raising a Moral Child

Raising a Moral Child
By Adam Grant  APRIL 11, 2014

PDF copy this item:

What does it take to be a good parent? We know some of the tricks for teaching kids to become high achievers. For example, research suggests that when parents praise effort rather than ability, children develop a stronger work ethic and become more motivated.

Yet although some parents live vicariously through their children’s accomplishments, success is not the No. 1 priority for most parents. We’re much more concerned about our children becoming kind, compassionate and helpful. Surveys reveal that in the United States, parents from European, Asian, Hispanic and African ethnic groups all place far greater importance on caring than achievement. These patterns hold around the world: When people in 50 countries were asked to report their guiding principles in life, the value that mattered most was not achievement, but caring.

Despite the significance that it holds in our lives, teaching children to care about others is no simple task. In an Israeli study of nearly 600 families, parents who valued kindness and compassion frequently failed to raise children who shared those values.

Are some children simply good-natured — or not? For the past decade, I’ve been studying the surprising success of people who frequently help others without any strings attached. As the father of two daughters and a son, I’ve become increasingly curious about how these generous tendencies develop.

Genetic twin studies suggest that anywhere from a quarter to more than half of our propensity to be giving and caring is inherited. That leaves a lot of room for nurture, and the evidence on how parents raise kind and compassionate children flies in the face of what many of even the most well-intentioned parents do in praising good behavior, responding to bad behavior, and communicating their values.

By age 2, children experience some moral emotions — feelings triggered by right and wrong. To reinforce caring as the right behavior, research indicates, praise is more effective than rewards. Rewards run the risk of leading children to be kind only when a carrot is offered, whereas praise communicates that sharing is intrinsically worthwhile for its own sake. But what kind of praise should we give when our children show early signs of generosity?

Many parents believe it’s important to compliment the behavior, not the child — that way, the child learns to repeat the behavior. Indeed, I know one couple who are careful to say, “That was such a helpful thing to do,” instead of, “You’re a helpful person.”

But is that the right approach? In a clever experiment, the researchers Joan E. Grusec and Erica Redler set out to investigate what happens when we commend generous behavior versus generous character. After 7- and 8-year-olds won marbles and donated some to poor children, the experimenter remarked, “Gee, you shared quite a bit.”

The researchers randomly assigned the children to receive different types of praise. For some of the children, they praised the action: “It was good that you gave some of your marbles to those poor children. Yes, that was a nice and helpful thing to do.” For others, they praised the character behind the action: “I guess you’re the kind of person who likes to help others whenever you can. Yes, you are a very nice and helpful person.”

A couple of weeks later, when faced with more opportunities to give and share, the children were much more generous after their character had been praised than after their actions had been. Praising their character helped them internalize it as part of their identities. The children learned who they were from observing their own actions: I am a helpful person. This dovetails with new research led by the psychologist Christopher J. Bryan, who finds that for moral behaviors, nouns work better than verbs. To get 3- to 6-year-olds to help with a task, rather than inviting them “to help,” it was 22 to 29 percent more effective to encourage them to “be a helper.” Cheating was cut in half when instead of, “Please don’t cheat,” participants were told, “Please don’t be a cheater.” When our actions become a reflection of our character we lean more heavily toward the moral and generous choices. Over time it can become part of us.

Praise appears to be particularly influential in the critical periods when children develop a stronger sense of identity. When the researchers Joan E. Grusec and Erica Redler praised the character of 5-year-olds, any benefits that may have emerged didn’t have a lasting impact: They may have been too young to internalize moral character as part of a stable sense of self. And by the time children turned 10, the differences between praising character and praising actions vanished: Both were effective. Tying generosity to character appears to matter most around age 8, when children may be starting to crystallize notions of identity.

Praise in response to good behavior may be half the battle, but our responses to bad behavior have consequences, too. When children cause harm, they typically feel one of two moral emotions: shame or guilt. Despite the common belief that these emotions are interchangeable, research led by the psychologist June Price Tangney reveals that they have very different causes and consequences.

Shame is the feeling that I am a bad person, whereas guilt is the feeling that I have done a bad thing. Shame is a negative judgment about the core self, which is devastating: Shame makes children feel small and worthless, and they respond either by lashing out at the target or escaping the situation

altogether. In contrast, guilt is a negative judgment about an action, which can be repaired by good behavior. When children feel guilt, they tend to experience remorse and regret, empathize with the person they have harmed, and aim to make it right.

In one study spearheaded by the psychologist Karen Caplovitz Barrett, parents rated their toddlers’ tendencies to experience shame and guilt at home. The toddlers received a rag doll, and the leg fell off while they were playing with it alone. The shame-prone toddlers avoided the researcher and did not volunteer that they broke the doll. The guilt-prone toddlers were more likely to fix the doll, approach the experimenter, and explain what happened. The ashamed toddlers were avoiders; the guilty toddlers were amenders.

If we want our children to care about others, we need to teach them to feel guilt rather than shame when they misbehave. In a review of research on emotions and moral development, the psychologist Nancy Eisenberg suggests that shame emerges when parents express anger, withdraw their love, or try to assert their power through threats of punishment: Children may begin to believe that they are bad people. Fearing this effect, some parents fail to exercise discipline at all, which can hinder the development of strong moral standards.

The most effective response to bad behavior is to express disappointment. According to independent reviews by Professor Eisenberg and David R. Shaffer, parents raise caring children by expressing disappointment and explaining why the behavior was wrong, how it affected others, and how
they can rectify the situation. This enables children to develop standards for judging their actions, feelings of empathy and responsibility for others, and a sense of moral identity, which are conducive to becoming a helpful person. The beauty of expressing disappointment is that it communicates disapproval of the bad behavior, coupled with high expectations and the potential for improvement: “You’re a good person, even if you did a bad thing, and I know you can do better.”

As powerful as it is to criticize bad behavior and praise good character, raising a generous child involves more than waiting for opportunities to react to the actions of our children. As parents, we want to be proactive in communicating our values to our children. Yet many of us do this the wrong way.

In a classic experiment, the psychologist J. Philippe Rushton gave 140 elementary- and middle-school-age children tokens for winning a game, which they could keep entirely or donate some to a child in poverty. They first watched a teacher figure play the game either selfishly or generously, and then preach to them the value of taking, giving or neither. The adult’s influence was significant: Actions spoke louder than words. When the adult behaved selfishly, children followed suit. The words didn’t make much difference — children gave fewer tokens after observing the adult’s selfish actions, regardless of whether the adult verbally advocated selfishness or generosity. When the adult acted generously, students gave the same amount whether generosity was preached or not — they donated 85 percent more than the norm in both cases. When the adult preached selfishness, even after the adult acted generously, the students still gave 49 percent more than the norm. Children learn generosity not by listening to what their role models say, but by observing what they do.

To test whether these role-modeling effects persisted over time, two months later researchers observed the children playing the game again. Would the modeling or the preaching influence whether the children gave — and would they even remember it from two months earlier?

The most generous children were those who watched the teacher give but not say anything. Two months later, these children were 31 percent more generous than those who observed the same behavior but also heard it preached. The message from this research is loud and clear: If you don’t model generosity, preaching it may not help in the short run, and in the long run, preaching is less effective than giving while saying nothing at all.

People often believe that character causes action, but when it comes to producing moral children, we need to remember that action also shapes character. As the psychologist Karl Weick is fond of asking, “How can I know who I am until I see what I do? How can I know what I value until I see where I walk?”

Adam Grant  is a professor of management and psychology at the Wharton School of the University of Pennsylvania and the author of “Give and Take: Why Helping Others Drives Our Success.”

— ————————————————-

Praise for intelligence can undermine children’s motivation and performance.
Mueller, Claudia M.; Dweck, Carol S.
Journal of Personality and Social Psychology, Vol 75(1), Jul 1998, 33-52. doi: 10.1037/0022-3514.75.1.33
Praise for ability is commonly considered to have beneficial effects on motivation. Contrary to this popular belief, six studies demonstrated that praise for intelligence had more negative consequences for students’ achievement motivation than praise for effort. Fifth graders praised for intelligence were found to care more about performance goals relative to learning goals than children praised for effort. After failure, they also displayed less task persistence, less task enjoyment, more low-ability attributions, and worse task performance than children praised for effort. Finally, children praised for intelligence described it as a fixed trait more than children praised for hard work, who believed it to be subject to improvement. These findings have important implications for how achievement is best encouraged, as well as for more theoretical issues, such as the potential cost of performance goals and the socialization of contingent self-worth. (PsycINFO Database Record (c) 2012 APA, all rights reserved) Value Hierarchies Across Cultures

Taking a Similarities Perspective
Shalom H. Schwartz ,Anat Bardi
Beyond the striking differences in the value priorities of groups is a surprisingly widespread consensus regarding the hierarchical order of values. Average value hierarchies of representative and near representative samples from 13 nations exhibit a similar pattern that replicates with school teachers in 56 nations and college students in 54 nations. Benevolence, self-direction, and universalism values are consistently most important; power, tradition, and stimulation values are least important; and security, conformity, achievement, and hedonism are in between. Value hierarchies of 83% of samples correlate at least .80 with this pan-cultural hierarchy. To explain the pan-cultural hierarchy, the authors discuss its adaptive functions in meeting the requirements of successful societal functioning. The authors demonstrate, with data from Singapore and the United States, that correctly interpreting the value hierarchies of groups requires comparison with the pan-cultural normative baseline.

Cultural Bases for Self-Evaluation
Seeing Oneself Positively in Different Cultural Contexts
Maja Becker,Vivian L. Vignoles,Ellinor Owe,Matthew J. Easterbrook,Rupert Brown,Peter B. Smith
Michael Harris Bond,Camillo Regalia,Claudia Manzi,Maria Brambilla,Said Aldhafri,Roberto González
Diego Carrasco,Maria Paz Cadena,Siugmin Lay,Inge Schweiger Gallo,Ana Torres,Leoncio Camino
Emre Özgen,Ülkü E. Güner,Nil Yamakoğlu,Flávia Cristina Silveira Lemos,Elvia Vargas Trujillo
Paola Balanta,Ma. Elizabeth J. Macapagal,M. Cristina Ferreira,Ginette Herman,Isabelle de Sauvage
David Bourguignon,Qian Wang,Márta Fülöp,Charles Harb,Aneta Chybicka,Kassahun Habtamu Mekonnen
Mariana Martin,George Nizharadze,Alin Gavreliuc,Johanna Buitendach,Aune Valk,Silvia H. Koller
Personality and Social Psychology Bulletin May 1, 2014 40: 657-675
Several theories propose that self-esteem, or positive self-regard, results from fulfilling the value priorities of one’s surrounding culture. Yet, surprisingly little evidence exists for this assertion, and theories differ about whether individuals must personally endorse the value priorities involved. We compared the influence of four bases for self-evaluation (controlling one’s life, doing one’s duty, benefitting others, achieving social status) among 4,852 adolescents across 20 cultural samples, using an implicit, within-person measurement technique to avoid cultural response biases. Cross-sectional and longitudinal analyses showed that participants generally derived feelings of self-esteem from all four bases, but especially from those that were most consistent with the value priorities of others in their cultural context. Multilevel analyses confirmed that the bases of positive self-regard are sustained collectively: They are predictably moderated by culturally normative values but show little systematic variation with personally endorsed values.

Social Forces June 1, 2013 91: 1499-1528
Position and Disposition: The Contextual Development of Human Values
Kyle C. Longest, Steven Hitlin, Stephen Vaisey
Research on the importance of values often focuses primarily on one domain of social predictors (e.g., economic) or limits its scope to a single dimension of values. We conduct a simultaneous analysis of a wide range of theoretically important social influences and a more complete range of individuals’ value orientations, focusing both on value ratings and rankings. Results indicate that traditional institutions such as religion and parenthood are associated with more concern for the welfare of others and maintaining the status quo, whereas more individually oriented occupational factors like higher income and self-employment are linked to achievement and change-related values. Yet several factors, such as education and gender, have complex associations when individual values are examined as part of a coherent system rather than in isolation.
© The Author 2013. Published by Oxford University Press on behalf of the University of North Carolina at Chapel Hill

Journal of Cross-Cultural Psychology May 1, 2007 38: 333-360
What Defines the Good Person? Cross-Cultural Comparisons of Experts’ Models With Lay Prototypes
Kyle D. Smith ,Seyda Türk Smith , John Chambers Christopher
“Good” is a fundamental concept present in all cultures, and experts in values and positive psychology have mapped good’s many aspects in human beings. Which aspects do laypersons typically access and consider as they make everyday judgments of goodness? Does the answer vary with culture? To address these questions, the authors compiled prototypes of the good person from laypersons’ free-listings in seven cultures and used experts’ classifications to content-analyze and compare the prototypes. Benevolence, conformity, and traditionalism dominated the features that laypersons frequently attributed to good people. Other features—competence in particular—varied widely in their accessibility across cultures. These findings depart from those obtained in research using expert-designed self-report inventories, highlighting the need to consider everyday accessibility when comparing cultures’ definitions of the good person.

February 25, 2015 The American Way of Equality

The American Way of Equality

PDF of this item:

Income inequality is on the rise. The rich are getting better at passing their advantages on to their kids. Lifestyle and values gaps are widening between the educated and uneducated. So the big issue is: Will Americans demand new policies to reverse these trends — to redistribute wealth, to provide greater economic security? Are we about to see a mass populist movement in this country?

Nobody was smarter on this subject than Seymour Martin Lipset, the eminent sociologist who died at 84 on New Year’s Eve. Lipset had been a socialist in the hothouse atmosphere of City College during the 1940s, and though he later became a moderate Democrat, he continued to wonder, with some regret, why America never had a serious socialist movement, why America never adopted a European-style welfare state.

Lipset was aware of the structural and demographic answers to such questions. For example, racially diverse nations tend to have lower levels of social support than homogeneous ones. People don’t feel as bound together when they are divided on ethnic lines and are less likely to embrace mutual support programs. You can have diversity or a big welfare state. It’s hard to have both.

But as he studied these matters, Lipset moved away from structural or demographic explanations (too many counterexamples). He drifted, as Tocqueville and Werner Sombart had before him, to values.

America never had a feudal past, so nobody has a sense of social place or class-consciousness, Lipset observed. Meanwhile, Americans have inherited from their Puritan forebears a sense that they have a spiritual obligation to rise and succeed.

Two great themes run through American history, Lipset wrote in his 1963 book “The First New Nation”: achievement and equality. These are often in tension because when you leave unequally endowed people free to achieve, you get unequal results.

Though Lipset never quite put it this way, the clear message from his writings is that when achievement and equality clash in America, achievement wins. Or to be more precise, the achievement ethos reshapes the definition of equality. When Americans use the word “equality,” they really mean “fair opportunity.” When Americans use the word “freedom,” they really mean “opportunity.”

Lipset was relentlessly empirical, and rested his conclusions on data as well as history and philosophy. He found that Americans have for centuries embraced individualistic, meritocratic, antistatist values, even at times when income inequality was greater than it is today.

Large majorities of Americans have always believed that individuals are responsible for their own success, Lipset reported, while people in other countries are much more likely to point to forces beyond individual control. Sixty-five percent of Americans believe hard work is the key to success; only 12 percent think luck plays a major role.

In his “American Exceptionalism” (1996), Lipset pointed out that 78 percent of Americans endorse the view that “the strength of this country today is mostly based on the success of American business.” Fewer than a third of all Americans believe the state has a responsibility to reduce income disparities, compared with 82 percent of Italians. Over 70 percent of Americans believe “individuals should take more responsibility for providing for themselves” whereas most Japanese believe “the state should take more responsibility to ensure everyone is provided for.”

America, he concluded, is an outlier, an exceptional nation. And though his patriotism pervaded his writing, he emphasized that American exceptionalism is “a double-edged sword.”

Political movements that run afoul of these individualistic, achievement-oriented values rarely prosper. The Democratic Party is now divided between moderates — who emphasize individual responsibility and education to ameliorate inequality — and progressive populists, who advocate an activist state that will protect people from forces beyond their control. Given the deep forces in American history, the centrists will almost certainly win out.

Indeed, the most amazing thing about the past week is how modest the Democratic agenda has been. Democrats have been out of power in Congress for 12 years. They finally get a chance to legislate and they push through a series of small proposals that are little pebbles compared to the vast economic problems they described during the campaign.

They grasp the realities Marty Lipset described. They understand that in the face of inequality, Americans have usually opted for policies that offer more opportunity, not those emphasizing security or redistribution. American domestic policy is drifting leftward, but there are sharp limits on how far it will go.The American Way of Equality

January 28, 2015 Drones and the Democracy Disconnect

             Drones and the Democracy Disconnect
By Firmin DeBrabander

PDF copy this item:

With President Obama’s announcement that we will open a new battlefront in yet another Middle Eastern country — in Syria, against ISIS (the Islamic State in Iraq and Syria) — there is widespread acknowledgement that it will be a protracted, complex, perhaps even messy campaign, with many unforeseeable consequences along the way. The president has said we will put “no boots on the ground” in Syria; he is wary of simply flooding allies on the ground with arms, for fear that they will fall into the wrong hands — as they already have. Obama wants to strike against ISIS in a part of Syria that is currently outside the authority of the Syrian government, which the president has accused of war crimes, and is thus, in our eyes, a legal no-man’s land. He has also made clear that he is ready to go it alone in directing attacks on ISIS — he has asked for Congress’s support, but is not seeking their authorization. All these signs point to drones playing a prominent role in this new war in Syria.

Increasingly, this is how the United States chooses to fight its wars. Drones lead the way and dominate the fight against the several non-state actors we now engage — Al Qaeda, the Shabab in Somalia and now ISIS. Drones have their benefits: They enable us to fight ISIS without getting mired on the ground or suffering casualties, making them politically powerful and appealing. For the moment, the American public favors striking ISIS; that would likely change if our own ground forces were involved.

If any group deserves drone strikes, it may well be ISIS.

This fundamentalist Muslim group, so brutal that even Al Qaeda shunned it, has taken to forcibly converting and exterminating Christians and other minority religious groups; one such minority, the Yazidis, may have narrowly escaped genocide at ISIS’ hands. The West has received vivid proof of the group’s ferocity. On Saturday it released its third video of a beheading — this time of a British aid worker — after two other such videos of the beheadings of American journalists within the past month.

The use of drones raises not just strategic and political problems, but ethical questions as well. In particular, what does our use of and reliance on drones say about us? How do drones affect the nation that endorses them — overtly, or, as is more often the case, tacitly? Are drones compatible with patriotism? With democracy? Honor? Glory? Or do they, as I fear, represent — and exacerbate — a troubling, even obscene disconnect between the American people and the wars waged in our name?

Writing in The Guardian in 2012, George Monbiot declared the United States’ drone strikes in Pakistan cowardly. He echoed the howls of many Pakistanis on the ground, who suffer the drone onslaught firsthand, while those who carry it out are safely removed thousands of miles away. The new breed of warriors is strange indeed: They are safely ensconced here in the United States, often commuting to work like ordinary citizens, and after a day spent monitoring and perhaps striking enemy targets, they return home to kids, homework and dinner.

Drone apologists, and many defense experts, claim drones are a reasonable development in warfare technology. The Slate commentator Jamie Holmes argues that extreme complaints about military innovations are hardly new. To people like Monbiot, or the BBC commentator Jeremy Clarkson, who scoffs that medals for drone pilots “should feature an armchair and a Coke machine or two crossed burgers,” Holmes says, “the hyperventilating about heroism being killed by machines misses the point. For one, the list of weapons once considered ‘cowardly’ … include[s] not only the submarines of World War I but also the bow and arrow and the gun. The point of each of these technologies was the same: to gain an asymmetrical advantage against adversaries and reduce risk.”

There are few philosophers more clear-eyed, frank, even cynical when it comes to war than Niccolò Machiavelli. In “The Prince,” he asserts that war is inescapable, inevitable. He praises the Romans for understanding the danger in putting it off. To the simple question of when you should go to war, Machiavelli’s simple answer is, “When you can” — not when it is just, or “right.” And yet, in another work, “The Art of War,” Machiavelli reveals that how a nation goes to war, how a nation chooses to fight is just as critical, perhaps even more so. At this point, the issue of military technology is pertinent, and Machiavelli’s discussion of the topic is highly reminiscent of our current debate about drones and character.

In “The Art of War,” Machiavelli again praises the ancient Romans, for their battlefield exploits, and states his worry that newly introduced artillery “prevents men from employing and displaying their virtue as they used to do of old.” Machiavelli ultimately dismisses such fears — though he was only contemplating cannons at the time. But elsewhere he declares that “whoever engages in war must use every means to put himself in a position of facing his enemy in the field and beating him there,” since a field war “is the most necessary and honorable of all wars.” Why is this? Because on the battlefield, military discipline and courage are exhibited and forged, and your opponent gets a true taste of what he’s up against — not only the army, but the nation he is up against.

For Machiavelli, military conduct is a reflection, indeed an extension — better yet, the root and foundation of a nation’s character, the bravery and boldness of its leaders, the devotion and determination of its citizens. Military conduct is indelibly linked to civic virtue, which is why he argues that nations should reject a professional army, much less a mercenary one, in favor of a citizen militia. Every citizen must get a taste of military discipline — and sacrifice. Every citizen must have a stake, an intimate investment, in the wars his nation fights.

Machiavelli was highly sensitive to the role military glory plays in inspiring the public and uniting, even defining, a nation. Great battles and military campaigns forged the identity, cohesion and indomitable pride of the Roman Republic, Machiavelli maintains — across the different social classes — and stoked the democratic energy of the people. Haven’t they served a similar purpose in our own republic? War has offered iconic images of our national identity: George Washington crossing the Delaware with his ragtag soldiers; marines hoisting the flag at Iwo Jima. These images are inherently democratic — they offer no king on his steed, lording over kneeling troops. To that extent, they nourish and reinforce our democratic identity and sensibilities.

This is no longer the case in the age of drones. I have strained to imagine the great battles drones might fight, which the public might rally around and solemnly commemorate. But this is a silly proposition — which cuts to the heart of the matter.

Never have the American people been more removed from their wars, even while we are the most martial nation on earth, and drones are symptoms, and drivers, of this troubling alienation. The United States has been engaged in two expensive and protracted wars in the past decade, as well as the seemingly endless war on terror spread the world over. The war in Afghanistan — where drones have made their mark as never before — is the longest in the nation’s history, and we have spent more money rebuilding Afghanistan than we did on Europe after World War II. Through all our recent wars in the region, however, most Americans have hardly felt a thing. Given the extent of our military engagement, unparalleled in the world, that is astounding, shameful even, and politically treacherous.

Critics have long warned that drones put too much war making power in the hands of few government actors, who increasingly operate on their own, or in the shadows. Many felt we saw a preview of political abuses to come when President Obama unilaterally ordered a drone strike against an American citizen in Yemen. This new technology has already emboldened our government to openly wage war in countries against which we have not officially declared war. We operate there with the tacit, and dubious, assent of a few ruling interests.

Perhaps it is not inevitable that drones are linked to arbitrary, centralized government; perhaps drone warfare can be waged transparently, democratically, legally, though it is admittedly hard to imagine what that would look like. What is certain, however, is that drone technology offers manifold temptations to those who would expand the borders of our wars, or wage war according to their own agenda, independent of the will, or interest, or attention of the American public.

Most American citizens are quick to let someone or something else bear the brunt of our wars, and take up the fight. Hence there is less worry about whether a given incursion is necessary, justified, logical or humane. Drones point to a new and terrible kind of cruelty — violence far removed from the perpetrator, and easier to inflict in that regard. With less skin in the game — literally — we can be less vigilant about the darker tendencies of our leaders, the unintended consequences of their actions, and content to indulge in private matters.

The United State is gradually becoming a warring nation with fewer and fewer warriors, and few who know the sacrifices of war. Drones represent the new normal, and are an easy invitation to enter into and wage war — indefinitely. This is a state of affairs Machiavelli could not abide by, and neither should we. It is antithetical to a democracy for its voting public to be so aloof from the wars it fights. It is a feature, I fear, of a democracy destined to lose that title.

Firmin DeBrabander, an associate professor of philosophy at the Maryland Institute College of Art, Baltimore, is the author of “Spinoza and the Stoics” and a forthcoming book critiquing the gun rights movement.


Gemli, boston
People have been killing each other for a long time. The grim history of warfare is summed up succinctly in Kubrick’s “2001: A Space Odyssey,” when the film jumps in a single frame from a proto-human realizing that a thigh bone can crush a skull to an orbiting nuclear weapon. The intervening millennia are skipped because the details are unimportant. People will kill each other with whatever is handy. Thigh bones, drones, it’s all the same if you’re on the receiving end.

It’s hard to place a value on words like honor, glory and patriotism since I’m sure the Nazis used these as well. These words are rhetorical recruitment tools, employed whether we’re actually defending ourselves from evil or merely taking other people’s stuff. Sometimes it’s hard to tell.

We allowed the “darker tendencies of our leaders” to get us into two wars. These wars were terrible and cruel, and the consequences were unintended but not unpredictable. Drones didn’t lull us into these pointless and ruinously costly conflicts. We were drawn in the old fashioned way, with tales of WMDs and promises of a quick and easy victory. We ultimately left, but not before more soldiers were dying of suicide from moral injury than were being killed in battle.

ISIS believes in beheadings, female circumcision and stonings for trivial offenses. Honor is off the table. Machiavelli isn’t here. I don’t think he’ll mind if we send in the drones.

Steve Fankuchen     Oakland, Calif.
There is absolutely no relationship one way or the other between the use of drones and democracy. Democracy is about the way decisions get made. Drones are simply a weapon with which some decisions, right or wrong, are carried out. One would expect DeBrabander, a professor of philosophy, to be more precise in the use of language.

Calling the use of drones “cowardly” is absurd! War and sanctioned killing is not about a “fair fight;” it is about winning, however that may be defined.

As to being removed from intimate contact with the effects of one’s weapons, drones are nothing new. Two thousand years ago catapults were wrecking mayhem within walled villages, the results not immediately apparent to those who launched projectiles with them. Guns allowed individual soldiers not to see the close-up damage, and artillery, whether on land or on boats, carried the removal further. With the advent of planes and bombs, especially the B-52s and high altitude bombers, you could look in a gadget, push a button, and be done with it. Drones merely go a little step farther in a well-established continuum.

The author claims drones are cruel. How in the world is getting blown up by a drone any worse than getting shot by a gun, turned to hamburger by a mine or a V-2 rocket, or having one’s head cut off?

A fair fight? That is called sports unless, of course, you think Alexander Hamilton and Aaron Burr had the right idea, and we should bring back American honor and glory with duels.

MLP     Pittsburgh
If Machiavelli was right, if military conduct is indeed a reflection of a nation’s character, then drone warfare may well be the quintessential expression of contemporary American character: a nation of couch potatoes playing “video games” where real flesh and blood human beings are the targets but who are too insensitive and clueless to comprehend the inevitable consequences of their actions. Sooner or later, others will have drone technology capable of inflicting harm upon the United States, and when that happens the couch potatoes will call it “terrorism” and moan “why do they hate us?”

Bob Garcia     Miami
Drones are an example of our self-blindness, which is an important element of our Exceptionalism. For example, why aren’t drones considered weapons of terror? Look at the parallels between the use of hijacked planes on 9/11 and our now routine use of drones. The main difference is that by definition nothing we do is considered terrorism, whether with drones, torture, or kicking in doors of peasants at midnight. Or widespread use of contractors who are outside all law of accountability, just possible recall by their employers.

And have we legitimized the use of drones? For example, if the Chinese unilaterally decided to use drones to kill alleged Uighur or Tibetan terrorists in the United States, would it be reasonable for them to carry out such strikes in the United States — limited only by the practical matter of avoiding being shot out of the sky? Would it be accepted that sometimes they’d make a mistake and blow up a wedding party? Would it be OK for them to negotiate with Mexico or Honduras to site drone bases?

Harold V. McCoy       Pinehurst, NC, USA
If Mr. DeBrabander is looking for glory, honor and national character in war, he has obviously never been in one.

Chuck     S C
I can’t help but think of what Robert E Lee said:”It is well that war is so terrible, otherwise we would grow too fond of it.”

The further our technology removes us from the God-awful stench of war, the more likely we are to grow fond of it. That will lead to greater hubris and the nemesis that will inevitably follow will be devastating.

PogoWasRight     Melbourne Florida
It appears that The President, as many others before him, is continuing to weave a tangled web, one in which we ourselves could become entangled. As an old retired career military person, I have never been able to understand that if we drop a bomb on a home or factory or office in some foreign country, will we create a friend or will we create an enemy? All I am sure of is what I would become if it were done to me. A situation most times called “a no-brainer”. And that is not the end of all possibilities. Consider what is in store for us in the future when missile-armed drones become affordable and available to terrorists around the world, as they certainly will. Instead of flying from “here” to “there”, those drones will be flying from “there” to “here”. Not a pleasant thought. But an inevitable outcome. As Eliza Doolittle said: “Just You Wait!”

Hoff     Philadelphia, PA
The author neglects the reality of the modern battlefield when bemoaning our lack of intimacy with the opposition. The enemy we face doesn’t form ranks and march across the desert. He hides in civilian homes in areas removed from global society and conspires to release his own versions of remote weapons; theology-addled recruits on consciencless suicidal missions to achieve a glorious afterlife. And he does not hesitate to use his tools to their maximum range and effect. This is fundamental asymmetry to ‘honorable’ war-making, and to match the threat we spend multitudes of fortunes trying to stay on the high road.
To say that we feel nothing in the West is absurd in the extreme. We reap daily harvests of horrors through the press, our economy struggles to overcome the $trillions we spend on these conflicts, and our politics have devolved to fighting over the bloody scraps from administration after administration of strategic losses to nearly invisible enemies. You’ll have to forgive me for my lack of guilt over finding a way to score a win out of range of the stench of my adversaries’ death.

Bruce Balfe     Valparaiso
I fail to see the distinction between drones and ships lobbing ordinance from 30 miles out at sea or planes dropping bombs from 35000 feet. It is all long distance warfare. If we can conduct our wars, however distasteful they are in the first place, in a way that minimizes putting our troops in dangers way, then why not?

Chris Koz     Portland, OR

there needs to be a distinction made between the utilization of words like Patriotism, Honor, Glory by civilian leadership and military personnel. Civilians often use these words, even with the best of intentions, as a recruitment tool, a source of rhetoric, and to beat the drums of nationalism. The jingoism sold by politicians, the military industrial complex, and armchair warriors is largely self-serving.

Conversely, these words are the very vehicle by which the military chain of command survives and to argue, as some commentator’s have, honor does not matter when fighting an honor less enemy simply do not understand this code; the “disconnect” the writer speaks of grows out of the former.

May 28, 2014 Millenials

My So-Called Opinions

 April 6, 2014,                                   By ZACHARY FINE                           The Stone
Click here for a pdf version

Critics of the millennial generation, of which I am a member, consistently use terms like “apathetic,” “lazy” and “narcissistic” to explain our tendency to be less civically and politically engaged. But what these critics seem to be missing is that many millennials are plagued not so much by apathy as by indecision. And it’s not surprising: Pluralism has been a large influence on our upbringing. While we applaud pluralism’s benefits, widespread enthusiasm has overwhelmed desperately needed criticism of its side effects.
By “pluralism,” I mean a cultural recognition of difference: individuals of varying race, gender, religious affiliation, politics and sexual preference, all exalted as equal. In recent decades, pluralism has come to be an ethical injunction, one that calls for people to peacefully accept and embrace, not simply tolerate, differences among individuals. Distinct from the free-for-all of relativism, pluralism encourages us (in concept) to support our own convictions while also upholding an “energetic engagement with diversity, ” as Harvard’s Pluralism Project suggested in 1991. Today, paeans to pluralism continue to sound throughout the halls of American universities, private institutions, left-leaning households and influential political circles.
Those of us born after the mid-1980s grew up amid a new orthodoxy of multiculturalist ethics and ‘political correctness.’
However, pluralism has had unforeseen consequences. The art critic Craig Owens once wrote that pluralism is not a “recognition, but a reduction of difference to absolute indifference, equivalence, interchangeability.” Some millennials who were greeted by pluralism in this battered state are still feelings its effects. Unlike those adults who encountered pluralism with their beliefs close at hand, we entered the world when truth-claims and qualitative judgments were already on trial and seemingly interchangeable. As a result, we continue to struggle when it comes to decisively avowing our most basic convictions.
Those of us born after the mid-1980s whose upbringing included a liberal arts education and the fruits of a fledgling World Wide Web have grown up (and are still growing up) with an endlessly accessible stream of texts, images and sounds from far-reaching times and places, much of which were unavailable to humans for all of history. Our most formative years include not just the birth of the Internet and the ensuing accelerated global exchange of information, but a new orthodoxy of multiculturalist ethics and “political correctness.”
These ideas were reinforced in many humanities departments in Western universities during the 1980s, where facts and claims to objectivity were eagerly jettisoned. Even “the canon” was dislodged from its historically privileged perch, and since then, many liberal-minded professors have avoided opining about “good” literature or “high art” to avoid reinstating an old hegemony. In college today, we continue to learn about the byproducts of absolute truths and intractable forms of ideology, which historically seem inextricably linked to bigotry and prejudice.
For instance, a student in one of my English classes was chastened for his preference for Shakespeare over that of the Haitian-American writer Edwidge Danticat. The professor challenged the student to apply a more “disinterested” analysis to his reading so as to avoid entangling himself in a misinformed gesture of “postcolonial oppression.” That student stopped raising his hand in class.
I am not trying to tackle the challenge as a whole or indict contemporary pedagogies, but I have to ask: How does the ethos of pluralism inside universities impinge on each student’s ability to make qualitative judgments outside of the classroom, in spaces of work, play, politics or even love?
In 2004, the French sociologist of science Bruno Latour intimated that the skeptical attitude which rebuffs claims to absolute knowledge might have had a deleterious effect on the younger generation: “Good American kids are learning the hard way that facts are made up, that there is no such thing as natural, unmediated, unbiased access to truth, that we are always prisoners of language, that we always speak from a particular standpoint, and so on.” Latour identified a condition that resonates: Our tenuous claims to truth have not simply been learned in university classrooms or in reading theoretical texts but reinforced by the decentralized authority of the Internet. While trying to form our fundamental convictions in this dizzying digital and intellectual global landscape, some of us are finding it increasingly difficult to embrace qualitative judgments.
Matters of taste in music, art and fashion, for example, can become a source of anxiety and hesitation. While clickable ways of “liking” abound on the Internet, personalized avowals of taste often seem treacherous today. Admittedly, many millennials (and nonmillennials) might feel comfortable simply saying, “I like what I like,” but some of us find ourselves reeling in the face of choice. To affirm a preference for rap over classical music, for instance, implicates the well-meaning millennial in a web of judgments far beyond his control. For the millennial generation, as a result, confident expressions of taste have become more challenging, as aesthetic preference is subjected to relentless scrutiny.
Philosophers and social theorists have long weighed in on this issue of taste. Pierre Bourdieu claimed that an “encounter with a work of art is not ‘love at first sight’ as is generally supposed.” Rather, he thought “tastes” function as “markers of ‘class.’ ” Theodor Adorno and Max Horkheimer argued that aesthetic preference could be traced along socioeconomic lines and reinforce class divisions. To dislike cauliflower is one thing. But elevating the work of one writer or artist over another has become contested territory.
This assured expression of “I like what I like,” when strained through pluralist-inspired critical inquiry, deteriorates: “I like what I like” becomes “But why do I like what I like? Should I like what I like? Do I like it because someone else wants me to like it? If so, who profits and who suffers from my liking what I like?” and finally, “I am not sure I like what I like anymore.” For a number of us millennials, commitment to even seemingly simple aesthetic judgments have become shot through with indecision.
It seems especially odd because in our “postcritical” age, as the critic Hal Foster termed it, a diffusion of critical authority has elevated voices across a multitude of Internet platforms. With Facebook, Twitter and the blogosphere, everyone can be a critic. But for all the strident young voices heard across social media, there are so many more of us who abstain from being openly critical: Every judgment or critique has its weakness, making criticism seem dangerous at worst and impotent at best.
This narrative runs counter to the one that has been popularized in the press about the indefatigable verbiage of blog-hungry millennials, but it is a crucial one. The proliferation of voices has made most of them seem valueless and wholly interchangeable, even for important topics. To use social media to publicly weigh in on polarized debates, from the death of Trayvon Martin to the Supreme Court’s striking down of the Defense of Marriage Act, seems to do nothing more than provide fodder for those who would attack us. This haunts many of us when we are eager to spill ink on an issue of personal importance but find the page to be always already oversaturated.
Perhaps most crucially, the pluralistic climate has confused stances on moral judgment. Even though “difference” has historically been used, according to the philosopher Cornel West, as a “justification for degradation and a justification for subordination,” we millennials labor to relish those differences and distances separating individuals, exalting difference at all costs.
We anxiously avoid casting moral judgment. Because with absolute truths elusive, what claims do we have to insist that our moral positions are better than those of someone from a different nation or culture?
Consider the challenge we might face when confronted with videos from the popular youth-oriented news outlet Vice. Here, viewers can watch videos of communities, from across the globe, participating in a host of culturally specific activities, ranging from excessive forms of eating to ritual violence to bestiality. While the greater Western culture may denounce these acts, a substantial millennial constituency would hesitate to condemn them, in the interest of embracing “difference.”
We millennials often seek refuge from the pluralist storm in that crawlspace provided by the expression “I don’t know.” It shelters the speaking-subject, whose utterances are magically made protean and porous. But this fancy footwork will buy us only so much time. We most certainly do not wish to remain crippled by indecision and hope to one day boldly stake out our own claims, without trepidation.
Zachary Fine is a junior at the New York University Gallatin School of Individualized Study

— ———–
Northstar5     Los Angeles
It’s interesting how many commentators say that the whole point of a college education is to learn other ways of seeing things. That’s just not true. That is only one part of the purpose of a college education.
Another huge and fundamental reason is to learn the great cannon of human knowledge—to learn it, to carry it forward, so that the legacy of human civilization is learned and passed on generation to generation. It is to be part of the continuum of learning, and learning involves absorbing known information. Science, philosophy, history, literature. Facts. Ways of thinking. Sound reasoning.
Learning to open your mind is important, but we must safeguard and treasure actual facts and knowledge that people toiled intensely to discover in the first place.
Don’t forsake truth and actual facts and knowledge in the name of political correctness and being ‘open.’
In the words of the great Richard Dawkins: Don’t open your mind so much that your brain falls out.

John Ombelets   Boston, MA
I’d recommend “Zen and the Art of Motorcycle Maintenance.” It puts questions of personal taste and the validity of “just what I like” in excellent perspective.
Mike in Colorado      Denver
Of all that you can say about “Millenials” (I am on the younger edge of the Baby Boom), this entire essay only addresses a single attribute – what I call the “whatever” generation -where any idea can be diminished by implying that nothing really matters. I never thought it a result of indecision, but a relativism that says that nothing is important except personal wants. It is a world view where peripheral vision is unimportant, usually “Boring!” The attribute of Millenials more distasteful to me is their implicit desire to be immediately recognized at a young age and adored for their abilities and how little they value the maturity gained with experience. There used to be a general acceptance that regardless of education, skills, and abilities, that many necessary soft skills took time and experience – the ability to manage and lead people, for example. What I see from Millenials is, “I’m great, I’m talented, and I want it now” – the result, I think, of a generation of parents saying “You’re great, you talented, you deserve it now.” My own parents were more likely to say, “You’re smart enough, but really, you haven’t got a clue.” Now in my 50’s I realize that they were right.

David Gutting     St. Louis
I have been personally responsible for overseeing an enormous amount of research on millennials, and I am convinced that this entire discussion is overblown.
Millennials are not that different from any younger generation that came before, especially around the issue Mr. Fine brings to light. When people are young they usually haven’t developed deeply informed opinions, and this often leads to either indecisiveness or to narrow minded views. (Advancing deeper into adulthood often doesn’t improve this situation.)
In the last 20 years, there has been a significant change in the way society as a whole weighs in on certain social issues, notably on marriage equality. The liberal tendency in that time has less to do with millennials leading the charge and more to do with the fact that more and more people have personal experience with gay people in their own friend and family circles–and this, in turn, has been a by-product of increasing openness in our culture. A much more open gay culture drove this with a hard-fought effort for many years.
Millennials, at best, are followers on social issues. While marriage equality has seen a leftward movement, reproductive choice has definitely turned in the other direction–with right wing interests succeeding in winning more and more restrictions. On paper and in the polls, millennials are more “pro-choice.” But they have done little to personally take up a pro-choice cause and push back against this trend.

T Cecil      Silicon Valley
As I was reading this I had a thought running through my head similar to your statement ““I like what I like” becomes “But why do I like what I like?”
if the results that follow from this self inquiry lead to pluralism or whatever, so be it, but I think the ability and instinct to question and analyze one’s likes and dislikes is a great evolutionary step for any generation.

Steven    NYC
So you’re suggesting that if there’s an imperfection in the current “system” then we should lunge back to the old straight-white-male hegemony of the past? I think we can work with pluralism. With any of its imperfections it’s still better than what we had in the past.      In reply to RG

Jason      St Louis, MO
Here’s an idea: grow a spine and get some actual scruples. If you feel strongly about something: gather evidence, form an informed opinion, and then make your voice heard. To me, this article doesn’t say that Millennials don’t have opinions or are afraid to voice them, it’s that they lack the fortitude to actually disagree with someone. There is a big difference between having a strong opinion and being culturally insensitive.
Also, maybe read some science or learn some math. Find out what it’s like to interact with a field of study that does have some objective truths…

RG      Chicago
Excellent analysis. The consequences of pluralism, multiculturalism, and moral relativism are now being seen. Some commenters just aren’t willing to accept the obvious: that pluralism / moral relativism has unintended consequences. The classical teachings in religion, literature, and culture are ridiculed and persecuted by the leftists. Universities have been taken over by social activist committed to their utopian ideal of “social justice” over the original ideal of truth. Open inquiry has been squashed by the politically correct. Humanities have devolved into deconstructed nonsense. The question, “what is a good life?” asked by philosophers throughout the ages is now unanswerable according to the moral relativists. There is no truth, only your biased version of the truth. In all prior countries where the quest for social justice replaces the ideal of truth a predictable outcome has occurred: tyranny and dystopia.

David Wiles      Northfield, MN
I’m struck by how often a discussion of a Liberal Arts education descends into or begins with a discussion of literature courses as if everyone majors in English. I teach in a small liberal arts college that attracts a “progressive” student body taught by a faculty that is widely thought of as being “liberal.” We offer degrees in forty or so subjects only one of which is English. A look at our course catalogue (easy to see, it’s online just like everyone’s is) and our requirements shows that our students are required to take no more than one lit course unless they major in English. A sample look at our English department offerings shows courses in Medieval and Renaissance lit, Chaucer, Milton, Victorian Lit, 19th Century American Lit, Marlowe (Chris not Phil), American Transcendentalism and at least three courses in Shakespeare among many other things that critics seem to think are no longer taught.
My point is that most critiques I read of the Liberal Arts don’t seem to be informed by much if any knowledge of what actually goes on. Instead we get anecdotes about what some teacher said to some student and they become the model. We get recycled debates from 35 years ago as if everything changed then and nothing has changed since. Let me suggest looking at the readily available evidence of what’s taught. Looking at evidence. There’s a name for that; critical thinking.

ARP   Northeastern US
This article is a protracted hasty generalization. The author takes it as self-explanatory that millions of people born in an arbitrary time period are “indecisive” and chalks it up to an ethos of “pluralism.” In doing so he makes a move that is not unfamiliar to mass media writing about generations in general: take a larger cultural anxiety and project it onto an imagined monolithic group. The “millennial” stereotype is often coded white, for instance. On the other hand, George Zimmermann is a millennial who doesn’t seem tied up by “pluralism.” This problem with the article comes back when Fine uncritically invokes the myth of political correctness, which is really just a moral panic propagated by people unhappy with the expansion of the curriculum.
Fine namechecks Adorno and Horkheimer, but obscures the argument. He and anyone else considering writing sweeping claims about “generations” might wish to revisit them. It is not that taste breaks down along class lines, but that the culture industry creates groups by reducing all humans to knowable categories. The reality of concepts like “Boomer” “Gen X,” “millennial,” etc. comes from somewhere. That reality is produced partly by mass media writers opining about “generations.” But it also comes from marketers who create the “millennial” as a category some people can identify with, in order to produce value more effectively.

Siobhan    New York
I’m surprised at all the criticisms of this piece. I thought it was great.
Mr. Fine has done a wonderful job of explaining the impossible task put before him and the rest of his generation. Every opinion or judgment must be weighed for its greater cultural and moral meaning, for its implications.
Yet supposedly, no one thing is better than another. And simultaneous with that, his generation is asked why they do not take a stand on things.
When they live simply by their own values, they are called narcissists. But a preference for Shakespeare over Haitian literature–well, that’s about as loaded as it gets.
Pluralism has turned itself inside out. Respect for all has become equated with all are equal, now make a choice.
I thank him for this interesting and illuminating piece of writing.

W. Freen      New York City
Zach: Thank you for your column. I would add that, because Millenials live so much of their lives online, they are constantly confronted by commenters who pick things apart and are eager to tell them that everything they think, write, say and do is wrong. The first batch of comments here in response to your column are perfect examples. I can’t imagine what it must be like to have lived one’s young life in the face of so much manufactured negativity. Try to spend more time offline and ignore the boo-birds.

gemli      Boston 19 hours ago
You poor dears. In an effort to provide anchors for your wobbly worldviews, here are some absolutes:
Good things, in no particular order: Coca Cola; Roger Ebert; Torvill and Dean; Science; Scientists; Northern Exposure, season 3, episode 10, “Seoul Mates”; Christopher Hitchens; Popular music before 1975; Director Mike Leigh; The Roches; David Copperfield’s Flying illusion; Jim Jarmusch; Eraserhead; 2001: A Space Odyssey; Playing an instrument; Books made of paper and ink; Good sound systems; HDTV; Grave of the Fireflies; Randy Newman; New Orleans food; Ricky Gervais; The (British) Office; Deadwood; Battlestar Gallactica (2002); most things by Pixar.
Bad things: Social media; Religion; 95% of the musical guests on Saturday Night Live; All -phobes and most -isms; Republican since Reagan; Things Republicans think; Things Republicans believe; low-information voters; new-age anything; products sold by infomercials; pseudoscience; the weather; local TV news; cheap ear buds; pretentious art; How I Met Your Mother; Generations of kids who can’t tell good from bad; e-anything (except and YouTube); teachers that tell kids all things are relative.
Make your own list. All of these are just suggestions (with the possible exception of Randy Newman).

Masaccio     Chicago
The point of going to college is to learn other ways to think about things. Maybe it was easier for me; I studied existentialism, and learned that no matter how hard the situation appeared, the responsibility for making decisions was on me and me alone.
I’m familiar with the arguments you hear in your classes; I’ve read a bunch of that stuff myself, and occasionally rail against it for the same reasons. It doesn’t matter. I’m still responsible, and I still have to act.
It’s not that hard.

Jack     NYC
In spite of the many valid criticisms below, I think this essay raises issues that should be considered seriously. Even here in the NY Times comment section, I have written non-offensive observations of cultural groups which are not published. These groups seem to have blanket protection from any form of criticism. This type of oppression, in the name of diversity or political correctness or whatever you want to call it, are preventing serious people from making observations about what is wrong with our society.
This bias against saying anything that might be considered offensive on the surface is strangling our ability to speak freely. I’m tired of having to try to figure out what’s going on by looking at the subtext beneath writing that is full of polite euphemisms.

David Underwood    Citrus Heights
Oh my, it appears to me that there is a need to study logic, rhetoric, and how to research history.
Although I went back to college in the 1970s, I do not recall any professors making such blatant contradictory claims.
“we entered the world when truth-claims and qualitative judgments were already on trial and seemingly interchangeable.” How would one know what truth meant in such a case? Unless you can define it, you can not make any statements about it.
Or this one:
“Good American kids are learning the hard way that facts are made up, that there is no such thing as natural, unmediated, unbiased access to truth, that we are always prisoners of language, that we always speak from a particular standpoint, and so on.”
Is that a fact I ask?
“We anxiously avoid casting moral judgment. Because with absolute truths elusive, what claims do we have to insist that our moral positions are better than those of someone from a different nation or culture?’
Is this a truth? if is not, then it is a contradiction in terms.
The whole article is full of claims that contradict each other. There are no truths, is that a truth?
You can peruse the article and find several instances of using a concept to refute itself, that as I recall is known as fallacious or circular reasoning.
Since I was a bit older than most of my classmates, I had no compunction challenging such statements. The question, is that a fact, usually made an impression.

David Wiles     Northfield, MN
Has the author considered that the group that consists of “(t)hose of us born after the mid-1980s whose upbringing included a liberal arts education” excludes the overwhelming majority of people born after the mid-1980s?
If he has considered this, why has he gone on to generalize about how “his” generation thinks?

SDK      Boston, MA
I sincerely doubt that anyone majoring in Math, Business, Engineering or Biology can sympathize with or even understand this article. With my undergraduate degree in cultural studies and 10 years in academia, I understood it very well.
I agree with the writer that there is an issue — but it is limited to the humanities and particularly that discipline at elite schools. In my degree (’93), I learned how to deconstruct — and that was it. Pointing out how power imbalances are re-inscribed in everything that seems good at first glance is the only skill one needs. That, and the ability to hyphenate most of your nouns and adjectives.
That’s a pity, because the work of tearing down is much easier than the work of building up, taking a position, and taking action. This is not taught in the humanities but it can be learned elsewhere.
My advice to Mr. Fine is to take a few business classes, some sociology, and some chemistry. Spend some time working in your community. In these places, you can start to use your critical eye in more creative and productive ways.

May 15, 2014 Diversity


Sotomayor’s Race Dissent

The most complete explanation of Barack Obama’s and Eric Holder’s reasoning on race.
By Daniel Henninger
April 30, 2014 The Wall Street Journal Click here for a pdf version

Attorney General Eric Holder, in a speech to Justice Department employees, praised Justice Sonia Sotomayor’s dissent in last week’s Supreme Court decision upholding Michigan’s ban on race-based admissions to its state universities. He called it “courageous and very personal.”

It was personal. Toward the end of her 58-page dissent, she said this about the six Justices who formed the plurality:

“More fundamentally,” Justice Sotomayor wrote, the plurality “ignores the importance of diversity in institutions of higher education and reveals how little my colleagues understand about the reality of race in America.” Those colleagues are Chief Justice Roberts and Justices Kennedy, Alito, Scalia, Breyer and Thomas.

Justice Sotomayor’s dissent in Schuette v. BAMN provides the most complete explanation I’ve seen of the reasoning behind the views on race of President Obama and Attorney General Holder. Over five years, the administration has repeatedly challenged various states on their voting practices, intervened to alter the racial composition of public-school populations and racial patterns in housing. Disagreement between Democrats and Republicans over voter ID laws has been particularly contentious.

Some of this is politics. But some of it is belief about the status of race in America a half century after passage of landmark civil-rights legislation in 1964.

“Race matters,” Justice Sotomayor wrote. It matters “because of persistent racial inequality that cannot be ignored and that has produced stark socioeconomic disparities.”

In 2006, Michigan voters by 58% approved a constitutional amendment that forbids the use of race-based preferences for admissions to the state’s universities. Eight other states have similar bans, including California.

Michigan’s ban on race-based admissions, says Justice Sotomayor, is not the result of “invidious intent” to discriminate as in the past. Instead the Michigan “majority” resorted to something that she calls “the last chapter of discrimination.” Its admissions amendment unfairly “changed the rules” of the political process. Prior to the amendment, she says, minorities persuaded Michigan’s elected Board of Regents to use “race-sensitive” university admissions policies. The voters’ ban eliminated the Regents’ policy and therefore “burdened racial minorities.”

Some, including the Court’s majority, would say the amendment was a proper exercise of the democratic political process. Justice Sotomayor replies: “While our Constitution does not guarantee minority groups victory in the political process . . . [i]t guarantees the majority may not win by stacking the political process against minority groups permanently, forcing the minority alone to surmount unique obstacles in pursuit of its goals.” In Michigan, that goal was the value of “racial diversity” in the student body.

Equal protection, she adds, is about groups, not mere individuals: “Discrimination against an individual occurs because of that individual’s membership in a particular group.”

And so believes the Obama administration.

Last year, Justice sued to stop Louisiana’s school vouchers program, arguing that when black parents took their kids out of public schools to attend, say, a Catholic school, this increased “the racial identifiability” of the schools. That is, the abandoned public schools had too many white students and so were no longer diverse and had become unequal.

This presumably would also be the rationale for the Justice Department’s interventions against voter ID laws, most famously its lawsuit last year against North Carolina. By requiring an ID, the majority is “changing the rules” in a way that disadvantages black voters. In Justice Sotomayor’s words: “This means vigilantly policing the political process to ensure that the majority does not use other methods to prevent minority groups from partaking in that process on equal footing.” These are what she calls “third-generation barriers.”

In the last line—a footnote—of his concurring opinion, Justice Scalia (joined by Justice Thomas) says that Justice Sotomayor is likening the “majority” in Michigan to the same “majority” who created the Jim Crow laws. She denies that. So what is an average voter supposed to believe?

The Sotomayor dissent in Schuette, as its supporters say, is an important statement of progressive belief about race. Let’s assume they, Justice Sotomayor, President Obama and Mr. Holder wish most Americans would agree with their point of view on race and so support it. If only all could read the Sotomayor dissent to render a national opinion about their racial views.

We can guess. I think it’s fair to say that many who read her reasoning on how Michigan’s voters or other “majorities” are using the political process to harm minorities and produce inequality in every aspect of American life would say: I just don’t get the argument. They might, for instance, ask her about the four-decade catastrophe of urban public schools.

The intricate case she is making about “third-generation barriers” to equality and such—arguments developed by liberal law professors the past 25 years—is not persuasive. I doubt an open-minded majority would agree with it. It could, of course, be imposed anyway by court mandate.

One is left to conclude from the Sotomayor dissent that no matter how much progress people think has been made toward fulfilling the mandate of the 14th Amendment, an argument of some sort will be fashioned to say that equality is forever disappearing toward the horizon, and unattainable. After 50 years, where does that leave us? Polarized.

Letters May 7, 2014
Racial Diversity Seems to Have Become an End in Itself
I’ve thought of racial diversity initiatives as a means to an end, but Justice Sotomayor seems to view racial diversity as the desired end of public policy.

Regarding Daniel Henninger’s “Sotomayor’s Race Dissent” (Wonder Land, May 1): Justice Sonia Sotomayor’s dissent in Schuette v. BAMN brings into better focus the Eric Holder-President Obama understanding of race and race relations in America. I’ve thought of racial diversity initiatives as a means to an end, a way to break down the separation that resulted from de jure and de facto segregation, and enable people to experience one another as fellow human beings. In that light I think it has been useful and has enjoyed a degree of success. But Justice Sotomayor’s opinion in the Michigan case and the Justice Department’s position in the Louisiana school-voucher case and voter-ID disputes seem to view racial diversity as the desired end of public policy. Anything that has the potential of reducing racial diversity, even in the smallest way, is suspect.
This is bizarre. Doesn’t such a policy assume that there are inherent differences in people based on race? Doesn’t it assume that a majority race cannot be expected to treat a minority race fairly on its own? Doesn’t it assume racial conflict as a permanent human condition? Aren’t these points of view, in fact, racist? We’ll never free ourselves completely from conflict, but we have made significant and important progress in accepting one another as equals. Attempting to maintain diversity in appearance through fiat doesn’t help. The Court’s decision allows the social process to play itself out with due regard to the facts and circumstances of time and place.
John D. Hatch
Tarpon Springs, Fla.

That racial disparity is still a fact in America is undeniable. The Supreme Court’s ruling reminds us of our division on the question of how much government should do to try and rectify it. Let’s say we land on one side or the other of that divide based purely on whether we believe government action can be effective. Those among us who believe it cannot be are boosted by the Court’s decision. But there’s a bigger question for all of us here. If down the road we find racial disparity continues to persist with or without government action, is that OK?
Michael Young
Port Hueneme, Calif.

Supreme Court Justice Sonia Sotomayor’s dissenting opinion in Schuette v. BAMN stands in stark reaction to the revolutionary idea that created the U.S. Constitution as a restraint upon the government. Justice Sotomayor envisions the Constitution as a restraint upon the American people.
Morgan Foster

If, as Justice Sotomayor insists, equal protection of “individuals” must be understood in the context of “membership in a particular group,” when can her version of racism be deemed defeated except when a plurality of oppressed “groups” achieves victory over the apparently monolithic “group” that does the oppressing? As the demographic trends of the so-called “minority-majority” accelerate America’s coalescence into a country with no racial majority, will the oppressing “group” still be deemed too powerful by progressive elites and their allies in federal government?

By emphasizing individual experience viewed in light of race, the Obama-Holder-Sotomayor race camp may effect a future outcome that is the opposite of what they claim to intend. Whether the Balkans in the 1990s, Czechoslovakia in the late 1930s or Ukraine in the present day, history provides stark examples of individuals acting in the name of “groups.”
Kurt Hofer
Altadena, Calif.

A black attorney speaking to my high-school law class more than 30 years ago said that “when minorities start winning the game, those in power change the rules.” That is the constitutional wrong that Justice Sotomayor addressed in her dissent and that Daniel Henninger fails to adequately address.
But it was Justice Sotomayor who got it right. Her passionate, perceptive and well-reasoned dissent reminds us that what that attorney said to my class so long ago is still a reality, but one not permitted by our Constitution.
Rick Nagel
Mercer Island, Wash

Sotomayor has been mis-educated – thanks to “progressives” in academia and “living constitution judges – and herself come to accept, an erroneous understanding of the “equal protection” clause.
It has NOTHING to do with any constitutional mandate on the use of federal authority to “make us equal”.
Example: A man applies for a liquor license, having observed he meets the qualifications for one and has fulfilled all the requirements of applying for one. His license is granted. A second man follows the same path and his license is denied. He sees that nothing other than his “race or national origin” distinguishes him from the man whose license application was approved. The “equal protection” clause can be the second man’s basis of a suit – the law was not applied equally, due only to his race or national origin. With some technical caveats that very simple example represents the form of circumstances that the equal protection clause was written to help prevent – a law as written not being applied as written on equal terms no matter someone’s race or national origin. That’s all.
The Michigan state constitutional amendment is in complete respect of the equal protection clause, while Sotomayor actually seeks to deny “equal protection” of the laws and have the law discriminate on the basis of race or national origin. Her’s is not a belief or respect for the Constitution but belief and respect only for a political agenda NOT supported by the equal protection clause.

Charles Frederick Wrote:
Sotomayor describes herself, and these are her own words, as “a wise Latina woman with the richness of her experiences (who) would more often than not reach a better conclusion than a white male who hasn’t lived that life.”
What hubris and what a sorry explanation to rationalize her radical leftist ideology

Jonathan Murray Replied:
Diversity is a false front for affirmative action.

Tom Painter
The “diversity” meme is inherently racist. It assumes you MUST think differently because of your race. From that erroneous meme Liberals have come to loath “minority” Conservatives who have broken out of the Liberals’ intellectual plantation and don’t fit the mold they have tried to CREATE for “minorities”.

Richard Davidson Replied:
I can tell you only this. When I started at Motorola in 1967, located in Chicago back then, minorities in the professional jobs were almost non-existent. When a friend gave me a tour a few years ago at their Schaumburg, IL site, the technical and management people were from every conceivable race, creed, nationality, and color. It was like a tour of the UN.
Something happened in those 40 years. Diversity is the corporate mantra. You explain it
The alleged benefits of diversity that you assert are unproven. There is evidence that diversity of thought is productive in places like businesses, but there is no evidence that diversity based on skin color confers any benefit to anybody except for affirmative action candidates and the industry that feeds off of them.

What is the evidence that diversity benefits minorities ?

Michael Love Wrote:
Unfortunately, the hour is late and I have miles to go before I sleep but please ponder the entire context of the racial progress you allege. In 1960, a dark skinned American driving in most anywhere USA would have a significantly higher likelihood of being pulled over by the police. That hasn’t changed today. What has changed over the years is the rate of incarceration in our nation and especially of minorities. From 1980 to 2008 the rate of incarceration quadrupled in the United States from roughly 500,000 people to 2.3 million people. African Americans constitute nearly 1 million of this 2.3 million population. African Americans are incarcerated at a rate six time that of Whites. Examine the following from NACDL: Together, African American and Hispanics comprised 58% of all prisoners in 2008, even though African Americans and Hispanics make up approximately one quarter of the US population. According to Unlocking America, if African American and Hispanics were incarcerated at the same rates of whites, today’s prison and jail populations would decline by approximately 50%. One in six black men had been incarcerated as of 2001. If current trends continue, one in three black males born today can expect to spend time in prison during his lifetime. Five times as many Whites are using drugs as African Americans, yet African Americans are sent to prison for drug offenses at ten times the rate of Whites. It’s widely reported that 35% of black children grades 7-12 have been suspended or expelled at some point in their school careers compared to 20% of Hispanics and 15% of Whites. So what? Well that’s the actual progress in establishing racial fairness that has been made in American during the last several decades. The majority that Justice Sotomayor calls out is insulated or detached and apparently unable to see the broad picture in terms of our history or past their living rooms walls and further than their television sets. Those six justices and the author of WSJ opinion piece fail also to recall that protecting minorities from oppression by the oppressive majorities is part of the fabric of our founding ideals. Racism is alive and well. Bigots are not just occasional cranks–it’s not just the moocher rancher in Nevada or the twisted NBA franchise owner in Los Angeles. Racial bias still permeates everyday America, and when you look at the criminal justice system it’s like mounting a slice of America on a wet-slide to put under a microscope. Look objectively. Shuck your defensiveness and ideological agendas, the fashionable labels of liberal and conservative once unencumbered you too see the effects of racism in action as played out in our schools and our courts. The biggest change in racism from 1960 to today is that we give equality lip service and racial bias has been camouflaged. The majority via Schuette v. Coalition to Defend Affirmative Action simply yields to ocholcracy because in their myopia, they see racism as a problem of the past and now solved. What’s next? Perhaps allowing other states to reestablish segregated schools on ballot initiatives? It’s a slippery slope when the Court abandons principal.
On Ochlocacy, See John Adams, A Defence of the Constitutions of Government of the United States of America, Vol. 3 (London: 1788), p. 291

You have no clue. The incarceration rate among blacks is because they COMMIT CRIMES! Have you ever been in the hood….at night? In Chicago, there are 20-30 shootings PER NIGHT on the weekends and 5-15 during the week. And that’s only the ones that hit someone. These black youth that you believe have been “railroaded” into jail, they have rap sheets a mile long……..literally, 4 to 8 PAGES of crimes that they have been convicted of before they are 21. I have 2 good friends who work in the system: a DA and a PO. I get to hear these stories all the time. How about the one with the liberal, lenient judge who lets off the 18 yr old defendant with only probation even in the face of a long rap sheet, and he gets rearrested within an hour because HE BROKE INTO THE JUDGE’S CAR TO STEAL HIS LAPTOP AND RADIO!
And I live in a retirement city, not an urban center

Stephen Carroll Wrote:
(4) Sotomayor should step down. She has shown a level of bigotry that can no longer be tolerated in our country. We must end the hate in our country that this woman of the past has exposed in her thoughts. Obama is right. If you want to see just how ignorant a person is just let them talk. Racism at any level is wrong and coming from the uneducated it is regrettable. Coming form the educated it is inexcusable. Her sin is worse because she knows better. A step backward not only for whites but for blacks. All men are created equal. You cannot punish the sons and daughters for the sins of their parents. Obama, Holder and now Sotomayor have added mightily to racism. Are there any liberals of good heart or are you all blinded by your hate

Douglas Oglesby Wrote:
.The Wise Latina was a leader in the National Council of La Raza. Ginsburg, the other dissenter, was general counsel of the ACLU. Ideology and, at least in Sotomayor’s case, outright racism trump objective legal analysis every time.
< “More fundamentally,” Justice Sotomayor wrote, the plurality “ignores the importance of diversity in institutions of higher education and reveals how little my colleagues understand about the reality of race in America.”>
Is there any evidence that diversity as practiced by elite schools such as the University of Michigan, i.e. admitting minority students with lesser academic qualifications in the interest of “diversity,” actually benefits minorities who would not have gained admission on their merits? Is there any evidence that such students would not have been better served (e.g., higher graduation rate, success in STEM programs) in different universities that would have accepted them without racial preferences? Won’t the accomplishments of minorities admitted to elite universities on their merits be diminished because such universities also admit minorities with lesser qualifications?
How do Sotomayor and Ginsburg explain the academic success of Asian students, recalling that Asians were the victims of rampant racism (e.g., Chinese laborers in California goldmines and railroad construction in the mid – late 1800s, internment of Japanese in WWII)? Prior to the passage of California’s Prop 209, the number of Asians admitted to the UC system was cut back from what their number would have been under a race-neutral admission policy because their admission rate based on merit would have been vastly disproportionate to their representation in the overall CA population. Even today, whites are third in representation in the UC schools (36% Asian, 29% Hispanic, 27% white). Would the dissenters agree whites should be entitled to preferential admission?

Steve Haynes Wrote:
We are fortunate that in this case, the Supreme court made the right decision, however we all know it seems to be hit or miss.
The bigger point is that our court is being used to make moral decisions and impose a social direction for our country. You can see how upset Sotomayor was for losing this opportunity. These people were not elected yet our country waits with baited breath for every decision they make. The framers never intended the courts to be used in this manner.
Obama and Dems are continuing to pack the courts so this avenue will no longer be an outlet of justice. Soon the Scalias and Thomases will be gone – they last bastions of our constitution. The lower courts are already filled with Sotomayors and Kagans – they legislate through emotion and agenda, there is no rule of law anymore. This is why Harry Reid went nuclear – no more objection to judges.
Our country is in dire need of citizens that are involved and active. Federal involvement is not enough, acting locally is critical – the leaders of tomorrow come from smaller government in state and city locations.
We are in a dangerous place, but through the citizenry things can change if we work together

John Kelly Wrote:
Explaining Sotomayor’s opinion, Henninger said, “The voters’ ban eliminated the Regents’ policy and therefore [quoting Sotomayor] ‘burdened racial minorities.’ ” Any elimination of governmental privilege, whether that privilege is earned or unearned, “burdens” the formerly privileged. As Justice Sotomayor uses this arguement, she sides with former slave owners, who felt themselves unfairly burdened when the slaves were freed.

Scott Horsburgh Wrote:
Here’s a radical idea. Instead of “affirmative action” based on race, why not give an additional boost to students who overcome adversity (poverty) and achieved anyway? Isn’t that what we should be trying to accomplish, give the benefit of the doubt to a kid that just missed the cut at an elite university, but didn’t have the advantage of living in an upper-middle or high income household? Is a poor white kid less deserving of a hand up than a minority kid from a high-income family?

Anthony Brunsvold Replied:
Why is diversity merely skin deep?
Wouldn’t many of these same institutions benefit if they out reach to poor rural whites for example? Based on my experience most of the people at institutions of higher learning have not had much experience like them nor come from that background. Such whites could bring insights and experiences that are rather new to many on the college campus. Yet, for all the discussion of diversify one never hears about this kind of out reach.
Oh, based on my experience my guess is the answer (and it is a guess since I don’t know you personally) is that the Paula Dowlings of the world don’t mind their sons and daughters hanging out with a black liberal, a Hispanic liberal…. but to hang out (much less marry) a hick would be too much.
Liberal bigotry tends to just take on different forms but make no mistake it still exists

Richard Davidson Replied:
It benefits society when fairness in hiring, housing, education, and everything else is the norm.
They said during the enactment of the Civil Rights Act that you could not legislate morality. We can, and we did. That is what George F. Will, the conservative commentator, said on an ABC This Week roundtable.

April 23, 2014 Science Delusion

 The Science Delusion

A conversation with Curtis White   Click here for a pdf version

Curtis White pulls no punches. To readers who see in Buddhism little room for spirited debate, White’s unapologetic bluntness may seem unexpected or even jarring. But for White—Distinguished Professor Emeritus of English at Illinois State University, novelist, and author of several works of criticism including the 2003 international bestseller The Middle Mind: Why Americans Don’t Think for Themselves—there is too much at stake in our current intellectual climate to indulge in timid discussion.

White’s latest book, The Science Delusion: Asking the Big Questions in a Culture of Easy Answers, strikes out at a nimble opponent, one frequently sighted yet so elusive it often seems to dodge just out of view: scientism. White identifies scientism as an unwarranted triumphalism based on unproven premises—such as the claim that science has got the world nailed down (or soon will, anyway), that the answer to all of our human problems lies in the discovery of natural laws, or that submitting to a scientific perspective is a choiceless imperative dictated by impersonal facts. To White, this attitude is not only wrongheaded, it is dangerous and wreaks social, cultural, and political damage.

The Science Delusion takes dual aim: at scientists and critics who proclaim themselves “enemies of religion” and at certain neuroscientists and thought leaders in the popular press whose neuro-enthusiasm, White thinks, is adding spin to the facts. What these science advocates share, he says, is both an ideology promoting the scientific worldview as the single valid understanding of human phenomena and also a set of assumptions, “many of which,” he writes, “are dubious if not outright deluded.” But for White, the debate over knowledge claims is a side skirmish. There is a more urgent battle to fight that becomes evident when he asks, “In whose interest do these science popularizers and provocateurs write? And to what end?”

White writes at a moment when the arts and humanities are struggling for survival on campuses across America as they are increasingly eclipsed by the “STEM” disciplines (science, technology, engineering, and math). In White’s view, what we are witnessing is a takeover, on the part of science, of the multiple narratives of what it means to be human—narratives that have flourished throughout Western history in religion, art, literature, and philosophy. Scientism comes with its own narrative, which White puts like this: “We are not ‘free’; we are chemical expressions of our DNA and our neurons. We cannot will anything, because our brains do our acting for us. We are like computers or systems, and so is nature.” When this is what we think we are, we become quiescent cogs readily manipulated by societal forces. In White’s view, once scientism rewrites our story so that the things human beings care about—like love, wonder, presence, or play—are reduced to atoms, genes, or neurons, human lives become easy prey to corporate and political interests. We become “mere functions within systems.” White wants us to wake up and recognize that this view is not scientific discovery, it is ideology. Mistaking one for the other has profound consequences, “not just for knowledge but even more importantly for how we live.”

Western Buddhists, engaged as we are in adapting an Asian religious tradition, generally agree that it is valuable to try to understand how Buddhism has been shaped by its host cultures in Asia. But shining that light of understanding on ourselves is a much more difficult proposition. It is hard to see what presumptions we bring to the project precisely because they are our own and not someone else’s. In striking hard at some of our most deeply ingrained assumptions, White brings them to our attention. Whether or not we agree with his critique isn’t the point. White isn’t looking for agreement. He wants to challenge our complacency, and in so doing, to shift the very framework within which we determine our agreements and disagreements.

–Linda Heuman, Contributing Editor

Your latest book is entitled The Science Delusion, which is clearly a response to the title of Richard Dawkins’s book The God Delusion. What is the science delusion, and what are its implications for living a spiritually meaningful life? There is no singular science delusion. One of the biggest challenges in writing a book that tries to question the role that science plays in our culture is being visible at all. So the title is a provocation, although an earnest one.

What I criticize is science as ideology, or scientism, for short. The problem with scientism is that it attempts to reduce every human matter to its own terms. So artistic creativity is merely a function of neurons and chemicals, religion is the result of the God gene, and faith is hardwired into our genetic makeup.

Not surprisingly, “spirit” is a forbidden word. Science writers tend to reduce believers to fundamentalists and the history of religion to a series of criminal anecdotes. Richard Dawkins is, and Christopher Hitchens was, particularly culpable in this regard. Any subtle consideration of the meaning of spirit is left out. But of course the history of religious thought is quite subtle, as anyone familiar with Buddhist philosophy knows well. Another good example is the legacy of Christian existential thinkers beginning with Kierkegaard: it seems to me shamefully dishonest not to acknowledge such work.

Both scientism and religious fundamentalism answer the human need for certainty in a rapidly shifting and disorientingly pluralistic world. To what extent are they in the same business? As your question suggests, the drama of the confrontation between religious fundamentalism and scientism is a confrontation between things that are more alike than they know. Both fundamentalism and scientism try to limit and close down, not open up. Science tends to be vulnerable to the “closed-in” syndrome. Scientists value curiosity, and they value open-mindedness, but they are often insensible to alternative ways of thinking about the world. It’s really difficult for them to get outside of their own worldview. This problem is probably created by the way in which we educate scientists. It seems to me scientists need to have a better background in history and the history of ideas, especially if prominent figures like Stephen Hawking are going to pass judgment on that history and say things like “Philosophy is dead.”

There is a common assumption that science is not a world-view but simply “the way things are.” Along with that assumption goes another: that science derives its authority from its privileged access to how things are—that it launches off from the bedrock of the Real. The odd thing here is that science itself tells us that it does not have a privileged access to

things as they are, and that the philosophical paradoxes in its discoveries, especially in physics, are an open acknowledgment of its many uncertainties.

What we have now is this very uncomfortable joining of an ideological assumption that science is fact-based with the actual work of science, something that is highly speculative and whose reality is often only mathematical. For example, physics is deeply dependent on mathematical modeling, but no one knows why mathematics seems to be so revealing about reality. As the physicists Tony Rothman and George Sudarshan point out in Doubt and Certainty, the math equation of the Black-Scholes model used by stock traders is identical to the equation that shows how a particle moves through a liquid or gas. But, as they observe laconically, in the real world there is a difference between stocks and particle movement.

Even something as familiar as Newtonian equations are mathematical idealizations and, as Einstein showed, they are inadequate in important ways. And if Newtonian predictions about the movements of things as large as astral bodies are idealizations, what can be said about quanta or strings or the branes strings are said to attach to? These things are only numbers. They have no empirical presence at all.

Most Buddhists would have little argument with the statement in The Science Delusion that “the world is something we both find and invent.” How is this understanding at odds with scientism? Even now, after Heisenberg, after quantum physics, so much of the discourse of science in its public proclamations is focused on the establishment of knowledge as fact. This overlooks the paradoxical nature of scientific confirmation. Does confirmation mean positive knowledge of reality? Does it mean probability? Does it mean that something is useful? Newton’s equations have never stopped being useful, even though they have been superseded by general relativity.

Scientism is intolerant of the idea that the universe depends for its being on the participation of mind. Immanuel Kant’s Copernican Revolution was about this single fact: we have no simple access to the thing in itself. Any knowledge we have of reality is necessarily mediated by our own symbolic structures, whether they be math, philosophy, religion, or art. Even the theoretical physicist John Archibald Wheeler could say with conviction, “The universe does not exist ‘out there,’ independent of us. We are inescapably involved in bringing about that which appears to be happening.” Yet what we most often hear from scientism is “We scientists deal in knowledge of truth, and philosophers, artists, and religious believers don’t.” End of conversation.

Many assume that logic and reason lead away from religion. How can the systematic study of literature and art affirm religion? Our culture widely assumes that all reason is empirical reason: a logical development proceeding from an empirical fact. Similarly, we tend to assume that spirit concerns things that are supernatural. But this is not the only way to understand reason or spirit. The essence of the spiritual logic of Buddhism is contained in the four noble truths. There is suffering. Most of this suffering comes from self-interested desire enabled by delusion. This suffering can be stopped. The eightfold path shows how suffering can cease. This is not an appeal to the supernatural, but it is most certainly an appeal to spirit.

The ultimate religious question, the ultimate religious mystery, is not whether or not there is a God. I call myself an atheist because I think that question is silly, childish, and beside the point. The ultimate religious question is “What is compassion?” Or as Christianity puts it, “What is love?” Compassion is not a quality that can be demonstrated empirically. It is not a thing. It is

something that we use flexibly. It speaks to a quality that we keep very close to us: the urgency of kindness. Compassion exists only to the extent that we invest it with the energy of our own lives—“altruism gene” be damned.

This sort of “theo-logic” also exists in the West. If there is a God principle in existential Christianity, it is in its confidence in the ultimacy of compassion. The Protestant theologian Paul Tillich argued that God is the object of our “ultimate concern.” When we are claimed by those concerns, we open ourselves to our true nature.

And art since Romanticism participates in a similar logic. Of course, the common assumption is that art is just imagination or entertainment or a waste of time. My point is that art thinks, and the history of art for the last two centuries shows that art thinks in very particular ways. Art has its own spiritual logic. It asks: How are we to transcend what Friedrich Schiller calls “the misery of culture,” meaning industrial culture in which man is “nothing but a fragment”? For Schiller and the Romantics, the multifold path of art is the way to accomplish the transcendence of this suffering. As Pablo Picasso wrote, “Painting is not made to decorate apartments. It is a weapon of offensive and defensive war against the enemy.” As Picasso’s Guernica or Goya’s The Third of May 1808 show, the “enemy” is cruelty.

Now, in any of these contexts, this is a perverse logic. If you had to judge the situation empirically, I don’t see how you could fail to conclude that the “preponderance of evidence,” as lawyers like to say, points to the idea that, as O’Brien says in Orwell’s 1984, the future is “a boot stamping on a human face—forever.” But Buddhism comes to the opposite conclusion. Our suffering is proof not of who we are—violent because of “human nature”—but of the fact that we are deluded, that we don’t know ourselves, and that if we are to end suffering we must, as Nietzsche says, become who we really are. It is the perversity of this logic that makes it spiritual because it is in no way supported by the facts on the ground. It’s like the story of the Jew who tells his Christian neighbor that he is going to Rome to see what Christianity is really like. The neighbor, of course, fears that once the man sees all of the corruption there he will not convert. But when his neighbor returns, he says, “Ah, my friend, yours is truly the greatest faith, otherwise it could not survive such cruelty and hypocrisy.”

The crucial thing to see in this process of thought is that it is a form of spiritual reason based in realism: our experience of how it is with the human world. True, it is not empirical reason driven by a notionally objective world, but neither are its conclusions dependent on supernaturalism or magical thinking. The idea that all human reason must be empirical is a story that is told to us by our masters.

When critics speak of scientism as an ideology, many seem to be thinking of an ideology as a set of beliefs—like propositions you hold in your head. Your book gave me a sense that ideology, in particular scientism, is much more deeply rooted than that. I use the word ideology in the sense that Marx used it: the stories and ideas that we live out as members of a particular culture. Needless to say, there is a neutral sense in which every culture must have ideologies. The pejorative sense of the term comes from the idea that structures of power and privilege can and do manipulate and enforce these stories in order to support their own interests. The stories stop being concerned with the question “what is the best way for us to live together?” and start being about “what stories best support our own interests?” Telling stories that you want everyone to see themselves in, but that really favor only one group, requires

dishonesty. So what I am concerned with is identifying those dishonest or false elements within the ideology delivered to us by science and its patrons.

Of course, the primary ideological story told by science is that it has no relation to ideology. But that’s what every ideology says. It says, “We are only concerned with the way things really are.” And so the science of economics tells us that self-interest is rational, that it is the essence of freedom, and that it may even be a part of our genetic makeup. These become the covering fictions for stupendous destruction and cruelty. As Buddhism argues, these ideas are not skillful. They are delusions, and they do great harm.

Neuroscience’s claim to be able to understand meditation in terms of the mechanics of neurons and chemicals is another example of ideological storytelling. You can have Buddhism, this story goes, as long as you are willing to acknowledge that it can be best understood through neuroscience. Buddhism is dangerous if it can’t be made to confirm our culture’s empiricist assumptions. If Buddhism refuses to confirm those assumptions, it is a counterculture and therefore a threat to the stability of the status quo. My feeling is that if we in the West are fated to misperceive Asian Buddhism, let it be a creative misperception in the spirit of Buddhism, and not merely the repetition of a familiar and oppressive ideology.

You’ve written that we don’t only have technology, we also have technocracy—which is run by corporatists, militarists, and self-serving politicians. You see a moral urgency to this situation, where many, including many Buddhists, are much more sanguine. It is a mistake to think that we just happen to have these toys and gadgets around without trying to understand what their relationship is to the larger culture. One of the first books that spoke to me powerfully as political theory was Theodore Roszak’s The Making of a Counter Culture (1968). I reread it recently, and it still holds up very well. He wrote, “By technocracy, I mean that social form in which an industrial society reaches the peak of its organizational integration.” Theodor Adorno called it “administered society.” An administered society is one in which technological rationality and industrial organization have penetrated deeply into every aspect of how we live.

For example, by bringing personal computers into our homes, we also brought our workstations into our homes. And so, who knows how many hours a week you work? In a sense, many workers are never not at work, because now they carry their job in their pocket. Or consider service workers in the fast food industry. These workers are treated not as humans but as a part of a superefficient machine, and the skills required of them are crudely mechanical as well.

The more normalized all of this becomes, the more oppressive—and, needless to say, perversely successful—it is. The result is a culture that is “totalized.” Every aspect of the culture is made conformable to a certain technocratic and mechanistic ideal. That’s why I say that scientism is such an important part of state ideology. It is doing work for the boss.

How? Simply by normalizing the idea that everything is a machine, especially us. We are not likely to make a Thoreauvian or a Buddhist critique of technocracy if we have been convinced that we are computers. Thoreauvian critiques are disruptive and disobedient, and technocracy would prefer that we not think in that way. Ultimately, we are arguing about what it means to be human.

For the moment, the idea that we are neural computers is in ascendancy. Currently, from a very early age our children are given to understand that if they want a decent standard of living, they’re going to have to make their peace (ideally, an enthusiastic peace) with Science, Technology, Engineering, and Math, or STEM. Universities are now in the business of training people to go out into a world that is understood to be one vast mechanism, and this includes nature or, as we now say, “the ecosystem.” But that’s okay because we’re computers too. I can’t emphasize enough how oppressive this feels to many young people. As one reviewer of my book wrote, rather bitterly, “Anyone who doesn’t want to be a graphic designer, or a techie, or a slavish Apple devotee—no jobs for you!” And, I’ll add, no way to pay off your huge student loans.

Anyone who doubts the seriousness of this vision should read David Brooks’s December 2013 column for The New York Times “Thinking for the Future,” in which he predicts that the economy of the future will depend upon “mechanized intelligence.” Fifteen percent of the working population will make up a mandarin class of computer geeks and the “bottom 85%” will serve them as “greeters” or by doing things like running food trucks. And yet, Brooks claims, this vast class of servants will have “rich lives” that will be provided for them by the “free bounty of the Internet.”

In your own Thoreauvian article “The Spirit of Disobedience: An Invitation to Resistance,” you quoted Simone Weil: “The authentic and pure values—truth, beauty, and goodness—in the activity of a human being are the result of one and the same act, a certain application of the full attention to the object.” In light of this perspective, what are your thoughts about the introduction of meditation into education and industry, especially into the “creative industries” of Silicon Valley? Thoreau and Weil were writers coming out of the Romantic tradition. For me, the Romantic movement was an attempt to create a wisdom literature for the West. A good part of that wisdom had to do with returning us to the immediacy of the world. As a poetic technique this has come to be known as “defamiliarization.” What it attempts to do is to destroy the world of custom, habit, stereotype, and ideology so that we can see things for what they are, so that we can see and feel the “stone’s stoniness.” When Walt Whitman says that his poetry is about “leaves of grass,” he is essentially saying, We have not been attentive. We need to look again at this leaf of grass. He wrote, “Bring all the art and science of the world, and baffle and humble it with one spear of grass.”

Perhaps the saddest thing we can say about our culture is that it is a culture of distraction. “Attention deficit” is a cultural disorder, a debasement of spirit, before it is an ailment in our children to be treated with Ritalin.

As for Silicon Valley, it has a legitimate interest in the health of its workers, but it has little interest in Weil’s notion of “the authentic and pure values.” Its primary aim is to bring Buddhist meditation techniques (as neuroscience understands them) to the aid of corporate culture, such as in the Search Inside Yourself program developed at Google. This is from the Search Inside Yourself Leadership Institute website:

Developed at Google and based on the latest in neuroscience research, our programs offer attention and mindfulness training that build the core emotional intelligence skills needed for peak performance and effective leadership. We help professionals at all levels adapt, management teams evolve, and leaders optimize their impact and influence.

Mindfulness is enabling corporations to “optimize impact”? In this view of things, mindfulness can be extracted from a context of Buddhist meanings, values, and purposes. Meditation and mindfulness are not part of a whole way of life but only a spiritual technology, a mental app that is the same regardless of how it is used and what it is used for. It is as if we were trying to create a Buddhism based on the careful maintenance of a delusion, a science delusion. It reminds me of the Babylonian captivity in the Hebrew Bible, but now the question for Buddhists is whether or not we can exist in technological exile and still remain a “faithful remnant.”

Bringing Buddhist meditation techniques into industry accomplishes two things for industry. It does actually give companies like Google something useful for an employee’s well-being, but it also neutralizes a potentially disruptive adversary. Buddhism has its own orienting perspectives, attitudes, and values, as does American corporate culture. And not only are they very different from each other, they are also often fundamentally opposed to each other.

A benign way to think about this is that once people experience the benefits of mindfulness they will become interested in the dharma and develop a truer appreciation for Buddhism—and that would be fine. But the problem is that neither Buddhists nor employees are in control of how this will play out. Industry is in control. This is how ideology works. It takes something that has the capacity to be oppositional, like Buddhism, and it redefines it. And somewhere down the line, we forget that it ever had its own meaning.

It’s not that any one active ideology accomplishes all that needs to be done; rather, it is the constant repetition of certain themes and ideas that tend to construct a kind of “nature.” Ideology functions by saying “this is nature”—this is the way things are; this is the way the world is. So, Obama talks about STEM, scientists talk about the human computer, universities talk about “workforce preparation,” and industry talks about the benefits of the neuroscience of meditation, but it all becomes something that feels like a consistent world, and after a while we lose the ability to look at it skeptically. At that point we no longer bother to ask to be treated humanly. At that point we accept our fate as mere functions. Ideology’s job is to make people believe that their prison is a pleasure dome.

Linda Heuman is a Tricycle contributing editor. Tricycle Spring 2014

April 9, 2014 Nationalism

The Case for Nationalism

Click here for a pdf version
By John O’Sullivan    March 21, 2014
(Trying to abolish or replace the nation-state is almost certain to produce more evils than it deters. )

Incessant “antifascist” propaganda from Moscow, baseless claims of attacks against Russians in Ukraine, incitement of Russian-speakers in eastern Ukraine, Russian troops without insignia seizing official buildings in Crimea, a stage-managed illegal plebiscite there and then its annexation by Russia, assurances from President Vladimir Putin that he has no further territorial designs in Europe (though, alas, he may be forced to intervene elsewhere to protect ethnic Russians)—yes, it all has the disturbing ring of the 1930s.

Isn’t this where nationalism leads—to fascism and war?

That is a common interpretation of Europe’s recent crises. It is also, coincidentally, Mr. Putin’s interpretation of events in Ukraine, which he blames on neo-fascist followers of the nationalist leader Stepan Bandera, who was murdered by the KGB in 1959. But this view is really too simple by half.

Nationalists are certainly implicated in the Ukraine crisis, but more as victims than perpetrators. The crisis began as an attempt by Moscow to rescue its stillborn concept of a Eurasian Economic Union by forcing Ukraine to join it and to reject associate membership in the European Union.

Mr. Putin, who isn’t a nationalist (see below) but the ruler of a shaky multinational empire hostile to nationalism, sparked off the crisis by closing Russia’s borders to Ukraine’s agricultural exports. He did so to compel a reluctant President Viktor Yanukovych to abandon the more popular EU option.

The Ukrainian government, encouraged by Mr. Putin, unified the assorted democrats, nationalists and activists of the left and the right who protested this move by firing indiscriminately on them. Mr. Yanukovych’s power crumbled almost visibly; he fled; and a new Ukrainian government that includes nationalists took over.

Nationalism was thus one impulse in this general movement. Others were love of freedom, desire for a more democratic system, economic hopes for greater prosperity through ties to Western Europe and simple human decency. The Ukrainians inspired by these aims have just sustained an (inevitable) defeat in Crimea, but they still govern most of Ukraine, which is now escaping from Moscow’s post-Soviet institutions. While that remains the case, Mr. Putin has suffered a reverse overall.

If Ukrainian nationalists have been reactive, even victimized, in this crisis, what about Mr. Putin himself? His actions have certainly been objectionable—ruthless, aggressive, deceitful, illegal, repressive, subversive. But to describe them as “nationalist” is to reduce the concept of nationalism to a politics of aggressive self-assertion. There is no reason to suppose that nations and nation-states are more prone to indulge in such folly than are federations, empires or states founded on nonnational principles.

Mr. Putin has indeed acted ruthlessly of late, but he has done so in the service of what he sees as clear state or even personal interests, not from a commitment to Russian peoplehood.

The history of the 1930s is instructive for making the necessary distinctions here. World War II began as the result of a conspiracy by Hitler and Stalin—the Nazi-Soviet Pact—to invade Poland and divide Eastern Europe and the Baltic states between them. Nazi Germany was a state built upon the ideology of racial nationalism (which places race above nationhood), the Soviet Union upon the ideology of proletarian internationalism (which rejects nationalism entirely). Both acted far more brutally and unrestrainedly than any conventional nation-state of the period.

Besides, today’s Russian Federation is itself not a nation-state but an empire. Mr. Putin’s conduct of the crisis, in addition to being aggressive, might best be described as imperialist or neo-imperialist, not nationalist. We should not illegitimately associate the nation-state with crimes that aren’t uniquely nationalist and may even be less likely to be committed by stable nation-states.

This matters because nationalism is an increasingly necessary word that is too often misused as a term of abuse. Nationalisms and nationalist movements are popping up all over Europe. These can take very different forms: left, right and ambivalent. Some are straightforward secessionist movements, like the nationalist parties in Scotland and Catalonia, striving to establish new states rooted in historic nations. Others are movements resisting further integration of their existing nation-states into the European Community, such as the True Finns party in Finland and the U.K. Independence Party in Britain.

Still others want to protect the nation and its distinctive political spirit (the National Front in France), or the welfare state (the Danish People’s Party in Denmark) or “liberal values” (the Party of Freedom led by Geert Wilders in Holland) that each feels is threatened by mass immigration. Even the mercifully cautious Germans have the Alternative for Germany party, which, though not avowedly nationalist, emits a distinctively postwar German anti-Euro economic nationalism—and should probably be renamed the Alliance of Patriotic Bankers.

Most of these parties, which didn’t exist 20 years ago, are now represented in Europe’s parliaments. They are expected to do well in May’s elections. They probably won’t win power or enter government, but they force mainstream parties to deal with such issues as the loss of national sovereignty.

In the eyes of Europe’s various political and cultural establishments—what the British call the Great and the Good—none of this should be happening. It is akin to water running uphill. For several decades now, we have heard from these precincts that the nation-state is on its way out, losing power upward to supranational institutions and downward to organized minority groups. Behind their hands, the critics of resurgent nationalism murmur that it is nothing but xenophobia, authoritarianism or even fascism, in folkloric drag. They see Europe’s rising nationalist parties as the preserve of bitter losers or those in the grip of nostalgia.

Herman Van Rompuy, the president of the European Council, expressed this view perfectly in 2010 when he announced for the umpteenth time that the nation-state was dead, adding: “The biggest enemy of Europe today is fear; fear leads to egoism, egoism leads to nationalism, and nationalism leads to war.”

This pronouncement didn’t foresee Mr. Putin’s recent actions. But it illustrates nicely how Europe’s political elites see events like the Ukraine crisis in the distorting mirror of anti-nationalism. This view persuades them to consider nationalism a threat, but a dying one. And it is, quite simply, wrong on both counts.

A practical refutation of this view lies in the fact that there are more nation-states in the world today than ever before. They have multiplied since 1945 in two great leaps forward: the decolonization period of the 1950s and 1960s, and the years following the dissolution of communism in 1989 and 1991. Some of these nations gained their independence, alas, by war and revolution—Zimbabwe, Croatia, Bosnia, Kosovo. Others did so by peaceful negotiation. Most former British colonies and Soviet republics took this route, but the most significant example of it is the “velvet divorce” that produced successful Czech and Slovak states.

This upsurge of nationhood might be dismissed as a detour on the high road to global governance if the establishment view of nationalism weren’t so absurdly crude. It elides vital distinctions and treats all forms of national loyalty as if they were the most aggressive and exclusivist type. In reality, the full spectrum of nationalist loyalties runs roughly as follows: from Nazism, which is totalitarian racial nationalism; to fascism, which is authoritarian and aggressive nationalism; to ethnic nationalism, which is exclusivist, treating ethnic minorities as second-class citizens (if that); to civic nationalism, which opens full citizenship to all born in the national territory in return for their loyalty to the nation and its institutions; and finally, to patriotism, which is that same national loyalty plus simple love of country—its scenery, its sights and sounds, its characteristic architecture, its songs and poems, its people, its wonderful familiarity.

Here, for instance, is George Orwell, perhaps the most famous critic of nationalism, upon returning to southern England from Spain: “Down here it was still the England I had known in my childhood: the railway-cuttings smothered in wildflowers, the deep meadows where the great shining horses browse and meditate, the slow-moving streams bordered by willows, the green bosoms of the elms, the larkspurs in the cottage gardens; and then the huge peaceful wilderness of outer London, the barges on the miry river, the familiar streets, the posters telling of cricket matches and Royal weddings, the men in bowler hats, the pigeons in Trafalgar Square, the red buses, the blue policemen—all sleeping the deep, deep sleep of England.”

England has changed since then, of course; men no longer wear bowler hats. But it would be as absurd to condemn such a tender patriotism as likely to lead to fascism as it would be to abstain from all interest in sex on the grounds that it might lead to promiscuity. Ordinary people, attached to reality as they must be to survive, feel exactly that sense of absurdity when they hear statements like Mr. Van Rompuy’s.

But that hasn’t hitherto affected their political behavior. Why have they suddenly begun thinking and voting in line with such sentiments?

One obvious reason is that all the ideological rivals to patriotism have been largely discredited. Orwell pointed out that those who abandoned patriotism generally adopted a more virulent ideological substitute. In our day, the most obvious rival ideologies are Europeanism in Europe and multiculturalism in the U.S., both of which seek to weaken national patriotism to change the political character of their societies.

Scots who hope to break away from the U.K. rally in Edinburgh in September 2013, a year before a scheduled referendum on independence for Scotland. Agence France-Presse/Getty Images

But neither of these creeds has yet become more than a niche loyalty, even though they enjoy lavish official support and the sympathy of those government officials, international bureaucrats, NGO executives, “denationalized” corporate managers and academics ambitious to be the vanguard of the new or transformed nation. Old-fashioned patriotism survives, perhaps weakened by such defections, but not seriously challenged. It remains in the shadows until tempted into the open by a 9/11, or an anniversary of D-Day or the funeral of a Margaret Thatcher. It is then suddenly recognized as the sentiment of most of the nation.

Until recently, those voters for whom patriotism and the national interest were determining issues found comfortable homes in parties of both the left and the right. But that has gradually ceased to be true.

As parties of the left swapped their working-class identity for that of middle-class liberalism, they began to think patriotism vulgar, cheap and xenophobic. At the same time, mainstream parties of the right drifted unthinkingly into a posture that treated nationalist and socially conservative voters as somewhat embarrassing elderly relatives whose views could be safely ignored. Party leaders reasoned that their atavistic voters had nowhere else to go.

The result can be seen most dramatically in Britain, where the U.K. Independence Party, having secured its base among traditional middle-class Tories, is now harvesting new votes from patriotic blue-collar Laborites. But one can see similar outcomes throughout Europe.

Another factor in this resurgence is a change of intellectual fashion toward bigness. Fewer people in all classes are still confident that the future belongs to the big battalions. They have noticed that smaller states are likely to be richer, easier to manage and closer to the people than larger states. As the Economist magazine pointed out a few years ago: “Of the 10 countries with populations of over 100 [million], only the U.S. and Japan are prosperous.”

These economic facts remove an important obstacle to secession. And if there ever was a link between prosperity and bigness, it has been dissolved by free trade and globalization, which ensure that the size of a nation need no longer coincide with the size of the market open to it. At the same time, a government can shrink to the size that its citizens find most convenient to control.

The U.S. is the exception to these rules—it is both large and prosperous—because its federalism distributes power to states and localities, where it can be better controlled. Switzerland is another example. Europe might imitate America’s success if it were to model itself on Switzerland and distribute power downward. But the opposite is happening—in both Europe and America.

A final brief argument is perhaps the strongest: Nation-states are an almost necessary basis for democracy. A common language and culture, a common allegiance to national institutions, a common sense of destiny, all within a defined territory, with equal rights for all citizens—these seem to be the conditions that enable people with different opinions and interests to accept political defeat and the passage of laws to which they strongly object. There are a few exceptions to this rule—India, Switzerland—but many more confirmations of it.

None of these many considerations justify supporting nationalism as a universal principle of statehood. There is no such principle. States rooted in ideas as different as popular consent and the dynastic principle have been handed down to us by history. Wholesale reconstruction of them is utopian and nearly always fails. The best we can hope for is to improve them by piecemeal reform along the grain of their history.

But trying to abolish or replace the nation-state is almost certain to produce more evils than it deters. The lesson of recent history is that nationalism is here to stay—and that secure, stable and satisfied nation-states are likely to want friendship with neighboring countries rather than their conquest. Wise political leaders anxious for peace will concentrate on shaping their people’s nationalism into an amiable patriotism rather than on submerging it in a new sovereignty and driving it toward its darker manifestations.

Mr. O’Sullivan is director of the Danube Institute in Budapest and a senior fellow of the National Review Institute in New York.
— —
Jason C. Taylor   Fayetteville, N.C.
John O’Sullivan’s positive case for nationalism rooted in patriotism flies in the face of conventional wisdom and a powerful trend toward global governance in the West, but it was wonderful to read his heretical challenge to internationalist orthodoxy (“The Case for Nationalism,” Review, March 22). Academics interested in transforming nations tend to gloss over the fact that aggressor countries in World War II—Nazi Germany, the Soviet Union and Imperial Japan—were the nations with the biggest, most powerful central governments. Their messianic missions, driven by racial purity or collectivist ideology, were possible only with authoritarian or totalitarian regimes unchecked by voters and democratic institutions.

As patriotism has become the domain of narrow-minded, right-wing extremists in the U.S. (according to leftist elites), we may forget that the overwhelming desire of most “extremist” patriots is to shrink our federal government and return power to state and local governments and individuals. There is a distinct (and potentially dangerous) isolationist streak in the conservative movement that wants a stronger national defense, but only for defense, not for intervention in world affairs. But the underlying logic of the conservative isolationist streak is peace through strength.

If most of the world’s bloodiest confrontations in the past century are attributable to very powerful, non- or faux-democratic central governments, why in the world does it make sense to elevate and consolidate governance in global bodies that have little or no accountability to an electorate? The leftist goal to extinguish nation-state loyalties in favor of a commitment to an international ruling body of elites is mind-bogglingly naive. As Mr. O’Sullivan eloquently explains, nationalism is a unifying and positive sentiment in societies that cannot or do not let their government bureaucracies grow too big.

David Gallup
President and  General Counsel , World Service Authority , Washington
The case for nationalism dependent on “secure, stable and satisfied nation states” and “wise political leaders” is flawed. More than 70% of the world’s population lives in nations that aren’t stable or whose political leaders don’t respect human rights.

World law reaffirmed by the Universal Declaration of Human Rights provides a framework for global peace that is not dependent on allegiance to nation-states. Without a global legal framework, “amiable patriotism” cannot prevent nations’ “darker manifestations.”

Ronald J. Glossop   St. Louis
Mr. O’Sullivan seems not to understand that federations can be a way of preserving nations and nation-states while maintaining peace by subordinating them to a higher centralized authority. He seems to think that one must choose between having small, good nation-states and large, bad ones. He regards the U.S. and India as exceptional rather than as large federations that could act as guides for the future.

The continuing integration of the peoples of planet Earth will eventually lead to a democratic world federation that ends global anarchy while preserving nation-states and internal autonomous provinces as subordinate entities. It will produce world peace and a readiness to deal seriously with global problems, something thwarted now by unlimited national sovereignty. Loyalty to the nation and to the nation-state won’t be eliminated, but it will be subordinated to loyalty to humanity, “humatriotism.”

Basil Coukis :
As of right now, tail-chasing circumlocution in the article has inspired more than a hundred impassioned comments, many of them quoting Wikipedia and other high authorities. No need for scholarship. Human nature is not complicated.

The human condition is one of war of everyone against everyone else. Since the many always overwhelm the few, an individual is advised to seek out other individuals and form a temporary alliance with them. The alliance can then fight everyone who is not part of the alliance.

Alliances are formed by those who share the belief that if someone has to die, it is preferable that this be our neighbor. To ensure that we all understand this requires that we can talk among ourselves. Hence, the first condition for any effective alliance is a common language.

The second condition is to justify why our neighbors must die so that we live. Realizing that our alliance contains sensitive souls who abhor armed conflict, we resort to arguments attractive to sensitive souls. Which is to say, metaphysical ones. We loudly proclaim that we deplore violence, that the mere thought of war is repulsive to us, that we shall never wage war unless we are forced into it, that if we are forced into it we shall prevail because we are not fighting to oppress but to bring peace on earth and goodwill to all living creatures. In fact, we fight this war to end all war. More specifically, we fight this war to make the world safe for democracy. Obviously, the uses of metaphysics in the fabrication of alliances are considerable.

But one problem lurks. Size of the alliance. The bigger the alliance, the greater the temptation to enlarge it further. Soon enough, it may contain groups which do not speak the same language and do not subscribe to the beliefs and superstitions of the founding members. Past a certain size, all alliances become impotent and fall apart. In short, viable alliances must be ethnically, linguistically and doctrinally homogeneous.

NATO is not.  Russia is.

“The Gun Is Civilizing”  By Maj. L. Caudill USMC (Ret)

“Human beings only have two ways to deal with one another: reason and force . If you want me to do something for you, you have a choice of either convincing me via argument, or force me to do your bidding under threat of force. Every human interaction falls into one of those two categories, without exception. Reason or force, that’s it.

“In a truly desirable, moral and civilized society, people mainly interact through persuasion . Force has no place as a valid method of social interaction, and a thing that removes force from the menu is the personal firearm, as paradoxical as it may sound to some.

“When I carry a gun, you cannot deal with me by force. You have to use reason and try to persuade me, because I have a way to negate your threat or employment of force .

“The gun is the only personal weapon that puts a 100-pound woman on equal footing with a 220-pound mugger, a 75-year old retiree on equal footing with a 19-year old gang banger, and a single guy on equal footing with a carload of drunk guys with baseball bats. The gun removes the disparity in physical strength, size, or numbers between a potential attacker and a defender.

“There are plenty of people who consider the gun as the source of bad force equations. These are the people who think that we’d be more civilized if all guns were removed from society, because a firearm makes it easier for a [armed] mugger to do his job. That, of course, is only true if the mugger’s potential victims are mostly disarmed either by choice or by legislative fiat – it has no validity when most of a mugger’s potential marks are armed.

“People who argue for the banning of arms ask for automatic rule by the young, the strong, and the many, and that’s the exact opposite of a civilized society. A mugger, even an armed one, can only make a successful living in a society where the state has granted him a force monopoly .

“Then there’s the argument that the gun makes confrontations lethal that otherwise would only result in injury. This argument is fallacious in several ways. Without guns involved, confrontations are won by the physically superior party inflicting overwhelming injury on the loser.

“People who think that fists, bats, sticks, or stones don’t constitute lethal force, watch too much TV, where people take beatings and come out of it with a bloody lip at worst. The fact that the gun makes lethal force easier, works solely in favor of the weaker defender, not the stronger attacker. If both are armed, the field is level.

“The gun is the only weapon that’s as lethal in the hands of an octogenarian as it is in the hands of a weight lifter. It simply would not work as well as a force equalizer if it wasn’t both lethal and easily employable. Our social structure can help by finding effective ways to prevent ownership or ownership privileges (concealment, etc) by felons (untrustworthy) or mentally disturbed (incapable) individuals to the best of its ability.

“When I carry a gun, I don’t do so because I am looking for a fight, but because I’m looking to be left alone. The gun at my side means that I cannot be forced, only persuaded. I don’t carry it because I’m afraid, but because it enables me to be unafraid. It doesn’t limit the actions of those who would interact with me through reason, only the actions of those who would do so by force. It removes force from the equation… which is why carrying a gun is a civilized act. ”

The problem is that loss of ethical protocols and the “amoral revolution” put the solutions into the realm of force since institutions no longer recognize Balance of Power as a desirable status quo. When is all about winning in an environment where there are no constraints on the means of competition then the only other choice is force – and an arms race at all levels of society.

Peter Borregard Replied:
Sigh. The fictional Maj. Caudill again.
This was written by author and blogger Marko Kloos. As much as I enjoy his writing, he is setting up a false dichotomy. It isn’t force or reason. It’s force or persuasion. Maybe by reason, maybe not.

Emmanuel Goldstein Wrote:
.>> Nation-states are an almost necessary basis for democracy. A common language and culture, a common allegiance to national institutions, a common sense of destiny, all within a defined territory, with equal rights for all citizens—these seem to be the conditions that enable people with different opinions and interests to accept political defeat and the passage of laws to which they strongly object. <<
Borders – Language – Culture
It doesn’t get much simpler than that.


You are missing Capitalism, which is also an internationalist system, based on world rule by capitalist institutions. The despotism of the international banks that control world governments can be compared to the despotism of allied world governments under Communism that control world financial institutions.

The issue is, “Which institution controls the other?” See – “The End of the Free Market: Who Wins the War Between States and Corporations?”

The ideal of Balance of Power, among institutions that provides checks and balances on the despotism of one institution over others whether it be a private or public institutions, means all institutions require the power not to seek monopolistic power but to maintain ethical relations with other institutions that have become despotic.

In the current case of Russian switch in their economic architecture to a political institution in control of banking institutions within their area of responsibility, versus allowing the despotism of the Western banking system to determine Russia’s future is really nothing more than Russia checking the power of the Western banking establishment (New World Order, supported by NATO) that has proven to be a poor partner to Russia since the disintegration of the Soviet Union in 1991.

Russia actually has no choice but to establish an alternative Eurasian Economic Union backed by a unified military force of those member nations to combat the despotism of the NWO/NATO that refuses to admit Russia.

Nationalism is the counterforce to despotism by an internationalist institution that is destroying national governments and national economic societies worldwide. Thus nationalism can be a force for “Balance of Power” among institutions worldwide. Having read a number Putin’s writings, that is clearly his current philosophy.

Frank Pecarich Wrote:
The author says, “A common language and culture, a common allegiance to national institutions, a common sense of destiny, all within a defined territory, with equal rights for all citizens—these seem to be the conditions that enable people with different opinions and interests to accept political defeat and the passage of laws to which they strongly object.”

Compromise is only possible among competing interests when they can agree on an overarching goal. That has been impossible in the US. Citizens are deeply divided about who should get the benefits of government and under what conditions. This problem has been made extraordinarily difficult by the cultural diversity in the country. Many public policy analysts see no fix to that.

Dr. Byron Roth in his 2010 book “The Perils of Diversity: Immigration and Human Nature” argues that the debate over immigration policy in the Western world is critically uninformed by the sciences of evolutionary biology and psychology. In his work he examines the intersection between culture, genetics, IQ and society. Prominent among the fundamental features of human nature is a natural bias toward one’s own kind, making harmony in multi-ethnic, multi-cultural societies problematic at best. He says , “All historical evidence indicates that “diversity” is not a strength, and that blood is thicker than water. Ignoring such biological realities leads to failed social experiments that may cause great human suffering.”

Roth points out that “multiculturism denies historical and scientific evidence that people differ in important biological and cultural ways that makes their assimilation into host countries problematic. Frank Salter presents a powerful case for the adaptiveness of ethnocentrism. Different human ethnic groups and races have been separated for thousands of years, and during this period they have evolved some genetic distinctiveness. This genetic distinctiveness constitutes a storehouse of genetic interest

March 26, 2014 Reinventing Ethics

Reinventing Ethics

By HOWARD GARDNER   Click here for a pdf version

What’s good and what’s bad? There are plenty of reasons to believe that human nature changes slowly, if at all — all’s still fair in love and war. For millennia, religious believers have attributed our nature to God’s image, as well as to God’s plan. In recent years, evolutionary psychologists peered directly at our forerunners on the savannahs of East Africa; if human beings change, we do so gradually over thousands of years. Given little or nothing new in the human firmament, traditional morality — the “goods” and “bads” as outlined in the Ten Commandments or the Golden Rule — should suffice.

My view of the matter is quite different. As I see it, human beings and citizens in complex, modern democratic societies regularly confront situations in which traditional morality provides little if any guidance. Moreover, tenable views of “good” and “bad” that arose in the last few centuries are being radically challenged, most notably by the societal shifts spurred by digital media. If we are to have actions and solutions adequate to our era, we will need to create and experiment with fresh approaches to identifying the right course of action.

Let’s start with the Ten Commandments. We are enjoined to honor our parents, and to avoid murder, theft, adultery and dishonesty. Or consider the Golden Rule: “Do onto others. “ A moment’s reflection reveals that these commandments concern how we treat those nearby — we might say those 150 persons who, according to anthropologist Robin Dunbar, each of us has evolved to be able to know well. For most of history, and all of pre-history, our morality has been extended to our geographical neighbors — anyone else falls outside the framework of neighborly morality.

This characterization is largely true until we reach the modern era — the last few centuries, particularly in the West. The one dramatic exception is the brief period of the Greek city-state. Citizens of Athens pledged to work for the improvement and glory of the entire society. And in extending the gamut of responsibility, the Hippocratic oath of the Periclean era enjoined physicians to extend aid and avoid mistreatment of any person in need of medical attention. As explained a century ago by the German sociologist Max Weber, professionals were no longer simply humans relating to their neighbors. Rather, the doctor, the lawyer, the architect, the educator had taken on more specified and finely articulated roles, with characteristic rights and responsibilities. Now, the morality that we direct to those living in the neighborhood and the ethics that a responsible professional should direct to all who come within his or her ambit, whether friend, foe, or someone from outside one’s customary circle, are two quite different matters.

It would be hyperbolic to maintain that “the ethics of roles” disappeared for almost two millennia. Yet this wider sense of responsibility was much less evident after classical times, when almost everyone was a peasant, guilds kept their practices secret and emerging states were hierarchical and authoritarian. Only as these trends were gradually overturned in the West in the last few centuries, did the role of the responsible professional re-emerge. The rise of the Fabians

in England, of the progressives in the United States or of the elite professional classes in Bismarckian and Weimar Germany, to take some familiar examples, established a cohort of individuals who were given status and a comfortable livelihood in return for the license to render complex judgments and decisions in a disinterested manner. According to the historian Kenneth Lynn, writing in the early 1960s, “Everywhere in American life, the professions are triumphant.”

But even as Lynn wrote, the hegemony of the professions was breaking down. It was not only the witty George Bernard Shaw who believed that “professions are a conspiracy against the laity.” Many saw the professions as the province of the privileged — chiefly white, primarily Anglo- Saxon in lineage, largely male. Most of us today deem the democratization — or demoticization — of the professions as a healthy development. Yet, I maintain that this trend had its costs. Specifically, the very notion of professions serving the wider community has broken down, to be replaced by a growing consensus that professions are by their nature destined to serve parochial interests.

When Anthony Kronman, a professor and former dean of Yale School of Law, wrote nostalgically in 1995 about “the lost lawyer,” he has in mind the “found lawyer” who is no longer concerned with the health of the community but only with the wealth of his employers, generally large corporations. And the same waning of disinterestedness can be seen in the once- solo practitioner physician (“Marcus Welby”) who is now “managed” by the business school graduates of the health maintenance organization; the once “Mr. Smith goes to Washington” politician now under the thumbs of the most wealthy donors; the once selfless “ Mr. Chips” who serves his own careerist interests rather than those of the discipline, the college or the students.

Why should this matter? If my argument is correct, the professional deals every day with issues that cannot possibly be decided simply by consulting the Bible or some other traditional moral code. At which point should the journalist protect an anonymous source? Should a lawyer continue to defend a client whom she believes to be lying? Ought a medical scientist take research support when the funds come from a convicted felon or when subjects cannot give informed consent? Alas, traditional texts don’t provide reliable answers to these questions — they don’t even raise them. And yet, if professions are to disappear, should we simply answer these vexed questions by flipping a coin or by majority vote?

Perhaps the gradual undermining of the professions was inevitable, but it has certainly been accelerated by the emergence and increasing prevalence of the digital media. At the fingertips of anyone with a digital device, one can now learn the good, the bad, and the ugly of just about any professional practitioner — without the means of determining the legitimacy of these characterizations. Moreover, one can instantly access all forms of real and faux expertise on issues ranging from the treatment of disease to the preparation of term papers to the drawing up of a will or a trust fund. Tomorrow, if not today, one will be able to gain accreditation or diplomas for the thousand-plus careers that now style themselves as “professions.” And shouldn’t we honor these sheepskins, particularly if we cannot reliably distinguish on the basis of a score on a bar exam between those who went for three years to Yale Law School and those who enrolled in Dr. Khan’s free online course in legal thinking and practice?

These forces of democratization and digitalization will not go away. Ethical dilemmas are no longer going to be decided solely by those who wear certain clothing and who have a certain professional pedigree. How then should we go about deciding which of the alternative courses of action is the right one, or at least the one that is more ethical?

My solution involves the recasting of venerable institutions into forms appropriate for the contemporary era. In ancient Greece and Rome, citizens gathered in the central square, or agora, to discuss complex issues. Much the same occurred centuries later in the fabled town hall meetings of New England. A congruent “mentality” characterized the physical “commons” in which members of a community grazed their animals. Unless each member respected the need to limit grazing time, the pasture land would not be arable.

I call on members of a professional community to create common spaces in which they can reflect on ethical conundra of our era. For the first time in human history, it is not essential that participants occupy the same physical space. Virtual common spaces can allow all who have interest and knowledge in the area to weigh in — whether the topic is the protection of sources by journalists, the determination of which intellectual property can legitimately be downloaded and which not, whether studies of the creation of a deadly new strain of virus should be published. Indeed, in the last decade, in professions ranging from journalism and law to medicine and science, such spaces have been created and, in some case, have been ably curated.

Still, by themselves “virtual agoras” are limited; they can be hijacked, trivialized, or ignored. And so I recommend the reinvigoration of the role of “trustees” — individuals afforded the privilege of maintaining the standards of an institution or profession. Traditionally, trustees were drawn from the rank of wise seniors, and such persons can offer both time and experience. But particularly in a fast changing world, trustees should reflect the range of ages and experiences. And so, as an example, young journalists should be asked to choose as trustees both peers and veterans whom they admire; and veteran journalists should nominate both peers and younger colleagues who embody the best of the profession. These trustees should have vested in them a spectrum of powers, ranging from an identification of best practices to the institution of rules governing admission to or expulsion from the profession.

Clearly, in an era marked by fast change, the creation of attractive agoras and of respected trustees will not be easy. Nor will the relation between these spaces and these persons be straightforward. Yet, given the importance of establishing ethical practices in our time, we need starting points, and these appear to be the most promising. I’m fully confident that good trustees and well-curated agoras can improve on my recommendations!

The problem with a belief in the immutability of morality is the same as the problem with a belief that the American Constitution contains the answers to all legal disputes. Like the Ten Commandments (or the code of Hammurabi or the Analects of Confucius), the Constitution is a remarkable document for its time. But it’s absurd to believe that the text magically contains the answers to complex modern issues: the definition of what it means to be alive, or how the commerce clause or the right to bear arms amendment should be interpreted; or whether a corporation is a person. By the same token, while we can draw inspiration from the classical texts and teachings of neighborly morality, we cannot expect that dilemmas of professional life

will be settled by recourse to these sources. But we need not tackle these alone. If we can draw on wise people across the age spectrum, and enable virtual as well as face-to-face discussion, we are most likely to arrive at an ethical landscape adequate for our time.

Howard Gardner is the Hobbs Professor of Cognition and Education at the Harvard Graduate School of Education. His most recent book is “Truth, Beauty and Goodness Reframed: Educating for the Virtues in the Age of Truthiness and Twitter.”

Ross Williams Grand Rapids, Minnesota John Richmond

What’s missing here is an appraisal of the effect our economic system has had on traditional ethics and morality. “Doing unto others” does not make for good capitalism, and for me, that’s where the trouble starts. Prof. Gardner touched on this briefly in describing the difference between the “lost” and”found” lawyers, doctors today vs. those of yesteryear, etc. The difference is that we all now work for someone else, or more precisely, something else; the corporation and it’s single-the addition minded focus on ever-expanding profit. Any discussion of what form our morality and ethics should take today must also include the system in which we all earn our living.

An Ordinary American Prague

I disagree with the thesis of this column. The problem is not in our ethical standards, be they the Ten Commandments or the Golden Rule. The problem is with our imagination, our failure to identify humans at a distance as “neighbors”. The limits of our empathy are too small for the modern world. Perhaps widespread education purposely aimed to enlarge our identification with “the other” can change that. Or perhaps only the slow, incremental change in human nature can do so. But I am fairly certain that a few “professions” adopting new ethical standards is not going to change it.

Gemli Boston

God help us if we had to rely on the Bible to acquire our sense of morality. Human beings would never have evolved if we had to wait for the Ten Commandments to tell us that it was wrong to lie, steal and murder. We could never have survived as a species if we could not trust each other, or if we were all plotting our neighbor’s demise.

In general, we should be very cautious of the kind of morality that comes from religious sources. The Good Book has good advice concerning how a man should treat his slaves, and how to sell his daughter into sexual slavery, along with instructions on how to lay waste to neighboring villages, kill every man and child, while saving the virgins for, well, later.

There is an innate sense of morality that comes with being a human being. It doesn’t come from a book; it’s part of our standard equipment. It has survived for millions of years, and it will survive the age of the Internet. Possibly.
The essence of ethics comes from the Golden Rule, and each age learns how best to implement

its simple imperative. It can be done person to person, or in the agora, or on Angie’s List. Compared to the rate at which we evolve, these technologies are flying by in a blur. Before we can figure out one, something else has come along. The details matter less than the simple directive to be mindful of each other’s weaknesses, and to reciprocate fairness.

Tim Bal Belle Mead, NJ

I beg to differ.

“There is nothing new under the sun.”

We do not need another Constitution. We do not need “well-curated agoras”.

What we need is more common sense, less greed, and more honesty, compassion, tolerance and patience. In other words, we don’t need anything new to support a more ethical society.

Howard Los Angeles

In the world today, I don’t observe any great obedience to the Golden Rule and to the ethical (non-ritual) parts of the Ten Commandments. So it’s kind of early to say that obeying them wouldn’t suffice.
Certainly there are technical requirements for somebody to understand what “stealing” is in computer software, or what “false witness” is in describing medical treatments. But once that definition is made, the golden rule and its equivalents (e.g., Kant: Act as though your action would become a universal law) can take you pretty far.

David Jones Rochester NY

We need to stop holding up the Athenians as models of democracy. They kept slaves and routinely sentenced people to death by popular vote!

ACW New Jersey

I agree with you about the Constitution. (Someone tell Scalia, please.).And you had me up to the graf that begins with Anthony Kroman’s lament on the lost lawyer, and the supposed loss of integrity in other professions. Do you really believe there was a time doctors were selfless, unmercenary near-saints? Read Moliere. For that matter, read history; e.g., the inventor of the forceps was as jealous and secretive of his lucrative device as Big Pharma of any of its patents. That there was a time when lawyers were not venal, equivocating opportunists would startle, say, John Webster. Plato and Thucydides knew a bit about democracy and that the system’s more likely to spawn an Alcibiades than a Mr. Smith. And George Orwell, who wrote ‘Such, Such Were the Joys,’ is laughing somewhere at your encomium to teachers. (As is IF Stone, who pretty much took apart the myth of the noble martyr Socrates.)
Things ain’t what they used to be … and they never were.

The 10 Commandments and Golden Rule also won’t wash. ‘Thou shalt have no other gods before me.’ Who, exactly, is ‘me’? ‘Do unto others as you would have them do unto you’? As GBS

pointed out, never do unto others as you would have them do unto you; ‘their tastes may not be the same.’ The OT/NT and general history of revealed religion prove not only are the commandments and rule poor ethical yardsticks, there is no more mischivous creature on earth than a man convinced he is virtuous in the eyes of the lord (any lord).

Kevin Brock Waynesville, NC

We don’t need to reinvent ethics. Rather, we need to reinforce and expand to more and more “neighbors” the basic tenets of ethics we all are familiar with.

“Do unto others as you would have them do unto you” applies equally to environmental policy as it does to international relations as it does to my barking dog.

“Treat the foreigner in your land as a citizen,” often dismissed as applicable only to the nomadic culture of the desert, applies equally to the conduct of foreign affairs or commerce in a global economy.
Biblical principles like gleaning, the forgiveness of debts, being slow to anger or take offense, taking care of the widow and the fatherless, and universal hospitality, all sound like worthy fundamentals upon which to build an ethical and just society.
There is nothing new in any of that, and nothing that needs reinvention.

Marilyn J Los Angeles

When I was a very small girl, a very long time ago, my father explained the Golden Rule to me and what it means. This was his version of “religion” and he believed that striving to live by this rule would create a better life and a better world. After over sixty years of trying to live by this rule I know he was right.

KiWi Markham ON

Judaeo-Christian morality says the Golden Rule applies only to our immediate neighbours but is not a universal claim? Which Bible is Professor Gardner referring to?

Alan Paris

Kant’s categorical imperative does not depend on “neighborly ethics”. The injunctions to treat strangers well in the Bible do not either. The “ethics of roles” was well-known to St. Thomas Aquinas. The author hardly makes the case that ethics needs to be reinvented. I confess I was relieved when I saw he was a member of neither a history nor a philosophy dept – but depressed to have my prejudices about Education departments confirmed.

David Chowes New York City


Do on to others as your greed compels you to. If it is illegal, make sure that it is
done with care so you won’t get caught. If indicted call on the most unscurulous members of the

profession which Shakespeare said to kill.

SteveH Henderson, NV

The professor’s sophist attempt to extirpate thousands of years of civilization’s wisdom in having created a system of absolute morality (the Ten Commandments) in terms of clearly defined right and wrong does not stand up to the realities of the world (yes world not neighborhood) which we all occupy. Jurisprudence, for example from antiquity to modern legislation (primarily criminal, but civil as well) recognizes the principles of malum in se vis a vis malum prohibitum. If history has taught us anything at all, it is that there is no disinterested arbiter that can be appointed or elected as an elite to judge the rest of us, professional or hoi polloi. Witness the current as well as past practices of the United States Supreme Court or the Security Council of the United Nations. The bright line simplicity of the Ten Commandments tempered by the relative but humanistic percept of the Golden Rule are perfect guides to a moral existence, if only they would be observed by elitists such as the Professor.

Robert Racine, WI

People whose knowledge of the Bible is limited to a fuzzy misquoting and misunderstanding of the 10 commandments should probably refrain from writing about morality or really, anything.

ACW New Jersey

‘God will know his own’ was uttered during the Church’s war on the Cathar heresy, after a soldier asked what to do about a town in which some were faithful and some were heretics, but the troops didn’t know which was which. ‘Kill them all. God will know His own.’ The modern updating, popular on T-shirts and bumper stickers, is ‘kill ’em all, let God sort ’em out.’

A good example of what happens when quotes are removed from their context …

Mark Thomason Clawson, MI

The Golden Rule provides a perfectly adequate answer to those supposedly hard cases. Put yourself in the shoes of each of the other parties, and treat them as you would feel to be fair treatment of yourself. Medical research? “If it was me in that bed or my loved one, my remains being used or those of a loved one . . ..” It works just fine.

Tom Midwest

The problem with agoras in public is the lack of civility and failure to follow the rules of debate that has been hijacked by the extremes. Just listen to any of the programs on talk radio these days. We still have old fashioned township meetings where I live, where many residents attend, listen quietly, respond when allowed, keep their comments civil and never interrupt another speaker. A johnny come lately frothing tea party type was asked politely to either shut up or leave. He were given the option to debate civilly but he declined and appeared much happier outside the township hall bellowing at the top of his lungs and being ignored by everyone. Sort of like it was back when the anti war protests were going on but now it is the other side.

Amused Reader SC

The 10 Commandments are outdated according to the writer as well as the Golden Rule. I guess that prohibitions against murder, adultery, and stealing as well as being good to your neighbor is too old fashioned for enlightened minds. I have always found that the KISS principal (Keep It Simple Stupid) works pretty well. And doing the right thing is not to hard to figure out. If we need complicated agencies to tell us what to do we already have the IRS and the Obama administration who are ready and willing to make those moral and ethical judgments for us in language so complex we don’t understand them.

I think the writer misses the point where we are to live so that we don’t hurt anyone and respect others. It doesn’t take a new Constitution or rewriting the 10 Commandments to let us know how we need to live in relation to our fellow man. All new rules do is to take away equality and replace it with some sort of chaste system where others are put above the “little people”. We already have government for that, we don’t need to lose the few freedoms we have left to glorified lawyers, accountants, and middle managers.

Ethics don’t change except when some people want to take advantage of others using their superior knowledge. We see where that got the Germans with Hitler. Leave the 10 Commandments, Golden Rule, and the Constitution alone. They work fine to protect us from the enlightened minority.

Goackerman Bethesda, Maryland

What a silly — no, make that scary — essay. Our “wise” Masters will tell us what’s good and what’s bad, and what we should do and not do. Define “wise”. I hope journalists reading this essay note that Gardner advocates journalism “trustees” deciding who should be admitted to or expelled from the “profession”. Journalism is not a profession in the classic sense, e.g., there is no specific education, examination, or license required, nor should it be. In a free press, journalists can be hired and fired, not admitted or expelled. As for the interpretation of the U.S. Constitution, Gardner seems merely to be venting his spleen because of some recent decisions with which he has disagreed.

JHSM Lake Placid, NY

Trustees, huh? As I recall, Plato called them the Guardians in his Republic. And with all his evocations of Athenian democracy, Prof.
Gardner can’t be unaware of the resemblances between his program and Plato’s. He must also be aware of how deeply contemptuous Plato was of the democratic assemblies of his city. In his fictional Republic, Plato evoked historical Athenian oligarchy and repackaged it as philosophically enlightened despotism.

It important to call professionals to account, absolutely. What bothers me is the notion, implicit here, that once the professionals get their acts together, the common man or woman will not have to concern him or herself with issues that to my mind at least properly belong in the public democratic domain. Prof. Gardner refers to the Athenian assembly and to the New England town

meeting as models for the virtual assemblies of experts he envisages. He seems to believe that these democratic institutions have seen their day, and that the issues of our era are too complex for ordinary people to resolve.
I wish I believed that experts weren’t ordinary, too. Perfectibility, even in right minded people, is not a realistic goal. The American democratic system is based on the Venetian notion that the wicked ambitions of one individual or faction necessarily conflict with those of others. The rivalry and compromise that these conflicts produce can lead to a decent, if imperfect, result.

Oliver Jones Newburyport, MA

“Do onto others?” Shouldn’t you have written “Do unto others as you would have them do unto you?”

John T. Grand Rapids, Michigan

Professionals have an obligation to not only think about themselves and their clients, but also about their role in maintaining the stability of the system. This is what the bankers failed to do in the run up to the financial crisis. In a way, it is a Kantian question: What if all my peers did what I am doing? Would it make the system unstable? In the case of the banks it surely did. It seemed like nobody asked that question. I am disappointed that the business community is not acting like a profession and asking the ethical questions it needs to ask. They have to change their professional culture.

James Currin Stamford, Ct

When Prof. Gardner finally gets down to something concrete, he tells us that we must give up the belief that our constitution contains the solution to all legal disputes. I know of no one, living or dead, who has ever held such a absurd opinion. Our Constitution was and is a compact between the several states to create a national government of limited and defined powers. That is all it is and it would suffice if only the courts would enforce it. What Gardner really means, but won’t say plainly is that the national government should have vastly expanded powers that the Constitution does not permit. As to his vaporous musings about “wise” trustees, they appear to be nothing more than the NYT editorial board writ large.

Martin Weiss mexico, mo

I get what you’re saying, and I agree with your proposal, but, for my own reference, I consolidate, boil these ethical guides, along with those of original doctrines of Jesus, Lao Tzu, Jefferson, and many others. Teilhard de Chardin,Rachel Carson, Lincoln, Madison, Annie Dillard, Edward Abbey, Aldo Leopold, and on and on– to synthesize what they all were getting at. What use is the Bible, what use the Bhagavad-Gita, the I Ching, etc.?
Rawls and Locke and Mill would agree, they were intended to prescribe best practices for group survival and long-term prosperity. That clarifies ethical choices. “Thou Shalt Not Bear False Witness”? Unless, by a lie, a life can be saved. And so on. The ultimate value must be life. Preserving, honoring, furthering life is a standard that cuts across all doctrinal differences, unifies the field, so to speak, of ethical dilemmas. In essence, the unified field is staring us in the face– so close we overlook it. Wordsworth and Einstein would have gotten along fine with

Hammurabi, Buddha, Krishna, Jesus, Maimonides, Spinoza and Lao Tse. Hammurabi’s law protecting strangers brought a manifold increase in foreign trade. Abbey’s injunction that “Grown Men Don’t Need Leaders” puts the ethical response-abilty into democratic hands. The synthesis of ethical doctrines may conflict with advice like spreading one’s dogma by the sword, but that can’t ensure group survival, anyway. Heresy is in the eye of the authorities. Common interest must prevail


March 12, 2014 Oprah-Nyad

The Oprah-Nyad Affair

Dec. 20, 2013 • 9 min read • original                            Click here for a pdf version

Atheists have one thing in common: they lack a belief in any god. But that commonality tells us very little about what they do believe and what they do experience in lieu of the divine. For instance, do atheists have experiences they consider “spiritual” or awe-inspiring? And if one doesn’t believe in a god but does believe in some kind of connection between humans, animals, the sun, the earth, and its oceans, can they still claim to be an atheist? These two questions were recently raised following an interview with long-distance swimmer Diana Nyad that aired October 6 on Oprah Winfrey’s show, Super Soul Sunday.

The part of the interview that received the most attention was the exchange between Winfrey and Nyad over Nyad’s atheism:

Oprah Winfrey: You told our producers that you’re not a God person; that you’re a person who is deeply in awe.

Diana Nyad: Yeah, I’m not a God person… OW: Do you consider yourself atheist? DN: I am an atheist.
OW: But you’re in the awe.

DN: I don’t understand why anybody would find a contradiction in that. I can stand at the beach’s edge with the most devout Christian, Jew, Buddhist—go on down the line—and weep with the beauty of this universe and be moved by all of humanity. All of the billions of people who’ve lived before us and have loved and hurt and suffered… To me, my definition of God is humanity and is the love of humanity.

OW: Well, I don’t call you an atheist then. I think if you believe in the awe and the wonder and the mystery, then that is what God is… not a bearded guy in the sky.

1 of 6 2/12/14, 11:31 AM

The Oprah-Nyad Affair —

Many nontheists criticized the talk-show host for this exchange because it denies Nyad’s atheist identity. Indeed, Winfrey deserves to be criticized for denying her guest the right to self-identify as an atheist. But there are two additional elements of the Oprah-Nyad affair that I believe warrant additional attention. First, while most of the public commentary targeted Winfrey’s dismissal of Nyad’s atheism, there was also some criticism of Nyad for claiming to be an atheist while indicating that she holds “spiritual” views. Second, virtually no one in the media commented on Winfrey’s unorthodox suggestion that God is not “a bearded guy in the sky.” Before we examine these two specific points further, it’s worth asking generally whether atheists and nonreligious people in the United States report experiencing wonder and awe, and if these are appropriately termed “spiritual” feelings.

Unfortunately, though perhaps not surprisingly, I was unable to find questions that had been posed to representative samples of Americans that perfectly capture what “spirituality” means. Incidentally, in her 2010 book, Science Vs. Religion: What Scientists Really Think, Elaine Howard Ecklund does provide data on atheist scientists who report experiencing wonder and awe at things like the immensity of the universe and the diversity of life, though they were reticent to call those experiences “spiritual” as they defined them as natural, not supernatural. (Personally, I’m not surprised that the majesty of scientific findings and understandings of the universe overwhelms people; it helps illustrate how insignificant humans are relative to the universe around us.)

By now we’re all familiar with the rise of the so-called “nones” in the United States—those who answer “none” when asked their religious affiliation. In 2012 the Pew Research Center’s Forum on Religious and Public Life survey revealed that, of the 46 million unaffiliated American adults surveyed, more than half reported a deep connection with nature and the earth, and 37 percent claimed they were “spiritual but not religious.” Thirty percent of nones said they believed in spiritual energy located in physical things like mountains, trees, or crystals; 25 percent claimed a belief in both astrology and reincarnation; and 28 percent regarded yoga as a spiritual practice.

But what about those who explicitly identify as atheists or agnostics? In the same 2012 Pew survey, 34 percent said they were spiritual and 38 percent said they believe in “God or a universal spirit.” However these groups didn’t respond to specific questions about how they define spirituality.

Despite not finding perfect polls to address this issue, I did find several questions in the 1998 General Social Survey—a nationally representative face-to face survey of adult Americans conducted every other year by the National Opinion Research Center at the University of Chicago—that are closely related to the idea of spirituality. Two of the questions capture the idea of wonder and awe, but are somewhat problematic in that they were phrased in such a way as to be more appealing to religious people. Both were introduced with this setup: “The following questions deal with possible daily spiritual experiences. To what extent can you say you experience the following…?” Response options ranged from “many times a day” to “never or almost never.” Survey participants were first asked to what extent they “feel deep inner peace or harmony.” As noted, the wording is likely appealing to religious people—particularly Buddhists but also Christians—who are often instructed to seek inner peace and harmony. Even so, atheists and the nonreligious also answered the question. The responses were mixed, as shown in Figure 1 below. Only about a quarter of atheists said they never experienced inner peace or harmony, and just 16 percent of the nonreligious reported never experiencing such feelings. On the other end, about a quarter of both groups reported feeling inner peace most days.

2 of 6 2/12/14, 11:31 AM

The Oprah-Nyad Affair —

The second question was even more clearly directed toward the religious: “I am spiritually touched by the beauty of creation.” However, even though spirituality and creation are often problematic concepts for atheists and the nonreligious, 75 percent reported feeling touched by the beauty of creation at least once in a while.

What these survey results indicate is that most atheists and nonreligious Americans do experience so-called spiritual feelings that are very similar to the wonder and awe

3 of 6 2/12/14, 11:31 AM

The Oprah-Nyad Affair —

expressed by Diana Nyad in her interview with Oprah Winfrey. Of course, experiencing wonder and awe—or feeling touched by things or having a sense of inner peace or harmony —doesn’t mean these individuals must believe in God or anything supernatural. To the contrary, what these data indicate is that such feelings are natural in both a “caused by nature” and a “highly common” sense. In short, a substantial portion of atheists and nonreligious report spiritual feelings.

Returning to Nyad’s own personal spirituality, later in the interview Winfrey asked her if she was spiritual and Nyad said that she was. Here’s Nyad’s explanation:

I think you can be an atheist who doesn’t believe in an overarching being who created all of this and sees over it. But there’s spirituality because we human beings, and we animals, and maybe even we plants—but certainly the ocean and the moon and the stars—we all live with something that is cherished and we feel the treasure of it.

The swimmer reaffirmed this idea when prompted by Winfrey:

OW: Do you feel one with the ocean? At some point do you just feel like there’s no difference between your body and your stroke and the water and the surroundings around you?

DN: I do. I feel at home in the ocean. Not in a pool. Not in a river, not in a lake, but the ocean. I actually feel the tidal pull. And I feel the moon pulling the tides out there. It’s what mountain climbers feel when they get to the top of the mountain; not that they conquered it, but that they are part of it.

Note that in both of these statements, Nyad never suggests that she believes in anything explicitly supernatural. She simply suggests that there is “something” that connects her to the ocean, plants, animals, the moon, and stars. What that is she agrees to call “spirituality,” but that doesn’t mean she believes it to be supernatural or divine.

Even so, Nyad’s admissions caused consternation among some atheist commentators, who seemed to be under the impression that atheism is inimical to spirituality. Writing at his website, Why Evolution Is True, University of Chicago biologist Jerry Coyne wrote:

In truth, I think that more damage to atheism was done here by Nyad, eloquent though she was, than by Oprah. After all, Winfrey makes just one short claim about the issue, denying that Nyad is an atheist because she believes in wonder, awe, and humanity. In contrast, Nyad calls those feelings “God,” admits the existence of souls that exist after death, and says that she has no problem with believers, even those who accept the existence of ghosts. In other words, she’s an atheist who, like Oprah, accepts woo. It’s really time for us to discard the word “spirituality.” All it does is give believers a reason to say, “See, you’re really one of us after all.”

As a sociologist, I find Nyad’s spiritual admission intriguing. Sociologists have, for several decades, drawn a clear distinction between “religiosity” and “spirituality.” The former is used to describe the many ways in which institutional religion can manifest itself in the lives of people: they can attend services, pay tithing, and/or identify as a member of an organized religion, and so on. Note that I stipulate this is an “and/or” relationship, since religiosity is multi-dimensional—people can regularly attend religious services but not believe in the teachings of their religion, like the many U.S. Catholics who reject the church’s official position on birth control but still strongly identify as Catholic. Spirituality, on the other hand, is generally used in sociology to refer to beliefs and

4 of 6 2/12/14, 11:31 AM

The Oprah-Nyad Affair —

practices—typically supernatural in orientation—that are not tied to organized religion. Examples would be things like believing you have a spiritual connection with other people, believing that there is some form of higher power, or believing that there is some form of afterlife. All of these can be held without any institutional affiliation with a religion.

Just like people can be religious in different ways, people can be nonreligious in different ways. And, there is reason to believe that people can be partially religious, partially spiritual, and partially nonreligious. In short, religiosity, spirituality, and nonreligiosity are “messy” precisely because they are human characteristics. And so Nyad is someone I would label a “spiritual atheist.” Yes, they exist. And atheists should simply accept that and welcome Nyad as an ambassador for atheism. If atheists and secular humanists reject her atheist credentials because she’s also spiritual, they’re committing the same logical fallacy that many moderate Muslims do when they argue that radical, fundamentalist Muslims aren’t “true Muslims” because they don’t share a moderate interpretation of Islam. If Osama bin Laden was a Muslim, Diana Nyad is an atheist. (My apologies to Nyad for the comparison.)

As I have repeatedly indicated, many atheists take issue with the term “spirituality” because “spirit” lies at the heart of the term and the most common understanding of the word is as a second component of human identity that is “other” than the body, a supernatural component of an individual. It is the “essence”—often called the soul—of the individual that is believed to live on after the mortal component—the body—dies. But there’s another definition of spirit: the qualities that form core characteristics of a person or potentially reflect one’s inner drive or motivation, as in “Diana Nyad has an indomitable spirit.” This doesn’t mean that Nyad has a supernatural soul but rather that she is dedicated and perseveres. Nyad’s comments on Super Soul Sunday seem, to me, to lean more toward this second definition, by which it would be more appropriate to describe her as “an atheist with spirit.” Regardless, it seems the majority of atheists and nonreligious people in the United States are indeed spiritual in the sense that they experience wonder and awe and are moved by what surrounds them.

A final point worth examining involves the complete lack of attention given to Winfrey’s statement about God not being “a bearded guy in the sky.” Some people watching the video might interpret Winfrey’s statement as a summary of her guest’s beliefs rather than her own. However, Winfrey has often espoused a spiritualism of self-empowerment and has championed New Age gurus like Rhonda Byrne (The Secret) and spiritual leaders such as the Dalai Lama. It seems likely then that she does reject the notion of God as a man with a long white beard living up above the clouds somewhere. If that’s the case, then her God is more akin to the God of twentieth-century theologian Paul Tillich, who believed in a nebulous, perhaps pantheistic deity; for him God was equal to “ultimate concerns” and was in everything good.

In the 2006 Portraits of American Life Study, participants were asked to indicate their agreement with the statement, “God is not a personal being, but more like an impersonal spiritual force.” Close to 30 percent of Americans strongly disagreed, while another 7 percent indicated they “somewhat disagreed.” A larger percentage of Americans agreed with the statement—29.8 percent strongly and 18.9 percent somewhat (13.3% were neutral). God as a personal being is the orthodox Christian position, but it’s clearly declining in acceptance in the United States, which likely explains why so little was said about Winfrey’s casual reference to the bearded guy in the sky.

5 of 6 2/12/14, 11:31 AM

The Oprah-Nyad Affair —

Still, her denial of atheism is the major story in the Oprah-Nyad Affair. By denying Nyad’s atheism, Winfrey was denying the nonreligious identity of millions of Americans. As others have pointed out, had she done this with a Jew or Muslim or Mormon, there would have been widespread outrage. But given the outsider status of atheists in the United States, the denial was neither surprising nor widely condemned.

The fact that atheists experience wonder and awe is important for people like Oprah Winfrey to know. And perhaps for her and other Americans who don’t know an atheist (or more likely don’t know an “out” atheist), it may come as a surprise that atheists and the nonreligious experience spiritual feelings and emotions of wonder and awe. Atheists may be more analytical in their thinking and less likely to base decisions on emotions, but they are human after all.

Original URL:

February 26, 2014 Your Ancestors, Your Fate

Your Ancestors, Your Fate

By Gregory Clark
February 21, 2014, NYTimes Website. Opinionator                              Click here for a pdf version

Inequality of income and wealth has risen in America since the 1970s, yet a large-scale research study recently found that social mobility hadn’t changed much during that time. How can that be?

The study, by researchers at Harvard and Berkeley, tells only part of the story. It may be true that mobility hasn’t slowed — but, more to the point, mobility has always been slow.

When you look across centuries, and at social status broadly measured — not just income and wealth, but also occupation, education and longevity — social mobility is much slower than many of us believe, or want to believe. This is true in Sweden, a social welfare state; England, where industrial capitalism was born; the United States, one of the most heterogeneous societies in history; and India, a fairly new democracy hobbled by the legacy of caste. Capitalism has not led to pervasive, rapid mobility. Nor have democratization, mass public education, the decline of nepotism, redistributive taxation, the emancipation of women, or even, as in China, socialist revolution.

To a striking extent, your overall life chances can be predicted not just from your parents’ status but also from your great-great-great-grandparents’. The recent study suggests that 10 percent of variation in income can be predicted based on your parents’ earnings. In contrast, my colleagues and I estimate that 50 to 60 percent of variation in overall status is determined by your lineage. The fortunes of high-status families inexorably fall, and those of low-status families rise, toward the average — what social scientists call “regression to the mean” — but the process can take 10 to 15 generations (300 to 450 years), much longer than most social scientists have estimated in the past.

We came to these conclusions after examining reams of data on surnames, a surprisingly strong indicator of social status, in eight countries — Chile, China, England, India, Japan, South Korea, Sweden and the United States — going back centuries. Across all of them, rare or distinctive surnames associated with elite families many generations ago are still disproportionately represented among today’s elites.

Does this imply that individuals have no control over their life outcomes? No. In modern meritocratic societies, success still depends on individual effort. Our findings suggest, however, that the compulsion to strive, the talent to prosper and the ability to overcome failure are strongly inherited. We can’t know for certain what the mechanism of that inheritance is, though we know that genetics plays a surprisingly strong role. Alternative explanations that are in vogue — cultural traits, family economic resources, social networks — don’t hold up to scrutiny.

Because our findings run against the intuition that modernity, and in particular capitalism, has eroded the impact of ancestry on a person’s life chances, I need to explain how we arrived at them.

Let’s start with Sweden, which — like Denmark, Finland, Iceland and Norway — is one of the world’s most equal societies in terms of income. To our surprise, we found that social mobility in Sweden today was no greater than in Britain or the United States today — or even Sweden in the 18th century.

Sweden still has a nobility. Those nobles no longer hold de facto political power, but their family records are stored by the Riddarhuset (House of Nobility), a society created in 1626. We estimate that about 56,000 Swedes hold rare surnames associated with the three historic tiers of nobles. (Variations on the names of the unfortunate Rosencrantz and Guildenstern of “Hamlet” are on the list.)

Another elite group are Swedes whose ancestors — a rising educated class of clerics, scholars, merchants — Latinized their surnames in the 17th and 18th centuries (like the father of the botanist Carolus Linnaeus). Adopting elite names was limited by law in Sweden in 1901, so a vast majority of people holding them are descended from prominent families.

Given the egalitarian nature of Swedish society, one would expect that people with these elite surnames should be no better off than other Swedes. That isn’t so. In a sample of six Stockholm- area municipalities in 2008, rich and poor, we found that the average taxable income of people with noble names was 44 percent higher than that of people with the common surname Andersson. Those with Latinized names had average taxable incomes 27 percent higher than those named Andersson.

Surnames of titled nobles (counts and barons) are represented in the register of the Swedish Bar Association at six times the rate they occur in the general population (three times the rate, for untitled-noble and Latinized surnames). The same goes for Swedish doctors. Among those who completed master’s theses at Uppsala University from 2000 to 2012, Swedes with elite surnames were overrepresented by 60 to 80 percent compared with those with the common surname prefixes Lund- and Berg-.


Over centuries, there is movement toward the mean, but it is slow. In three of the Royal Academies of Sweden, half of the members from 1740 to 1769 held one of the elite surnames in our sample; by 2010, only 4 percent did — but these surnames were held by just 0.7 percent of all Swedes, so they were still strongly overrepresented. In short, nearly 100 years of social democratic policies in Sweden, while creating a very egalitarian society, have failed to accelerate social mobility.

What if we go back even further in time — to medieval England?

We estimate that one-tenth of all surnames in contemporary England can be traced to the occupation of a medieval ancestor — names like Smith (the most common surname in the United States, England and Australia), Baker, Butler, Carter, Chamberlain, Cook, Shepherd, Stewart and Wright. Tax records suggest that most surnames became heritable by 1300.

We compared the frequency of these common surnames in the population as a whole against elite groups, as drawn from several sources, including membership rolls at Oxford and Cambridge, dating as far back as 1170, and probate records from 1384 onward.

We found that late medieval England was no less mobile than modern England — contrary to the common assumption of a static feudal order. It took just seven generations for the successful descendants of illiterate village artisans of 1300 to be incorporated fully into the educated elite of 1500 — that is, the frequency of their names in the Oxbridge rolls reached the level around where it is today. By 1620, according to probate records, people with names like Butcher and Baker had nearly as much wealth as people with high-status surnames like Rochester and Radcliffe.

Take Chaucer. A commoner by birth — his name probably comes from the French word for shoemaker — he became a courtier, a diplomat and a member of Parliament, and his great-great- grandson was even briefly considered heir to the throne during the reign of Richard III.

Of course, mobility, in medieval times as now, worked both ways. Just as Chaucer’s progeny prospered, other previously well-off families declined. The medieval noble surname Cholmondeley was, by the 19th century, held by a good number of farm laborers.

In any generation, happy accidents (including extraordinary talent) will produce new high-status families. It is impossible to predict which particular families are likely to experience such boosts. What is predictable is what the path to elite status will look like, and the path back to the mean. Both happen at a very slow pace.

For all the creative destruction unleashed by capitalism, the industrial revolution did not accelerate mobility. Looking at 181 rare surnames held by the wealthiest 15 percent of English and Welsh people in the mid-19th century — to be clear, these were not the same elite surnames as in the medieval era — we found that people with these surnames who died between 1999 and 2012 were more than three times as wealthy as the average person.

If your surname is rare, and someone with that surname attended Oxford or Cambridge around 1800, your odds of being enrolled at those universities are nearly four times greater than the average person. This slowness of mobility has persisted despite a vast expansion in public

financing for secondary and university education, and the adoption of much more open and meritocratic admissions at both schools.

We selected a sampling of high- and low-status American surnames. The elite ones were held by descendants of Ivy League alumni who graduated by 1850, exceptionally wealthy people with rare surnames in 1923-24 (when public inspection of income-tax payments was legal) and Ashkenazi Jews. The low-status names were associated with black Americans whose ancestors most likely arrived as slaves, and the descendants of French colonists in North America before 1763.

We chose only surnames closely correlated with these subgroups — for example, Rabinowitz for American Jews, and Washington for black Americans.

We used two indicators of social status: the American Medical Association’s directory of physicians and registries of licensed attorneys, along with their dates of registration, in 25 states, covering 74 percent of the population.

In the early to mid-20th century we found the expected regression toward the mean for all of these groups, except for Jews and blacks — which reflects the reality of quotas that had barred Jews from many elite schools, and of racial segregation, which was not fully outlawed until the 1960s.

Starting in the 1970s, Jews began, over all, a decline in social status, while blacks began a corresponding rise, at least as measured by the doctors’ directory. But both trends are very slow. At the current rate, for example, it will be 300 years before Ashkenazi Jews cease to be overrepresented among American doctors, and even 200 years from now the descendants of enslaved African-Americans will still be underrepresented.

Family names tell you, for better or worse, a lot: The average life span of an American with the typically Jewish surname Katz is 80.2 years, compared with 64.6 years for those with the surname Begay (or Begaye), which is strongly associated with Native Americans. Heberts, whites of New France descent, live on average three years less than Dohertys, whites of Irish descent.

But to be clear, we found no evidence that certain racial groups innately did better than others. Very high-status groups in America include Ashkenazi Jews, Egyptian Copts, Iranian Muslims, Indian Hindus and Christians, and West Africans. The descendants of French Canadian settlers don’t suffer racial discrimination, but their upward mobility, like that of blacks, has been slow.

Chen (a common Chinese surname) is of higher status than Churchill. Appiah (a Ghanaian surname) is higher than Olson (or Olsen), a common white surname of average status. Very little information about status can be surmised by the most common American surnames — the top five are Smith, Johnson, Williams, Brown and Jones, which all originated in England — because they are held by a mix of whites and blacks.

Our findings were replicated in Chile, India, Japan, South Korea and, surprisingly, China, which stands out as a demonstration of the resilience of status — even after a Communist revolution nearly unparalleled in its ferocity, class hatred and mass displacement.

Hundreds of thousands of relatively prosperous mainland Chinese fled to Taiwan with the Nationalists in the late 1940s. Under Communist agrarian reform, as much as 43 percent of all land was seized and redistributed. The Cultural Revolution of 1966-76 saw purges of scholars and other former elites and “class enemies.”

In China, there are only about 4,000 surnames; the 100 most common are held by nearly 85 percent of the population. Yet we were able to identify 13 rare surnames that were exceptionally overrepresented among successful candidates in imperial examinations in the 19th century. Remarkably, holders of these 13 surnames are disproportionately found now among professors and students at elite universities, government officials, and heads of corporate boards. Social mobility in the Communist era has accelerated, but by very little. Mao failed.

These findings may surprise two groups that are often politically opposed: those who believe that certain “cultures” are higher-achieving than others and those who attribute success to family resources and social networks.

Culture is a nebulous category and it can’t explain the constant regression of family status — from the top and the bottom. High-status social groups in America are astonishingly diverse. There are representatives from nearly every major religious and ethnic group in the world — except for the group that led to the argument for culture as the foundation of social success: white European Protestants. Muslims are low-status in much of India and Europe, but Iranian Muslims are among the most elite of all groups in America.

Family resources and social networks are not irrelevant. Evidence has been found that programs from early childhood education to socioeconomic and racial classroom integration can yield lasting benefits for poor children. But the potential of such programs to alter the overall rate of social mobility in any major way is low. The societies that invest the most in helping disadvantaged children, like the Nordic countries, have produced absolute, commendable benefits for these children, but they have not changed their relative social position.

The notion of genetic transmission of “social competence” — some mysterious mix of drive and ability — may unsettle us. But studies of adoption, in some ways the most dramatic of social interventions, support this view. A number of studies of adopted children in the United States and Nordic countries show convincingly that their life chances are more strongly predicted from their biological parents than their adoptive families. In America, for example, the I.Q. of adopted children correlates with their adoptive parents’ when they are young, but the correlation is close to zero by adulthood. There is a low correlation between the incomes and educational attainment of adopted children and those of their adoptive parents.

These studies, along with studies of correlations across various types of siblings (identical twins, fraternal twins, half siblings) suggest that genetics is the main carrier of social status.

If we are right that nature predominates over nurture, and explains the low rate of social mobility, is that inherently a tragedy? It depends on your point of view.

The idea that low-status ancestors might keep someone down many generations later runs against most people’s notions of fairness. But at the same time, the large investments made by the super-

elite in their kids — like those of the Manhattan hedge-funders who spend a fortune on preschool — are of no avail in preventing long-run downward mobility.

Our findings do suggest that intermarriage among people of different strata will raise mobility over time. India, we found, has exceptionally low mobility in part because religion and caste have barred intermarriage. As long as mating is assortative — partners are of similar social status, regardless of ethnic, national or religious background — social mobility will remain low.

As the political theorist John Rawls suggested in his landmark work “A Theory of
Justice” (1971), innate differences in talent and drive mean that, to create a fair society, the disadvantages of low social status should be limited. We are not suggesting that the fact of slow mobility means that policies to lift up the lives of the disadvantaged are for naught — quite the opposite. Sweden is, for the less well off, a better place to live than the United States, and that is a good thing. And opportunities for people to flourish to the best of their abilities are essential.

Large-scale, rapid social mobility is impossible to legislate. What governments can do is ameliorate the effects of life’s inherent unfairness. Where we will fall within the social spectrum is largely fated at birth. Given that fact, we have to decide how much reward, or punishment, should be attached to what is ultimately fickle and arbitrary, the lottery of your lineage.

Gregory Clark is a professor of economics at the University of California, Davis, and the author of “The Son Also Rises: Surnames and the History of Social Mobility.”

A version of this article appears in print on 02/23/2014, on page SR1 of the NewYork edition with the headline: Your Ancestors, Your Fate
— —–
William J. Keith

Macomb, Illinois Although it may be slow, if our children, whether we are prosperous or poor, are likely to eventually return to the average, it seems like the most productive thing we can do to better their lot is to focus on improving the lot of the average.

Orazio New York 3 days ago
Regarding the inference of genetic transmission as measured by surnames I would like to point out that the genetic inheritance is reduced by 50% with each succeeding generation. Thus, a high status person today received 50% of his/her genes from each parent, 25% from each grandparent, 12.5% from each great grandparent, 6.25% from each great, great and 3.125% from each great, great, great-grandparent. Or in other words, each of us has 2 parents, 4 grandparents, 8 great grandparents, 16 great, greats and 32 great, great, great grandparents. However, one’s surname is transmitted intact and undivided from only one of these 32 great, great, greats to whom the individual is 96.875% genetically unrelated. Thus, it seems more plausible to me that admission committees, professional societies and society in general are influenced by surnames that are recognizable and associated with higher social status – think legacy admissions to Ivy League universities. The slow decay of social status may simply reflect the time necessary for surname recognizability to fade from memory.
SP Singapore Yesterday

This article pushes an odd theory – it equates surnames with genetic identity. This makes no sense – I got only 25% of my DNA from my father’s father, and only 0.1% of my DNA from my male-lineage ancestor ten steps up the paternal line. In other words, surname continuity does not imply genetic identity – the resemblance fades very rapidly! I don’t think the author understands this point.
Also, if Gregory Clark believes that blacks in America compete on a level playing field, he is completely deluded – one wonders which planet he was on when he wrote this piece.
Lastly, the effects of poverty on brain development, metabolic health and other traits important for social success have all been very well documented. Perhaps Mr Clark got carried away by his research into surnames – he seems to have completely ignored every other factor.
volutes Switzerland
This article presents an interesting statistical/historical analysis and then jumps to a conclusion that the reason must be predominantly because of genetical factors.
The evasive dismissal in the article of family resources and social networks as an explanation of the situation described is extraordinarily short. The hint that racial classroom integration programs did not change the relative social position of their recipients does not explain much: millions of dollars for starting a company, intimate life and business connections with wealthy people and acquired social patterns for fending life challenges are not something that can just be taught in one classroom program.
Therefore, I wonder whether the authors are following a hidden agenda.
Doug R. New Jersey 2 days ago
The 1% in America have rigged the game by creating a tax structure that gives them all the advantages & the rest of us all the cost. The most prosperous families have advantages in education & connections. You rigged your game by selecting names by a subjective criteria & looking for results to prove your point. You begin with elite names in Sweden citing the Royal Academy membership where most elite surnames have declined from 50% 250 years ago to 4% today. Doesn’t that prove the opposite point. Sweden has really only been a democratic society since the end of WW 1 less than a hundred years ago. That’s a pretty quick decline in status for those elite families in that time.

May I suggest you study orphans with elite names & without. No family to help, or to pay for better educational opportunities, no nepotism. That might give more objective results.
Changed and Changed Back San Francisco CA I am a woman. I bear my father’s family name but I am also the child of my mother who bore the family name of her father. Should my social status be aligned with my father and paternal grandfather? how about my son who bears the family name of his father? Failing to address the issue the relationship of naming to gender is egregious although using the data available may be perfectly acceptable as a sampling technique.


Texancan Ranchotex
Great study but I would like to add two factors: 1) the assets and connections for the upper class provide their children a superior (and unfair) advantage for generations to come 2) same for recognition and justice from the Courts In addition, children of lower class will have to work harder
jcb Portland, OR
Demonstrating by surname the persistence of “genetic” traits influencing social mobility is so scattershot that it confuses. For example, by taking changes in professional affiliation as a measure of social mobility (in, e.g., Sweden) it ignores the persistence of paternal models that determine the occupation of sons (but not daughters). There is an overwhelming survivors bias toward the male, primo-genetic line. Lawyer fathers, lawyer sons.
The analysis lacks any comparative standard of low-, medium-, or high- social mobility. Or of the long-, medium-, or short- period over which it is supposed to occur. And it confuses social mobility (rate and degree of change) with persistence of inequality (difference in wealth).
It’s unclear whether, e.g., a comparison of high status surnames in Swedish Royal Academies in the 18th and 21st century revealing a decline of 92% (from 50% to 4%) signals mobility, or why “mobility” is measured by overrepresentation in current Swedish population (do they mean “inequality”?)– or whether the 92% rise in non-elite academy surnames is high upward mobility. (The same problem with the Chinese example: “disproportion” is not mobility.)
But the larger problem is simply resorting to “genes” as an explanatory catch-all after eliminating (to their satisfaction) other explanations. We’ve reached the degree of precision in genetic research where a reader is entitled to electron-microscopic images in an appendix: Which genes– on which chromosomes?
rjnyc NYC
To take just one of many examples of the inadequacy of this author’s methodology, look at how he tests mobility in the U.S. A serious test would measure the mobility of the descendants of high status Anglo Americans against that of the descendants of low status Anglo Americans, so that the results would not be corrupted by the influence of racial prejudice–a factor that surely has limited mobility for a certain minority but not for all. This author however does not perform such a comparison; rather, he compares the mobility of Anglo Americans to that of African Americans. Moreover, he treats the initial status of Ashkenazi Jews as if it were no different from the status of the highest status Anglo Americans–a laughable assumption, and also fails to consider changes to the status of American Jews resulting from the massive immigration in the late 19th Century, which brought a population of Jews unlike the one here before that. Either the author’s methodology is utterly inept, or else the relevant details have not been included, so that the author has essentially provided nothing to the reader except unsupported assertions.

A Cranky Alumna Ohio Yesterday
I was an untraditional student at an elite college in the 70s, by virtue of gender, economic status, geography, and family background. After watching the lifetime career trajectories of my peers and, eventually, our children, I’ve become increasingly convinced that elite status is maintained more by a narrowly circumscribed world view than by intelligence, work ethic, social skills, or even startup funds or family connections.
Those of us from diverse backgrounds face a nearly infinite array of life choices: we can listen to our hearts, follow our dreams, and use our intelligence and our education to pursue the life that’s right for us, unfettered, for the most part, by status issues.
The options of our high-status peers are, by contrast, tightly circumscribed: it’s simply not acceptable to be “just” a teacher or “just” a nurse or “just” a photographer or a researcher or a social worker. So only the truly rebellious make those choices. Everyone else does their part to maintain the family status, but we shouldn’t be surprised when the lawyer (who dreamed of coaching basketball) and the banker (who wanted to write children’s stories) prove again and again that they’re only in it for the money.
LR NYC Yesterday
This is completely illogical and I can’t believe the NYT published this irresponsible and pernicious argument for a genetic component to success. The author treats it as axiomatic that surnames can be seen tracing gene inheritance. But surnames pass only from father to son to son to son (until very recently). As they are inherited, the sons’ genes are mixed with innumerable other ancestors’, equally. One’s number of ancestors doubles with each generation. You have about 1/32 of each of your great-great-grandparents’ genomes, regardless of whose last name you have. You have a 1 out of 32 chance of having great-great-grandpa Rockefeller’s “success gene,” and so do all of his descendants who DON’T have his last name (i.e., who are descended from his daughters). So the notion that we can assume that these last names reliably track the inheritance of a success gene is absurd.
The author states that genes must be responsible for his findings, because neither culture nor inherited social advantages explain those findings. Well, just because you’ve ruled out A and B doesn’t mean that C is correct. You may need to come up with a D. You may need to look again at A or B. Etc. Then, the author cites evidence of adoption studies, which only measure the relation of one generation (adopted child) to previous generation (biological parents). This has nothing to do with proving his assertion of a genetic component influencing status across 15 generations. In sum, preposterous.
Bruce Crossan Lebanon, OR
So the authors out with an idea and then searched and searched for measurements that would support (at least in their minds) their conclusions. I realize that finding suitable data to analyze, for the question you wish to research, can be challenging; however, the wide divergence of apples and oranges comparisons brings the conclusions into doubt.


Did the authors look at why people have the last names that they do; in any countries other than England (Smith etc). How did people who lacked surnames, like emancipated slaves choose their last names? Could Washington or Jefferson be a hint of a cultural influence? How about the Schmitts that came to America and changed their names to Smith, so that they blended into the country they moved to. Doesn’t the fact that the WASPs in North America wiped out most of the indigenous population have a cultural effect on Native American surnames? Did a lot of rich Irish doctors and businessmen come to America because there was social unrest that threatened to take away their wealth? No?: How about the Iranians? Seems to me that where their from and why they came could have a large effect on rare last names that are identified with status.

I can find alternate explanations for every surmise that the authors make, in countries/cultures I’m familiar with. So I’m sorry, the arguments presented are not convincing. bc
RTB Washington, DC

So the authors leap from the observation that social status changes three or four generations, which is more slowly than our political myths would suggest, to the conclusion that “genetics” is the most likely cause. This is a rather curious conclusion given that three or four generations is a nanosecond in terms of significant genetic change among humans.
The much better conclusion is that high status families are extraordinarily good at preserving their advantages regardless of the political system in which they exist.
This book appears to be one of a growing number arguing for the inherent superiority of some people over others while strenuously avoiding terms like superiority. The claim that some are born to lead and rule and others to be ruled over is as old as human civilization. This book is merely a restatement of that tiresome idea.
ADRIAN San Francisco, CA
Oh, but the article points you away from mobility altogether:
“As the political theorist John Rawls suggested in his landmark work “A Theory of
Justice” (1971), innate differences in talent and drive mean that, to create a fair society, the disadvantages of low social status should be limited. We are not suggesting that the fact of slow mobility means that policies to lift up the lives of the disadvantaged are for naught — quite the opposite. Sweden is, for the less well off, a better place to live than the United States, and that is a good thing. And opportunities for people to flourish to the best of their abilities are essential.” !
So what can governments do? Never mind mobility! But organically promote opportunities for people to flourish to the best of their abilities, however great or not so great such abilities may be.
Jerry Beilinson New York, NY Yesterday


How bizarre to include Ashkenazi Jews. In other case, the author looks at a concrete marker of past elite status: noble rank, admission to Oxford, slavery. Then, he just assumes that Jews were elites in early 20th century America without presenting evidence. In reality, the millions of Jews flooding the U.S. from Eastern Europe at that time tended to arrive in poverty, and find work in sweatshops, as pushcart peddlers, ice-delivery men, and so on. Newspaper columnists at the time complained about Jews because they were considered poor, dirty, uneducated, etc. This is my ancestry, and none of my grandparents had more than an 8th grade education. From what I’ve read and my parents tell me, that was typical. In the Jewish neighborhoods of the Lower East Side and Brooklyn, no one of their generation knew any adult with a college education, or anyone who worked in a white-collar setting of any kind. Among my parents’ generation, who became adults in the 1950s, typical professions included teaching, the civil service, doctors, and accountants: Expressly areas where ancestry could be overcome by earning a degree and taking a test. (Those accomplishments were made possible by essentially free tuition through New York City’s public universities and the GI bill.) Certainly, there were a few prominent Jews in the United States in the 19th century, just as their were a few prominent Irish Catholics and even African-Americans. But they weren’t at all typical.

Justice Holmes Charleston Yesterday
Nothing like an article that tells us that what we thought was bad isn’t so we can just stop complaining and allow the rich to continue getting richer. As a product of the American dream I can tell you that social and economic mobility was once the norm. Working class parents saw their children graduate high school, college and law school. They saw them succeed in their chosen fields and become economically comfortable and stable. Then, the corporations and the really, really rich woke up. They realized that those upwardly mobile lawyers, teachers, CPAs and others were making changes, protecting unions, consumers and holding the government agencies accountable. well, we couldn’t have that. It was time to bring the hammer down. Crush the unions that helped the middle class grow; disarm the concumers and the government agencies that regulated the corporations and banks and make sure working class and middle class kids could not longer afford college and while you are at it tell them that college is just an elitist romp so they will thank you for it.
Has the deck always been stacked agains the working class in this and other countries? Of course. But the US was different, working class kids had a shot. Now they don’t and minority kids well it hasn’t gotten easier for them either. But thanks to articles like this we should all just be quite because well its always been this way. I say malarkey.


Page 1 of 3
1 2 3