July 26, 2015 – Institutional Racism & Poverty

Culture of Poverty’ Makes a Comeback
By PATRICIA COHEN, Published: October 17, 2010 NYTImes WebSite

for PDF copy this article

For more than 40 years, social scientists investigating the causes of poverty have tended to treat cultural explanations like Lord Voldemort: That Which Must Not Be Named.

The reticence was a legacy of the ugly battles that erupted after Daniel Patrick Moynihan, then an assistant labor secretary in the Johnson administration, introduced the idea of a “culture of poverty” to the public in a startling 1965 report. Although Moynihan didn’t coin the phrase (that distinction belongs to the anthropologist Oscar Lewis), his description of the urban black family as caught in an inescapable “tangle of pathology” of unmarried mothers and welfare dependency was seen as attributing self-perpetuating moral deficiencies to black people, as if blaming them for their own misfortune.

Moynihan’s analysis never lost its appeal to conservative thinkers, whose arguments ultimately succeeded when President Bill Clinton signed a bill in 1996 “ending welfare as we know it.” But in the overwhelmingly liberal ranks of academic sociology and anthropology the word “culture” became a live grenade, and the idea that attitudes and behavior patterns kept people poor was shunned.

Now, after decades of silence, these scholars are speaking openly about you-know-what, conceding that culture and persistent poverty are enmeshed.

We’ve finally reached the stage where people aren’t afraid of being politically incorrect,” said Douglas S. Massey, a sociologist at Princeton who has argued that Moynihan was unfairly maligned.

The topic has generated interest on Capitol Hill because so much of the research intersects with policy debates. Views of the cultural roots of poverty “play important roles in shaping how lawmakers choose to address poverty issues,” Representative Lynn Woolsey, Democrat of California, noted at the briefing.

This surge of academic research also comes as the percentage of Americans living in poverty hit a 15-year high: one in seven, or 44 million.

With these studies come many new and varied definitions of culture, but they all differ from the ’60s-era model in these crucial respects: Today, social scientists are rejecting the notion of a monolithic and unchanging culture of poverty. And they attribute destructive attitudes and behavior not to inherent moral character but to sustained racism and isolation.

To Robert J. Sampson, a sociologist at Harvard, … the reason a neighborhood turns into a “poverty trap” is also related to a common perception of the way people in a community act and think. When people see graffiti and garbage, do they find it acceptable or see serious disorder? Do they respect the legal system or have a high level of “moral cynicism,” believing that “laws were made to be broken”?

As part of a large research project in Chicago, Professor Sampson walked through different neighborhoods this summer, dropping stamped, addressed envelopes to see how many people would pick up an apparently lost letter and mail it, a sign that looking out for others is part of the community’s culture.

In some neighborhoods, like Grand Boulevard, where the notorious Robert Taylor public housing projects once stood, almost no envelopes were mailed; in others researchers received more than half of the letters back. Income levels did not necessarily explain the difference, Professor Sampson said, but rather the community’s cultural norms, the levels of moral cynicism and disorder.

The shared perception of a neighborhood — is it on the rise or stagnant? — does a better job of predicting a community’s future than the actual level of poverty, he said.

William Julius Wilson, whose pioneering work boldly confronted ghetto life while focusing on economic explanations for persistent poverty, defines culture as the way “individuals in a community develop an understanding of how the world works and make decisions based on that understanding.”

For some young black men, Professor Wilson, a Harvard sociologist, said, the world works like this: “If you don’t develop a tough demeanor, you won’t survive. If you have access to weapons, you get them, and if you get into a fight, you have to use them.”

sociologists have ventured into poor neighborhoods to delve deeper into the attitudes of residents. Their results have challenged some common assumptions, like the belief that poor mothers remain single because they don’t value marriage.

In Philadelphia, for example, low-income mothers told the sociologists Kathryn Edin and Maria Kefalas that they thought marriage was profoundly important, even sacred, but doubted that their partners were “marriage material.” Their results have prompted some lawmakers and poverty experts to conclude that programs that promote marriage without changing economic and social conditions are unlikely to work.

Mario Luis Small, a sociologist at the University of Chicago and an editor of The Annals’ special issue, tried to figure out why some New York City mothers with children in day care developed networks of support while others did not. As he explained in his 2009 book, “Unanticipated Gains,” the answer did not depend on income or ethnicity, but rather the rules of the day-care institution. Centers that held frequent field trips, organized parents’ associations and had pick-up and drop-off procedures created more opportunities for parents to connect.

Scholars like Professor Wilson, 74, … have … felt compelled to look more closely at culture after the publication of Charles Murray and Richard Herrnstein’s controversial 1994 book, “The Bell Curve,” which attributed African-Americans’ lower I.Q. scores to genetics.

The authors claimed to have taken family background into account, Professor Wilson said, but “they had not captured the cumulative effects of living in poor, racially segregated neighborhoods.”

He added, “I realized we needed a comprehensive measure of the environment, that we must consider structural and cultural forces.”

He mentioned a study by Professor Sampson, 54, that found that growing up in areas where violence limits socializing outside the family and where parents haven’t attended college stunts verbal ability, lowering I.Q. scores by as much as six points, the equivalent of missing more than a year in school.

Changes outside campuses have made conversation about the cultural roots of poverty easier than it was in the ’60s. Divorce, living together without marrying, and single motherhood are now commonplace. At the same time prominent African-Americans have begun to speak out on the subject. In 2004 the comedian Bill Cosby made headlines when he criticized poor blacks for “not parenting” and dropping out of school. President Obama, who was abandoned by his father, has repeatedly talked about “responsible fatherhood.”

Conservatives also deserve credit, said Kay S. Hymowitz, a fellow at the conservative Manhattan Institute, for their sustained focus on family values and marriage even when cultural explanations were disparaged.

Still, worries about blaming the victim persist. Policy makers and the public still tend to view poverty through one of two competing lenses, Michèle Lamont … said: “Are the poor poor because they are lazy, or are the poor poor because they are a victim of the markets?”

… Fuzzy definitions or not, culture is back. This prompted mock surprise from Rep. Woolsey at last spring’s Congressional briefing: “What a concept. Values, norms, beliefs play very important roles in the way people meet the challenges of poverty.”

——-

Dan P, East Village,

I do not think that the writings of Moynihan were particularly “Conservative”. If anything, they highlighted the nurture side of the nature versus nurtue debate.
Analyzing how poor people think is an appropriate area of investigation. This is not to blame people for how they think but rather to demonstrate the economic consequences of such thoughts.

Nick Lento, Cliffside Park, NJ

The choice of poverty being caused by either “cultural” factors or external conditions is false and even absurd.

Yes, obviously, there *are* cultural/systemic behavior patterns associated with poverty. It could also be argued that being poor and powerless and oppressed is what “bred” the cultural pathologies that some like to trumpet.

It’s cheap to blame poor people for being poor. That lets everyone else off the hook. What Veronika Larsson (#9 above) said needs to be studied closely.
People that were fortunate enough to be born in the right area, with the right parents, schools and a nice trust fund will tend to be wealthy…..and to leave *that* wealth to their own progeny…that will then amplify and perpetuate the “culture” of wealth for the fortunate. Does that make them superior human beings to those born in conditions that are opposite to their own? Or does it just make them lucky?

People who are raised in conditions of multi-generational poverty and oppression can’t just “snap out of it” because some judgmental prejudiced conservative wants to “modify their behavior”.

Yes, there are always exceptions. Rich folk who blow it, and poor folk who rise above all the odds……but, generally speaking the poor tend to get poorer and the rich get richer…….and that is *not* an indefinitely sustainable model for the maintenance of democracy.

What’s needed is a radical change in our economic polices that create an environment that is fertile with the possibilities of enhancing upward mobility.


As it stands now the police/prison/”justice” industrial complex profits mightily from oppressing and incarcerating the poor….thereby further dispiriting and degrading the poor so that they become even more cynical and hopeless…..thereby reinforcing the self sustaining feedback loop that devolves and metastasizes the cancer of poverty.

The oppression, contempt, prejudice and exploitation of the poor by the rich is also part of the “culture” of poverty *and* of wealth!


One could go on at length talking about the outrageously extreme income disparity that poisons our society and how the agenda of eliminating/reducing the minimum wage is part of the *real* “class warfare” being perpetrated on the have nots by the haves.

America has de-industrialized and virtually destroyed the union movement and the possibility of a blue collar worker being able to raise a family, own a home and send their children to college on a single decent salary.


Bottom line: To reduce the problem of poverty to solely “cultural” factors is indeed unjust and unfair and just one more way of blaming the victims of oppression for their own broken spirits and all of the cumulative deficits that generations of poverty inflicts upon the poor.

So let’s deal with the problem form *all* the angles, not just pick on the one that gives aid and comfort to those who want to blame the whole problem of poverty on the poor themselves. Such a framework can only lead us into the temptation to inflict additional insults to generations of injury.

John Fullinwider, Dallas, Texas,

What would make for some interesting sociology is a study of the “culture of wealth.” Why do rich people remain so willfully blind to the injustice that benefits them? Why do landlords who defer apartment maintenance pretend that it’s the tenants who “don’t care”? Why do cops haul poor black men out of their cars during a minor traffic stop, but give prosperous-looking whites a pass? Why are there a thousand studies about unwed teenage mothers who are poor and zero about the disposition of unwanted pregnancies among Ivy League co-eds? Why don’t the sociologists at Harvard study the moral failings of their largest donors? Why doesn’t Professor Sampson study his own amoral “culture of well-funded curiosity” as he drops fake letters on the sidewalk of destitute, devastated neighborhoods to see which poor people care enough to return his mail? Why not study residents of the wealthiest census tracts in Chicago to learn why they don’t care enough to end the devastation? Instead of studying the disadvantaged, why not study the ones who put so many at a disadvantage?

elysianhome Rosebud Sioux Indian Reservation, South Dakota

I am writing as a Nebraska raised, college educated white male and current cattle rancher. (Just clearing some things up).

I live on the Rosebud Sioux Indian Reservation in South Dakota and have 54 years of very close and personal experience living among my many Native American friends. My observations are completely identical with KMJ’s personal history. The only exception is that my observations are of low income whites and low income Native Americans. The environment and the outcome is the same. So many Native families here have succeeded when so very many others seem to have not even tried.

I really feel that the most significant element of KMJ’s experience was when she said “Both my parents cultivated an attitude of achievement and hard work . . .”. I’d like to emphasize her use of the word “BOTH”, as, in my opinion, that’s the single most important aspect of this entire discussion. Yes, it may take a village to raise a child, but the job can be so much more successful when the job starts with both parents taking an active, caring lead role.

Richard Longworth, Chicago

To Cathy and others who wonder if poor whites, especially those in rural areas, share the same cultural baggage as poor inner-city blacks, the answer is yes. My book, “Caught in the Middle,” chronicles the impact of long-term economic decline on the Midwest, both rural and urban. In chapters on poverty-stricken rural areas, it draws a distinct parallel between the lives of rural whites and urban blacks caught in generational poverty — a pathology of bad education, high drop-out rates, unstable or broken families, high drug use (coke in the cities, meths in the countryside), no access to good jobs, dependence on government support, bad housing and bad health care, down-home religions and general hopelessness — people in both places who want a good life for themselves and their children but have no idea how to get it. I stress that young people, both black and white, continue to escape from this culture to get an education and good jobs. But the culture itself is inhibiting and offers next to nothing to those who stay. Odd, isn’t it — two tribes who have so much in common yet are, for the most part, totally unaware that the other exists?

Seen this way, it is possible to talk about the pathology of generational poverty without falling into misleading racial stereotypes. As William Julius Wilson wrote, inner-city residents may be in the ghetto because they’re black, but they stay there because they’re poor. Similarly, rural whites don’t stay in rural ghettoes because they’re white. They stay because they’re poor — and have no idea how to get out.

Anon, Chicago,

From the article:
“As part of a large research project in Chicago, Professor Sampson walked through different neighborhoods this summer, dropping stamped, addressed envelopes to see how many people would pick up an apparently lost letter and mail it, a sign that looking out for others is part of the community’s culture.

In some neighborhoods, like Grand Boulevard, where the notorious Robert Taylor public housing projects once stood, almost no envelopes were mailed; in others researchers received more than half of the letters back. Income levels did not necessarily explain the difference, Professor Sampson said, but rather the community’s cultural norms, the levels of moral cynicism and disorder. ”

I have significant experience in this particular neighborhood, and what differentiates it from other, more mail-forwarding neighborhoods is the distinct lack of postal service. Mail carriers will not enter the towers to collect or deliver mail, and street drop boxes are rare. Few people will pick up a dropped letter in this neighborhood because there is no way to mail it.

Basic ignorance of local conditions like this make me suspect that this bit of research is another trojan horse for the reactionary right, manufacturing “data” to be used in blaming the poor for being poor.

Lcfredc, New York, NY

Today’s epidemic levels of underclass unemployment and crime are not rooted in the oft-noted “families without men” – but rather in the never-mentioned “men without families.”

After all, when 80 percent of the mothers in a community are self-supported or welfare-supported, 80 percent of the men are also unmarried. These men have not disappeared. In their neighborhoods they are history’s first majority of never-married men.

Black unmarried births are now at 72% (2008), up from 67% in 1990 and 24% in 1965. White unmarried births have similarly “progressed,” now at 29% (2008), up from 17% in 1990 and 3% in 1965.*

Hard labor for low wages requires intense motivation. Before the welfare entitlements of the 1960s and better jobs for women, a father’s paycheck was necessary for survival. Now millions of unmarried men need not work to feed, clothe or shelter families. They need never face their hungry child or suffer tender emotions. They need not be deterred by a prison term, nor fear the drug lifestyle – nor cling to a job.

These men impose an outlaw culture. They teach boys coming of age that irresponsibility and crime are viable ways of life, forcing them to dress like convicts, harden their hearts and prove their own brutality to earn protective “respect.”

So then, what will happen if we succeed in eliminating racism and improving schools – even if we get unmarried mothers off welfare? How will that change the day-to-day motivations of the mass of men who never marry? The root cause of the crime epidemic will continue, as each unmarried woman’s first pregnancy creates an invisible man – a man with nothing to lose.

If low-wage fathers remain unessential, we will continue routing whole communities of women and children into poverty – and great masses of unmarried, unmotivated men into rootless, antisocial, violent criminality.

Bill, Redlands,

Teaching in an inner-city elementary school for the last 20 years, I have observed the following: working parents are held in contempt by children of generational welfare families, most students are waiting to turn 16 so they can stop having to attend school, having a man is more important to women than having an education, babies are the way women show a man that she loves him, and going to prison is a good thing because it is an excuse to get away from the responsibilities of being a parent. Finally, too may kids do not know about fathers or marriage. They think men in families are just boyfriends or uncles.

JTM, , DC,

Capitalism IS a zero sum game. Our wealth depends on people serving us and doing most of the physical labor. The system requires a certain amount of “losers” to perform these tasks for wages that keep them in poverty. No one understands this better than African Americans. It’s not stupidity that makes them reject laws but a clear eyed understanding of the system which requires huge numbers of working poor to keep the machine running. Unfortunately for our law and order they seem to be getting tired of this game. Worst of all, they may not mail your envelope.

A, New York, NY

Perhaps I am oversimplifying, but it seems the article’s analysis (and perhaps the academic analysis) is stuck in dichotomized thinking: it’s no one’s fault (just the market’s) or it’s everyon’e fault (the poor are lazy, etc.). As a former social worker in the Bronx, I think it’s inarguable that having a baby as a single teenage mother who drops out of school is generally a recipe for poverty and reduced life chances for the entire family. If instead of moralizing the issue – “bad, bad poor people!” – and we treated many of the behavioral contributors to poverty for what they are – public health issues like smoking (inadvisable, not good for you, but not requiring the wholesale condemnation of the individual) and provided a campaign of advertising and public service messaging, I suspect we might see reductions in the behaviors that statistically are shown to drive poverty – just as we have driven done smoking rates: not overnight, but steadily, with both education and financial disincentives.

But the most important first step would be to reduce the noise from those on both sides of the issue who speak of this debate in moralistic, quasi-religious terms, and substitute sustained public education campaigns (among many other empirically proven initiatives) that target the most obviously self-destructive,self-defeating behaviors one finds in many poor communities. [a moral sentiment]

ACK Boston

I am baffled by this article’s characterization of the Moynihan Report. I was taught in Afro Am studies in college in the 90s that the reason is was so controversial was because it blamed society for locking minorities away and denying them opportunities for progress, thereby creating a separate “culture.” A stated purpose of the Report was to identify positive attributes of the African American community from which “mainstream” society could learn, and that would give greater insight into the values actually present. Among these was the notion of “fictive kin,” whereby families group together to form larger “family” units to help survive under such difficult circumstances when the deck is so stacked against them. The Moynihan Report has been cited by Afro Am scholars for decades in support of the notion that you must provide hope, opportunity and a vision for a way out before you can expect people to find their way out. I am happy to see social dialogue returning to the principles that the unabashedly liberal Moynihan elucidated so long ago: Until we provide opportunities, no hope can be garnered and no progress made. How can you ask people to climb when you’ve removed all the ladders?

MT NJ

As a liberal Black woman with and advanced degree, I have long wondered why we’re so scared to talk about a culture of poverty. Acknowledging that cultural norms can continue the cycle of poverty is not the same as criticizing an entire race or denying the impacts of other factors.

Culture plays a huge role in perpetuating poverty! People tend to make choices and pursue goals that fit in with the culture that surrounds them. No one else in my family went to college. They are mostly chronically underemployed and have children out of wedlock. They are not bad or lazy people; they simply lack context for another lifestyle. The middle class American Dream is fiction to them. I, on the other hand, had the luck to fall in with a group of highly motivated students in junior high and high school. This particular group of friends made it seem normal to work hard for good grades. Their parents helped me out with school projects when my parents could not or would not. They brought me along on college tours and showed me what was attainable. They introduced me to different culture! Four years of college (thanks to copious financial aid) further enhanced my understanding of that culture.

I personally think it is a grave disservice to poor children and their families to not show them ways that their cultural norms put them at a disadvantage. I do not advocate paternalism but people deserve to know what actions they can take to create a better life. I shudder to think how different my life would be today if not for the cultural exposure I received from my friends’ families.

LarryG Alabama

We don’t have poverty, it’s pseudo poverty. Count the number of televisions, phones and cell phones, autos, etc, in the typical “Under poverty level” home, and think again.
Poverty here is a political tool created by liberals to pander for votes, nothing more. Disingenuous and sleazy. Ask the real victims of poverty around the world if our lower class is poor.

Shelby Steele’s Thankless Task

By Joseph Epstein March 20, 2015 wsj

Shame

By Shelby Steele
Basic, 198 pages, $29.95

You,’ a character in Ossie Davis’s 1961 play “Purlie Victorious” says to another, “are a disgrace to the Negro profession.” The line recurs to me whenever I see Al Sharpton or Jesse Jackson making perfunctory rabble-rousing remarks in Ferguson, Mo., Madison, Wis., current-day Selma, Ala., or any other protest scene where their appearance, like Toni Morrison on a list of honorary-degree recipients, has become de rigueur. I wonder if Shelby Steele has also been called “a disgrace to the Negro profession,” and this for diametrically opposite reasons. Had he been it could only have been by people who, despite their endless cries for social justice, in one way or another have a deep emotional if not financial investment in keeping black Americans in the sad conditions in which so many of them continue to find themselves.

Shelby Steele is one of the very few writers able to tell home truths about the plight of black Americans. Telling truth to power used to be a sign of intellectual courage, but today, when the Internet has made this no great feat, what takes courage is telling truth to listeners who have grown accustomed to thinking themselves victims, have accepted the ultimately inadequate benefits of victimhood and, touchier than a fresh burn, take offense at the least criticism. Mr. Steele has taken on this thankless job with, as I suspect he would agree, less than happy results. Still, he shows no sign of letting up. In “Shame,” an essay on the political polarization of our country and on the want of progress among black Americans, he has produced his most complex and challenging work.

His authority for writing derives in part from his intellectual cogency, in part from his birth. His white mother married his black father in 1944, a time when a more radical act than miscegenation is not easily imagined. A mixed marriage in those days meant that a couple lived in black neighborhoods. Shelby Steele, born in 1946, grew up in Harvey, Ill., a predominantly working-class town just south of Chicago. He has described his biracial birth as “an absolute gift, the greatest source of insight and understanding. . . . [because] race was demystified for me. I could never see white people as just some unified group who hated blacks.” Although he doesn’t say so, being biracial has also allowed him insight into the hypocrisy of both blacks and whites on the subject of race.

The author has a fierce racial pride, and his writing about blacks in America is without condescension and imbued with deep sympathy. He is a brother, make no mistake, but a brother quite unlike any other. What distinguishes him is his openly stated belief that blacks in America have been sold out by the very liberals who ardently claim to wish them most good. He regrets that affirmative action, multiculturalism and most welfare programs purportedly put in place to show racial preference, far from liberating black Americans, have failed to advance their fortunes. Judging from high crime, divorce and unemployment rates, as well as relatively low rates of high-school and college completion, a case can be made that liberal policies have harmed them. To cite a single statistic: In 1965, the year after passage of the Civil Rights Act, 23.6% of black births in America were to single women; today that number is 72%.

Shame” does not portray the United States as the promised land in which all promises have been made good. Even our liberal royal family, the Kennedys, were, when in power, wobbly on civil rights. Mr. Steele’s father was a truck driver who, owing to racism, was kept out of the Teamsters union, and hence out of making a good living, until late in his working life. Mr. Steele recounts a heartbreaking story of his own high-school days in the early 1960s when he learned that the school swimming team, of which he was a key member, was invited to the coach’s mother’s summer house and that he was excluded because the woman disliked blacks. Pockets of racism of course still exist in the country, and doubtless always will. But legal freedom has long been established, owing in part to the physical courage of civil-rights activists in the South, and opportunities for blacks to rise are now in place. “Shame” takes up the question of why for the most part they haven’t.

Mr. Steele graduated college in that annus horribilis 1968, at the height of protest tumult and before affirmative action kicked in. An Afro-wearing, James Baldwin-reading young man, he worked in an anti-poverty program in East St. Louis, Ill., and was sufficiently swept up in what in those days was called “the movement” to have spent time with members of the Black Panthers exiled in Algiers. He did not attend any of the name-brand, or what today might be called designer, colleges. He went to Coe College in Iowa, Southern Illinois for a master’s degree and the University of Utah for a doctorate in English literature. This relieves him of doubt about his having been given a free pass on his education by affirmative action.

In a few dispiriting pages, Mr. Steele takes up the dubiety that Supreme Court Justice Clarence Thomas has always felt about his own entry, via affirmative action, into Yale Law School. After law school, Justice Thomas applied for jobs with several firms, but to no avail. “His interrogators did not believe that he was as good as his own grades indicated,” Mr. Steele writes. “They assumed his presence before them was explained by racial preferences, not by talent. It was as if they were saying the pretense was over: Yale could afford tokenism, but they could not.” Every black student in the affirmative-action era must feel similar doubt. One wonders if the Obamas, who between them were admitted to Columbia, Princeton and Harvard Law School, ever do.

Liberalism in the twenty-first century,” Mr. Steele writes, “is, for the most part, a moral manipulation that exaggerates inequity and unfairness in American life in order to justify overreaching public policies and programs.” This liberalism, which is not your Aunt Bessie’s liberalism but the liberalism that came into play at the 1972 Democratic convention that nominated George McGovern, “is invested in an overstatement of America’s present sinfulness based on the nation’s past sins.” Mr. Steele argues that liberalism’s efforts to alleviate the past injustices done to blacks in America have amounted to another botched project of that famously failed political construction firm, the Good Intentions Paving Co. “Liberalism,” Mr. Steele writes, “expresses its inborn racism in the way it overlooks the full human complexity of blacks—the fact that they are more than mere victims—in order to distill and harden the idea of their victimization into a currency of liberal power.”

Liberals, Mr. Steele holds, deal in what he calls “poetic truth.” This is a kind of truth “conceived in reaction to the great shames of America’s past—racism, sexism, territorial conquest (manifest destiny), corporate greed, militarism, and so on.” In poetic truth, the world is reduced to victims and victimizers, with liberals alone innocent of evil and thus excluded, by self-dissociation, from the role of victimizers. Under the realm of poetic truth, Mr. Steele explains, the race riots of the late 1960s could find justification and the feminist slogan “woman as nigger” could be taken seriously, while “fifty years of real moral evolution in America” can be entirely ignored.

After the 1960s, in Mr. Steele’s reading, authority was undermined and “authenticity” put in its place. Authenticity, he writes, “meant the embrace of new idealisms and new identities that explicitly untethered you from America’s notorious hypocrisies.” Through rebellion, antiwar activity, dissent, civil and uncivil disobedience, and dropping out before selling out, authenticity rendered one innocent of all the old evils associated with American power, domestic and international; authenticity also gave one the right to view “traditional America as a fundamentally hypocritical society.”

Mr. Steele does not use the word, but authenticity also conferred virtue on those who chose it. Self-virtue is the ultimate consolation to be found in the poetic truth of the new politics that came into being in the 1960s, and millions of Americans, rich white liberals prominent among them—recall Leonard Bernstein’s famous party for the Black Panthers—gloried in it. These politics changed the nature of liberalism from a reform-minded, character-forming set of political ideas into “a broad, guilt-driven, moralistic liberalism in which at least a vague anti-Americanism was decency itself.” America, in this interpretation, is essentially evil, and those who oppose it from within are thereby good. Hence the claim to moral superiority of the protest groups—blacks, women, gays et al.—of our day. For black Americans, the claim to moral superiority took the form of grievance, boisterous, unrelenting and willfully blind to any evidence of progress.

The new liberalism, eager to bring about The Good (Mr. Steele’s capital letters), went in for social engineering to accomplish its missionary work. For Mr. Steele not The Good but true good “would include an incentive to minorities to in fact become equal with all others by talent and merit . . . [and] would ask minorities to assimilate into modernity even if that felt like self-betrayal. . . . And it would discourage them from building a group identity singularly focused on protest. . . . Instead, all would be focused on their becoming competitive.” Blacks, Mr. Steele argues, ran into serious discrimination in sports and music, and yet in these competitive fields “their excellence and merit ultimately prevailed over all else.”

As things now stand in American political life, the desire for equality has trumped freedom; self-virtue, honesty; and preferential programs, the development of character. The effect of these liberal victories has been to lessen the quality of American life. Consider the contemporary university, where the goal of diversity, enforced by the whiphand of political correctness, has brought in various minority studies, women’s and gay studies, and other intellectual vulgarities in the name of redressing old injustices and mollifying grievances. The humanities and the social sciences have become hopelessly tendentious, the ideal of truth besmirched and higher education itself turned sadly comic.

Through the pages of “Shame,” Mr. Steele fills in a few of the details behind his own conversion from angry young black man to chronicler of the dead end that anger and moral indignation, supported by white guilt, have brought to American blacks. Strongly implicated in this conversion was his father, who had seen much darker days than his son ever would and who, as long ago as the late 1960s, assured him that “you shouldn’t underestimate America. . . . It’s strong enough to change.” After visiting the Black Panthers in North Africa and witnessing their self-destructive hatred for their own country, which left them placeless and bereft, Shelby Steele began to recognize that “the American mainstream would be my fate.”

The author’s conclusion is that black America sold itself out, entered “a Faustian pact,” as he puts it, by placing its destiny in the “hands of contrite white people.” Doing so, he writes, “left us pleading with government, not for freedom, which we had already won, but for ‘programs’ and ‘preferences’ that would be a ladder to full equality. The chilling result is that now, fifty years later, we remain—by most important measures—in the position of inferiors and dependents.” The liberalism that has come into prominence since the 1960s, Mr. Steele believes, “has done little more than toy with blacks.”

Mr. Steele has himself become a conservative. He is a conservative who believes less in the mysticism of the invisible hand of the market than in the force of strong character as the main element propelling social change. He is certain that there will never be a government program that builds such character. Speaking out about the false bargain that blacks have made with the new liberalism will doubtless earn him, if it hasn’t already done so, the old opprobrious title of Uncle Tom. The irony here is that Shelby Steele might just be a Tom of a different kind—a black Tom Paine, whose 21st-century common sense could go a long way to bringing his people out of their by now historical doldrums

Slavery’s Long Shadow

The Opinion Pages | Op-Ed Columnist

JUNE 22, 2015 NYTImes website

Paul Krugman

America is a much less racist nation than it used to be, and I’m not just talking about the still remarkable fact that an African-American occupies the White House. The raw institutional racism that prevailed before the civil rights movement ended Jim Crow is gone, although subtler discrimination persists. Individual attitudes have changed, too, dramatically in some cases. For example, as recently as the 1980s half of Americans opposed interracial marriage, a position now held by only a tiny minority.

Yet racial hatred is still a potent force in our society, as we’ve just been reminded to our horror. …t the racial divide is still a defining feature of our political economy, the reason America is unique among advanced nations in its harsh treatment of the less fortunate and its willingness to tolerate unnecessary suffering among its citizens.

My own understanding of the role of race in U.S. exceptionalism was largely shaped by two academic papers.

The first, by the political scientist Larry Bartels, analyzed the move of the white working class away from Democrats, a move made famous in Thomas Frank’s “What’s the Matter With Kansas?” Mr. Frank argued that working-class whites were being induced to vote against their own interests by the right’s exploitation of cultural issues. But Mr. Bartels showed that the working-class turn against Democrats wasn’t a national phenomenon — it was entirely restricted to the South, where whites turned overwhelmingly Republican after the passage of the Civil Rights Act and Richard Nixon’s adoption of the so-called Southern strategy.

And this party-switching, in turn, was what drove the rightward swing of American politics after 1980. Race made Reaganism possible. And to this day Southern whites overwhelmingly vote Republican, to the tune of 85 or even 90 percent in the deep South.

The second paper, by the economists Alberto Alesina, Edward Glaeser, and Bruce Sacerdote, was titled “Why Doesn’t the United States Have a European-style Welfare State?” Its authors — who are not, by the way, especially liberal — … eventually concluded that race is central, because in America programs that help the needy are all too often seen as programs that help Those People: “Within the United States, race is the single most important predictor of support for welfare. America’s troubled race relations are clearly a major reason for the absence of an American welfare state.”

Now, that paper was published in 2001, and you might wonder if things have changed since then. Unfortunately, …they haven’t, as you can see by looking at how states are implementing — or refusing to implement — Obamacare. …, in 2012 the Supreme Court gave individual states the option…of blockin g the Affordable Care Act’s expansion of Medicaid . … [Medicaid] provide[s] health insurance to lower-income Americans. … a federally-funded program [of] major benefits to millions of their citizens, pour billions into their economies, and help support their health-care providers. Who would turn down such an offer?

The answer is, 22 states at this point, although some may eventually change their minds. And what do these states have in common? Mainly, a history of slaveholding: Only one former member of the Confederacy has expanded Medicaid, and while a few Northern states are also part of the movement, more than 80 percent of the population in Medicaid-refusing America lives in states that practiced slavery before the Civil War.

And it’s not just health reform: a history of slavery is a strong predictor of everything from gun control (or rather its absence), to low minimum wages and hostility to unions, to tax policy.

So will it always be thus? … I’d like to think not. …, our country is growing more ethnically diverse, and the old black-white polarity is slowly becoming outdated. … we really have become much less racist, and in general a much more tolerant society on many fronts. Over time, we should expect to see the influence of dog-whistle politics decline.

But that hasn’t happened yet. Every once in a while you hear a chorus of voices declaring that race is no longer a problem in America. That’s wishful thinking; we are still haunted by our nation’s original sin.

A version of this op-ed appears in print on June 22, 2015, on page A19 of the New York edition

—————

Rico Greenville, SC

I am a baby boomer (born in ’54) in the South. … I am old enough to remember the drinking fountain and the separate restroom (singular) in a given public building. Those institutional things are gone but among my white peers a lot of resentment remains. I do not think America will truly be in a post racial state until my children’s generation is in the ground beside me. They are more tolerant and accepting than their parents but have been marked by us and so I think they too will need to be out of the picture.

This should not been seen as a message of despair though but of hope. Today because none of us knew slavery we find the idea unacceptable, because my children and grandchildren never knew Jim Crow they find that unacceptable. Time and the generations march on. The very young today have a black President. No he is not the cure to racism, he is the next step for a new generation, the ones who will look back at us and scratch their heads in wonder at us just as we do our ‘greats’ from the 19th century.

Karen Garcia New Paltz, NY

We may have more anti-racist laws on the books, and surveys might show that white attitudes have changed, but Jim Crow is alive and well in the land of the free (defined in GOP-speak as freedom to slash the social safety net to shreds and along with it, millions of “disposable people.”)

Black people have taken the brunt of the economic pain since the great 2008 meltdown. They are at least three times as likely to be poor, they earn at least 40% less than whites and their average net worth is about an eighth that of whites. This is true in all the states. In Blue New York, for example, Blacks are twice as likely to be unemployed as whites, and Black infant mortality rates are more than double those of whites.

There are currently more Blacks imprisoned in America than there were enslaved in the decade before the civil war. A study by the Malcolm X Grassroots Movement reveals that one Black person is killed by a security officer or a vigilante every single day in this country. A less racist nation?

If anything, “we” are a more racist nation. I hope that the “tear down this Confederate flag” community spirit catches fire. I hope that revelations that the same white supremacist hate group which inspired Dylann Roof also funds certain GOP candidates result in more than the usual “national conversation.”

Lectures by well-meaning experts to be patient, that things will improve “over time” are wearing pretty thin. The time for change is now. It’s getting desperate out there.

Paul Cohen Hartford CT 5

Karen, … Your comment on, “being patient, that things will improve over time,” harkens back to Dr. Martin Luther King’s immemorial speech at the reflecting pool”

“We have also come to this hallowed spot to remind America of the fierce urgency of Now. This is no time to engage in the luxury of cooling off or to take the tranquilizing drug of gradualism. Now is the time to make real the promises of democracy. Now is the time to rise from the dark and desolate valley of segregation to the sunlit path of racial justice. Now is the time to lift our nation from the quicksands of racial injustice to the solid rock of brotherhood. Now is the time to make justice a reality for all of God’s children.”

Rima Regas Mission Viejo, CA

“America is a much less racist nation than it used to be, … The raw institutional racism that prevailed before the civil rights movement ended Jim Crow is gone…”

America is about as racist, in a different way. I am sorry to disagree with you Professor Krugman…

Education is one huge factor at play in subconscious bias which is now equal in millenials as it is in the older generation. As education has declined and the humanities been de-emphasized, racism, or the lack of historical and sociological end of it has declined. Add to that the inequality and increasing disenfranchisement of young whites in a job market that isn’t promising, and we have a recipe for disaster.

Racism as it is practiced in red states, in the South and Midwest, especially, is driven by those who control the GOP apparatus. The oligarchs forged alliances with extreme organizations in order to take control.

Institutional racism is alive and well, whether it is in the for-profit prison system, out of control state prison system, housing, the courts… You name the institution and its racist apparatus is there. In California, there is an initiative to stop the same kind of fining system that was discovered in Ferguson.

Race isn’t the problem. Racism is. It is this nation’s most burning issue….

Tom Cuddy Texas

One thing that is not well understood is that racism is a system which does not require racists. It only requires people go along with a racist system. That is why we always hear ‘he doesn’t have a racist bone in his body’. Someone can be quite racist, as in supporting a racist system, without being bigoted or prejudiced. This also reflects why Blacks and Whites see our racial situation so differently.

Walter Rhett Charleston, SC

Au contraire: Reganism made race a formidable political force! Here’s how: earlier, race/slavery/the bondage at the source of our political economy was debated in theological terms. Every sermon, abolition speech, pamphlet, novel, court decision, Declaration of Secession referenced God’s will and divine order and plan–with both sides speaking fervently in God’s name!

Texas declared: “that the African race had no agency in their establishment [is] the revealed will of the Almighty Creator, as recognized by all Christian nations.” The Dred Scott decision declared white supremacy to be the supreme law of the land, as the Chief Justice wrote, “it was God’s will.” Those who disobeyed the fugitive slave law were “disobeying God’s will.”

The pushback was also definitive: Henry Highland Garnet preached: “But others–their fellow men, equal before the Almighty and made by Him of the same blood, and glowing with immortality–they doom to lifelong servitude and chains. Yes, they stand in the most sacred places on earth, and beneath the gaze of the piercing eye of Jehovah, the universal Father of all men, and declare that ‘the best possible condition of the Negro is slavery.”

Until Dr. King, race was a moral/theological issue. Reagan secularized it by shifting the debate to personal virtues and social justice within a conservative social order. The welfare queen, the dole cheat reframed race and the poor! He opened the door for the arguments of the balance sheets and freed hate.

Dave NYC

, I would suggest one important correction. American treatment of those living here before the European invasion, that is, the many tribes once thriving, is our original sin. Not that our kidnapping and enslavement of Africans wasn’t an horror beyond imagination, it just wasn’t the first one. Both, skipped over lightly in our schools and largely in the media, are shames remaining unanswered in full. Healing means full disclosure from our leaders and reparations when necessary. Our failure to do so thus far goes a long way toward explaining why the divisions remain.

July 8, 2015 – Empathy and Morality

Opinionator

The Stone

By Paul Bloom

June 6, 2015 NYTimes website

What could be more exhilarating than experiencing the world through the perspective of another person? In “Remembrance of Things Past,” Marcel Proust’s narrator says that the only true voyage of discovery is not to visit other lands but “to possess other eyes, to behold the universe through the eyes of another, of a hundred others, to behold the hundred universes that each of them beholds.” This is one of the central projects of the humanities; it’s certainly part of the pleasure we get from art and literature.

Many believe that this psychological connection is also essential for political change. They may argue, for instance, that in order for white Americans to adequately respond to the events in Baltimore, Ferguson, Mo., and elsewhere, they need to put themselves in the shoes of those in minority communities. After the death of Eric Garner at the hands of New York City police officers, Hillary Rodham Clinton called for changing police tactics, and then added: “The most important thing each of us can do is to try even harder to see the world through our neighbors’ eyes, to imagine what it is like to walk in their shoes, to share their pain and their hopes and their dreams.”

This is a moral claim, but it raises a psychological question. Can we do what Mrs. Clinton asks of us? Just how successful are we at seeing the world as others see it?

Apparently, we are nowhere near as good as we think we are. In his book “Mindwise,” the psychologist Nicholas Epley discusses experiments in which people were asked to judge the thoughts of strangers. These included asking speed daters to identify others who wanted to date them, asking job candidates how impressed their interviewers were with them and asking a range of people whether or not someone was lying to them.

People are often highly confident in their ability to see things as others do, but their attempts are typically barely better than chance. Other studies find that people who are instructed to take the perspectives of others tend to do worse, not better, at judging their thoughts and emotions.

So we are often bad at the project Clinton recommends. But a fan of perspective-taking would say that we just have to get better at it; we should try harder.

There are certain limits, however, to how far we can go. The philosopher Laurie Paul, in her book “Transformative Experience,” argues that it’s impossible to actually imagine what it would be like to have certain deeply significant experiences, such as becoming a parent, changing your religion or fighting a war. The same lack of access applies to our understanding of others. If I can’t know what it would be like for me to fight in a war, how can I expect to understand what it was like for someone else to have fought in a war? If I can’t understand what it would be like to become poor, how can I know what it’s like for someone else to be poor?

One approach is to go ahead and actually have the experience. Some have chronicled their attempts to take on other identities, like Norah Vincent in her 2006 book “Self-Made Man,” a memoir of a woman posing as a man, or John Howard Griffin in “Black Like Me,” which recounts his experience living disguised as a black man.

These acts of immersion are fascinating, but they have their limits. In the aftermath of torture revelations during the Iraq war, some journalists, like Christopher Hitchens, decided to get themselves waterboarded so that they would know what it was like. I don’t doubt that they learned something from the experience, but what they didn’t experience — what they couldn’t experience — was the lack of control. Surely part of the terror of waterboarding is that it is done to you when you don’t want it and you have no way to make it stop.

This point was missed by Donald H. Rumsfeld, who, when told that prisoners had to stand for many hours a day, responded that he himself had a standing desk and was also standing for many hours a day. But of course he could sit down whenever he wanted.

There is also the issue of duration. I can imagine what it’s like to deal with a crying baby for a few minutes, or spend time by myself in a small room, or have a stranger recognize me on the street. But it’s much harder to imagine — impossible, I think — what it would be like to be a single parent, suffer a year of solitary confinement or become a famous movie star.

These failures should motivate a certain humility when it comes to dealing with the lives of others. Instead of assuming that we can know what it is like to be them, we should focus more on listening to what they have to say. This isn’t perfect — people sometimes lie, or are confused, or deluded — but it’s by far the best method of figuring out the needs, desires and histories of people who are different from us. It also shows more respect than a clumsy attempt to get into their skins; I agree with the essayist Leslie Jamison, who describes empathy as “perched precariously between gift and invasion.”

Also, Mrs. Clinton might be mistaken in her claims about the moral importance of perspective-taking. Scholars ranging from Adam Smith to the contemporary literary critic Elaine Scarry have pointed out that when we try to act morally toward strangers based on empathic projection, we typically fail. This is in part because we’re not good at it, and in part because, when we allow ourselves to be guided by our feelings, our emotional investment in ourselves and those we love is overwhelming relative to our weak attachment to strangers. We become better people and better policy makers if we rely instead on more abstract principles of justice and fairness, along with a more diffuse compassion.

None of this is to say that the project of experiencing the lives of others should be abandoned. Under the right circumstances, we might have some limited success — I’d like to believe that novels and memoirs have given me some appreciation of what it’s like to be an autistic teenager, a geisha or a black boy growing up in the South. And even if they haven’t, most of us are still intensely curious about the lives of other people, and find the act of trying to simulate these lives to be an engaging and transformative endeavor. We’re not going to stop.

But we’re not good at it, particularly when the stakes are high, and empathic engagement is far too fragile a foundation to ground public policy. To make the world better, we shouldn’t try to put ourselves in the shoes of Eric Garner or anyone else. Our efforts should instead be put toward cultivating the ability to step back and apply an objective and fair morality.

Paul Bloom is a professor of psychology and cognitive science at Yale and the author of “Just Babies: The Origins of Good and Evil.”

A version of this article appears in print on 06/07/2015, on page SR8 of the National edition with the headline: Imagining the Lives of Others

————–

N.G. Krishnan Bangalore, India

“You never really know a man until you understand things from his point of view, until you climb into his skin and walk around in it.” said Lee, Harper in her book To Kill a Mockingbird.

This of course is relative to ones capacity to identify, feel and understand self feeling, to be able to project the feeling onto others. This is not simple as it sounds. Hardest part is it require the person to understand what the other is undergoing without actually having undergone it himself. Carry on the rant about “the lazy unemployed” when you have grown up in riches, is no way. But once you experience for yourself what it feels like to be unemployed a stretch, your point of view might change drastically and also how you feel about those who are facing a similar situation.

Empathy depends on actually experiencing the emotions and enables a person to empathize with someone, without the need of self experience, provided one can manage the exceptional job of mental projection into the emotional state of mind of another person, allowing identification of their feelings.

Timothy C Queens, New York

I respectfully disagree. Consider the historical Buddha. In his early life he was a wealthy prince with three palaces, whose father sought to shield him from knowledge of misfortune. But, one day while out alone, Gautama saw an old man, a beggar, and a corpse. From that one moment of understanding and empathy, the whole Buddhist enterprise followed.

My point is that it is very difficult to generate the diffuse morality that you rightfully champion without first having a visceral moment of empathy to ignite it.

For example, I was first shocked into giving to charity by seeing a young child in southeast Asia begging on the streets. From that experience I can look at a naked statistic, say, “6,000 killed in the latest aftershock”, and donate to quake aid. I don’t know any of the victims, and may not even see images of them, but I want to help because I know that if roles were reversed, I might want help too.

Don’t downplay empathy. It is the tiny spark from which effective and widespread action can begin to take root.

Raul Campos San Francisco

Empathy is not about mind reading or mentally simulating someone else’s circumstances in an egocentric attempt at emotional voyeurism. Empathy is the ability to share in the suffering of others. At its core is a deep compassion and willingness to open yourself up to the pain of others. It is not for the faint of heart or for self centered individuals. We all have empathy but we are constrained from using it by our ego that protects us from feeling the very suffering that empathy allows us to share. Our rational mind builds walls of logic that separate us from them and allow us to detach ourselves from those in need. Eventually, we begin to believe that empathy is a delusion and that we cannot truly understand others and to believe that we are islands of individuality connected not by compassion for each other but by laws and contracts and mutual self interest.

Madeline Conant Midwest

I disagree. Even if we can’t do a terribly good job of imagining ourselves in another person’s circumstance, I believe it is essential that we try. Although I do agree we should set a goal of defining morality, or justice, in objective terms, that task is impossible without compassion and human understanding.

I think this ability and willingness to imagine people in circumstances other than our own is the defining difference between liberals and conservatives.

Solomon New Haven, CT

The question that we are all charged with in this world of suffering — and more specifically, in response to the pain of oppressed communities in our very country — is not whether or not we can fully ‘imagine,’ ‘completely’ adopt, or ‘successfully’ instantiate the perspective of another. What we face is the question of critically assessing our own perspective. It is saying “I don’t understand. So tell me your story.” This is what I believe Bloom is directing us towards practically.

But this is where I think Bloom misses the point: in understanding the impossibility of totally imagining the Other’s pain, we do not retreat to objectivity. When I concede the unfathomable nature of another’s pain, I do not cease to understand. I understand that I do not understand, and that changes us. That enables and enlivens us. We go forth with the experiential knowledge that our context is not others’, that our conventions are not reality. And that is empathy.

As Bloom has outlined, there are indeeds limits to our capacity for care, understand and realize the situation of another. However, this is not a problem. As I understand it, this is the solution. We are different. And only through difference is there dialogue. And only through dialogue is there effective change. Thus the way we move towards a holistic and healthier being in the world is not through “stepping back,” it is through listening to others and constantly revisioning our horizon through encounters with difference.

Marilyn louisville

I think we actually can share the pain of others, not necessarily specifically identified pain, such as poverty or disease or loss of a loved one, for instance, but we can know our own pain and its impact on our lives. and from this knowingness can perceive the effects of pain in others. Our own pain then can have relevance beyond ourselves if we choose to use this gift of personal suffering to go deeper into the connectedness of the universe, the deepest bottom of awareness where all is one. Sometimes I become aware of a person who simply listens to a heartbroken person’s story, one who never interrupts the story of the grieving one, who neither waits for the end of the storyteller’s words to jump in with one’s own painful remembrance, nor who feels it necessary to make patronizing or judgmental remarks at the end. When I see this, I think, “That is what mysticism is all about, that shared identification of humanity in the oneness of all things where the recipient of the story understood the gift given and felt no need to add his/her personal stamp.”

Slooch Staten Island

The subject is important, but a lot of the assertions seem trivial. We’re to judge the possibility of empathy by a study of speed daters? Really? And the other “psychological studies” (not cited) are just that: studies that result in statistical information: many people (probably undergraduates) “instructed to take the perspective of others” (how?) do worse at judging their thoughts.

That means “people so instructed did worse, to a level of statistical significance.” Were the people that did really well (those instructed, as well as those not instructed) studies for commonalities? This seems not just soft science, but thin science as well.

Of course we can’t fully experience someone else’s experience, because even if we were taking in the same sense data, we don’t have the same history, and even if we learn the history, that’s not the same as living it. That is, we can be other than who we are (change, grow, impersonate), but we can’t be somebody else. But isn’t this another trivial point? Who would argue differently?

And it may also be impossible to “actually imagine” (whatever that means–hold on a second: actually “imagine?”) significant experiences that we haven’t actually had.

But it also seems likely that education, depth of life experience, immersion in mimetic art, and so on will increase empathy.

March 25, 2015 – The Death of Privacy

The Death of Privacy

pdf copy this item  

Alex Preston   The Guardian 8-3-14 

We have come to the end of privacy; our private lives, as our grandparents would have recognised them, have been winnowed away to the realm of the shameful and secret.  …  Insidiously, through small concessions that only mounted up over time, we have signed away rights and privileges that other generations fought for, undermining the very cornerstones of our personalities in the process.  … We) have come to accept that the majority of our social, financial and even sexual interactions take place over the internet and that someone, somewhere, whether state, press or corporation, is watching.

The past few years have brought an avalanche of news about the extent to which our communications are being monitored: WikiLeaks, the phone-hacking scandal, the Snowden files. Uproar greeted revelations about Facebook’s “emotional contagion” experiment (where it tweaked mathematical formulae driving the news feeds of 700,000 of its members in order to prompt different emotional responses). Cesar A Hidalgo of the Massachusetts Institute of Technology described the Facebook news feed as “like a sausage… Everyone eats it, even though nobody knows how it is made”.

Sitting behind the outrage was a particularly modern form of disquiet – the knowledge that we are being manipulated, surveyed, rendered and that the intelligence behind this is artificial as well as human. Everything we do on the web, from our social media interactions to our shopping on Amazon, to our Netflix selections, is driven by complex mathematical formulae that are invisible and arcane.

Most recently, campaigners’ anger has turned upon the so-called Drip (Data Retention and Investigatory Powers) bill in the UK, which will see internet and telephone companies forced to retain and store their customers’ communications (and provide access to this data to police, government and up to 600 public bodies). Every week, it seems, brings a new furore over corporations – Apple, Google, Facebook – sidling into the private sphere. Often, it’s unclear whether the companies act brazenly because our governments play so fast and loose with their citizens’ privacy… (“If you have nothing to hide, you’ve nothing to fear,” William Hague famously intoned); or if governments see corporations feasting upon the private lives of their users and have taken this as a licence to snoop, pry, survey.

…Novels have long been the province of the great What If?, allowing us to see the ramifications from present events extending into the murky future. As long ago as 1921, Yevgeny Zamyatin imagined One State, the transparent society of his dystopian novel, We. For …many others, the loss of privacy was one of the establishing nightmares of the totalitarian future. Dave Eggers’s 2013 novel The Circle …: “Secrets are lies, sharing is caring, and privacy is theft.”  …

Our age has seen an almost complete conflation of the previously separate spheres of the private and the secret. A taint of shame has crept over from the secret into the private so that anything that is kept from the public gaze is perceived as suspect. This, I think, is why defecation is so often used as an example of the private sphere. …It is to the bathroom that Max Mosley turns when we speak about his own campaign for privacy. …

Here we have a clear example of the blurred lines between secrecy and privacy. Mosley believed that what he chose to do in his private life, even if it included whips and nipple-clamps, should remain just that – private. The News of the World, on the other hand, thought it had uncovered a shameful secret that, given Mosley’s professional position, justified publication. There is a momentary tremor in Mosley’s otherwise fluid delivery as he speaks about the sense of invasion. “Your privacy or your private life belongs to you. Some of it you may choose to make available, some of it should be made available, because it’s in the public interest to make it known. The rest should be yours alone. And if anyone takes it from you, that’s theft and it’s the same as the theft of property.”

Mosley … has fallen victim to what is known as the Streisand Effect, where his very attempt to hide information about himself has led to its proliferation (in 2003 Barbra Streisand tried to stop people taking pictures of her Malibu home, ensuring photos were posted far and wide). Despite this, he continues to battle – both in court, in the media and by directly confronting the websites that continue to display the pictures. It is as if he is using that initial stab of shame, turning it against those who sought to humiliate him. “…there isn’t a huge difference between the state watching everything you do and Google watching everything you do. Except that, in most European countries, the state tends to be an elected body, whereas Google isn’t. There’s not a lot of difference between the actions of the government of East Germany and the actions of Google.”

All this brings us to some fundamental questions about the role of search engines. Is Google the de facto librarian of the internet, given that it is estimated to handle 40% of all traffic? Is it something more than a librarian, since its algorithms carefully (and with increasing use of your personal data) select the sites it wants you to view? To what extent can Google be held responsible for the content it puts before us?

In 2009, Mario Costeja González found that a Google search of his name brought up a 36-word document concerning a case from the late 90s in which banks threatened to seize his home. The information was factually incorrect – he’d actually paid off the debts in question. More than this, it was, he argued, irrelevant. He is now a lawyer with a successful practice and any former money worries ought not to feature on the internet record of his life. Google fought the case within Spain and then all the way to the European Court of Justice. Costeja González won, the article was taken down, the victory labelled “the right to be forgotten”.

Google’s response to the ruling has been swift and sweeping, with 70,000 requests to remove information processed in the weeks following the judgment. A message now appears at the bottom of every search carried out on Google, warning: “Some results may have been removed under data protection law in Europe.” It seems that Google has not been judging the quality of these requests and relying on others to highlight content that has been taken down erroneously. Search results to newspaper articles on Dougie McDonald, a Scottish football referee accused of lying, were taken down. And a Robert Peston piece for the BBC about Merrill Lynch CEO Stan O’Neal’s role in the 2008 financial crisis was removed; as of August 2014, it is still missing from Google searches.

To understand how much protection the law offers those wishing to defend their privacy against the triumvirate of state, press and data-harvesting corporations, I turn to one of the country’s top privacy lawyers, Ruth Collard at Carter Ruck. … I ask her about the Costeja González case.

“I think it’s a very surprising decision, I really do,” she says, “but it was Google’s collection of data, the arrangement and prioritisation of it that influenced the judgment, as the court found they were a ‘controller’ of the information.”

I ask about the freedom of speech implications of the judgment – surely every politician will be applying to Google, trying to scrub away the traces of the sordid affair, the expenses scandal? She nods emphatically. “Almost immediately after the judgment, there were reports of a politician trying to clean up his past. We have been contacted by clients who have read about the judgment and are interested. So far, the ones that have contacted me, I have not thought they had a case to make. It was clear from the judgment that a balance has to be struck – between the interest of the subject in keeping information private and the interest of internet users in having access to it. The effect will be removing information that may have been once completely justified in being there, but is now outdated or irrelevant.”

We go on to discuss the changing face of privacy in the internet age. She firstly notes how relatively young privacy law is in this country – only since 1998 have there been laws in place to protect privacy. Cases brought before this, such as that of ‘Allo ‘Allo star Gordon Kaye, who in 1990 was photographed in his hospital bed badly battered following a car crash, had to prove that a breach of confidence had taken place. Collard points me to the recent Paul Weller case, in which the Modfather sued after being pictured in the Daily Mail out walking with his 16-year-old daughter and 10-month-old twins.

“A lot of these cases are Mail cases,” she says with a twinkle. “It was the fact that the photos showed the children’s faces that was found to be significant by the court. Weller brought the claim on behalf of all three children and it succeeded [although the Mail is appealing].” In his judgment, Justice Dingemans argued that the children’s faces represented “one of the chief attributes of their respective personalities… These were photographs showing the expressions on faces of children, on a family afternoon out with their father. Publishing photographs of the children’s faces, and the range of emotions that were displayed, and identifying them by surname, was an important engagement of their Article 8 rights.”

Collard tells me: “The Mail argued very hard on a number of grounds, one of which was that the children didn’t have a reasonable expectation of privacy, given that the teenager a couple of years earlier had done some modelling for Teen Vogue and the babies had had photos of them tweeted by their mother. However, she had been careful when tweeting not to show their faces. This thing about facial expressions is new and it will be interesting to see where it goes.”

The distinction between secret and private has been the guiding philosophical principle. We’re not looking to get your secrets. We’re asking you to test this thing which is less tangible and less transactable, which is your privacy.”

A few days after our meeting, Rourke puts me in touch with Michal Kosinksi, the Cambridge academic who, with David Stillwell, has designed youarewhatyoulike.com, the psychometric algorithm that produces from your Facebook “likes” a map of your soul. I think of Ruskin, who in 1864 said: “Tell me what you like and I’ll tell you what you are.” I also think of Orwell’s thought police and Philip K Dick’s The Minority Report, where criminals are identified and arrested before they commit crimes. This is one of the problems about advances in technology: we are preconditioned to view them through a dystopian lens, with Orwell, Ballard, Burgess and others staring over our shoulders, marvelling, but fearful.

While I wait for my results, I ask Kosinski whether the potential for misuse worries him. “Most technologies have their bright and dark side,” he replies, buoyantly. “My personal opinion is that a machine’s ability to better understand us would lead to improved consumer experience, products, etc… But imagine that we published a clone of youarewhatyoulike.com that simply predicted which of your friends was gay (or Christian or liberal or HIV-positive, etc); lynches are not unlikely to follow…”

I’m left baffled by my results. According to the algorithm, I’m 26 (I’m actually 35). The program is unsure whether I’m male or female (I’m male). I’m borderline gay or, as Kosinski puts it in his analysis: “You are not yet gay, but very close.” I’m most likely to be single, extremely unlikely to be married (I’m not sure what my wife will say to all this). The algorithm correctly predicts my professional life (art, journalism, psychology) and my politics (liberal) but claims that I exhibit low neuroticism. It should sit next to me on a turbulent flight. I realise that I’m viewing the results of a kind of double of myself, the public persona I present through social media (and over which I presume some sort of control), nothing like the real me. For that, I need a psychiatrist.

… academic and psychoanalyst Josh Cohen. Cohen’s book, The Private Life (2013), is an intelligent and highly literary exploration of the changing nature of privacy in the age of Facebook and Celebrity Big Brother. Skipping from Katie Price to Freud to Booker-winning author Lydia Davis, Cohen paints a convincing picture of a culture fighting a desperate psychological battle over the private self. He argues that both our ravenous hunger for celebrity gossip and the relentless attempts by the wealthy to protect their privacy have recast the private life as “a source of shame and disgust”. The tabloid exposé and the superinjunction both “tacitly accede to the reduction of private life to the dirty secrets hidden behind the door”.

Psychoanalyst Josh Cohen: ‘Privacy, precisely because it ensures that we are never fully known to others, provides a shelter for imaginative freedom, curiosity and self-reflection. …

And yet what neither the press nor the lawyers recognise when they treat privacy as they would secrecy – as something that can be revealed, possessed, passed on – is that the truly private has a habit of staying that way. Cohen argues that the private self is by definition unknowable, what George Eliot calls “the unmapped country within us”. In an email conversation, Cohen gives me a condensation of this thesis: “When we seek to intrude on the other’s privacy, whether with a telephoto lens, a hacking device or our own two eyes, we’re gripped by the fantasy of seeing their most concealed, invisible self. But the frustration and disappointment is that we only ever get a photograph of the other, an image of their visible self – a mere shadow of the true substance we really wanted to see. The most private self is like the photographic negative that’s erased when exposed to the light.”

There is something strangely uplifting in this idea – that no matter how deep they delve, the organs of surveillance will never know my true self, for it is hidden even from me.

I ask Cohen about the differences between our “real” selves and those we project online. I think of the younger, gayer, less neurotic incarnation of myself that appears on Facebook. “I agree that the online persona has become a kind of double,” he says. “But where in Dostoevsky or Poe the protagonist experiences his double as a terrifying embodiment of his own otherness (and especially his own voraciousness and destructiveness), we barely notice the difference between ourselves and our online double. I think most users of social media and YouTube would simply see themselves as creating a partial, perhaps preferred version of themselves.”

…This is the horror of social media – that it gives us the impression we are in control of our virtual identities, putting out messages that chime with our “real” selves (or some idealised version of them). In fact, there is always slippage and leakage, the subconscious asserting its obscure power. The internet can, as Cohen tells me, “provide a way of exploring and playing the multiplicity and complexity of the self”. It can also prove to us just how little control we have over how we appear. As William Boyd put it in Brazzaville Beach: “The last thing we discover in life is our effect.”

There is, of course, a flipside to the dystopian view of profit-hungry corporations and totalitarian governments relentlessly reaping our private selves. Josh Cohen describes the lifelogging movement as bearing “an overriding tone of utopian enthusiasm”. Lifelogging involves the minute-by-minute transmission of data about one’s life, whether by photographs, web-journals or the sort of Quantified Self technologies – wearable watches, data-gathering smartphone apps – developed by German firm Datarella (and many others, Google Glass not least among them). …

… a wormhole, a place in which it is possible to lose yourself in the beautiful but useless ephemera of a single existence. …

Josh Cohen told me about the psychic risks of lifelogging. For some, he said, “shadowing your transient, irretrievable life is a permanent digital life, and the really frightening spectre here is that the digital recording becomes more ‘real’, more authoritative than your memory.” …

Your privacy has a value. There are even companies such as RapLeaf.com that will tell you what your personal information is worth. The basic facts?  Very little. More detailed information – for example, you own a smartphone, are trying to lose weight or planning a baby – are worth much more. Big life changes – marriage, moving home, divorce – bring with them fundamental changes in our buying patterns as we seek, through the brands with which we associate ourselves, to recast the narratives of our lives. Through analysis of buying patterns, US retailer Target predicted one of its customers was pregnant (and sent her coupons for maternity wear) before the teenager had broken the news to her disapproving parents.

Perhaps the reason people don’t seem to mind that so much of their information is leaking from the private to the public sphere is not, as some would have it, that we are blind and docile, unable to see the complex web of commercial interests that surround us. Maybe it’s that we understand very clearly the transaction. The internet is free and we wish to keep it that way, so corporations have worked out how to make money out of something we are willing to give them in return – our privacy. We have traded our privacy for the wealth of information the web delivers to us, the convenience of online shopping, the global village of social media.

Let me take you back to August 2006, the Chesterfield Hotel, Mayfair. My little brother (Preston from the Ordinary Boys) was marrying a girl he’d met on the telly (Chantelle Houghton). Celebrity Big Brother was still in its Channel 4 pomp, attracting 6 million viewers and upwards, the focus of water-cooler debate and gossip mag intrigue. The wedding was, briefly, an event. I remember a moment between the ceremony and the reception when we were queuing up in our gladrags to have our pictures taken for the OK! magazine spread. I felt a sudden, instinctive lurch – the thought of my phiz besmirching every hairdresser’s salon and dentist’s waiting room. I wasn’t on Facebook, Twitter hadn’t been invented, Friends Reunited? No thanks. I ran – to the bemusement of my family and the photographer.

Now, though, I post pictures …  I write articles on subjects I’d previously kept secret from my nearest and dearest. I let a Sunday newspaper take a (relatively tasteful) picture of me and my children when I was promoting my last novel. We have all – to a greater or lesser extent – made this same transaction and made it willingly (although my children didn’t have much say in the matter).

We weren’t private creatures in centuries past, either. In a 1968 talk on privacy in the electronic age, sociologist Marshall McLuhan argued that it was the coming of a new technology – books – and the “closed-off architecture” needed to read and study that had forged the sense of the private self. It may be that another new technology – the internet – is radically altering our sense of what (if anything) should remain private. We live in a liberal democracy, but, with recent lurches to the right, here and abroad, you don’t need to be Philip K Dick to imagine the information you gave up so glibly being used against you by a Farage-led dictatorship.

More immediately, there is the normalising effect of surveillance. There is a barrier or check on our behaviour when we know we are being watched: deviancy needs privacy. This was the thinking behind Jeremy Bentham’s Panopticon, a model for a jail where a single watching guard could survey a whole prison of inmates (the model, by the way, for Zamyatin’s One State). Soon, it didn’t matter whether the guard was on duty or not, the mere possibility of surveillance was enough to ensure compliance. This is where we find ourselves now, under surveillance that may seem benign enough but which nonetheless asserts a dark, controlling power over us, the watched.

The message seems to be that if you really want to keep something private, treat it as a secret, and in the age of algorithmic analysis and big data, perhaps best to follow Winston Smith’s bitter lesson from Nineteen Eighty-Four: “If you want to keep a secret, you must also hide it from yourself.”

Here lies our greatest risk, one insufficiently appreciated by those who so blithely accept the tentacles of corporation, press and state insinuating their way into the private sphere. As Don DeLillo says in Point Omega: “You need to know things the others don’t know. It’s what no one knows about you that allows you to know yourself.” By denying ourselves access to our own inner worlds, we are stopping up the well of our imagination, that which raises us above the drudge and grind of mere survival, that which makes us human.

I asked Josh Cohen why we needed private lives. His answer was a rallying cry and a warning. “Privacy,” he said, “precisely because it ensures we’re never fully known to others or to ourselves, provides a shelter for imaginative freedom, curiosity and self-reflection. So to defend the private self is to defend the very possibility of creative and meaningful life.”

Alex Preston’s most recent novel is In Love and War, published by Faber

 

April 9, 2015 Israel – Palestine

The Two Israels  

pdf copy this item

NYTimes Website 2-28-15 Sunday Review | Op-Ed Columnist

NEGEV DESERT, Israel — FOR generations, Americans and others have been donating trees to Israel through the Jewish National Fund.

“Planting a tree in Israel is the perfect way to show you care,” the fund says on its website. An $18 donation buys a tree, turns the desert green, protects the environment and supports an embattled Jewish state. The fund says it has planted more than 250 million trees in Israel so far.

Yet here in the Negev Desert in southern Israel, it looks more complicated. The Bedouin Arabs, the indigenous inhabitants, say that they are being pushed out of their lands by these trees donated by well-meaning contributors.

“Each of those trees is a soldier causing the destruction of our communities, our lives,” Sheikh Sayakh al-Turi, a Bedouin leader, told me. “All those trees are planted on lands of Bedouin who are still living here.”

His son, Aziz, says that the Jewish National Fund destroyed hundreds of his own fruit and olive trees and then replanted the area with new trees to push out the Bedouin. “They want to delete our history and plant Jewish history,” Aziz said.

Rabbi Arik Ascherman, the president of Rabbis for Human Rights, a group that is helping the Bedouin, backs up these claims.

“J.N.F. does many good things, but this is the dark side,” he said. “Almost anywhere you go in this country where there is a J.N.F. forest, you will find, at its heart, the ruins of an Arab village.”

“I, as a Zionist, believe I have a place here,” Rabbi Ascherman added, “but I don’t want to be here by displacing Aziz.”

The J.N.F. sees it differently, saying that the fundamental problem isn’t trees but Bedouin poverty. Russell Robinson, the chief executive of the fund, says that the J.N.F. follows Israeli law and forestry plans and that it has some programs that directly combat Bedouin poverty.

That’s true, and those anti-poverty efforts seem admirable. Still, I don’t think Americans who have donated trees would feel too good after meeting some of the displaced Bedouin, and the tree-planting raises larger questions: Particularly at a time of hard-line Israeli leadership, how can foreigners support Israel without inadvertently oppressing Arabs?

The Negev Desert is part of Israel itself, not the West Bank, and these Bedouin are Israeli citizens. Yet Israel is pushing the Bedouin off their lands and destroying their homes in ways that would never happen if they were Jewish.

Aziz al-Turi says the Jewish National Fund has planted trees to help push his fellow Bedouins off their lands in the Negev Desert in Israel. Credit Nicholas D. Kristof/The New York Times

“We have, in many ways, an incredibly strong democracy” notes Rabbi Ascherman, singling out the vibrancy and range of political debate. Yet it doesn’t always work well for the Arab minority.

It’s also a democracy with contradictions. West Bank Jews vote, but not West Bank Palestinians. A Jewish kid in Chicago has a birthright to Israel, but not a Palestinian child next door whose roots are in Haifa.

The roughly 200,000 Bedouin Arabs reflect the ways in which Israeli democracy falls short. The government doesn’t recognize their land claims and has bulldozed Bedouin villages and then herded them into bleak modern townships that are basically the Israeli equivalent of American Indian reservations.

The Turis’ village, Al Araqib, was bulldozed several years ago. When I spoke to the Bedouin, they were huddled in temporary shacks. And the day after I spoke to them, the authorities knocked those shacks down as well.

On my visit here to the Negev, I faced two Israels. One is the thriving democracy that many of us admire, the one that gives disgruntled Arab citizens free speech and ballots, that treats the wounded Syrians brought across the border, that nurtures a civil society that stands up for the Bedouin. This is the Israel that anyone can support without risking harm to Arabs. Any of us would plant a tree in this Israel. (Indeed, Rabbis for Human Rights has its own tree-planting program.)

Yet the other Israel has been gaining ground. It’s more nationalistic, more militaristic, more determined to push Palestinians off land in the West Bank, more eager to dispatch the United States to bomb Iranian nuclear sites. This is the Israel that Prime Minister Benjamin Netanyahu will represent in his address to Congress scheduled for this week.

This is also the Israel that antagonizes many Europeans and Americans. Hard-line policies under Netanyahu are turning support for Israel’s government from a bipartisan issue to a Republican one. A poll of Americans published in December found that 51 percent of Republicans wanted the United States to lean toward Israel, but only 17 percent of Democrats agreed (most didn’t want to lean either way). Increasingly, the constituency in America that most reliably backs the Israeli government may be not Jews but Evangelical Christians.

With the Netanyahu speech coming up, American politicians will be strutting and jostling to prove their “pro-Israel” credentials. So this is a moment to remember that the better question is which Israel to support.

————-

SundayReview | Op-Ed Columnist

Winds of War in Gaza, NYTimes Week in Review 3-7-15

Nicholas Kristof

GAZA — IT is winter in Gaza, in every wretched sense of the word. Six months after the latest war, the world has moved on, but tens of thousands remain homeless — sometimes crammed into the rubble of bombed-out buildings. Children are dying of the cold, according to the United Nations.

Rabah, an 8-year-old boy who dreams of being a doctor, walked barefoot in near-freezing temperatures with his friends through the rubble of one neighborhood. The United Nations handed out shoes, but he saves them for school. For the first time in his life, he said, he and several friends have no shoes for daily life. Nearly everyone I spoke to said conditions in Gaza are more miserable than they have ever been — exacerbated by pessimism that yet another war may be looming.

Lacking other toys, boys like Rabah sometimes play with the remains of Israeli rockets that destroyed their homes.

Gaza has been compared to an open-air prison, and, in the years I’ve been coming here, that has never felt more true, partly because so many Gazans are now literally left in the open air. But people joke wryly that at least prisons have reliable electricity.

Rubble and bombed-out buildings, dot Gaza six months after the latest war.

The suffering here has multiple causes. Israel sustains a siege that amounts to economic warfare on an entire population. Hamas provokes Israel, squanders resources and is brutal and oppressive in its own right. Egypt has closed smuggling tunnels that used to relieve the stranglehold, and it mostly keeps its border with Gaza closed. The 1.8 million Gazans are on their own, and one step forward should be international pressure on Israel and Egypt to ease the blockade.

Yet I have to acknowledge that Israel’s strategy of collective punishment may be succeeding with a sector of the population. Gazans aren’t monolithic in their views any more than Americans, but many said that they were sick of war and of Hamas and don’t want rockets fired at Israel for fear of terrible retribution.

“I don’t want resistance,” said Khadra Abed, a 50-year-old woman living with her family in the remains of her home. “We’ve had enough suffering.”

Halima Jundiya, a 65-year-old matriarch who says the children in her family are still traumatized by war, was blunt: “We don’t want Hamas to fire rockets. We don’t want another war.”

One bearded young man said he worked for Hamas but had turned against it, because government salaries were no longer being paid. “I hate Hamas,” he said, which seemed an odd thing for a Hamas officer to say.

Yet Israel should understand clearly that its bombings also put some on the path to becoming fighters. A 14-year-old boy, Ahmed Jundiya, is part of the same clan as Halima, but he draws the opposite conclusion: He aspires to grow up and massacre Israelis.

“War made us feel we will die anyway, so why not die with dignity,” Ahmed told me. “I want to be a fighter.”

Ahmed keeps a poster of a family friend who was killed while firing rockets at Israel, and he says he yearns to do the same. I asked him how he could possibly favor more warfare after all the bloodshed Gaza had endured, and he shrugged.

“Maybe we can kill all of them, and then it will get better,” he said. I asked him if he really wanted to wipe out all of Israel, and he nodded. “I will give my soul to kill all Israelis,” he said.

Some of that is teen bravado, and some may reflect the unfortunate reality that, if you’re a teenage boy, one of the few career paths available is as a fighter. Ahmed’s father is an unemployed construction worker, the boy explains with a hint of distaste.

Over all, my sense is that the suffering has left some Gazans more disenchanted with fighting, and others yearning for violent revenge. It’s difficult to be sure how those forces balance out.

Israel and Egypt both have legitimate security concerns in Gaza (an Egyptian court recently declared Hamas a terrorist organization), but the Israeli human rights organization Gisha notes that it’s ridiculous for Israel to insist that the ongoing economic stranglehold is essential for security. Gisha introduced me to Aya Abit, 24, who is married to a Palestinian man in the West Bank. They have a 5-month-old baby whom the father has never seen because Israel won’t allow Abit to leave Gaza.

“I cry every day,” she says. “I don’t know what to do.”

Likewise, Israel prevents some Gazan students accepted at American or other foreign universities from leaving to study. That’s counterproductive: More Western-educated Gazans might be a moderating presence, but the point seems to be to make all Gazans suffer.

Senior Israeli officials understand that the economic blockade has undermined the independent business community that could counter Hamas. So Israeli officials have been saying the right things recently about easing the blockade, but not much has changed.

On a visit to Gaza in 2010, I visited a cookie factory in Gaza run by Mohammed Telbani, a prominent businessman. I returned on this visit, and I found that Israel had bombed Telbani’s factory repeatedly during the war.

Telbani has restored part of the factory, but Israel won’t allow three European technicians into Gaza to set up Danish machinery that is sitting idle. And a packaging machine has been out of operation for months because he needs a spare part that Israel won’t allow in. Israel pretty much seals off Gaza — journalists are a rare exception — and isolation and despair mark Gaza today.

This blockade isn’t as dramatic as the bombings, but, in the long run, it’s soul-destroying. Businesses can’t sell their goods; students can’t go to West Bank universities; a wife can’t join her husband. True, Hamas’s misrule is central to the problem, but we don’t have influence over Hamas; we do have influence over Israel. The U.S. and other global powers should call more forcefully on both Israel and Egypt to ease this siege of Gaza.

Telbani is a pragmatic businessman, a fluent Hebrew speaker whose aim is to sell cookies, hire workers and make money. He sounded far more bitter toward Israelis on this visit than before, and I told him so.

“They burned $22 million for no reason,” he replied indignantly. “What I created in 45 years, they destroyed in less than two hours. What should I tell them? ‘Thank you’?

“This is the worst time ever,” he added. “People have nothing to lose. So I expect another war.”

 

March 11, 2015 Raising a Moral Child

Raising a Moral Child
By Adam Grant  APRIL 11, 2014

PDF copy this item:

What does it take to be a good parent? We know some of the tricks for teaching kids to become high achievers. For example, research suggests that when parents praise effort rather than ability, children develop a stronger work ethic and become more motivated.

Yet although some parents live vicariously through their children’s accomplishments, success is not the No. 1 priority for most parents. We’re much more concerned about our children becoming kind, compassionate and helpful. Surveys reveal that in the United States, parents from European, Asian, Hispanic and African ethnic groups all place far greater importance on caring than achievement. These patterns hold around the world: When people in 50 countries were asked to report their guiding principles in life, the value that mattered most was not achievement, but caring.

Despite the significance that it holds in our lives, teaching children to care about others is no simple task. In an Israeli study of nearly 600 families, parents who valued kindness and compassion frequently failed to raise children who shared those values.

Are some children simply good-natured — or not? For the past decade, I’ve been studying the surprising success of people who frequently help others without any strings attached. As the father of two daughters and a son, I’ve become increasingly curious about how these generous tendencies develop.

Genetic twin studies suggest that anywhere from a quarter to more than half of our propensity to be giving and caring is inherited. That leaves a lot of room for nurture, and the evidence on how parents raise kind and compassionate children flies in the face of what many of even the most well-intentioned parents do in praising good behavior, responding to bad behavior, and communicating their values.

By age 2, children experience some moral emotions — feelings triggered by right and wrong. To reinforce caring as the right behavior, research indicates, praise is more effective than rewards. Rewards run the risk of leading children to be kind only when a carrot is offered, whereas praise communicates that sharing is intrinsically worthwhile for its own sake. But what kind of praise should we give when our children show early signs of generosity?

Many parents believe it’s important to compliment the behavior, not the child — that way, the child learns to repeat the behavior. Indeed, I know one couple who are careful to say, “That was such a helpful thing to do,” instead of, “You’re a helpful person.”

But is that the right approach? In a clever experiment, the researchers Joan E. Grusec and Erica Redler set out to investigate what happens when we commend generous behavior versus generous character. After 7- and 8-year-olds won marbles and donated some to poor children, the experimenter remarked, “Gee, you shared quite a bit.”

The researchers randomly assigned the children to receive different types of praise. For some of the children, they praised the action: “It was good that you gave some of your marbles to those poor children. Yes, that was a nice and helpful thing to do.” For others, they praised the character behind the action: “I guess you’re the kind of person who likes to help others whenever you can. Yes, you are a very nice and helpful person.”

A couple of weeks later, when faced with more opportunities to give and share, the children were much more generous after their character had been praised than after their actions had been. Praising their character helped them internalize it as part of their identities. The children learned who they were from observing their own actions: I am a helpful person. This dovetails with new research led by the psychologist Christopher J. Bryan, who finds that for moral behaviors, nouns work better than verbs. To get 3- to 6-year-olds to help with a task, rather than inviting them “to help,” it was 22 to 29 percent more effective to encourage them to “be a helper.” Cheating was cut in half when instead of, “Please don’t cheat,” participants were told, “Please don’t be a cheater.” When our actions become a reflection of our character we lean more heavily toward the moral and generous choices. Over time it can become part of us.

Praise appears to be particularly influential in the critical periods when children develop a stronger sense of identity. When the researchers Joan E. Grusec and Erica Redler praised the character of 5-year-olds, any benefits that may have emerged didn’t have a lasting impact: They may have been too young to internalize moral character as part of a stable sense of self. And by the time children turned 10, the differences between praising character and praising actions vanished: Both were effective. Tying generosity to character appears to matter most around age 8, when children may be starting to crystallize notions of identity.

Praise in response to good behavior may be half the battle, but our responses to bad behavior have consequences, too. When children cause harm, they typically feel one of two moral emotions: shame or guilt. Despite the common belief that these emotions are interchangeable, research led by the psychologist June Price Tangney reveals that they have very different causes and consequences.

Shame is the feeling that I am a bad person, whereas guilt is the feeling that I have done a bad thing. Shame is a negative judgment about the core self, which is devastating: Shame makes children feel small and worthless, and they respond either by lashing out at the target or escaping the situation

altogether. In contrast, guilt is a negative judgment about an action, which can be repaired by good behavior. When children feel guilt, they tend to experience remorse and regret, empathize with the person they have harmed, and aim to make it right.

In one study spearheaded by the psychologist Karen Caplovitz Barrett, parents rated their toddlers’ tendencies to experience shame and guilt at home. The toddlers received a rag doll, and the leg fell off while they were playing with it alone. The shame-prone toddlers avoided the researcher and did not volunteer that they broke the doll. The guilt-prone toddlers were more likely to fix the doll, approach the experimenter, and explain what happened. The ashamed toddlers were avoiders; the guilty toddlers were amenders.

If we want our children to care about others, we need to teach them to feel guilt rather than shame when they misbehave. In a review of research on emotions and moral development, the psychologist Nancy Eisenberg suggests that shame emerges when parents express anger, withdraw their love, or try to assert their power through threats of punishment: Children may begin to believe that they are bad people. Fearing this effect, some parents fail to exercise discipline at all, which can hinder the development of strong moral standards.

The most effective response to bad behavior is to express disappointment. According to independent reviews by Professor Eisenberg and David R. Shaffer, parents raise caring children by expressing disappointment and explaining why the behavior was wrong, how it affected others, and how
they can rectify the situation. This enables children to develop standards for judging their actions, feelings of empathy and responsibility for others, and a sense of moral identity, which are conducive to becoming a helpful person. The beauty of expressing disappointment is that it communicates disapproval of the bad behavior, coupled with high expectations and the potential for improvement: “You’re a good person, even if you did a bad thing, and I know you can do better.”

As powerful as it is to criticize bad behavior and praise good character, raising a generous child involves more than waiting for opportunities to react to the actions of our children. As parents, we want to be proactive in communicating our values to our children. Yet many of us do this the wrong way.

In a classic experiment, the psychologist J. Philippe Rushton gave 140 elementary- and middle-school-age children tokens for winning a game, which they could keep entirely or donate some to a child in poverty. They first watched a teacher figure play the game either selfishly or generously, and then preach to them the value of taking, giving or neither. The adult’s influence was significant: Actions spoke louder than words. When the adult behaved selfishly, children followed suit. The words didn’t make much difference — children gave fewer tokens after observing the adult’s selfish actions, regardless of whether the adult verbally advocated selfishness or generosity. When the adult acted generously, students gave the same amount whether generosity was preached or not — they donated 85 percent more than the norm in both cases. When the adult preached selfishness, even after the adult acted generously, the students still gave 49 percent more than the norm. Children learn generosity not by listening to what their role models say, but by observing what they do.

To test whether these role-modeling effects persisted over time, two months later researchers observed the children playing the game again. Would the modeling or the preaching influence whether the children gave — and would they even remember it from two months earlier?

The most generous children were those who watched the teacher give but not say anything. Two months later, these children were 31 percent more generous than those who observed the same behavior but also heard it preached. The message from this research is loud and clear: If you don’t model generosity, preaching it may not help in the short run, and in the long run, preaching is less effective than giving while saying nothing at all.

People often believe that character causes action, but when it comes to producing moral children, we need to remember that action also shapes character. As the psychologist Karl Weick is fond of asking, “How can I know who I am until I see what I do? How can I know what I value until I see where I walk?”

Adam Grant  is a professor of management and psychology at the Wharton School of the University of Pennsylvania and the author of “Give and Take: Why Helping Others Drives Our Success.”

— ————————————————-

Praise for intelligence can undermine children’s motivation and performance.
Mueller, Claudia M.; Dweck, Carol S.
Journal of Personality and Social Psychology, Vol 75(1), Jul 1998, 33-52. doi: 10.1037/0022-3514.75.1.33
Abstract
Praise for ability is commonly considered to have beneficial effects on motivation. Contrary to this popular belief, six studies demonstrated that praise for intelligence had more negative consequences for students’ achievement motivation than praise for effort. Fifth graders praised for intelligence were found to care more about performance goals relative to learning goals than children praised for effort. After failure, they also displayed less task persistence, less task enjoyment, more low-ability attributions, and worse task performance than children praised for effort. Finally, children praised for intelligence described it as a fixed trait more than children praised for hard work, who believed it to be subject to improvement. These findings have important implications for how achievement is best encouraged, as well as for more theoretical issues, such as the potential cost of performance goals and the socialization of contingent self-worth. (PsycINFO Database Record (c) 2012 APA, all rights reserved) Value Hierarchies Across Cultures

Taking a Similarities Perspective
Shalom H. Schwartz ,Anat Bardi
Abstract
Beyond the striking differences in the value priorities of groups is a surprisingly widespread consensus regarding the hierarchical order of values. Average value hierarchies of representative and near representative samples from 13 nations exhibit a similar pattern that replicates with school teachers in 56 nations and college students in 54 nations. Benevolence, self-direction, and universalism values are consistently most important; power, tradition, and stimulation values are least important; and security, conformity, achievement, and hedonism are in between. Value hierarchies of 83% of samples correlate at least .80 with this pan-cultural hierarchy. To explain the pan-cultural hierarchy, the authors discuss its adaptive functions in meeting the requirements of successful societal functioning. The authors demonstrate, with data from Singapore and the United States, that correctly interpreting the value hierarchies of groups requires comparison with the pan-cultural normative baseline.

Cultural Bases for Self-Evaluation
Seeing Oneself Positively in Different Cultural Contexts
Maja Becker,Vivian L. Vignoles,Ellinor Owe,Matthew J. Easterbrook,Rupert Brown,Peter B. Smith
Michael Harris Bond,Camillo Regalia,Claudia Manzi,Maria Brambilla,Said Aldhafri,Roberto González
Diego Carrasco,Maria Paz Cadena,Siugmin Lay,Inge Schweiger Gallo,Ana Torres,Leoncio Camino
Emre Özgen,Ülkü E. Güner,Nil Yamakoğlu,Flávia Cristina Silveira Lemos,Elvia Vargas Trujillo
Paola Balanta,Ma. Elizabeth J. Macapagal,M. Cristina Ferreira,Ginette Herman,Isabelle de Sauvage
David Bourguignon,Qian Wang,Márta Fülöp,Charles Harb,Aneta Chybicka,Kassahun Habtamu Mekonnen
Mariana Martin,George Nizharadze,Alin Gavreliuc,Johanna Buitendach,Aune Valk,Silvia H. Koller
Personality and Social Psychology Bulletin May 1, 2014 40: 657-675
Abstract
Several theories propose that self-esteem, or positive self-regard, results from fulfilling the value priorities of one’s surrounding culture. Yet, surprisingly little evidence exists for this assertion, and theories differ about whether individuals must personally endorse the value priorities involved. We compared the influence of four bases for self-evaluation (controlling one’s life, doing one’s duty, benefitting others, achieving social status) among 4,852 adolescents across 20 cultural samples, using an implicit, within-person measurement technique to avoid cultural response biases. Cross-sectional and longitudinal analyses showed that participants generally derived feelings of self-esteem from all four bases, but especially from those that were most consistent with the value priorities of others in their cultural context. Multilevel analyses confirmed that the bases of positive self-regard are sustained collectively: They are predictably moderated by culturally normative values but show little systematic variation with personally endorsed values.

Social Forces June 1, 2013 91: 1499-1528
Position and Disposition: The Contextual Development of Human Values
Kyle C. Longest, Steven Hitlin, Stephen Vaisey
Abstract
Research on the importance of values often focuses primarily on one domain of social predictors (e.g., economic) or limits its scope to a single dimension of values. We conduct a simultaneous analysis of a wide range of theoretically important social influences and a more complete range of individuals’ value orientations, focusing both on value ratings and rankings. Results indicate that traditional institutions such as religion and parenthood are associated with more concern for the welfare of others and maintaining the status quo, whereas more individually oriented occupational factors like higher income and self-employment are linked to achievement and change-related values. Yet several factors, such as education and gender, have complex associations when individual values are examined as part of a coherent system rather than in isolation.
© The Author 2013. Published by Oxford University Press on behalf of the University of North Carolina at Chapel Hill

Journal of Cross-Cultural Psychology May 1, 2007 38: 333-360
What Defines the Good Person? Cross-Cultural Comparisons of Experts’ Models With Lay Prototypes
Kyle D. Smith ,Seyda Türk Smith , John Chambers Christopher
Abstract
“Good” is a fundamental concept present in all cultures, and experts in values and positive psychology have mapped good’s many aspects in human beings. Which aspects do laypersons typically access and consider as they make everyday judgments of goodness? Does the answer vary with culture? To address these questions, the authors compiled prototypes of the good person from laypersons’ free-listings in seven cultures and used experts’ classifications to content-analyze and compare the prototypes. Benevolence, conformity, and traditionalism dominated the features that laypersons frequently attributed to good people. Other features—competence in particular—varied widely in their accessibility across cultures. These findings depart from those obtained in research using expert-designed self-report inventories, highlighting the need to consider everyday accessibility when comparing cultures’ definitions of the good person.

February 25, 2015 The American Way of Equality

The American Way of Equality
By DAVID BROOKS

PDF of this item:

Income inequality is on the rise. The rich are getting better at passing their advantages on to their kids. Lifestyle and values gaps are widening between the educated and uneducated. So the big issue is: Will Americans demand new policies to reverse these trends — to redistribute wealth, to provide greater economic security? Are we about to see a mass populist movement in this country?

Nobody was smarter on this subject than Seymour Martin Lipset, the eminent sociologist who died at 84 on New Year’s Eve. Lipset had been a socialist in the hothouse atmosphere of City College during the 1940s, and though he later became a moderate Democrat, he continued to wonder, with some regret, why America never had a serious socialist movement, why America never adopted a European-style welfare state.

Lipset was aware of the structural and demographic answers to such questions. For example, racially diverse nations tend to have lower levels of social support than homogeneous ones. People don’t feel as bound together when they are divided on ethnic lines and are less likely to embrace mutual support programs. You can have diversity or a big welfare state. It’s hard to have both.

But as he studied these matters, Lipset moved away from structural or demographic explanations (too many counterexamples). He drifted, as Tocqueville and Werner Sombart had before him, to values.

America never had a feudal past, so nobody has a sense of social place or class-consciousness, Lipset observed. Meanwhile, Americans have inherited from their Puritan forebears a sense that they have a spiritual obligation to rise and succeed.

Two great themes run through American history, Lipset wrote in his 1963 book “The First New Nation”: achievement and equality. These are often in tension because when you leave unequally endowed people free to achieve, you get unequal results.

Though Lipset never quite put it this way, the clear message from his writings is that when achievement and equality clash in America, achievement wins. Or to be more precise, the achievement ethos reshapes the definition of equality. When Americans use the word “equality,” they really mean “fair opportunity.” When Americans use the word “freedom,” they really mean “opportunity.”

Lipset was relentlessly empirical, and rested his conclusions on data as well as history and philosophy. He found that Americans have for centuries embraced individualistic, meritocratic, antistatist values, even at times when income inequality was greater than it is today.

Large majorities of Americans have always believed that individuals are responsible for their own success, Lipset reported, while people in other countries are much more likely to point to forces beyond individual control. Sixty-five percent of Americans believe hard work is the key to success; only 12 percent think luck plays a major role.

In his “American Exceptionalism” (1996), Lipset pointed out that 78 percent of Americans endorse the view that “the strength of this country today is mostly based on the success of American business.” Fewer than a third of all Americans believe the state has a responsibility to reduce income disparities, compared with 82 percent of Italians. Over 70 percent of Americans believe “individuals should take more responsibility for providing for themselves” whereas most Japanese believe “the state should take more responsibility to ensure everyone is provided for.”

America, he concluded, is an outlier, an exceptional nation. And though his patriotism pervaded his writing, he emphasized that American exceptionalism is “a double-edged sword.”

Political movements that run afoul of these individualistic, achievement-oriented values rarely prosper. The Democratic Party is now divided between moderates — who emphasize individual responsibility and education to ameliorate inequality — and progressive populists, who advocate an activist state that will protect people from forces beyond their control. Given the deep forces in American history, the centrists will almost certainly win out.

Indeed, the most amazing thing about the past week is how modest the Democratic agenda has been. Democrats have been out of power in Congress for 12 years. They finally get a chance to legislate and they push through a series of small proposals that are little pebbles compared to the vast economic problems they described during the campaign.

They grasp the realities Marty Lipset described. They understand that in the face of inequality, Americans have usually opted for policies that offer more opportunity, not those emphasizing security or redistribution. American domestic policy is drifting leftward, but there are sharp limits on how far it will go.The American Way of Equality

January 28, 2015 Drones and the Democracy Disconnect

             Drones and the Democracy Disconnect
By Firmin DeBrabander

PDF copy this item:

With President Obama’s announcement that we will open a new battlefront in yet another Middle Eastern country — in Syria, against ISIS (the Islamic State in Iraq and Syria) — there is widespread acknowledgement that it will be a protracted, complex, perhaps even messy campaign, with many unforeseeable consequences along the way. The president has said we will put “no boots on the ground” in Syria; he is wary of simply flooding allies on the ground with arms, for fear that they will fall into the wrong hands — as they already have. Obama wants to strike against ISIS in a part of Syria that is currently outside the authority of the Syrian government, which the president has accused of war crimes, and is thus, in our eyes, a legal no-man’s land. He has also made clear that he is ready to go it alone in directing attacks on ISIS — he has asked for Congress’s support, but is not seeking their authorization. All these signs point to drones playing a prominent role in this new war in Syria.

Increasingly, this is how the United States chooses to fight its wars. Drones lead the way and dominate the fight against the several non-state actors we now engage — Al Qaeda, the Shabab in Somalia and now ISIS. Drones have their benefits: They enable us to fight ISIS without getting mired on the ground or suffering casualties, making them politically powerful and appealing. For the moment, the American public favors striking ISIS; that would likely change if our own ground forces were involved.

If any group deserves drone strikes, it may well be ISIS.

This fundamentalist Muslim group, so brutal that even Al Qaeda shunned it, has taken to forcibly converting and exterminating Christians and other minority religious groups; one such minority, the Yazidis, may have narrowly escaped genocide at ISIS’ hands. The West has received vivid proof of the group’s ferocity. On Saturday it released its third video of a beheading — this time of a British aid worker — after two other such videos of the beheadings of American journalists within the past month.

The use of drones raises not just strategic and political problems, but ethical questions as well. In particular, what does our use of and reliance on drones say about us? How do drones affect the nation that endorses them — overtly, or, as is more often the case, tacitly? Are drones compatible with patriotism? With democracy? Honor? Glory? Or do they, as I fear, represent — and exacerbate — a troubling, even obscene disconnect between the American people and the wars waged in our name?

Writing in The Guardian in 2012, George Monbiot declared the United States’ drone strikes in Pakistan cowardly. He echoed the howls of many Pakistanis on the ground, who suffer the drone onslaught firsthand, while those who carry it out are safely removed thousands of miles away. The new breed of warriors is strange indeed: They are safely ensconced here in the United States, often commuting to work like ordinary citizens, and after a day spent monitoring and perhaps striking enemy targets, they return home to kids, homework and dinner.

Drone apologists, and many defense experts, claim drones are a reasonable development in warfare technology. The Slate commentator Jamie Holmes argues that extreme complaints about military innovations are hardly new. To people like Monbiot, or the BBC commentator Jeremy Clarkson, who scoffs that medals for drone pilots “should feature an armchair and a Coke machine or two crossed burgers,” Holmes says, “the hyperventilating about heroism being killed by machines misses the point. For one, the list of weapons once considered ‘cowardly’ … include[s] not only the submarines of World War I but also the bow and arrow and the gun. The point of each of these technologies was the same: to gain an asymmetrical advantage against adversaries and reduce risk.”

There are few philosophers more clear-eyed, frank, even cynical when it comes to war than Niccolò Machiavelli. In “The Prince,” he asserts that war is inescapable, inevitable. He praises the Romans for understanding the danger in putting it off. To the simple question of when you should go to war, Machiavelli’s simple answer is, “When you can” — not when it is just, or “right.” And yet, in another work, “The Art of War,” Machiavelli reveals that how a nation goes to war, how a nation chooses to fight is just as critical, perhaps even more so. At this point, the issue of military technology is pertinent, and Machiavelli’s discussion of the topic is highly reminiscent of our current debate about drones and character.

In “The Art of War,” Machiavelli again praises the ancient Romans, for their battlefield exploits, and states his worry that newly introduced artillery “prevents men from employing and displaying their virtue as they used to do of old.” Machiavelli ultimately dismisses such fears — though he was only contemplating cannons at the time. But elsewhere he declares that “whoever engages in war must use every means to put himself in a position of facing his enemy in the field and beating him there,” since a field war “is the most necessary and honorable of all wars.” Why is this? Because on the battlefield, military discipline and courage are exhibited and forged, and your opponent gets a true taste of what he’s up against — not only the army, but the nation he is up against.

For Machiavelli, military conduct is a reflection, indeed an extension — better yet, the root and foundation of a nation’s character, the bravery and boldness of its leaders, the devotion and determination of its citizens. Military conduct is indelibly linked to civic virtue, which is why he argues that nations should reject a professional army, much less a mercenary one, in favor of a citizen militia. Every citizen must get a taste of military discipline — and sacrifice. Every citizen must have a stake, an intimate investment, in the wars his nation fights.

Machiavelli was highly sensitive to the role military glory plays in inspiring the public and uniting, even defining, a nation. Great battles and military campaigns forged the identity, cohesion and indomitable pride of the Roman Republic, Machiavelli maintains — across the different social classes — and stoked the democratic energy of the people. Haven’t they served a similar purpose in our own republic? War has offered iconic images of our national identity: George Washington crossing the Delaware with his ragtag soldiers; marines hoisting the flag at Iwo Jima. These images are inherently democratic — they offer no king on his steed, lording over kneeling troops. To that extent, they nourish and reinforce our democratic identity and sensibilities.

This is no longer the case in the age of drones. I have strained to imagine the great battles drones might fight, which the public might rally around and solemnly commemorate. But this is a silly proposition — which cuts to the heart of the matter.

Never have the American people been more removed from their wars, even while we are the most martial nation on earth, and drones are symptoms, and drivers, of this troubling alienation. The United States has been engaged in two expensive and protracted wars in the past decade, as well as the seemingly endless war on terror spread the world over. The war in Afghanistan — where drones have made their mark as never before — is the longest in the nation’s history, and we have spent more money rebuilding Afghanistan than we did on Europe after World War II. Through all our recent wars in the region, however, most Americans have hardly felt a thing. Given the extent of our military engagement, unparalleled in the world, that is astounding, shameful even, and politically treacherous.

Critics have long warned that drones put too much war making power in the hands of few government actors, who increasingly operate on their own, or in the shadows. Many felt we saw a preview of political abuses to come when President Obama unilaterally ordered a drone strike against an American citizen in Yemen. This new technology has already emboldened our government to openly wage war in countries against which we have not officially declared war. We operate there with the tacit, and dubious, assent of a few ruling interests.

Perhaps it is not inevitable that drones are linked to arbitrary, centralized government; perhaps drone warfare can be waged transparently, democratically, legally, though it is admittedly hard to imagine what that would look like. What is certain, however, is that drone technology offers manifold temptations to those who would expand the borders of our wars, or wage war according to their own agenda, independent of the will, or interest, or attention of the American public.

Most American citizens are quick to let someone or something else bear the brunt of our wars, and take up the fight. Hence there is less worry about whether a given incursion is necessary, justified, logical or humane. Drones point to a new and terrible kind of cruelty — violence far removed from the perpetrator, and easier to inflict in that regard. With less skin in the game — literally — we can be less vigilant about the darker tendencies of our leaders, the unintended consequences of their actions, and content to indulge in private matters.

The United State is gradually becoming a warring nation with fewer and fewer warriors, and few who know the sacrifices of war. Drones represent the new normal, and are an easy invitation to enter into and wage war — indefinitely. This is a state of affairs Machiavelli could not abide by, and neither should we. It is antithetical to a democracy for its voting public to be so aloof from the wars it fights. It is a feature, I fear, of a democracy destined to lose that title.

Firmin DeBrabander, an associate professor of philosophy at the Maryland Institute College of Art, Baltimore, is the author of “Spinoza and the Stoics” and a forthcoming book critiquing the gun rights movement.

—————————-

Gemli, boston
People have been killing each other for a long time. The grim history of warfare is summed up succinctly in Kubrick’s “2001: A Space Odyssey,” when the film jumps in a single frame from a proto-human realizing that a thigh bone can crush a skull to an orbiting nuclear weapon. The intervening millennia are skipped because the details are unimportant. People will kill each other with whatever is handy. Thigh bones, drones, it’s all the same if you’re on the receiving end.

It’s hard to place a value on words like honor, glory and patriotism since I’m sure the Nazis used these as well. These words are rhetorical recruitment tools, employed whether we’re actually defending ourselves from evil or merely taking other people’s stuff. Sometimes it’s hard to tell.

We allowed the “darker tendencies of our leaders” to get us into two wars. These wars were terrible and cruel, and the consequences were unintended but not unpredictable. Drones didn’t lull us into these pointless and ruinously costly conflicts. We were drawn in the old fashioned way, with tales of WMDs and promises of a quick and easy victory. We ultimately left, but not before more soldiers were dying of suicide from moral injury than were being killed in battle.

ISIS believes in beheadings, female circumcision and stonings for trivial offenses. Honor is off the table. Machiavelli isn’t here. I don’t think he’ll mind if we send in the drones.

Steve Fankuchen     Oakland, Calif.
There is absolutely no relationship one way or the other between the use of drones and democracy. Democracy is about the way decisions get made. Drones are simply a weapon with which some decisions, right or wrong, are carried out. One would expect DeBrabander, a professor of philosophy, to be more precise in the use of language.

Calling the use of drones “cowardly” is absurd! War and sanctioned killing is not about a “fair fight;” it is about winning, however that may be defined.

As to being removed from intimate contact with the effects of one’s weapons, drones are nothing new. Two thousand years ago catapults were wrecking mayhem within walled villages, the results not immediately apparent to those who launched projectiles with them. Guns allowed individual soldiers not to see the close-up damage, and artillery, whether on land or on boats, carried the removal further. With the advent of planes and bombs, especially the B-52s and high altitude bombers, you could look in a gadget, push a button, and be done with it. Drones merely go a little step farther in a well-established continuum.

The author claims drones are cruel. How in the world is getting blown up by a drone any worse than getting shot by a gun, turned to hamburger by a mine or a V-2 rocket, or having one’s head cut off?

A fair fight? That is called sports unless, of course, you think Alexander Hamilton and Aaron Burr had the right idea, and we should bring back American honor and glory with duels.

MLP     Pittsburgh
If Machiavelli was right, if military conduct is indeed a reflection of a nation’s character, then drone warfare may well be the quintessential expression of contemporary American character: a nation of couch potatoes playing “video games” where real flesh and blood human beings are the targets but who are too insensitive and clueless to comprehend the inevitable consequences of their actions. Sooner or later, others will have drone technology capable of inflicting harm upon the United States, and when that happens the couch potatoes will call it “terrorism” and moan “why do they hate us?”

Bob Garcia     Miami
Drones are an example of our self-blindness, which is an important element of our Exceptionalism. For example, why aren’t drones considered weapons of terror? Look at the parallels between the use of hijacked planes on 9/11 and our now routine use of drones. The main difference is that by definition nothing we do is considered terrorism, whether with drones, torture, or kicking in doors of peasants at midnight. Or widespread use of contractors who are outside all law of accountability, just possible recall by their employers.

And have we legitimized the use of drones? For example, if the Chinese unilaterally decided to use drones to kill alleged Uighur or Tibetan terrorists in the United States, would it be reasonable for them to carry out such strikes in the United States — limited only by the practical matter of avoiding being shot out of the sky? Would it be accepted that sometimes they’d make a mistake and blow up a wedding party? Would it be OK for them to negotiate with Mexico or Honduras to site drone bases?

Harold V. McCoy       Pinehurst, NC, USA
If Mr. DeBrabander is looking for glory, honor and national character in war, he has obviously never been in one.

Chuck     S C
I can’t help but think of what Robert E Lee said:”It is well that war is so terrible, otherwise we would grow too fond of it.”

The further our technology removes us from the God-awful stench of war, the more likely we are to grow fond of it. That will lead to greater hubris and the nemesis that will inevitably follow will be devastating.

PogoWasRight     Melbourne Florida
It appears that The President, as many others before him, is continuing to weave a tangled web, one in which we ourselves could become entangled. As an old retired career military person, I have never been able to understand that if we drop a bomb on a home or factory or office in some foreign country, will we create a friend or will we create an enemy? All I am sure of is what I would become if it were done to me. A situation most times called “a no-brainer”. And that is not the end of all possibilities. Consider what is in store for us in the future when missile-armed drones become affordable and available to terrorists around the world, as they certainly will. Instead of flying from “here” to “there”, those drones will be flying from “there” to “here”. Not a pleasant thought. But an inevitable outcome. As Eliza Doolittle said: “Just You Wait!”

Hoff     Philadelphia, PA
The author neglects the reality of the modern battlefield when bemoaning our lack of intimacy with the opposition. The enemy we face doesn’t form ranks and march across the desert. He hides in civilian homes in areas removed from global society and conspires to release his own versions of remote weapons; theology-addled recruits on consciencless suicidal missions to achieve a glorious afterlife. And he does not hesitate to use his tools to their maximum range and effect. This is fundamental asymmetry to ‘honorable’ war-making, and to match the threat we spend multitudes of fortunes trying to stay on the high road.
To say that we feel nothing in the West is absurd in the extreme. We reap daily harvests of horrors through the press, our economy struggles to overcome the $trillions we spend on these conflicts, and our politics have devolved to fighting over the bloody scraps from administration after administration of strategic losses to nearly invisible enemies. You’ll have to forgive me for my lack of guilt over finding a way to score a win out of range of the stench of my adversaries’ death.

Bruce Balfe     Valparaiso
I fail to see the distinction between drones and ships lobbing ordinance from 30 miles out at sea or planes dropping bombs from 35000 feet. It is all long distance warfare. If we can conduct our wars, however distasteful they are in the first place, in a way that minimizes putting our troops in dangers way, then why not?

Chris Koz     Portland, OR

there needs to be a distinction made between the utilization of words like Patriotism, Honor, Glory by civilian leadership and military personnel. Civilians often use these words, even with the best of intentions, as a recruitment tool, a source of rhetoric, and to beat the drums of nationalism. The jingoism sold by politicians, the military industrial complex, and armchair warriors is largely self-serving.

Conversely, these words are the very vehicle by which the military chain of command survives and to argue, as some commentator’s have, honor does not matter when fighting an honor less enemy simply do not understand this code; the “disconnect” the writer speaks of grows out of the former.

January 14, 2015 – Rice, Psychology, and Innovation

Rice, Psychology, and Innovation

Rice Wheat and People.pdf

Joseph Henrich

Related Resources

Large-Scale Psychological Differences Within China Explained by Rice Versus Wheat Agriculture T. Talhelm et al. Science 9 May 2014: 603-608.

By the late 18th century, the earliest tremors of the industrial revolution were beginning to shake England. Fueled by a stream of innovations related to textiles, transportation, and steel manufacturing, this eruption of economic growth would soon engulf northern Europe, spread to Britain’s former colonies, and eventually transform the globe. For the first time, humanity would be sprung from the Malthusian trap. The question of why this revolution first emerged in northern Europe remains one of history’s great questions. If you stood overlooking the globe in 1000 CE, the most obvious candidates for igniting this engine were perhaps in China or the Middle East, but certainly not in Europe. Addressing this question, researchers have pointed to differences in geography, institutions, religions, and even genes (1, 2). On page 603 of this issue, Talhelm et al. (3) take an important step forward by fingering psychological differences in analytical thinking and individualism as an explanation for differences in innovation, and then linking these differences to culturally transmitted institutions, and ultimately to environmental differences that influence the feasibility of rice agriculture.

Decades of experimental research show that, compared to most populations in the world, people from societies that are Western, Educated, Industrialized, Rich, and Democratic (WEIRD) (4) are psychologically unusual, being both highly individualistic and analytically minded. High levels of individualism mean that people see themselves as independent from others and as characterized by a set of largely positive attributes. They willingly invest in new relationships even outside their kin, tribal, or religious groups. By contrast, in most other societies, people are enmeshed in dense, enduring networks of kith and kin on which they depend for cooperation, security, and personal identity. In such collectivistic societies, property is often corporately owned by kinship units such as clans; inherited relationships are enduring and people invest heavily in them, often at the expense of outsiders, strangers, or abstract principles (4).

Measuring analytical thinking and individualism.

To investigate the individualism and analytical thinking in participants from different agricultural regions in China, Talhelm et al. used three tests. They measured analytical thinking with a series of triads. Participants were given a target object, such as a rabbit, and asked which of two other objects it goes with. Analytic thinkers tend to match on categories, so rabbits and dogs go together. Holistic thinkers tend to match on relationships, so rabbits eat carrots. The authors also measured individualism in two ways. First, they asked participants to draw a sociogram, with labeled circles representing themselves and their friends. In this test, individualism is measured implicitly by how much bigger the “self” circle is relative to the average “friends” circle. Second, they assessed the nepotism (in-group loyalty) of participants by asking them about hypothetical scenarios in which they could reward or punish friends and strangers for helpful or harmful action.

CREDIT: P. HUEY/SCIENCE

Psychologically, growing up in an individualistic social world biases one toward the use of analytical reasoning, whereas exposure to more collectivistic environments favors holistic approaches. Thinking analytically means breaking things down into their constituent parts and assigning properties to those parts. Similarities are judged according to rule-based categories, and current trends are expected to continue. Holistic thinking, by contrast, focuses on relationships between objects or people anchored in their concrete contexts. Similarity is judged overall, not on the basis of logical rules. Trends are expected to be cyclical.

Various lines of evidence suggest that greater individualism and more analytical thinking are linked to innovation, novelty, and creativity (5). But why would northern Europe have had greater individualism and more analytical thinking in the first place? China, for example, was technologically advanced, institutionally complex, and relatively educated by the end of the first millennium. Why would Europe have been more individualist and analytically oriented than China?

Talhelm et al. hypothesized that different combinations of environments and technologies influence the cultural evolution of different forms of social organization. Under some techno-environmental conditions, only intensely cooperative social groups can endure, prosper, and spread. Although potentially applicable to many situations, including territorial defense and whale hunting, Talhelm et al. focus on the different labor requirements of paddy rice and wheat cultivation. By demanding intense cooperation, paddy rice cultivation fosters and reinforces the social norms that govern patrilineal clans. Growing up in strong clans creates a particular kind of collectivistic psychology. In contrast, wheat cultivation permits independent nuclear households and fosters more individualistic psychologies.

To test these ideas, Talhelm et al. used standard psychological tools (see the figure) to measure analytical thinking and individualism among university students sampled from Chinese provinces that vary in wheat versus rice cultivation. Focusing on China removes many of the confounding variables such as religion, heritage, and government that would bedevil any direct comparison between Europe and East Asia. The prediction is straightforward: Han Chinese from provinces cultivating relatively more wheat should tend to be more individualistic and analytically oriented.

Sure enough, participants from provinces more dependent on paddy rice cultivation were less analytically minded. The effects were big: The average number of analytical matches increased by about 56% in going from all-rice to no-rice cultivation. The results hold both nationwide and for the counties in the central provinces along the rice-wheat (north-south) border, where other differences are minimized.

Participants from rice-growing provinces were also less individualistic, drawing themselves roughly the same size as their friends, whereas those from wheat provinces drew themselves 1.5 mm larger. [This moves them only part of the way toward WEIRD people: Americans draw themselves 6 mm bigger than they draw others, and Europeans draw themselves 3.5 mm bigger (6).] People from rice provinces were also more likely to reward their friends and less likely to punish them, showing the in-group favoritism characteristic of collectivistic populations.

So, patterns of crop cultivation appear linked to psychological differences, but can these patterns really explain differences in innovation? Talhelm et al. provide some evidence for this by showing that less dependence on rice is associated with more successful patents for new inventions. This doesn’t nail it, but is consistent with the broader idea and will no doubt drive much future inquiry. For example, these insights may help explain why the embers of an 11th century industrial revolution in China were smothered as northern invasions and climate change drove people into the southern rice paddy regions, where clans had an ecological edge, and by the emergence of state-level political and legal institutions that reinforced the power of clans (7).

Cultural evolution arises from a rich interplay of ecology, social learning, institutions, and psychology. Environmental factors favor some types of family structures or forms of social organization over others. Honed and refined over generations, these institutions create the conditions to which children adapt developmentally, shaping their psychologies and brains. Long after their ecological causes have become irrelevant, these cultural psychologies and institutions continue to influence rates of innovation, the formation of new institutions, and the success of immigrants in new lands. As such, wheat farming may contribute to explaining the origins of WEIRD psychology and the industrial revolution.

References

  1. G. Clark, A Farewell to Alms: A Brief Economic History of the World (Princeton Univ. Press, Princeton, NJ, 2007).
  2. J. Mokyr, The Lever of Riches (Oxford Univ. Press, New York, 1990).
  3. T. Talhelm et al., Science 344, 603 (2014).
  4. J. Henrich, S. J. Heine, A. Norenzayan, Behav. Brain Sci. 33, 61 (2010).
  5. Y. Gorodnichenko, G. Roland, Proc. Natl. Acad. Sci. U.S.A. 108 (suppl. 4), 21316 (2011).
  6. S. Kitayama, H. Park, A. T. Sevincer, M. Karasawa, A. K. Uskul, J. Pers. Soc. Psychol. 97, 236 (2009). . 97, 236 (2009).
  7. A. Greif, G. Tabellini, Am. Econ. Rev. 100, 135 (2010).

OPTIONAL …………………………… ADDITIONAL READING BACKGROUND MATERIAL

Science 9 May 2014: Vol. 344 no. 6184 pp. 603-608
Large-Scale Psychological Differences Within China Explained by Rice Versus Wheat Agriculture T. Talhelm, X. Zhang, S. Oishi, C. Shimin, D. Duan, X. Lan, S. Kitayama

Cross-cultural psychologists have mostly contrasted East Asia with the West. However, this study shows that there are major psychological differences within China. We propose that a history of farming rice makes cultures more interdependent, whereas farming wheat makes cultures more independent, and these agricultural legacies continue to affect people in the modern world. We tested 1162 Han Chinese participants in six sites and found that rice-growing southern China is more interdependent and holistic-thinking than the wheat-growing north. To control for confounds like climate, we tested people from neighboring counties along the rice-wheat border and found differences that were just as large. We also find that modernization and pathogen prevalence theories do not fit the data.

— ——

Individualism Rules?

On a diverse and large set of cognitive tests, subjects in East Asian countries are more inclined to display collectivist choices, whereas subjects in the United States are more inclined to score as individualists. Talhelm et al. (p. 603; see the Perspective by Henrich) suggest that one historical source of influence was societal patterns of farming rice versus wheat, based on three cognitive measures of individualism and collectivism in 1000 subjects from rice- and wheat-growing regions in China.

Over the past 20 years, psychologists have cataloged a long list of differences between East and West (13). Western culture is more individualistic and analytic-thinking, whereas East Asian culture is more interdependent and holistic-thinking. Analytic thought uses abstract categories and formal reasoning, such as logical laws of noncontradiction—if A is true, then “not A” is false. Holistic thought is more intuitive and sometimes even embraces contradiction—both A and “not A” can be true.

Even though psychology has cataloged a long list of East-West differences, it still lacks an accepted explanation of what causes these differences. Building on subsistence style theory (1, 4), we offer the rice theory of culture and compare it with the modernization hypothesis (5) and the more recent pathogen prevalence theory (6).

The modernization hypothesis argues that, as societies become wealthier, more educated, and capitalistic, they become more individualistic and analytical. World Values Surveys (7) and studies on indigenous Mayans’ transition to a market economy (5) have given some support to the modernization hypothesis. But this theory has difficulty explaining why Japan, Korea, and Hong Kong are persistently collectivistic despite per-capita gross domestic products (GDPs) higher than that of the European Union.

The pathogen prevalence theory argues that a high prevalence of communicable diseases in some countries made it more dangerous to deal with strangers, making those cultures more insular and collectivistic (6). Studies have found that historical pathogen prevalence correlates with collectivism and lower openness to experience (6). However, pathogens are strongly correlated with heat (8). Because rice grows in hot areas, pathogens may be confounded with rice—a possibility that prior research did not control for.

The Rice Theory

The rice theory is an extension of subsistence style theory, which argues that some forms of subsistence (such as farming) require more functional interdependence than other forms (such as herding). At the same time, ecology narrows the types of subsistence that are possible. For example, paddy rice requires a significant amount of water. Over time, societies that have to cooperate intensely become more interdependent, whereas societies that do not have to depend on each other as much become more individualistic.

In the past, most subsistence research has compared herders and farmers, arguing that the independence and mobility of herding make herding cultures individualistic and that the stability and high labor demands of farming make farming cultures collectivistic (1). We argue that subsistence theory is incomplete because it lumps all farming together. Two of the most common subsistence crops—rice and wheat—are very different, and we argue that they lead to different cultures.

The two biggest differences between farming rice and wheat are irrigation and labor. Because rice paddies need standing water, people in rice regions build elaborate irrigation systems that require farmers to cooperate. In irrigation networks, one family’s water use can affect their neighbors, so rice farmers have to coordinate their water use. Irrigation networks also require many hours each year to build, dredge, and drain—a burden that often falls on villages, not isolated individuals.

Paddy rice also requires an extraordinary amount of work. Agricultural anthropologists visiting premodern China observed the number of hours farmers worked and found that growing paddy rice required at least twice the number of hours as wheat (9). The difference in man-hours was not a difference only noticeable to scientists. Medieval Chinese people grew both wheat and rice, and they were aware of the huge labor difference between the two. A Chinese farming guide in the 1600s advised people, “If one is short of labor power, it is best to grow wheat” [quoted in (10)]. A Chinese anthropologist in the 1930s concluded that a husband and wife would not be able to farm a large enough plot of rice to support the family if they relied on only their own labor (11). Strict self-reliance might have meant starvation.

To deal with the massive labor requirements, farmers in rice villages from India to Malaysia and Japan form cooperative labor exchanges (12). Farmers also coordinate their planting dates so that different families harvest at different times, allowing them to help in each others’ fields (12). These labor exchanges are most common during transplanting and harvesting, which need to be done in a short window of time, creating an urgent need for labor. In economic terms, paddy rice makes cooperation more valuable. This encourages rice farmers to cooperate intensely, form tight relationships based on reciprocity, and avoid behaviors that create conflict.

In comparison, wheat is easier to grow. Wheat does not need to be irrigated, so wheat farmers can rely on rainfall, which they do not coordinate with their neighbors. Planting and harvesting wheat certainly takes work, but only half as much as rice (9). The lighter burden means farmers can look after their own plots without relying as much on their neighbors.

One point of clarification about the rice theory is that it applies to rice regions, not just the people farming rice. It is a safe bet that none of our thousand participants have actually farmed rice or wheat for a living. Instead, the theory is that cultures that farm rice and wheat over thousands of years pass on rice or wheat cultures, even after most people put down their plows. Simply put, you do not need to farm rice yourself to inherit rice culture.

We propose that the rice theory can partly explain East-West differences. Prior subsistence theory cannot fully explain East-West differences because it focuses on herding versus farming (1), which is not the main East-West difference. Several Western regions herd, such as parts of Scotland and Switzerland, but the bulk of Europe historically farmed wheat (and similarly grown crops, such as barley). Instead, rice-wheat is the main East-West difference, and psychologists have not studied it.

The easiest way to test whether rice and wheat lead to different cultures is to show that rice areas (East Asia) are interdependent and that wheat areas (the West) are independent. But that logic is obviously flawed. We cannot just compare East and West because they differ on many factors besides rice and wheat—religion, politics, and technology, to name a few. A more convincing test case would be a country that has a shared history, government, language, and religion, but farms rice in some areas and wheat in other areas.

China as a Natural Test Case

Han China is a fitting natural test case because it has traditionally grown both rice and wheat but is more ethnically and politically unified than, say, Europe or sub-Saharan Africa. China is over 90% Han Chinese, and the same dynasties have ruled over the wheat and rice cores for most of the past few thousands of years, which controls for some of the major variables that confound East-West comparisons.

Within China, the Yangtze River splits the wheat-growing north from the rice-growing south (Fig. 1). For generations, northern China has grown wheat, and southern China has grown rice. Of course, two regions can never be 100% equivalent. There are differences such as climate and spoken dialect between north and south. To rule out these smaller differences, we report additional analyses that compare people from neighboring counties along the rice-wheat border.

.

Fig. 1 Percent of cultivated land devoted to rice paddies in 1996.

Three major herding provinces are not shaded: Tibet, Xinjiang, and Inner Mongolia. Along the rice-wheat border (highlighted), people from the rice counties thought more holistically than their neighbors in wheat counties.

Three Predictions

The three theories make different predictions about which parts of China should be the most interdependent. First, the modernization hypothesis predicts that the least-developed provinces should be the most interdependent. Development has been uneven in China partly because in the late 1970s Deng Xiaoping made several areas along the southeast coast “special economic zones” open to foreign trade. This policy has given southeastern provinces like Guangdong a GDP per capita about 3.5 times that of interior provinces like Guizhou (13). That is roughly the ratio difference between the United States and Kazakhstan. Thus, modernization would predict the highest collectivism in China’s least-developed interior provinces.

Second, pathogen prevalence theory predicts a gradual rise in interdependence from north to south because pathogens rise gradually along with temperatures (8). Among Chinese provinces, overall pathogen rates and latitude are correlated: r(20) = –0.49, P = 0.02 (14). Furthermore, pathogen theory would predict the highest interdependence in the southwest, which has the highest rates of infectious disease death.

Third, the rice theory predicts the highest interdependence in the south and east. Unlike pathogens, rice is not the highest in the southernmost provinces. Instead, rice is concentrated in the east around Shanghai, which has flat floodplains ideal for growing rice. The rice theory also predicts a sharp divide along the rice-wheat border, which is different from the gradual rise of pathogens with climate.

To measure the prevalence of rice farming, we used statistical yearbook data on the percentage of cultivated land in each province devoted to rice paddies (13). Because some rice is grown with less labor on dry land (without paddies), we used statistics on rice paddies, rather than rice output. Because we wanted to assess the crop that different regions farmed traditionally, rather than figures affected by recent advances in irrigation and mechanization, we used rice statistics from 1996, the earliest available on the Bureau of Statistics Web site.

To test the modernization hypothesis, we collected GDP per capita for each province from the same year. To measure precontemporary disease prevalence, we used the earliest study we could find with disease rates in different provinces, from 1976 (15). Because the 1976 study did not cover 10 provinces, we also collected recent statistics (13). This increased the sample by four provinces. Both sources gave similar pictures: higher disease in the south and the highest in the southwest.

Samples

We tested 1162 Han Chinese students from six sites: Beijing (north), Fujian (southeast), Guangdong (south), Yunnan (southwest), Sichuan (west central), and Liaoning (northeast). We used three measures: a measure of cultural thought, implicit individualism, and loyalty/nepotism (described below). We chose these tasks because they are not self-report scales, avoiding the documented problems with use of self-report scales to measure cultural differences (16).

Results from these different sites show that rice-wheat differences held regardless of testing site (14). For all tasks, we analyzed only ethnic Han Chinese and excluded Han participants from the provinces of Tibet, Inner Mongolia, and Xinjiang. These areas are historically herding areas and have different ethnicities, cultures, languages, and religions that would confound our comparisons of rice and wheat.

We tested the main hypotheses with multilevel models because participants (level 1) were nested within provinces (level 2). We report correlations as an effect size at the province level that can be compared across variables. We calculated this by comparing the province-level variance of the models with and without the key predictor (Tables 1 to 3 report regression output).

View this table: In a new window

Table 1 Holistic thought hierarchical linear models for rice (28 provinces, 1019 participants), GDP per capita (28 provinces, 1019 participants), and pathogens (21 provinces, 725 participants).

See supplementary materials for detailed information on site effects and regressions with GDP, rice, and pathogens in a single model. Gender is coded as 0 = male and 1 = female.

View this table: In a new window

Table 2 Implicit individualism and loyalty/nepotism hierarchical linear models for rice, GDP per capita, and pathogens.

Implicit individualism N equals 28 provinces, 515 participants for rice and GDP. N equals 21 provinces and 452 participants for pathogens. Loyalty/nepotism N equals 27 provinces, 166 participants for rice and GDP. N equals 21 provinces and 146 participants for pathogens.

View this table: In a new window

Table 3 Divorce and invention regression models for rice, GDP per capita, and pathogens.

Divorces are calculated as divorces per marriage, with 27 provinces for rice and per-capita GDP models and 21 provinces for pathogens. Inventions are the log number of successful patents per capita. Inventions N equals 27 provinces for rice and GDP; N equals 21 for pathogens.

Our main dependent variable was a common measure of cultural thought, the triad task (17). The triad task shows participants lists of three items, such as train, bus, and tracks. Participants decide which two items should be paired together. Two of the items can be paired because they belong to the same abstract category (train and bus belong to the category vehicles), and two because they share a functional relationship (trains run on tracks). People from Western and individualistic cultures choose more abstract (analytic) pairings, whereas East Asians and people from other collectivistic cultures choose more relational (holistic) pairings (1, 17). We report scores as a percentage of holistic choices, where 100% is completely holistic and 0% is completely analytic.

We first tested the modernization hypothesis by testing whether people from provinces with lower GDP per capita thought more holistically. People from richer provinces actually thought more holistically: γ(25) = 0.52, P = 0.03, r = 0.46. (γ represents province-level HLM regression coefficients.)

We then tested the pathogen prevalence theory by testing whether provinces with higher rates of disease thought more holistically. Provinces with higher disease rates actually thought less holistically: γ(18) = –0.22, P = 0.04, r = –0.44.

The large-scale disease study from 1976 included statistics for 31 counties across China (15), which let us test the pathogen theory more precisely. Thus, we tested whether the 198 people in our sample who came from these 31 counties had different thought styles based on the historical disease prevalence in their county. Even with this finer precision, pathogen prevalence predicted thought style marginally in the wrong direction: γ(28) = –0.43, P = 0.08, r = –0.33.

The rice theory was the only model that fit the data (Fig. 2). People from provinces with a higher percentage of farmland devoted to rice paddies thought more holistically: γ(25) = 0.56, P = 0.007, r = 0.51. [Controlling for GDP per capita made little difference (table S1).]

View larger version: In a new window

ig. 2 Cultural thought style by percentage of cultivated area devoted to rice paddies.

Each circle represents a province. Circle size represents divorce and controls for effect of GDP. To illustrate cultural differences along the rice-wheat border, circles represent the rice and wheat border counties.

Northern and southern China also differ in several factors other than rice, such as climate, dialect, and contact with herding cultures. Therefore, we analyzed differences among neighboring counties along in the five central provinces along the rice-wheat border (Sichuan, Chongqing, Hubei, Anhui, and Jiangsu). Differences between neighboring counties are less likely to be due to climate or other third variables.

We gathered the rice cultivation statistics for each county in these provinces and split counties into rice and wheat counties. We defined rice counties as more than 50% of farmland devoted to rice paddies. Figure 1 depicts an example of the county split in the province of Anhui. The rice-wheat difference between neighboring counties can be stark. For example, in Anhui, Bozhou county farms only 2% rice, whereas neighboring Huainan county farms 67%. We tested for differences in cultural thought style, which had the largest sample, including 224 participants from the rice-wheat border.

People from the rice side of the border thought more holistically than people from the wheat side of the border: B(221) = 0.54, P < 0.001 (table S5). To compare the border effect size with the effect size for rice and wheat in all of China, we compared effect of a categorical rice-wheat variable. The effect sizes were similar (rice-wheat border, B = 0.53; all China, B = 0.43). (For group comparisons, wheat provinces are defined throughout as <50% farmland devoted to rice paddies; rice provinces as >50%.)

To test whether the findings generalize beyond thought style, we tested subsamples on two measures previously used for East-West cultural differences. The first was the sociogram task (n = 515), which has participants draw a diagram of their social network, with circles to represent the self and friends (18). Researchers measure how large participants draw the self versus how large they draw their friends to get an implicit measure of individualism (or self-inflation). A prior study found that Americans draw themselves about 6 mm bigger than they draw others, Europeans draw themselves 3.5 mm bigger, and Japanese draw themselves slightly smaller (18).

People from rice provinces were more likely than people from wheat provinces to draw themselves smaller than they drew their friends: γ(24) = –0.20, P = 0.03, r = 0.81 (fig. S2). On average, people from wheat provinces self-inflated 1.5 mm (closer to Europeans), and people from rice provinces self-inflated –0.03 mm (similar to Japanese).

Pathogen prevalence did not predict self-inflation on the sociogram task: γ(17) = 0.003, P = 0.95, r = 0. GDP per capita also failed to predict self-inflation: γ(24) = 0.04, P = 0.81, r = 0.

The second measure was the loyalty and nepotism task, which measures whether people draw a sharp distinction between how they treat friends versus strangers (n = 166). One defining feature of collectivistic cultures is that they draw a sharp distinction between friends and strangers (3). A previous study measured this by having people imagine going into a business deal with (i) an honest friend, (ii) a dishonest friend, (iii) an honest stranger, and (iv) a dishonest stranger (19). In the stories, the friend or stranger’s lies cause the participant to lose money in a business deal, and the honesty causes the participant to make more money. In each case, the participants have a chance to use their own money to reward or punish the other person.

The original study found that Singaporeans rewarded their friends much more than they punished them, which could be seen positively as loyalty or negatively as nepotism (19). Americans were much more likely than Singaporeans to punish their friends for bad behavior. We predicted that people from rice areas would be less likely to punish their friends than people from wheat areas.

We computed loyalty/nepotism as the amount they rewarded their friend minus the amount they punished their friend. People from rice provinces were more likely to show loyalty/nepotism: γ(25) = 2.45, P = 0.04, r = 0.49. In their treatment of strangers, people from rice and wheat provinces did not differ: γ(24) = –0.09, P = 0.90, r = 0.

Pathogen prevalence was not related to loyalty/nepotism: γ(19) = –0.13, P = 0.84, r = –0.08. GDP per capita did not predict loyalty/nepotism: γ(25) = 1.66, P = 0.36, r = 0.33.

In short, the results consistently showed that participants from rice provinces are more holistic-thinking, interdependent, and loyal/nepotistic than participants from the wheat provinces. However, one weakness of these studies is that the participants were all college students. To test whether the cultural differences extend beyond college students, we gathered provincial statistics on variables that have been linked to collectivism and analytic thought: divorce rates and patents for new inventions.

A prior study showed that individualistic countries have higher divorce rates, even controlling for gross national product per capita (20). Rice culture’s emphasis on avoiding conflict and preserving relationships may make people from rice cultures less willing to get divorced. We collected divorce statistics from the same statistical yearbook as the farming statistics, 1996. We also collected statistics from the 2000 and the 2010 yearbooks to track the differences over the past 15 years.

In China, modernization did predict divorce: wealthier provinces had more divorce: B(26) = 0.10, P = 0.01, β = 0.48. Adding rice to the model explained even more variation in divorce rates, with rice provinces having lower divorce rates: B(26) = –0.11, P = 0.005, β = –0.49. Pathogen prevalence did not predict divorce: B(20) = –0.01, P = 0.80, β = –0.07 (controlling for GDP). In 1996, wheat provinces had a 50% higher divorce rate than rice provinces. Although divorce rates have almost doubled in the past 15 years, the raw divorce rate gap between the wheat and rice provinces remained the same in the 2000 and 2010 statistics.

We also analyzed the number of successful patents for new inventions in each province because research has shown that analytic thinkers are better at measures of creativity and thinking of novel uses for ordinary objects (21). Within the United States, immigrants from individualistic cultures hold more patents for inventions (22).

We controlled for GDP per capita because wealthier provinces had more patents: B(27) = 2.19, P < 0.001, β = 0.75. Rice provinces had fewer successful patents for new inventions than wheat provinces: B(26) = –1.42, P < 0.001, β = –0.46. Pathogen prevalence did not predict patents: B(19) = –0.34, P = 0.24, β = –0.22. Wheat provinces had 30% more patents for inventions than rice provinces. This difference persisted through the 2000 statistics but not the 2010 statistics.

This study shows that China’s wheat and rice regions have different cultures. China’s rice regions have several markers of East Asian culture: more holistic thought, more interdependent self-construals, and lower divorce rates. The wheat-growing north looked more culturally similar to the West, with more analytic thought, individualism, and divorce. Furthermore, Table 4 presents an instrumental variable regression instrumental variable regression showing that climatic suitability for rice significantly predicts all of the cultural variables in this study, which suggests that reverse causality is unlikely.

View this table: In a new window

Table 4 Instrumental variable regressions.

In the topmost regression, “rice suitability” is a z score of the environmental suitability of each province for growing wetland rice based on the United Nations Food and Agriculture Organization’s Global Agro-ecological Zones database (27). In the five other regressions, “rice suitability” is the predicted rice from the topmost regression with rice suitability. Dash entries indicate not applicable.

How large are these differences compared with East-West differences? We compared results on our main task (cultural thought style) in our China sample to a prior U.S. sample. An East-West categorical variable had an effect of B = 0.78. In the China data, a categorical rice-wheat variable had an effect of B = 0.38 (table S2). This suggests that rice versus wheat can explain a portion of the variance in thought style between East and West but not all of it. It should also be noted that psychologists have found holistic thought in parts of the world beyond East Asia, which suggests holistic thought is not just an East-West difference. (23).

Modernization predicted divorce and patents, but why did it fail to predict the other differences? In China, modernization seems to have changed customs such as divorce, but perhaps the parts of culture and thought style we measured are more resistant to change. Or perhaps modernization simply takes more generations to change cultural interdependence and thought style. However, most of our participants were born after China’s reform and opening, which started in 1978. Furthermore, Japan, South Korea, and Hong Kong modernized much earlier than China, but they still score less individualistic on international studies of culture than their wealth would predict (fig. S2).

The rice theory can explain wealthy East Asia’s strangely persistent interdependence. China has a rice-wheat split, but Japan and South Korea are complete rice cultures. Most of China’s wheat provinces devote less than 20% of farmland to rice paddies. None of Japan’s 9 regions or South Korea’s 16 regions has that little rice (except for two outlying islands). Japan and Korea’s rice legacies could explain why they are still much less individualistic than similarly wealthy countries.

This study focuses on East Asia, but the rice theory also makes predictions about other parts of the world. For example, India has a large rice-wheat split. Indonesia and parts of West Africa have also traditionally farmed rice. If the rice theory is correct, we should find similar cultural differences there.

There are still unresolved questions with the rice theory. For example, studies can test whether irrigation is central to the effect of rice by comparing paddy rice with dryland rice cultures, which grow rice without irrigation. Studies can also explore how rice differences persist in the modern world, whether through values, institutions, or other mechanisms.

There is also the question of how long rice culture will persist after the majority of people stop farming rice. There is evidence that U.S. regions settled by Scottish and Irish herders have higher rates of violence, even though most locals stopped herding long ago (24). This is one example of how subsistence style can shape culture long after people have stopped relying on that subsistence style. In the case of China, only time will tell.

Psychologists, economists, and anthropologists have studied the effects of subsistence style and irrigation (1, 4, 25, 26). This study extends that work by using psychological measures to test differences resulting from rice and wheat agriculture. The rice theory provides a theoretical framework that might explain why East Asia is so much less individualistic than it “should be” based on its wealth. Finally, the rice theory can explain the large cultural differences within China, advancing a more nuanced picture of East Asian cultural diversity.

Supplementary Materials www.sciencemag.org/content/344/6184/603/suppl/DC1

Materials and Methods:Figs. S1 and S2, Tables S1 to S12

References (2849)

References and Notes

  1. R. E. Nisbett, K. Peng, I. Choi, A. Norenzayan

, Culture and systems of thought: Holistic versus analytic cognition. Psychol. Rev. 108, 291310 (2001).doi:10.1037/0033-295X.108.2.291 pmid:11381831

CrossRefMedlineWeb of Science Search Google Scholar

  1. H. R. Markus, S. Kitayama

, Culture and the self: Implications for cognition, emotion, and motivation. Psychol. Rev. 98, 224253 (1991).doi:10.1037/0033-295X.98.2.224

CrossRefWeb of Science Search Google Scholar

  1. H. C. Triandis, Individualism and Collectivism (Westview, Boulder, CO, 1995).
  2. J. W. Berry

, Independance and conformity in subsistence-level societies. J. Pers. Soc. Psychol. 7, 415418 (1967).doi:10.1037/h0025231 pmid:6065870

CrossRefMedlineWeb of Science Search Google Scholar

  1. P. M. Greenfield

, Linking social change and developmental change: Shifting pathways of human development. Dev. Psychol. 45, 401418 (2009).doi:10.1037/a0014726 pmid:19271827

CrossRefMedlineWeb of Science Search Google Scholar

  1. C. L. Fincher, R. Thornhill, D. R. Murray, M. Schaller

, Pathogen prevalence predicts human cross-cultural variability in individualism/collectivism. Proc. Biol. Sci. 275, 12791285 (2008).doi:10.1098/rspb.2008.0094 pmid:18302996

Abstract/FREE Full Text

  1. R. Inglehart

, Globalization and postmodern values. Wash. Q. 23, 215228 (2000).doi:10.1162/016366000560665

CrossRefWeb of Science Search Google Scholar

  1. V. Guernier, M. E. Hochberg, J.-F. Guégan

, Ecology drives the worldwide distribution of human diseases. PLOS Biol. 2, e141 (2004).doi:10.1371/journal.pbio.0020141 pmid:15208708

CrossRefMedline Search Google Scholar

  1. J. L. Buck, Land Utilization in China (Univ. Chicago Press, Chicago, IL, 1935).
  2. M. Elvin, in The Chinese Agricultural Economy, R. Barker, R. Sinha, B. Rose, Eds. (Westview, Boulder, CO, 1982), pp. 13–35.
  3. F. Xiaotong, Earthbound China: A Study of Rural Economy in Yunnan (Univ. Chicago Press, Chicago, IL, 1945).
  4. F. Bray, The Rice Economies: Technology and Development in Asian Societies (Blackwell, New York, 1986).
  5. State Statistical Bureau of the People’s Republic of China, China Statistical Yearbook (1996 and 2005).
  6. Materials and methods are available on Science Online.
  7. C. Junshi, T. C. Campbell, J. Li, R. Peto, Diet, Life-Style, and Mortality in China: A Study of the Characteristics of 65 Chinese Counties (Oxford Univ. Press, Oxford, 1990).
  8. K. Peng, R. E. Nisbett, N. Y. C. Wong

, Validity problems comparing values across cultures and possible solutions. Psychol. Methods 2, 329344 (1997).doi:10.1037/1082-989X.2.4.329

CrossRef Search Google Scholar

  1. L. J. Ji, Z. Zhang, R. E. Nisbett

, Is it culture or is it language? Examination of language effects in cross-cultural research on categorization. J. Pers. Soc. Psychol. 87, 5765 (2004).doi:10.1037/0022-3514.87.1.57 pmid:15250792

CrossRefMedlineWeb of Science Search Google Scholar

  1. S. Kitayama, H. Park,A. T. Sevincer, M. Karasawa, A. K. Uskul

, A cultural task analysis of implicit independence: Comparing North America, Western Europe, and East Asia. J. Pers. Soc. Psychol. 97, 236255 (2009).doi:10.1037/a0015999 pmid:19634973

CrossRefMedlineWeb of Science Search Google Scholar

  1. C. S. Wang, A. K.-Y. Leung, Y. H. M. See, X. Y. Gao

, The effects of culture and friendship on rewarding honesty and punishing deception. J. Exp. Psychol. 47, 12951299 (2011).

Search Google Scholar

  1. D. Lester

, Individualism and divorce. Psychol. Rep. 76, 258 (1995).doi:10.2466/pr0.1995.76.1.258

CrossRefWeb of Science Search Google Scholar

  1. H. A. Witkin, C. A. Moore, D. R. Goodenough, P. W. Cox

, Field-dependent and field-independent cognitive styles and their educational implications. Rev. Educ. Res. 47, 164 (1977).doi:10.3102/00346543047001001

FREE Full Text

  1. S. A. Shane

, Why do some societies invent more than others? J. Bus. Venturing 7, 2946 (1992).doi:10.1016/0883-9026(92)90033-N

CrossRefWeb of Science Search Google Scholar

  1. J. Henrich, S. J. Heine, A. Norenzayan

, The weirdest people in the world? Behav. Brain Sci. 33, 6183, discussion 83–135 (2010).doi:10.1017/S0140525X0999152X pmid:20550733

CrossRefMedlineWeb of Science Search Google Scholar

  1. D. Cohen, R. E. Nisbett, Culture of Honor (Westview, Boulder, CO, 1997).
  2. M. Harris, Cannibals and Kings (Random House, New York, 1977).
  3. M. Aoki, Toward a Comparative Institutional Analysis (MIT Press, Cambridge, MA, 2001).

Food and Agricultural Organization (FAO)/International Institute for Applied Systems Analysis (IIASA), Global Agro-ecological Zones (GAEZ v3.0) (2010).

  1. G. Hofstede, Culture’s Consequences: Comparing Values, Behaviors, Institutions, and Organizations Across Nations (Sage, Thousand Oaks, CA, ed. 2, 2001).
  2. E. Suh,E. Diener, S. Oishi, H. C. Triandis

, The shifting basis of life satisfaction judgments across cultures: Emotions versus norms. J. Pers. Soc. Psychol. 74, 482493 (1998).doi:10.1037/0022-3514.74.2.482

CrossRefWeb of Science Search Google Scholar

  1. M. J. Gelfand, D. P. S. Bhawuk, L. H. Nishii, D. J. Bechtold, “Individualism and collectivism,” in Culture, Leadership, and Organizations: The GLOBE Study of 62 Societies, R. J. House, P. J. Hanges, M. Javidan, P. W. Dorfman, V. Gupta, Eds. (Sage, Thousand Oaks, CA, 2004), pp. 437–512.
  2. A. Maddison, “Historical statistics for the world economy: 1-2003 AD,” www.ggdc.net/maddison/historical_statistics/horizontal-file_03-2007.xls
  3. A. F. Alesina,P. Giuliano, N. Nunn

, On the origins of gender roles: Women and the plough. Q. J. Econ. 128, 469530 (2013).doi:10.1093/qje/qjt005

Abstract/FREE Full Text

  1. R. Thornhill, C. L. Fincher, D. R. Murray,M. Schaller

, Zoonotic and non-zoonotic diseases in relation to human personality and societal values: Support for the parasite-stress model. Evol. Psychol. 8, 151169 (2010).pmid:22947787

Medline Search Google Scholar

  1. R. Inglehart, P. Norris, Rising Tide: Gender Equality and Cultural Change Around the World (Cambridge Univ. Press, Cambridge, 2003).
  2. Korea Statistical Information Service, Agricultural Census (Seoul, Korea, 1995).
  3. Japanese Statistics and Information Department, 7-8 Area by Type of Cultivated Land and District (1950–2000) (Tokyo, 2009).
  4. C. Holden, R. Mace

, Phylogenetic analysis of the evolution of lactose digestion in adults. Hum. Biol. 81, 597619 (2009).doi:10.3378/027.081.0609 pmid:20504185

CrossRefMedlineWeb of Science Search Google Scholar

  1. C. J. Holden, R. Mace

, Spread of cattle led to the loss of matrilineal descent in Africa: A coevolutionary analysis. Proc. R. Soc. London Ser. B 270, 24252433 (2003).doi:10.1098/rspb.2003.2535 pmid:14667331

Abstract/FREE Full Text

  1. M. Ruhlen, On the Origin of Language: Studies in Linguistic Taxonomy (Stanford Univ. Press, Stanford, CA, 1994).
  2. L. L. Cavalli-Sforza, P. Menozzi, A. Piazza, The History and Geography of Human Genes (Princeton Univ. Press, Princeton, NJ, 1994).
  3. J. Y. Chu, W. Huang, S. Q. Kuang, J. M. Wang, J. J. Xu, Z. T. Chu, Z. Q. Yang, K. Q. Lin, P. Li, M. Wu, Z. C. Geng, C. C. Tan, R. F. Du, L. Jin

, Genetic relationship of populations in China. Proc. Natl. Acad. Sci. U.S.A. 95, 1176311768 (1998).doi:10.1073/pnas.95.20.11763 pmid:9751739

Abstract/FREE Full Text

  1. B. Su, J. Xiao, P. Underhill, R. Deka, W. Zhang, J. Akey, W. Huang, D. Shen, D. Lu, J. Luo, J. Chu, J. Tan, P. Shen, R. Davis, L. Cavalli-Sforza, R. Chakraborty, M. Xiong, R. Du, P. Oefner, Z. Chen, L. Jin

, Y-Chromosome evidence for a northward migration of modern humans into Eastern Asia during the last Ice Age. Am. J. Hum. Genet. 65, 17181724 (1999).doi:10.1086/302680 pmid:10577926

CrossRefMedlineWeb of Science Search Google Scholar

  1. Y. C. Ding, S. Wooding,H. C. Harpending, H. C. Chi, H. P. Li, Y. X. Fu, J. F. Pang, Y. G. Yao, J. G. Yu, R. Moyzis, Y. Zhang

, Population structure and history in East Asia. Proc. Natl. Acad. Sci. U.S.A. 97, 1400314006 (2000).doi:10.1073/pnas.240441297 pmid:11095712

Abstract/FREE Full Text

  1. B. J. Calder, L. W. Phillips, A. M. Tybout

, Designing research for application. J. Consum. Res. 8, 197207 (1981).doi:10.1086/208856

CrossRefWeb of Science Search Google Scholar

  1. M. E. W. Varnum, I. Grossmann, S. Kitayama, R. E. Nisbett

, The origin of cultural differences in cognition: The social orientation hypothesis. Curr. Dir. Psychol. Sci. 19, 913 (2010).doi:10.1177/0963721409359301 pmid:20234850

Abstract/FREE Full Text

  1. D. Oyserman, S. W. Lee

, Does culture influence what and how we think? Effects of priming individualism and collectivism. Psychol. Bull. 134, 311342 (2008).doi:10.1037/0033-2909.134.2.311 pmid:18298274

CrossRefMedlineWeb of Science Search Google Scholar

  1. Y. Kashima, E. S. Kashima, GNP, Individualism

, climate, and pronoun drop: Is individualism determined by affluence and climate, or does language use play a role? J. Cross Cult. Psychol. 34, 125134 (2003).doi:10.1177/0022022102239159

Abstract/FREE Full Text

  1. This classification is slightly different from the 50% rice criterion that we use elsewhere. We excluded two southeastern wheat provinces (Yunnan and Guizhou, rice > 25%) because they are so far south that including them produces a subsample in which rice and temperature are correlated: r(16) = 0.48, P = 0.058. Thus, to get a subsample in which rice and temperature are uncorrelated, we chose the contiguous northern wheat area.
  2. E. Van de Vliert, Climate, Affluence, and Culture (Cambridge Univ. Press, New York, 2009).
  3. Acknowledgments: We thank Z. Xia, N. Qingyun, Y. Wu, Y. Wang, Y. Ma, and A Jiao for collecting data; A. Leung and C. Wang for making the loyalty/nepotism task available; L. Jun Ji for the Chinese version of the triad task; M. Hunter for statistical guidance; and J. P. Seder, A. Putnam, Y. Wang, T. Wilson, and E. Gilbert for comments on earlier versions of this paper. The data are available at the Inter-University Consortium for Political and Social Research (ICPSR no. 35027) or by request to the first author. This research was supported by a NSF Graduate Research Fellowship and a NSF East Asian and Pacific Summer Institute Fellowship. The Beijing Key Lab of Applied Experimental Psychology supplied laboratory space for the study.

May 28, 2014 Millenials

My So-Called Opinions

 April 6, 2014,                                   By ZACHARY FINE                           The Stone
Click here for a pdf version

I.
Critics of the millennial generation, of which I am a member, consistently use terms like “apathetic,” “lazy” and “narcissistic” to explain our tendency to be less civically and politically engaged. But what these critics seem to be missing is that many millennials are plagued not so much by apathy as by indecision. And it’s not surprising: Pluralism has been a large influence on our upbringing. While we applaud pluralism’s benefits, widespread enthusiasm has overwhelmed desperately needed criticism of its side effects.
By “pluralism,” I mean a cultural recognition of difference: individuals of varying race, gender, religious affiliation, politics and sexual preference, all exalted as equal. In recent decades, pluralism has come to be an ethical injunction, one that calls for people to peacefully accept and embrace, not simply tolerate, differences among individuals. Distinct from the free-for-all of relativism, pluralism encourages us (in concept) to support our own convictions while also upholding an “energetic engagement with diversity, ” as Harvard’s Pluralism Project suggested in 1991. Today, paeans to pluralism continue to sound throughout the halls of American universities, private institutions, left-leaning households and influential political circles.
Those of us born after the mid-1980s grew up amid a new orthodoxy of multiculturalist ethics and ‘political correctness.’
However, pluralism has had unforeseen consequences. The art critic Craig Owens once wrote that pluralism is not a “recognition, but a reduction of difference to absolute indifference, equivalence, interchangeability.” Some millennials who were greeted by pluralism in this battered state are still feelings its effects. Unlike those adults who encountered pluralism with their beliefs close at hand, we entered the world when truth-claims and qualitative judgments were already on trial and seemingly interchangeable. As a result, we continue to struggle when it comes to decisively avowing our most basic convictions.
Those of us born after the mid-1980s whose upbringing included a liberal arts education and the fruits of a fledgling World Wide Web have grown up (and are still growing up) with an endlessly accessible stream of texts, images and sounds from far-reaching times and places, much of which were unavailable to humans for all of history. Our most formative years include not just the birth of the Internet and the ensuing accelerated global exchange of information, but a new orthodoxy of multiculturalist ethics and “political correctness.”
These ideas were reinforced in many humanities departments in Western universities during the 1980s, where facts and claims to objectivity were eagerly jettisoned. Even “the canon” was dislodged from its historically privileged perch, and since then, many liberal-minded professors have avoided opining about “good” literature or “high art” to avoid reinstating an old hegemony. In college today, we continue to learn about the byproducts of absolute truths and intractable forms of ideology, which historically seem inextricably linked to bigotry and prejudice.
For instance, a student in one of my English classes was chastened for his preference for Shakespeare over that of the Haitian-American writer Edwidge Danticat. The professor challenged the student to apply a more “disinterested” analysis to his reading so as to avoid entangling himself in a misinformed gesture of “postcolonial oppression.” That student stopped raising his hand in class.
I am not trying to tackle the challenge as a whole or indict contemporary pedagogies, but I have to ask: How does the ethos of pluralism inside universities impinge on each student’s ability to make qualitative judgments outside of the classroom, in spaces of work, play, politics or even love?
II.
In 2004, the French sociologist of science Bruno Latour intimated that the skeptical attitude which rebuffs claims to absolute knowledge might have had a deleterious effect on the younger generation: “Good American kids are learning the hard way that facts are made up, that there is no such thing as natural, unmediated, unbiased access to truth, that we are always prisoners of language, that we always speak from a particular standpoint, and so on.” Latour identified a condition that resonates: Our tenuous claims to truth have not simply been learned in university classrooms or in reading theoretical texts but reinforced by the decentralized authority of the Internet. While trying to form our fundamental convictions in this dizzying digital and intellectual global landscape, some of us are finding it increasingly difficult to embrace qualitative judgments.
Matters of taste in music, art and fashion, for example, can become a source of anxiety and hesitation. While clickable ways of “liking” abound on the Internet, personalized avowals of taste often seem treacherous today. Admittedly, many millennials (and nonmillennials) might feel comfortable simply saying, “I like what I like,” but some of us find ourselves reeling in the face of choice. To affirm a preference for rap over classical music, for instance, implicates the well-meaning millennial in a web of judgments far beyond his control. For the millennial generation, as a result, confident expressions of taste have become more challenging, as aesthetic preference is subjected to relentless scrutiny.
Philosophers and social theorists have long weighed in on this issue of taste. Pierre Bourdieu claimed that an “encounter with a work of art is not ‘love at first sight’ as is generally supposed.” Rather, he thought “tastes” function as “markers of ‘class.’ ” Theodor Adorno and Max Horkheimer argued that aesthetic preference could be traced along socioeconomic lines and reinforce class divisions. To dislike cauliflower is one thing. But elevating the work of one writer or artist over another has become contested territory.
This assured expression of “I like what I like,” when strained through pluralist-inspired critical inquiry, deteriorates: “I like what I like” becomes “But why do I like what I like? Should I like what I like? Do I like it because someone else wants me to like it? If so, who profits and who suffers from my liking what I like?” and finally, “I am not sure I like what I like anymore.” For a number of us millennials, commitment to even seemingly simple aesthetic judgments have become shot through with indecision.
It seems especially odd because in our “postcritical” age, as the critic Hal Foster termed it, a diffusion of critical authority has elevated voices across a multitude of Internet platforms. With Facebook, Twitter and the blogosphere, everyone can be a critic. But for all the strident young voices heard across social media, there are so many more of us who abstain from being openly critical: Every judgment or critique has its weakness, making criticism seem dangerous at worst and impotent at best.
This narrative runs counter to the one that has been popularized in the press about the indefatigable verbiage of blog-hungry millennials, but it is a crucial one. The proliferation of voices has made most of them seem valueless and wholly interchangeable, even for important topics. To use social media to publicly weigh in on polarized debates, from the death of Trayvon Martin to the Supreme Court’s striking down of the Defense of Marriage Act, seems to do nothing more than provide fodder for those who would attack us. This haunts many of us when we are eager to spill ink on an issue of personal importance but find the page to be always already oversaturated.
III.
Perhaps most crucially, the pluralistic climate has confused stances on moral judgment. Even though “difference” has historically been used, according to the philosopher Cornel West, as a “justification for degradation and a justification for subordination,” we millennials labor to relish those differences and distances separating individuals, exalting difference at all costs.
We anxiously avoid casting moral judgment. Because with absolute truths elusive, what claims do we have to insist that our moral positions are better than those of someone from a different nation or culture?
Consider the challenge we might face when confronted with videos from the popular youth-oriented news outlet Vice. Here, viewers can watch videos of communities, from across the globe, participating in a host of culturally specific activities, ranging from excessive forms of eating to ritual violence to bestiality. While the greater Western culture may denounce these acts, a substantial millennial constituency would hesitate to condemn them, in the interest of embracing “difference.”
We millennials often seek refuge from the pluralist storm in that crawlspace provided by the expression “I don’t know.” It shelters the speaking-subject, whose utterances are magically made protean and porous. But this fancy footwork will buy us only so much time. We most certainly do not wish to remain crippled by indecision and hope to one day boldly stake out our own claims, without trepidation.
Zachary Fine is a junior at the New York University Gallatin School of Individualized Study

— ———–
Northstar5     Los Angeles
It’s interesting how many commentators say that the whole point of a college education is to learn other ways of seeing things. That’s just not true. That is only one part of the purpose of a college education.
Another huge and fundamental reason is to learn the great cannon of human knowledge—to learn it, to carry it forward, so that the legacy of human civilization is learned and passed on generation to generation. It is to be part of the continuum of learning, and learning involves absorbing known information. Science, philosophy, history, literature. Facts. Ways of thinking. Sound reasoning.
Learning to open your mind is important, but we must safeguard and treasure actual facts and knowledge that people toiled intensely to discover in the first place.
Don’t forsake truth and actual facts and knowledge in the name of political correctness and being ‘open.’
In the words of the great Richard Dawkins: Don’t open your mind so much that your brain falls out.

John Ombelets   Boston, MA
I’d recommend “Zen and the Art of Motorcycle Maintenance.” It puts questions of personal taste and the validity of “just what I like” in excellent perspective.
Mike in Colorado      Denver
Of all that you can say about “Millenials” (I am on the younger edge of the Baby Boom), this entire essay only addresses a single attribute – what I call the “whatever” generation -where any idea can be diminished by implying that nothing really matters. I never thought it a result of indecision, but a relativism that says that nothing is important except personal wants. It is a world view where peripheral vision is unimportant, usually “Boring!” The attribute of Millenials more distasteful to me is their implicit desire to be immediately recognized at a young age and adored for their abilities and how little they value the maturity gained with experience. There used to be a general acceptance that regardless of education, skills, and abilities, that many necessary soft skills took time and experience – the ability to manage and lead people, for example. What I see from Millenials is, “I’m great, I’m talented, and I want it now” – the result, I think, of a generation of parents saying “You’re great, you talented, you deserve it now.” My own parents were more likely to say, “You’re smart enough, but really, you haven’t got a clue.” Now in my 50’s I realize that they were right.

David Gutting     St. Louis
I have been personally responsible for overseeing an enormous amount of research on millennials, and I am convinced that this entire discussion is overblown.
Millennials are not that different from any younger generation that came before, especially around the issue Mr. Fine brings to light. When people are young they usually haven’t developed deeply informed opinions, and this often leads to either indecisiveness or to narrow minded views. (Advancing deeper into adulthood often doesn’t improve this situation.)
In the last 20 years, there has been a significant change in the way society as a whole weighs in on certain social issues, notably on marriage equality. The liberal tendency in that time has less to do with millennials leading the charge and more to do with the fact that more and more people have personal experience with gay people in their own friend and family circles–and this, in turn, has been a by-product of increasing openness in our culture. A much more open gay culture drove this with a hard-fought effort for many years.
Millennials, at best, are followers on social issues. While marriage equality has seen a leftward movement, reproductive choice has definitely turned in the other direction–with right wing interests succeeding in winning more and more restrictions. On paper and in the polls, millennials are more “pro-choice.” But they have done little to personally take up a pro-choice cause and push back against this trend.

T Cecil      Silicon Valley
As I was reading this I had a thought running through my head similar to your statement ““I like what I like” becomes “But why do I like what I like?”
if the results that follow from this self inquiry lead to pluralism or whatever, so be it, but I think the ability and instinct to question and analyze one’s likes and dislikes is a great evolutionary step for any generation.

Steven    NYC
So you’re suggesting that if there’s an imperfection in the current “system” then we should lunge back to the old straight-white-male hegemony of the past? I think we can work with pluralism. With any of its imperfections it’s still better than what we had in the past.      In reply to RG

Jason      St Louis, MO
Here’s an idea: grow a spine and get some actual scruples. If you feel strongly about something: gather evidence, form an informed opinion, and then make your voice heard. To me, this article doesn’t say that Millennials don’t have opinions or are afraid to voice them, it’s that they lack the fortitude to actually disagree with someone. There is a big difference between having a strong opinion and being culturally insensitive.
Also, maybe read some science or learn some math. Find out what it’s like to interact with a field of study that does have some objective truths…

RG      Chicago
Excellent analysis. The consequences of pluralism, multiculturalism, and moral relativism are now being seen. Some commenters just aren’t willing to accept the obvious: that pluralism / moral relativism has unintended consequences. The classical teachings in religion, literature, and culture are ridiculed and persecuted by the leftists. Universities have been taken over by social activist committed to their utopian ideal of “social justice” over the original ideal of truth. Open inquiry has been squashed by the politically correct. Humanities have devolved into deconstructed nonsense. The question, “what is a good life?” asked by philosophers throughout the ages is now unanswerable according to the moral relativists. There is no truth, only your biased version of the truth. In all prior countries where the quest for social justice replaces the ideal of truth a predictable outcome has occurred: tyranny and dystopia.

David Wiles      Northfield, MN
I’m struck by how often a discussion of a Liberal Arts education descends into or begins with a discussion of literature courses as if everyone majors in English. I teach in a small liberal arts college that attracts a “progressive” student body taught by a faculty that is widely thought of as being “liberal.” We offer degrees in forty or so subjects only one of which is English. A look at our course catalogue (easy to see, it’s online just like everyone’s is) and our requirements shows that our students are required to take no more than one lit course unless they major in English. A sample look at our English department offerings shows courses in Medieval and Renaissance lit, Chaucer, Milton, Victorian Lit, 19th Century American Lit, Marlowe (Chris not Phil), American Transcendentalism and at least three courses in Shakespeare among many other things that critics seem to think are no longer taught.
My point is that most critiques I read of the Liberal Arts don’t seem to be informed by much if any knowledge of what actually goes on. Instead we get anecdotes about what some teacher said to some student and they become the model. We get recycled debates from 35 years ago as if everything changed then and nothing has changed since. Let me suggest looking at the readily available evidence of what’s taught. Looking at evidence. There’s a name for that; critical thinking.

ARP   Northeastern US
This article is a protracted hasty generalization. The author takes it as self-explanatory that millions of people born in an arbitrary time period are “indecisive” and chalks it up to an ethos of “pluralism.” In doing so he makes a move that is not unfamiliar to mass media writing about generations in general: take a larger cultural anxiety and project it onto an imagined monolithic group. The “millennial” stereotype is often coded white, for instance. On the other hand, George Zimmermann is a millennial who doesn’t seem tied up by “pluralism.” This problem with the article comes back when Fine uncritically invokes the myth of political correctness, which is really just a moral panic propagated by people unhappy with the expansion of the curriculum.
Fine namechecks Adorno and Horkheimer, but obscures the argument. He and anyone else considering writing sweeping claims about “generations” might wish to revisit them. It is not that taste breaks down along class lines, but that the culture industry creates groups by reducing all humans to knowable categories. The reality of concepts like “Boomer” “Gen X,” “millennial,” etc. comes from somewhere. That reality is produced partly by mass media writers opining about “generations.” But it also comes from marketers who create the “millennial” as a category some people can identify with, in order to produce value more effectively.

Siobhan    New York
I’m surprised at all the criticisms of this piece. I thought it was great.
Mr. Fine has done a wonderful job of explaining the impossible task put before him and the rest of his generation. Every opinion or judgment must be weighed for its greater cultural and moral meaning, for its implications.
Yet supposedly, no one thing is better than another. And simultaneous with that, his generation is asked why they do not take a stand on things.
When they live simply by their own values, they are called narcissists. But a preference for Shakespeare over Haitian literature–well, that’s about as loaded as it gets.
Pluralism has turned itself inside out. Respect for all has become equated with all are equal, now make a choice.
I thank him for this interesting and illuminating piece of writing.

W. Freen      New York City
Zach: Thank you for your column. I would add that, because Millenials live so much of their lives online, they are constantly confronted by commenters who pick things apart and are eager to tell them that everything they think, write, say and do is wrong. The first batch of comments here in response to your column are perfect examples. I can’t imagine what it must be like to have lived one’s young life in the face of so much manufactured negativity. Try to spend more time offline and ignore the boo-birds.

gemli      Boston 19 hours ago
You poor dears. In an effort to provide anchors for your wobbly worldviews, here are some absolutes:
Good things, in no particular order: Coca Cola; Roger Ebert; Torvill and Dean; Science; Scientists; Northern Exposure, season 3, episode 10, “Seoul Mates”; Christopher Hitchens; Popular music before 1975; Director Mike Leigh; The Roches; David Copperfield’s Flying illusion; Jim Jarmusch; Eraserhead; 2001: A Space Odyssey; Playing an instrument; Books made of paper and ink; Good sound systems; HDTV; Grave of the Fireflies; Randy Newman; New Orleans food; Ricky Gervais; The (British) Office; Deadwood; Battlestar Gallactica (2002); most things by Pixar.
Bad things: Social media; Religion; 95% of the musical guests on Saturday Night Live; All -phobes and most -isms; Republican since Reagan; Things Republicans think; Things Republicans believe; low-information voters; new-age anything; products sold by infomercials; pseudoscience; the weather; local TV news; cheap ear buds; pretentious art; How I Met Your Mother; Generations of kids who can’t tell good from bad; e-anything (except Amazon.com and YouTube); teachers that tell kids all things are relative.
Make your own list. All of these are just suggestions (with the possible exception of Randy Newman).

Masaccio     Chicago
The point of going to college is to learn other ways to think about things. Maybe it was easier for me; I studied existentialism, and learned that no matter how hard the situation appeared, the responsibility for making decisions was on me and me alone.
I’m familiar with the arguments you hear in your classes; I’ve read a bunch of that stuff myself, and occasionally rail against it for the same reasons. It doesn’t matter. I’m still responsible, and I still have to act.
It’s not that hard.

Jack     NYC
In spite of the many valid criticisms below, I think this essay raises issues that should be considered seriously. Even here in the NY Times comment section, I have written non-offensive observations of cultural groups which are not published. These groups seem to have blanket protection from any form of criticism. This type of oppression, in the name of diversity or political correctness or whatever you want to call it, are preventing serious people from making observations about what is wrong with our society.
This bias against saying anything that might be considered offensive on the surface is strangling our ability to speak freely. I’m tired of having to try to figure out what’s going on by looking at the subtext beneath writing that is full of polite euphemisms.

David Underwood    Citrus Heights
Oh my, it appears to me that there is a need to study logic, rhetoric, and how to research history.
Although I went back to college in the 1970s, I do not recall any professors making such blatant contradictory claims.
“we entered the world when truth-claims and qualitative judgments were already on trial and seemingly interchangeable.” How would one know what truth meant in such a case? Unless you can define it, you can not make any statements about it.
Or this one:
“Good American kids are learning the hard way that facts are made up, that there is no such thing as natural, unmediated, unbiased access to truth, that we are always prisoners of language, that we always speak from a particular standpoint, and so on.”
Is that a fact I ask?
“We anxiously avoid casting moral judgment. Because with absolute truths elusive, what claims do we have to insist that our moral positions are better than those of someone from a different nation or culture?’
Is this a truth? if is not, then it is a contradiction in terms.
The whole article is full of claims that contradict each other. There are no truths, is that a truth?
You can peruse the article and find several instances of using a concept to refute itself, that as I recall is known as fallacious or circular reasoning.
Since I was a bit older than most of my classmates, I had no compunction challenging such statements. The question, is that a fact, usually made an impression.

David Wiles     Northfield, MN
Has the author considered that the group that consists of “(t)hose of us born after the mid-1980s whose upbringing included a liberal arts education” excludes the overwhelming majority of people born after the mid-1980s?
If he has considered this, why has he gone on to generalize about how “his” generation thinks?

SDK      Boston, MA
I sincerely doubt that anyone majoring in Math, Business, Engineering or Biology can sympathize with or even understand this article. With my undergraduate degree in cultural studies and 10 years in academia, I understood it very well.
I agree with the writer that there is an issue — but it is limited to the humanities and particularly that discipline at elite schools. In my degree (’93), I learned how to deconstruct — and that was it. Pointing out how power imbalances are re-inscribed in everything that seems good at first glance is the only skill one needs. That, and the ability to hyphenate most of your nouns and adjectives.
That’s a pity, because the work of tearing down is much easier than the work of building up, taking a position, and taking action. This is not taught in the humanities but it can be learned elsewhere.
My advice to Mr. Fine is to take a few business classes, some sociology, and some chemistry. Spend some time working in your community. In these places, you can start to use your critical eye in more creative and productive ways.

May 15, 2014 Diversity

 

Sotomayor’s Race Dissent

The most complete explanation of Barack Obama’s and Eric Holder’s reasoning on race.
By Daniel Henninger
April 30, 2014 The Wall Street Journal Click here for a pdf version

Attorney General Eric Holder, in a speech to Justice Department employees, praised Justice Sonia Sotomayor’s dissent in last week’s Supreme Court decision upholding Michigan’s ban on race-based admissions to its state universities. He called it “courageous and very personal.”

It was personal. Toward the end of her 58-page dissent, she said this about the six Justices who formed the plurality:

“More fundamentally,” Justice Sotomayor wrote, the plurality “ignores the importance of diversity in institutions of higher education and reveals how little my colleagues understand about the reality of race in America.” Those colleagues are Chief Justice Roberts and Justices Kennedy, Alito, Scalia, Breyer and Thomas.

Justice Sotomayor’s dissent in Schuette v. BAMN provides the most complete explanation I’ve seen of the reasoning behind the views on race of President Obama and Attorney General Holder. Over five years, the administration has repeatedly challenged various states on their voting practices, intervened to alter the racial composition of public-school populations and racial patterns in housing. Disagreement between Democrats and Republicans over voter ID laws has been particularly contentious.

Some of this is politics. But some of it is belief about the status of race in America a half century after passage of landmark civil-rights legislation in 1964.

“Race matters,” Justice Sotomayor wrote. It matters “because of persistent racial inequality that cannot be ignored and that has produced stark socioeconomic disparities.”

In 2006, Michigan voters by 58% approved a constitutional amendment that forbids the use of race-based preferences for admissions to the state’s universities. Eight other states have similar bans, including California.

Michigan’s ban on race-based admissions, says Justice Sotomayor, is not the result of “invidious intent” to discriminate as in the past. Instead the Michigan “majority” resorted to something that she calls “the last chapter of discrimination.” Its admissions amendment unfairly “changed the rules” of the political process. Prior to the amendment, she says, minorities persuaded Michigan’s elected Board of Regents to use “race-sensitive” university admissions policies. The voters’ ban eliminated the Regents’ policy and therefore “burdened racial minorities.”

Some, including the Court’s majority, would say the amendment was a proper exercise of the democratic political process. Justice Sotomayor replies: “While our Constitution does not guarantee minority groups victory in the political process . . . [i]t guarantees the majority may not win by stacking the political process against minority groups permanently, forcing the minority alone to surmount unique obstacles in pursuit of its goals.” In Michigan, that goal was the value of “racial diversity” in the student body.

Equal protection, she adds, is about groups, not mere individuals: “Discrimination against an individual occurs because of that individual’s membership in a particular group.”

And so believes the Obama administration.

Last year, Justice sued to stop Louisiana’s school vouchers program, arguing that when black parents took their kids out of public schools to attend, say, a Catholic school, this increased “the racial identifiability” of the schools. That is, the abandoned public schools had too many white students and so were no longer diverse and had become unequal.

This presumably would also be the rationale for the Justice Department’s interventions against voter ID laws, most famously its lawsuit last year against North Carolina. By requiring an ID, the majority is “changing the rules” in a way that disadvantages black voters. In Justice Sotomayor’s words: “This means vigilantly policing the political process to ensure that the majority does not use other methods to prevent minority groups from partaking in that process on equal footing.” These are what she calls “third-generation barriers.”

In the last line—a footnote—of his concurring opinion, Justice Scalia (joined by Justice Thomas) says that Justice Sotomayor is likening the “majority” in Michigan to the same “majority” who created the Jim Crow laws. She denies that. So what is an average voter supposed to believe?

The Sotomayor dissent in Schuette, as its supporters say, is an important statement of progressive belief about race. Let’s assume they, Justice Sotomayor, President Obama and Mr. Holder wish most Americans would agree with their point of view on race and so support it. If only all could read the Sotomayor dissent to render a national opinion about their racial views.

We can guess. I think it’s fair to say that many who read her reasoning on how Michigan’s voters or other “majorities” are using the political process to harm minorities and produce inequality in every aspect of American life would say: I just don’t get the argument. They might, for instance, ask her about the four-decade catastrophe of urban public schools.

The intricate case she is making about “third-generation barriers” to equality and such—arguments developed by liberal law professors the past 25 years—is not persuasive. I doubt an open-minded majority would agree with it. It could, of course, be imposed anyway by court mandate.

One is left to conclude from the Sotomayor dissent that no matter how much progress people think has been made toward fulfilling the mandate of the 14th Amendment, an argument of some sort will be fashioned to say that equality is forever disappearing toward the horizon, and unattainable. After 50 years, where does that leave us? Polarized.

Letters May 7, 2014
Racial Diversity Seems to Have Become an End in Itself
I’ve thought of racial diversity initiatives as a means to an end, but Justice Sotomayor seems to view racial diversity as the desired end of public policy.

Regarding Daniel Henninger’s “Sotomayor’s Race Dissent” (Wonder Land, May 1): Justice Sonia Sotomayor’s dissent in Schuette v. BAMN brings into better focus the Eric Holder-President Obama understanding of race and race relations in America. I’ve thought of racial diversity initiatives as a means to an end, a way to break down the separation that resulted from de jure and de facto segregation, and enable people to experience one another as fellow human beings. In that light I think it has been useful and has enjoyed a degree of success. But Justice Sotomayor’s opinion in the Michigan case and the Justice Department’s position in the Louisiana school-voucher case and voter-ID disputes seem to view racial diversity as the desired end of public policy. Anything that has the potential of reducing racial diversity, even in the smallest way, is suspect.
This is bizarre. Doesn’t such a policy assume that there are inherent differences in people based on race? Doesn’t it assume that a majority race cannot be expected to treat a minority race fairly on its own? Doesn’t it assume racial conflict as a permanent human condition? Aren’t these points of view, in fact, racist? We’ll never free ourselves completely from conflict, but we have made significant and important progress in accepting one another as equals. Attempting to maintain diversity in appearance through fiat doesn’t help. The Court’s decision allows the social process to play itself out with due regard to the facts and circumstances of time and place.
John D. Hatch
Tarpon Springs, Fla.

That racial disparity is still a fact in America is undeniable. The Supreme Court’s ruling reminds us of our division on the question of how much government should do to try and rectify it. Let’s say we land on one side or the other of that divide based purely on whether we believe government action can be effective. Those among us who believe it cannot be are boosted by the Court’s decision. But there’s a bigger question for all of us here. If down the road we find racial disparity continues to persist with or without government action, is that OK?
Michael Young
Port Hueneme, Calif.

Supreme Court Justice Sonia Sotomayor’s dissenting opinion in Schuette v. BAMN stands in stark reaction to the revolutionary idea that created the U.S. Constitution as a restraint upon the government. Justice Sotomayor envisions the Constitution as a restraint upon the American people.
Morgan Foster
Indianapolis

If, as Justice Sotomayor insists, equal protection of “individuals” must be understood in the context of “membership in a particular group,” when can her version of racism be deemed defeated except when a plurality of oppressed “groups” achieves victory over the apparently monolithic “group” that does the oppressing? As the demographic trends of the so-called “minority-majority” accelerate America’s coalescence into a country with no racial majority, will the oppressing “group” still be deemed too powerful by progressive elites and their allies in federal government?

By emphasizing individual experience viewed in light of race, the Obama-Holder-Sotomayor race camp may effect a future outcome that is the opposite of what they claim to intend. Whether the Balkans in the 1990s, Czechoslovakia in the late 1930s or Ukraine in the present day, history provides stark examples of individuals acting in the name of “groups.”
Kurt Hofer
Altadena, Calif.

A black attorney speaking to my high-school law class more than 30 years ago said that “when minorities start winning the game, those in power change the rules.” That is the constitutional wrong that Justice Sotomayor addressed in her dissent and that Daniel Henninger fails to adequately address.
But it was Justice Sotomayor who got it right. Her passionate, perceptive and well-reasoned dissent reminds us that what that attorney said to my class so long ago is still a reality, but one not permitted by our Constitution.
Rick Nagel
Mercer Island, Wash

TOM PAINTER Wrote:
Sotomayor has been mis-educated – thanks to “progressives” in academia and “living constitution judges – and herself come to accept, an erroneous understanding of the “equal protection” clause.
It has NOTHING to do with any constitutional mandate on the use of federal authority to “make us equal”.
Example: A man applies for a liquor license, having observed he meets the qualifications for one and has fulfilled all the requirements of applying for one. His license is granted. A second man follows the same path and his license is denied. He sees that nothing other than his “race or national origin” distinguishes him from the man whose license application was approved. The “equal protection” clause can be the second man’s basis of a suit – the law was not applied equally, due only to his race or national origin. With some technical caveats that very simple example represents the form of circumstances that the equal protection clause was written to help prevent – a law as written not being applied as written on equal terms no matter someone’s race or national origin. That’s all.
The Michigan state constitutional amendment is in complete respect of the equal protection clause, while Sotomayor actually seeks to deny “equal protection” of the laws and have the law discriminate on the basis of race or national origin. Her’s is not a belief or respect for the Constitution but belief and respect only for a political agenda NOT supported by the equal protection clause.

Charles Frederick Wrote:
Sotomayor describes herself, and these are her own words, as “a wise Latina woman with the richness of her experiences (who) would more often than not reach a better conclusion than a white male who hasn’t lived that life.”
What hubris and what a sorry explanation to rationalize her radical leftist ideology

Jonathan Murray Replied:
Diversity is a false front for affirmative action.

Tom Painter
The “diversity” meme is inherently racist. It assumes you MUST think differently because of your race. From that erroneous meme Liberals have come to loath “minority” Conservatives who have broken out of the Liberals’ intellectual plantation and don’t fit the mold they have tried to CREATE for “minorities”.

Richard Davidson Replied:
I can tell you only this. When I started at Motorola in 1967, located in Chicago back then, minorities in the professional jobs were almost non-existent. When a friend gave me a tour a few years ago at their Schaumburg, IL site, the technical and management people were from every conceivable race, creed, nationality, and color. It was like a tour of the UN.
Something happened in those 40 years. Diversity is the corporate mantra. You explain it
The alleged benefits of diversity that you assert are unproven. There is evidence that diversity of thought is productive in places like businesses, but there is no evidence that diversity based on skin color confers any benefit to anybody except for affirmative action candidates and the industry that feeds off of them.

DAVID KNUDSEN Wrote:
What is the evidence that diversity benefits minorities ?

Michael Love Wrote:
Unfortunately, the hour is late and I have miles to go before I sleep but please ponder the entire context of the racial progress you allege. In 1960, a dark skinned American driving in most anywhere USA would have a significantly higher likelihood of being pulled over by the police. That hasn’t changed today. What has changed over the years is the rate of incarceration in our nation and especially of minorities. From 1980 to 2008 the rate of incarceration quadrupled in the United States from roughly 500,000 people to 2.3 million people. African Americans constitute nearly 1 million of this 2.3 million population. African Americans are incarcerated at a rate six time that of Whites. Examine the following from NACDL: Together, African American and Hispanics comprised 58% of all prisoners in 2008, even though African Americans and Hispanics make up approximately one quarter of the US population. According to Unlocking America, if African American and Hispanics were incarcerated at the same rates of whites, today’s prison and jail populations would decline by approximately 50%. One in six black men had been incarcerated as of 2001. If current trends continue, one in three black males born today can expect to spend time in prison during his lifetime. Five times as many Whites are using drugs as African Americans, yet African Americans are sent to prison for drug offenses at ten times the rate of Whites. It’s widely reported that 35% of black children grades 7-12 have been suspended or expelled at some point in their school careers compared to 20% of Hispanics and 15% of Whites. So what? Well that’s the actual progress in establishing racial fairness that has been made in American during the last several decades. The majority that Justice Sotomayor calls out is insulated or detached and apparently unable to see the broad picture in terms of our history or past their living rooms walls and further than their television sets. Those six justices and the author of WSJ opinion piece fail also to recall that protecting minorities from oppression by the oppressive majorities is part of the fabric of our founding ideals. Racism is alive and well. Bigots are not just occasional cranks–it’s not just the moocher rancher in Nevada or the twisted NBA franchise owner in Los Angeles. Racial bias still permeates everyday America, and when you look at the criminal justice system it’s like mounting a slice of America on a wet-slide to put under a microscope. Look objectively. Shuck your defensiveness and ideological agendas, the fashionable labels of liberal and conservative once unencumbered you too see the effects of racism in action as played out in our schools and our courts. The biggest change in racism from 1960 to today is that we give equality lip service and racial bias has been camouflaged. The majority via Schuette v. Coalition to Defend Affirmative Action simply yields to ocholcracy because in their myopia, they see racism as a problem of the past and now solved. What’s next? Perhaps allowing other states to reestablish segregated schools on ballot initiatives? It’s a slippery slope when the Court abandons principal.
On Ochlocacy, See John Adams, A Defence of the Constitutions of Government of the United States of America, Vol. 3 (London: 1788), p. 291

WILLIAM L. JOHNSON Replied:
Michael,
You have no clue. The incarceration rate among blacks is because they COMMIT CRIMES! Have you ever been in the hood….at night? In Chicago, there are 20-30 shootings PER NIGHT on the weekends and 5-15 during the week. And that’s only the ones that hit someone. These black youth that you believe have been “railroaded” into jail, they have rap sheets a mile long……..literally, 4 to 8 PAGES of crimes that they have been convicted of before they are 21. I have 2 good friends who work in the system: a DA and a PO. I get to hear these stories all the time. How about the one with the liberal, lenient judge who lets off the 18 yr old defendant with only probation even in the face of a long rap sheet, and he gets rearrested within an hour because HE BROKE INTO THE JUDGE’S CAR TO STEAL HIS LAPTOP AND RADIO!
And I live in a retirement city, not an urban center

Stephen Carroll Wrote:
(4) Sotomayor should step down. She has shown a level of bigotry that can no longer be tolerated in our country. We must end the hate in our country that this woman of the past has exposed in her thoughts. Obama is right. If you want to see just how ignorant a person is just let them talk. Racism at any level is wrong and coming from the uneducated it is regrettable. Coming form the educated it is inexcusable. Her sin is worse because she knows better. A step backward not only for whites but for blacks. All men are created equal. You cannot punish the sons and daughters for the sins of their parents. Obama, Holder and now Sotomayor have added mightily to racism. Are there any liberals of good heart or are you all blinded by your hate

Douglas Oglesby Wrote:
.The Wise Latina was a leader in the National Council of La Raza. Ginsburg, the other dissenter, was general counsel of the ACLU. Ideology and, at least in Sotomayor’s case, outright racism trump objective legal analysis every time.
< “More fundamentally,” Justice Sotomayor wrote, the plurality “ignores the importance of diversity in institutions of higher education and reveals how little my colleagues understand about the reality of race in America.”>
Is there any evidence that diversity as practiced by elite schools such as the University of Michigan, i.e. admitting minority students with lesser academic qualifications in the interest of “diversity,” actually benefits minorities who would not have gained admission on their merits? Is there any evidence that such students would not have been better served (e.g., higher graduation rate, success in STEM programs) in different universities that would have accepted them without racial preferences? Won’t the accomplishments of minorities admitted to elite universities on their merits be diminished because such universities also admit minorities with lesser qualifications?
How do Sotomayor and Ginsburg explain the academic success of Asian students, recalling that Asians were the victims of rampant racism (e.g., Chinese laborers in California goldmines and railroad construction in the mid – late 1800s, internment of Japanese in WWII)? Prior to the passage of California’s Prop 209, the number of Asians admitted to the UC system was cut back from what their number would have been under a race-neutral admission policy because their admission rate based on merit would have been vastly disproportionate to their representation in the overall CA population. Even today, whites are third in representation in the UC schools (36% Asian, 29% Hispanic, 27% white). Would the dissenters agree whites should be entitled to preferential admission?

Steve Haynes Wrote:
We are fortunate that in this case, the Supreme court made the right decision, however we all know it seems to be hit or miss.
The bigger point is that our court is being used to make moral decisions and impose a social direction for our country. You can see how upset Sotomayor was for losing this opportunity. These people were not elected yet our country waits with baited breath for every decision they make. The framers never intended the courts to be used in this manner.
Obama and Dems are continuing to pack the courts so this avenue will no longer be an outlet of justice. Soon the Scalias and Thomases will be gone – they last bastions of our constitution. The lower courts are already filled with Sotomayors and Kagans – they legislate through emotion and agenda, there is no rule of law anymore. This is why Harry Reid went nuclear – no more objection to judges.
Our country is in dire need of citizens that are involved and active. Federal involvement is not enough, acting locally is critical – the leaders of tomorrow come from smaller government in state and city locations.
We are in a dangerous place, but through the citizenry things can change if we work together

John Kelly Wrote:
Explaining Sotomayor’s opinion, Henninger said, “The voters’ ban eliminated the Regents’ policy and therefore [quoting Sotomayor] ‘burdened racial minorities.’ ” Any elimination of governmental privilege, whether that privilege is earned or unearned, “burdens” the formerly privileged. As Justice Sotomayor uses this arguement, she sides with former slave owners, who felt themselves unfairly burdened when the slaves were freed.

Scott Horsburgh Wrote:
Here’s a radical idea. Instead of “affirmative action” based on race, why not give an additional boost to students who overcome adversity (poverty) and achieved anyway? Isn’t that what we should be trying to accomplish, give the benefit of the doubt to a kid that just missed the cut at an elite university, but didn’t have the advantage of living in an upper-middle or high income household? Is a poor white kid less deserving of a hand up than a minority kid from a high-income family?

Anthony Brunsvold Replied:
Why is diversity merely skin deep?
Wouldn’t many of these same institutions benefit if they out reach to poor rural whites for example? Based on my experience most of the people at institutions of higher learning have not had much experience like them nor come from that background. Such whites could bring insights and experiences that are rather new to many on the college campus. Yet, for all the discussion of diversify one never hears about this kind of out reach.
Oh, based on my experience my guess is the answer (and it is a guess since I don’t know you personally) is that the Paula Dowlings of the world don’t mind their sons and daughters hanging out with a black liberal, a Hispanic liberal…. but to hang out (much less marry) a hick would be too much.
Liberal bigotry tends to just take on different forms but make no mistake it still exists

Richard Davidson Replied:
It benefits society when fairness in hiring, housing, education, and everything else is the norm.
They said during the enactment of the Civil Rights Act that you could not legislate morality. We can, and we did. That is what George F. Will, the conservative commentator, said on an ABC This Week roundtable.

Page 1 of 3
1 2 3