May 28, 2014 Millenials

My So-Called Opinions

 April 6, 2014,                                   By ZACHARY FINE                           The Stone
Click here for a pdf version

Critics of the millennial generation, of which I am a member, consistently use terms like “apathetic,” “lazy” and “narcissistic” to explain our tendency to be less civically and politically engaged. But what these critics seem to be missing is that many millennials are plagued not so much by apathy as by indecision. And it’s not surprising: Pluralism has been a large influence on our upbringing. While we applaud pluralism’s benefits, widespread enthusiasm has overwhelmed desperately needed criticism of its side effects.
By “pluralism,” I mean a cultural recognition of difference: individuals of varying race, gender, religious affiliation, politics and sexual preference, all exalted as equal. In recent decades, pluralism has come to be an ethical injunction, one that calls for people to peacefully accept and embrace, not simply tolerate, differences among individuals. Distinct from the free-for-all of relativism, pluralism encourages us (in concept) to support our own convictions while also upholding an “energetic engagement with diversity, ” as Harvard’s Pluralism Project suggested in 1991. Today, paeans to pluralism continue to sound throughout the halls of American universities, private institutions, left-leaning households and influential political circles.
Those of us born after the mid-1980s grew up amid a new orthodoxy of multiculturalist ethics and ‘political correctness.’
However, pluralism has had unforeseen consequences. The art critic Craig Owens once wrote that pluralism is not a “recognition, but a reduction of difference to absolute indifference, equivalence, interchangeability.” Some millennials who were greeted by pluralism in this battered state are still feelings its effects. Unlike those adults who encountered pluralism with their beliefs close at hand, we entered the world when truth-claims and qualitative judgments were already on trial and seemingly interchangeable. As a result, we continue to struggle when it comes to decisively avowing our most basic convictions.
Those of us born after the mid-1980s whose upbringing included a liberal arts education and the fruits of a fledgling World Wide Web have grown up (and are still growing up) with an endlessly accessible stream of texts, images and sounds from far-reaching times and places, much of which were unavailable to humans for all of history. Our most formative years include not just the birth of the Internet and the ensuing accelerated global exchange of information, but a new orthodoxy of multiculturalist ethics and “political correctness.”
These ideas were reinforced in many humanities departments in Western universities during the 1980s, where facts and claims to objectivity were eagerly jettisoned. Even “the canon” was dislodged from its historically privileged perch, and since then, many liberal-minded professors have avoided opining about “good” literature or “high art” to avoid reinstating an old hegemony. In college today, we continue to learn about the byproducts of absolute truths and intractable forms of ideology, which historically seem inextricably linked to bigotry and prejudice.
For instance, a student in one of my English classes was chastened for his preference for Shakespeare over that of the Haitian-American writer Edwidge Danticat. The professor challenged the student to apply a more “disinterested” analysis to his reading so as to avoid entangling himself in a misinformed gesture of “postcolonial oppression.” That student stopped raising his hand in class.
I am not trying to tackle the challenge as a whole or indict contemporary pedagogies, but I have to ask: How does the ethos of pluralism inside universities impinge on each student’s ability to make qualitative judgments outside of the classroom, in spaces of work, play, politics or even love?
In 2004, the French sociologist of science Bruno Latour intimated that the skeptical attitude which rebuffs claims to absolute knowledge might have had a deleterious effect on the younger generation: “Good American kids are learning the hard way that facts are made up, that there is no such thing as natural, unmediated, unbiased access to truth, that we are always prisoners of language, that we always speak from a particular standpoint, and so on.” Latour identified a condition that resonates: Our tenuous claims to truth have not simply been learned in university classrooms or in reading theoretical texts but reinforced by the decentralized authority of the Internet. While trying to form our fundamental convictions in this dizzying digital and intellectual global landscape, some of us are finding it increasingly difficult to embrace qualitative judgments.
Matters of taste in music, art and fashion, for example, can become a source of anxiety and hesitation. While clickable ways of “liking” abound on the Internet, personalized avowals of taste often seem treacherous today. Admittedly, many millennials (and nonmillennials) might feel comfortable simply saying, “I like what I like,” but some of us find ourselves reeling in the face of choice. To affirm a preference for rap over classical music, for instance, implicates the well-meaning millennial in a web of judgments far beyond his control. For the millennial generation, as a result, confident expressions of taste have become more challenging, as aesthetic preference is subjected to relentless scrutiny.
Philosophers and social theorists have long weighed in on this issue of taste. Pierre Bourdieu claimed that an “encounter with a work of art is not ‘love at first sight’ as is generally supposed.” Rather, he thought “tastes” function as “markers of ‘class.’ ” Theodor Adorno and Max Horkheimer argued that aesthetic preference could be traced along socioeconomic lines and reinforce class divisions. To dislike cauliflower is one thing. But elevating the work of one writer or artist over another has become contested territory.
This assured expression of “I like what I like,” when strained through pluralist-inspired critical inquiry, deteriorates: “I like what I like” becomes “But why do I like what I like? Should I like what I like? Do I like it because someone else wants me to like it? If so, who profits and who suffers from my liking what I like?” and finally, “I am not sure I like what I like anymore.” For a number of us millennials, commitment to even seemingly simple aesthetic judgments have become shot through with indecision.
It seems especially odd because in our “postcritical” age, as the critic Hal Foster termed it, a diffusion of critical authority has elevated voices across a multitude of Internet platforms. With Facebook, Twitter and the blogosphere, everyone can be a critic. But for all the strident young voices heard across social media, there are so many more of us who abstain from being openly critical: Every judgment or critique has its weakness, making criticism seem dangerous at worst and impotent at best.
This narrative runs counter to the one that has been popularized in the press about the indefatigable verbiage of blog-hungry millennials, but it is a crucial one. The proliferation of voices has made most of them seem valueless and wholly interchangeable, even for important topics. To use social media to publicly weigh in on polarized debates, from the death of Trayvon Martin to the Supreme Court’s striking down of the Defense of Marriage Act, seems to do nothing more than provide fodder for those who would attack us. This haunts many of us when we are eager to spill ink on an issue of personal importance but find the page to be always already oversaturated.
Perhaps most crucially, the pluralistic climate has confused stances on moral judgment. Even though “difference” has historically been used, according to the philosopher Cornel West, as a “justification for degradation and a justification for subordination,” we millennials labor to relish those differences and distances separating individuals, exalting difference at all costs.
We anxiously avoid casting moral judgment. Because with absolute truths elusive, what claims do we have to insist that our moral positions are better than those of someone from a different nation or culture?
Consider the challenge we might face when confronted with videos from the popular youth-oriented news outlet Vice. Here, viewers can watch videos of communities, from across the globe, participating in a host of culturally specific activities, ranging from excessive forms of eating to ritual violence to bestiality. While the greater Western culture may denounce these acts, a substantial millennial constituency would hesitate to condemn them, in the interest of embracing “difference.”
We millennials often seek refuge from the pluralist storm in that crawlspace provided by the expression “I don’t know.” It shelters the speaking-subject, whose utterances are magically made protean and porous. But this fancy footwork will buy us only so much time. We most certainly do not wish to remain crippled by indecision and hope to one day boldly stake out our own claims, without trepidation.
Zachary Fine is a junior at the New York University Gallatin School of Individualized Study

— ———–
Northstar5     Los Angeles
It’s interesting how many commentators say that the whole point of a college education is to learn other ways of seeing things. That’s just not true. That is only one part of the purpose of a college education.
Another huge and fundamental reason is to learn the great cannon of human knowledge—to learn it, to carry it forward, so that the legacy of human civilization is learned and passed on generation to generation. It is to be part of the continuum of learning, and learning involves absorbing known information. Science, philosophy, history, literature. Facts. Ways of thinking. Sound reasoning.
Learning to open your mind is important, but we must safeguard and treasure actual facts and knowledge that people toiled intensely to discover in the first place.
Don’t forsake truth and actual facts and knowledge in the name of political correctness and being ‘open.’
In the words of the great Richard Dawkins: Don’t open your mind so much that your brain falls out.

John Ombelets   Boston, MA
I’d recommend “Zen and the Art of Motorcycle Maintenance.” It puts questions of personal taste and the validity of “just what I like” in excellent perspective.
Mike in Colorado      Denver
Of all that you can say about “Millenials” (I am on the younger edge of the Baby Boom), this entire essay only addresses a single attribute – what I call the “whatever” generation -where any idea can be diminished by implying that nothing really matters. I never thought it a result of indecision, but a relativism that says that nothing is important except personal wants. It is a world view where peripheral vision is unimportant, usually “Boring!” The attribute of Millenials more distasteful to me is their implicit desire to be immediately recognized at a young age and adored for their abilities and how little they value the maturity gained with experience. There used to be a general acceptance that regardless of education, skills, and abilities, that many necessary soft skills took time and experience – the ability to manage and lead people, for example. What I see from Millenials is, “I’m great, I’m talented, and I want it now” – the result, I think, of a generation of parents saying “You’re great, you talented, you deserve it now.” My own parents were more likely to say, “You’re smart enough, but really, you haven’t got a clue.” Now in my 50’s I realize that they were right.

David Gutting     St. Louis
I have been personally responsible for overseeing an enormous amount of research on millennials, and I am convinced that this entire discussion is overblown.
Millennials are not that different from any younger generation that came before, especially around the issue Mr. Fine brings to light. When people are young they usually haven’t developed deeply informed opinions, and this often leads to either indecisiveness or to narrow minded views. (Advancing deeper into adulthood often doesn’t improve this situation.)
In the last 20 years, there has been a significant change in the way society as a whole weighs in on certain social issues, notably on marriage equality. The liberal tendency in that time has less to do with millennials leading the charge and more to do with the fact that more and more people have personal experience with gay people in their own friend and family circles–and this, in turn, has been a by-product of increasing openness in our culture. A much more open gay culture drove this with a hard-fought effort for many years.
Millennials, at best, are followers on social issues. While marriage equality has seen a leftward movement, reproductive choice has definitely turned in the other direction–with right wing interests succeeding in winning more and more restrictions. On paper and in the polls, millennials are more “pro-choice.” But they have done little to personally take up a pro-choice cause and push back against this trend.

T Cecil      Silicon Valley
As I was reading this I had a thought running through my head similar to your statement ““I like what I like” becomes “But why do I like what I like?”
if the results that follow from this self inquiry lead to pluralism or whatever, so be it, but I think the ability and instinct to question and analyze one’s likes and dislikes is a great evolutionary step for any generation.

Steven    NYC
So you’re suggesting that if there’s an imperfection in the current “system” then we should lunge back to the old straight-white-male hegemony of the past? I think we can work with pluralism. With any of its imperfections it’s still better than what we had in the past.      In reply to RG

Jason      St Louis, MO
Here’s an idea: grow a spine and get some actual scruples. If you feel strongly about something: gather evidence, form an informed opinion, and then make your voice heard. To me, this article doesn’t say that Millennials don’t have opinions or are afraid to voice them, it’s that they lack the fortitude to actually disagree with someone. There is a big difference between having a strong opinion and being culturally insensitive.
Also, maybe read some science or learn some math. Find out what it’s like to interact with a field of study that does have some objective truths…

RG      Chicago
Excellent analysis. The consequences of pluralism, multiculturalism, and moral relativism are now being seen. Some commenters just aren’t willing to accept the obvious: that pluralism / moral relativism has unintended consequences. The classical teachings in religion, literature, and culture are ridiculed and persecuted by the leftists. Universities have been taken over by social activist committed to their utopian ideal of “social justice” over the original ideal of truth. Open inquiry has been squashed by the politically correct. Humanities have devolved into deconstructed nonsense. The question, “what is a good life?” asked by philosophers throughout the ages is now unanswerable according to the moral relativists. There is no truth, only your biased version of the truth. In all prior countries where the quest for social justice replaces the ideal of truth a predictable outcome has occurred: tyranny and dystopia.

David Wiles      Northfield, MN
I’m struck by how often a discussion of a Liberal Arts education descends into or begins with a discussion of literature courses as if everyone majors in English. I teach in a small liberal arts college that attracts a “progressive” student body taught by a faculty that is widely thought of as being “liberal.” We offer degrees in forty or so subjects only one of which is English. A look at our course catalogue (easy to see, it’s online just like everyone’s is) and our requirements shows that our students are required to take no more than one lit course unless they major in English. A sample look at our English department offerings shows courses in Medieval and Renaissance lit, Chaucer, Milton, Victorian Lit, 19th Century American Lit, Marlowe (Chris not Phil), American Transcendentalism and at least three courses in Shakespeare among many other things that critics seem to think are no longer taught.
My point is that most critiques I read of the Liberal Arts don’t seem to be informed by much if any knowledge of what actually goes on. Instead we get anecdotes about what some teacher said to some student and they become the model. We get recycled debates from 35 years ago as if everything changed then and nothing has changed since. Let me suggest looking at the readily available evidence of what’s taught. Looking at evidence. There’s a name for that; critical thinking.

ARP   Northeastern US
This article is a protracted hasty generalization. The author takes it as self-explanatory that millions of people born in an arbitrary time period are “indecisive” and chalks it up to an ethos of “pluralism.” In doing so he makes a move that is not unfamiliar to mass media writing about generations in general: take a larger cultural anxiety and project it onto an imagined monolithic group. The “millennial” stereotype is often coded white, for instance. On the other hand, George Zimmermann is a millennial who doesn’t seem tied up by “pluralism.” This problem with the article comes back when Fine uncritically invokes the myth of political correctness, which is really just a moral panic propagated by people unhappy with the expansion of the curriculum.
Fine namechecks Adorno and Horkheimer, but obscures the argument. He and anyone else considering writing sweeping claims about “generations” might wish to revisit them. It is not that taste breaks down along class lines, but that the culture industry creates groups by reducing all humans to knowable categories. The reality of concepts like “Boomer” “Gen X,” “millennial,” etc. comes from somewhere. That reality is produced partly by mass media writers opining about “generations.” But it also comes from marketers who create the “millennial” as a category some people can identify with, in order to produce value more effectively.

Siobhan    New York
I’m surprised at all the criticisms of this piece. I thought it was great.
Mr. Fine has done a wonderful job of explaining the impossible task put before him and the rest of his generation. Every opinion or judgment must be weighed for its greater cultural and moral meaning, for its implications.
Yet supposedly, no one thing is better than another. And simultaneous with that, his generation is asked why they do not take a stand on things.
When they live simply by their own values, they are called narcissists. But a preference for Shakespeare over Haitian literature–well, that’s about as loaded as it gets.
Pluralism has turned itself inside out. Respect for all has become equated with all are equal, now make a choice.
I thank him for this interesting and illuminating piece of writing.

W. Freen      New York City
Zach: Thank you for your column. I would add that, because Millenials live so much of their lives online, they are constantly confronted by commenters who pick things apart and are eager to tell them that everything they think, write, say and do is wrong. The first batch of comments here in response to your column are perfect examples. I can’t imagine what it must be like to have lived one’s young life in the face of so much manufactured negativity. Try to spend more time offline and ignore the boo-birds.

gemli      Boston 19 hours ago
You poor dears. In an effort to provide anchors for your wobbly worldviews, here are some absolutes:
Good things, in no particular order: Coca Cola; Roger Ebert; Torvill and Dean; Science; Scientists; Northern Exposure, season 3, episode 10, “Seoul Mates”; Christopher Hitchens; Popular music before 1975; Director Mike Leigh; The Roches; David Copperfield’s Flying illusion; Jim Jarmusch; Eraserhead; 2001: A Space Odyssey; Playing an instrument; Books made of paper and ink; Good sound systems; HDTV; Grave of the Fireflies; Randy Newman; New Orleans food; Ricky Gervais; The (British) Office; Deadwood; Battlestar Gallactica (2002); most things by Pixar.
Bad things: Social media; Religion; 95% of the musical guests on Saturday Night Live; All -phobes and most -isms; Republican since Reagan; Things Republicans think; Things Republicans believe; low-information voters; new-age anything; products sold by infomercials; pseudoscience; the weather; local TV news; cheap ear buds; pretentious art; How I Met Your Mother; Generations of kids who can’t tell good from bad; e-anything (except and YouTube); teachers that tell kids all things are relative.
Make your own list. All of these are just suggestions (with the possible exception of Randy Newman).

Masaccio     Chicago
The point of going to college is to learn other ways to think about things. Maybe it was easier for me; I studied existentialism, and learned that no matter how hard the situation appeared, the responsibility for making decisions was on me and me alone.
I’m familiar with the arguments you hear in your classes; I’ve read a bunch of that stuff myself, and occasionally rail against it for the same reasons. It doesn’t matter. I’m still responsible, and I still have to act.
It’s not that hard.

Jack     NYC
In spite of the many valid criticisms below, I think this essay raises issues that should be considered seriously. Even here in the NY Times comment section, I have written non-offensive observations of cultural groups which are not published. These groups seem to have blanket protection from any form of criticism. This type of oppression, in the name of diversity or political correctness or whatever you want to call it, are preventing serious people from making observations about what is wrong with our society.
This bias against saying anything that might be considered offensive on the surface is strangling our ability to speak freely. I’m tired of having to try to figure out what’s going on by looking at the subtext beneath writing that is full of polite euphemisms.

David Underwood    Citrus Heights
Oh my, it appears to me that there is a need to study logic, rhetoric, and how to research history.
Although I went back to college in the 1970s, I do not recall any professors making such blatant contradictory claims.
“we entered the world when truth-claims and qualitative judgments were already on trial and seemingly interchangeable.” How would one know what truth meant in such a case? Unless you can define it, you can not make any statements about it.
Or this one:
“Good American kids are learning the hard way that facts are made up, that there is no such thing as natural, unmediated, unbiased access to truth, that we are always prisoners of language, that we always speak from a particular standpoint, and so on.”
Is that a fact I ask?
“We anxiously avoid casting moral judgment. Because with absolute truths elusive, what claims do we have to insist that our moral positions are better than those of someone from a different nation or culture?’
Is this a truth? if is not, then it is a contradiction in terms.
The whole article is full of claims that contradict each other. There are no truths, is that a truth?
You can peruse the article and find several instances of using a concept to refute itself, that as I recall is known as fallacious or circular reasoning.
Since I was a bit older than most of my classmates, I had no compunction challenging such statements. The question, is that a fact, usually made an impression.

David Wiles     Northfield, MN
Has the author considered that the group that consists of “(t)hose of us born after the mid-1980s whose upbringing included a liberal arts education” excludes the overwhelming majority of people born after the mid-1980s?
If he has considered this, why has he gone on to generalize about how “his” generation thinks?

SDK      Boston, MA
I sincerely doubt that anyone majoring in Math, Business, Engineering or Biology can sympathize with or even understand this article. With my undergraduate degree in cultural studies and 10 years in academia, I understood it very well.
I agree with the writer that there is an issue — but it is limited to the humanities and particularly that discipline at elite schools. In my degree (’93), I learned how to deconstruct — and that was it. Pointing out how power imbalances are re-inscribed in everything that seems good at first glance is the only skill one needs. That, and the ability to hyphenate most of your nouns and adjectives.
That’s a pity, because the work of tearing down is much easier than the work of building up, taking a position, and taking action. This is not taught in the humanities but it can be learned elsewhere.
My advice to Mr. Fine is to take a few business classes, some sociology, and some chemistry. Spend some time working in your community. In these places, you can start to use your critical eye in more creative and productive ways.

Leave a Reply

Your email address will not be published. Required fields are marked *