Friday, May 30, 2008

Summer Reading...

Like every other academic, I have quite a lot of writing to get done this summer, but I do have a few books I want to make sure to read before September:

1. Richard Iton, In Search of the Black Fantastic: Politics and Popular Culture in the Post-Civil Rights Era
I've been waiting for this one for a while, and I know it will be useful for my own work. Iton's insights always are.

2. David Edwards, ARTSCIENCE: Creativity in the post-Google Generation
How can I be a PIK Professor at Penn and not get excited about this book.

3. Drew Gilpin Faust, This Republic of Suffering: Death and the American Civil War
I am trying to finish a piece on a Black sex magician who was in full swing at about this time. Plus, I just read Mothers of Invention and was completely blown away by its rigor.

4. David Levering Lewis, God's Crucible: Islam and the Making of Europe, 570-1215
He has to be considered something like the scholarly gold standard at this point.

5. Noah Feldman, The Fall and Rise of the Islamic State
I am already 50+ pages into it. A razor-sharp intellect who makes things all look far too easy.

6. Anita L. Allen, The New Ethics: A Guided Tour of the Twenty-First Century Moral Landscape
I saw a review of it that has me totally intrigued.

7. James Baldwin, No Name in The Street
I want to make it through all of Jimmy B by the end of the decade. Just four more to go.

8. Ta-Nehisi Coates, The Beautiful Struggle: A Father, Two Sons and an Unlikely Road
Everybody I know seems to be talking about this one.

9. Chuck Klosterman IV, A Decade of Curious People and Dangerous Ideas
I like his writerly voice whenever I catch it in Esquire or GQ or something.

10.Naomi Klein, The Shock Doctrine: The Rise of Disaster Capitalism
I read up to chapter 10 when it first dropped and then got swamped.

11. William Jelani Cobb, The Devil & Dave Chappelle & Other Essays
His cultural critiques always seem to turn the analytical screw one more rotation than almost anybody else.

Just my list. Let me know if you know of something I should add--and why.

Talking Ethnography (even briefly)




(By John L. Jackson, Jr. First posted for The Chronicle Review's Brainstorm Blog on May 16, 2008)

The term “ethnography” describes a literary genre (writings that attempt to capture people’s cultural beliefs and practices) as well as a qualitative research methodology (a way of collecting social scientific data based on long-term, face-to-face interactions). But we are living in a hyperscientific moment now, a time when ethnographic analysis seems to have lost some of its authority, especially since human genomics and the statistical analysis of massive datasets are privileged as holy grails in the search for contemporary solutions to social problems. Ethnography is still alive and well. It just ends up packaged for the public in ways that look vastly different from how other social sciences get framed.

Anthropology and sociology are the two academic disciplines that traditionally cornered the market on ethnographic research, but other social sciences have become more interested in the kinds of nuanced information that gets gathered during intimate and ongoing interactions between qualitative researchers and their research subjects, interactions euphemized as “deep hanging out.” Ethnographers spend time drinking beers with the folks they study, eating meals at their dinner tables, and shadowing them on the job — all in an effort to figure out what people’s everyday lives actually look like and to determine how people make sense of those lives.

When they first start conducting research in a particular community, ethnographers stand out like sore thumbs, drawing attention to themselves and making their research subjects decidedly self-conscious, which means that they run the risk of witnessing things that probably wouldn’t have taken place at all without the conspicuous seductions of an outside audience. But as ethnographers spend more and more time observing and participating in the same community, among the same community members, they eventually begin to lose some of their distracting influence on people’s behaviors. They transform into proverbial “flies on the wall,” at least that’s what we tell our graduate students. The ethnographer is still there, asking questions and watching people’s daily reactions, but they are hardly noticed any more, eventually, not in ways that might compromise the reliability of what they see or hear.

Ethnography’s value is based on the kinds of intimate and unguarded data that researchers gain from extended contact with one particular social group. When the discipline first emerged, this meant relatively small-scale and remote societies. “Father of ethnography” Bronislaw Malinowski’s early 20th century work with Trobrianders is taken as a powerful marker for the birth of full-fledged ethnographic research within anthropology. He crossed the seas, pitched his lonely tent, and found a way to live among people whose cultural world seemed radically different from his own. Part of the point, of course, was about making it clear to the European audience back home that those foreign practices could be understood only with the fullest knowledge of how people’s entire belief systems fit together — even and especially when those cultural systems seemed spectacularly exotic to the Western eye.

Anthropology was traditionally about studying societies unsullied by the advances of modernity. From the romantic attempts at “salvage ethnography” among Native American tribes in the early 19th century (archiving cultural practices before they disappeared forever) to the constructions of primitive societies as examples of the modern Western world’s hypothetical pasts, anthropologists used ethnographic methods to study those populations most removed from the taint of modern living.

Sociologists also embraced ethnographic methods in the early 20th century, and people like Robert Park at the University of Chicago helped to institutionalize “the ethnographic imagination” as a method for studying not just faraway villages but modern urban life in a teeming American city. That dividing line (between the anthropological ethnographer who studies some distant community and the sociological ethnographer who focuses her eyes on the modern Western metropolis) still defines most people’s assumptions about how those two fields carve up the social landscape for qualitative examination (even though there are certainly sociologists who study small-scale societies and anthropologists who have been working in urban America for a very long time).

One thing that both fields seem to emphasize and value amounts to a premium placed on the scientific equivalent of roughing it. They each have the highest regard for the “gonzo” ethnographer, the kind of heroic or mythical figure willing to put his very life at risk for the sake of ethnographic access. The more remote, removed, and potentially dangerous the location of the fieldwork experience, the more explicit and awestruck are the kudos offered up to any ethnographer bold enough to go where few have gone before. This search for dangerous exoticism can lead you halfway around the world, or just to the other side of the tracks, the other end of town. But in either case, an added value is placed on access to the everyday lives of human beings and cultural perspectives that most middle-class Western readers know little about.

During the 1960s, anthropologists and sociologists in the United States wrote classic ethnographic offerings on the urban poor — specifically, the Black poor — who were struggling to make ends meet in America’s ghettos. Ethnographers were trying to explain the hidden realities of urban poverty, a tradition that continues today. Anthropologists and sociologists working in American cities still disproportionately study poor minority communities. That’s because it is harder to sweeten the deal enough for wealthier Americans to accept such scholarly intrusions. A crisp $20 bill might suffice as incentive for an unemployed urbanite to answer some open-ended questions about her life history, but it is hardly enough to compel more decidedly middle-class citizens into exposing their raw lives to an ethnographic gaze. Middle-class and wealthier Americans also sometimes live in actual gated communities or attend the kinds of restricted social clubs that can keep prying anthropological eyes at bay.

Of course, most ethnographers will tell you that any valuable ethnographic work must be based on respect for the people researched, and that the most powerful studies of the poor decidedly humanize them, fending off insensitive attempts at reducing poverty to the cold numerical instances of this or that pathology. These ethnographic researchers carry a double burden, however. Besides making sense of people’s daily lives and future life chances, they are also asked to continue impressing readers with their fearlessness, with their courageous forays into the heart of darkness, offering first-hand “thick descriptions” of poor and dilapidated minority communities, the kinds of places that terrorize mainstream America’s imagination. And that is part of the trap. This formula usually means that even as urban ethnographers try to challenge middle-class cultural chauvinism, arguing that ostensible “cultures of poverty” are sometimes little different from more mainstream cultural groups, they are also asked to justify the value of their work by claiming to provide access to otherwise inaccessible locations.

This is exactly what the authority of some memoirs (about former gangbangers or drug dealers or welfare queens) traffic in: proffering the learned public first-person renditions of the kinds of social existence that they have never experienced — and would not want to. But, unlike the memoirist, a social scientist is supposed to be objective, politically and personally disinterested, and so it is supposed to help the case for scientific legitimacy that you are not conducting research in your own backyard. You have no axe to grind, no biases linked to prior investments in that community.

In some of our most recent and highly popular versions of ethnographic research, we have suburbanites from southern California studying gangs in Chicago (Sudhir Venkatesh conducting research on the south side), middle-class journalists working side-by-side with low-wage service-sector employees (Barbara Ehrenreich modeling her undercover reporting on something akin to what George Orwell attempted in Paris), and white researchers studying the ins and outs of Latino drug culture (Phillipe Bourgeois moving his family into one of the most crack-infested parts of Spanish Harlem). These authors are examples of careful and holistic ethnographic research that is both rigorous and politically committed. The problem has to do with the way we sometimes read them, how we unconsciously tap into the expectation that what makes ethnographies valuable is not the scientific rigor and meticulousness that all three of these ethnographers (and so many more) duly demonstrate, but instead any ethnography’s singular ability to take us into what we still imagine as that “heart of darkness,” whether the jungles of the Congo or the sidewalks of South Central Los Angeles.

It isn’t enough for ethnographers to study the world. For anyone to care nowadays, they have to titillate us, too. Give us a window into “the other” that will blow our minds. To accept this double-standard is to enter into a Faustian pact with a sensationalist contemporary ethos that demeans the true significance of careful, honest listening as a powerful methodological tool for social analysis when chronicling the experiences of nerdy video gamers as much as the most violent of gang members. Ethnography works in either case, even if only the latter fully satisfies our collective predilections for sensationalist storytelling. Number crunchers are asked to stick to the facts. If ethnographers want anyone else to care, it seems that they had better shock us with them.

Friday, May 23, 2008

The Racial Impasse...

From The Chronicle Review's Brainstorm Blog
May 16, 2008

The Racial Impasse
By John L. Jackson, Jr.

I remember speaking to an auditorium full of 7th and 8th graders at a junior high school in Central Florida a couple of years ago. I was attempting to explain to them just what anthropologists do for a living, and I was having the hardest time.

I’d been brought down to lecture at a local university only a few miles away, and the person who invited me, a minister and activist in the community, wanted to make sure that I got a chance to learn about the local area, especially from residents of the all-black town a stone’s throw from campus. I’m used to speaking to academic audiences about my work (undergraduates, graduate students, and colleagues), but addressing an auditorium full of pre-teens and teens was a major challenge.

I decided to start off with an invocation of Indiana Jones, which felt a little bit like cheating (or just pandering), but I thought that they’d at least have heard of the pop-culture icon. Of course, that might have already been a bad jumping-off point, since some of the kids weren’t even born when the third installment of the motion-picture franchise was released — and there wasn’t buzz in the air yet about Harrison Ford and Steven Spielberg gearing up for this summer’s fourth installment. Even so, they seemed to listen, probably because any “special assembly” in the auditorium probably beats class-as-usual for most teenagers.

I wanted to give them their money’s worth and to make sure that they didn’t feel like I was speaking above their heads. I bounced around the stage like a prize fighter, making eye contact with kids, telling the worst jokes you could imagine, even singing a bar or two from some popular songs to help punctuate points — just hoping that my physical energy and inadequate grasp of contemporary teen culture might translate into something exciting and engaging to a few of them.

I then moved from a discussion about all of anthropology (in about five minutes) to a longer description of my work, ethnographic research on race and class in America. I asked them what they knew about “race” and how they’d define it. Some students defined it as “what you look like” or “people that look the same.” Others blurted out that it had to do with “your family being related to other families from before.” Still other kids declared that it had to do with “blood and stuff.” But the talk of “blood” was met with a series of hoots, with vocal objections and dismissals.

“It isn’t blood,” one girl said. “We all have the same blood.” Her friend, a chubby-cheeked young boy, nodded his head in agreement. Several rows behind them, another student added, “race is just made up.” Many of the students shook their heads to second that conclusion, even as some of those same students seemed unconvinced of their own position, scrunching their eyebrows quizzically and pursing their lips in disbelief at their own arguments about how race should be defined.

Most students, even junior high school students, have gotten the memo about “race” not being simplistically biological. They know to say that race is at least partially about culture, not just biology, even if they can’t necessarily marshal the specific evidence scholars use to challenge assumptions about race’s biological grounding. To say that race is “cultural” is usually to say that it doesn’t have the inflexible imprimatur of nature behind it. But the stamp of culture can feel just as
intractable.

There are at least two racial camps in the academy right now. One consists of biologists, geneticists, sociologists, psychologists, anthropologists, and medical doctors who declare that “race” is not biologically based at all — and never has been. These are the social and cultural constructionists. They say that race is hardly reducible to biology. It is about power and exploitation, a way we fool ourselves into thinking there are natural justifications for the kinds of inequalities that plague us.

There is another group of biologists, geneticists, sociologists, psychologists, anthropologists, and medical doctors who dispute that claim. They think that there is a kind of academic conspiracy afoot to pretend that race isn’t real when it is — a move to treat the biology of race as a taboo subject, too politically sensitive to analyze. Instead of taking it seriously, they say, the folks in the former camp use dogma and charges of racism to keep contrary scientists (still interested in thinking about the physical realities of race) in line.

Each camp sets the other one up as being more powerful — and dangerous. They make accusations about one another’s intentions and morality. And they both imagine the other to be a serious problem to the future of scholarship.

There is even a growing third category of scholars, mostly nonscientists, who argue that race isn’t real (the first group is right) but that even the people in that camp use race in ways that are similar to their rivals. In one of its strongest versions, scholars claim that the “culture” of cultural constructionism is really just a smoke-and-mirror trick that allows academics to have their racial cake and eat it, too — by just renaming that race “culture.” (Literary critic Walter Benn Michaels has made this claim most forcefully.)

Race’s complicated relationship to reality (real vs. unreal, there vs. not there) is exactly what has everyone so preoccupied. Of course, just as those junior high school kids in Florida could imagine that race is and isn’t biological at the same time, a little of both and a little of neither, racial experts are caught in the same quicksand — of accusation and innuendo, of charges and countercharges. Race is a social construction, but it is also more than that, and this complicated, contradictory notion of race is exactly what makes racism so tenacious, perched right atop the electrified fence between those racial camps.

Saturday, May 10, 2008

Et Tu, TV?: Television’s Role-Reversal on Racial Equality

Watching the most recent snippets of Jeremiah Wright espousing race-based conspiracy theories while defiantly saluting journalists at his controversial press conference, I couldn’t help but lament television’s currently pathological role in discussions of race relations, especially given its newfangled subservience to an Internet that allows people to replay TV clips ad nauseum.

The civil rights movement’s successes in the 1950s and 60s were spurred on by the television set and its powerful nightly newscasts. Martin Luther King, Jr., tested America’s religious conscience, but none of that would have worked without the creation of broadcast television as a ready-made national stage for vividly showcasing nonviolent resistance and racial discontent. Television came of age in the 1950s, just in time to contribute to those civil rights victories.

It was one thing to read about water hoses or vicious dogs unleashed on women and children and another to see those graphic images in your own living room. On screen, the aggressors (police officers and violent white citizens) came off as the bad guys in these nightly narratives, with the stoic black protestors cast in the undeniable role of innocent victims.

Television became a window into America’s soul, and the view wasn’t pretty. For blacks, these images were frightening but politically galvanizing. For whites, they offered renditions of a whiteness that most Americans would just as soon distance themselves from.

That was then. Today is much different, and it has become much harder for television to play a productive role in discussions of race nowadays. In fact, it may be hurting more than helping in the fight against all forms of racism.

As a society, we bristle when Michael Richards spews racial epithets on stage at a comedy club. We seethe when Mel Gibson shouts anti-Semitic insults at police officers. And we shake our collective fingers with a “tisk-tisk” when a black presidential candidate’s pastor voices conspiracy theories at a national press conference. But these are mere simulations of old-fashioned racism, and not representative of how race most powerfully circulates today, which is nothing like the way race ever functioned in American society in the past.
Politicians once ran campaigns on explicitly racists platforms, committed to discriminatory practices like racial segregation.

Displays of such hard-lined investments were easy for television to capture. Nowadays, the mere hint of support for such positions, even if not stated explicitly, is a major political blunder. Nobody wants to be called a racist, so people have to be caught unawares.

Today, America has moved on from a past era of de jure racism (unapologetic discrimination codified in law) to one of de facto racism (conspicuous public displays of racial differentiation, even if not backed by the judiciary) to our current moment of what I’d call de cardio racism (hatreds potentially hidden in people’s darkest hearts even if they never see the light of day). This is the new reality of racism in contemporary American society, a racism that has lost much of its traditional swagger.

The media have a harder time figuring out how to depict this humbled and dissimulating racism. It might not even have the capacity to do it at all. Television is not particularly good at representing subtlety. To carry meaning quickly and clearly in a sound-bite culture like our own, visual images have to be blatant, like those waterhoses and dogs wielded against peaceful marchers. With 24-hour news cycles to fill and few prominent heels-dug-in racists to beam into people’s living rooms, the medium today often succumbs to depicting mere racial spectacles, all sound and little substance—Michaels Richards breaking down on stage, mock nooses hung on college campuses, a black presidential candidate’s pastor voicing conspiracy theories at a national press conference. These are simulations of old-fashioned racism that distract us from the subtleties that define how racism actually operates today.

This is part of the reason why most mass-media caricatures of racial issues do more harm than good. With enough cinematic and televisual priming, we start to expect or demand unmasked racial explicitness in our real world encounters. When we don’t get it, we either dismiss cries of racism as unjustified hypersensitivity or pretend that the offending actions are self-evidently racist even when they aren’t, when they are restrained and almost impossible to prove.

Thankfully, bold and obvious versions of unmasked racism are more media cliché than anything else these days, not all the time, but usually—at least in mixed racial company. These clichés still suffice for melodramatic tales of heroes and villains, but they are hardly up the challenge of representing how convoluted and messy racism is in our politically correct, post–civil rights present.

Pre-civil rights sensibilities won’t help us. They are too black and white. Ironically though, television was more useful when its images still were.