Monday, November 28, 2011

Blogging like a Beast?!

Martha Marcy May Marlene is a film about a young woman trying desperately (and unsuccessfully) to recover from her traumatic stint as a member of a rural cult, sexual concubine of its charismatic spiritual leader. It is one of those “art house” movies that ended with surprisingly little warning. When the closing credits began, audience members gasped. “What?” “You’re kidding me!” “Is that really it?” That’s only what I heard in the theater seats nearest my own. And I laughed, because I knew exactly what they were reacting to.

The writer had taken us on a complex and nerve-racking journey with the film’s female protagonist (who, at different moments in the story, answers to each one of the names that make up the film’s title). By the “end” of the story, our filmmaker hasn’t really provided us with any resolution. There is no simple (artificial?) closure to the narrative, just a final tension-filled scene rife with unanswered questions and uncertain outcomes.

It isn’t necessarily the way we’re taught to write screenplays, but it was a valuable reminder of what some good storytellers try do accomplish. And how.

Many people have made the claim, but it bears repeating: Good writers write like beasts. They don’t worry about the “audience” in any simplistic and condescending sense. And they certainly don’t care to placate them. They write without self-consciousness (at least, without the kind of self-conscious anxiety that allows for any too-precious preoccupations with making readers happy).

As I craft one of my final few Brainstorm blog postings this week, my final week as a Brainstorm blogger, I can’t help but think about all the interesting and energizing exchanges I’ve had with Chronicle readers over these past few years. From posts about the potentially racist underpinnings of “Obama-as-anti-Christ” rhetoric back in 2008 to a knock-down blog-brawl (played out over several postings) sparked by some comments I made about attending an academic conference, from defenders of Michelle Malkin keen on rebutting my (passing) characterization of her work to more recent debates about Herman Cain’s theories of race and class, one of the most conspicuous features of the blog (as platform and/or genre) is its almost immediately dialogical linkage of readers and writers.

Anthropologist Johannes Fabian has discussed the possibility that on-line exchanges between researchers and research subjects, exchanges modeled on the back-and-forth interactions between bloggers and blog readers, might be the beginning of the end for traditional forms of ethnographic writing, differently configuring those conventional relationships in radically new ways.

At its best, the interactivity of the blog format clears space for the rehearsal of real debates and differences of opinion, especially when the anonymity of the Web doesn’t help foment the worst forms of unproductive incivility. I really have enjoyed my time on the blog, and I feel like it taught me to think about writing in a very meaning-filled way, especially when readers made good-faith efforts to challenge some of my positions. (Even the responses that I’d describe as more like “bad faith” offerings were sometimes useful.)

At the same time, however, I realize that I used to feel as though I wrote (or, at least, tried to write) like a beast, with a cultivated indifference committed to getting my point across as honestly as possible (come what may). Although I have always tried to be honest in my Brainstorm posts, I certainly didn’t feel the freedom to write without self-consciousness, without having to worry about which readerly eyes stood in my path. On the contrary, I feel like I have gotten increasingly more self-conscious over the course of my blogging stint, which I know isn’t necessarily how everyone responds to this platform and its many possibilities (and may not, ultimately, be the worst thing). Sometimes that heightened self-consciousness found me pandering to the gods of sensationalist punditry, trying to be purposefully provocative (even unnecessarily harsh and fairly mean-spirited) as a way to drum up more explicit comments from readers. Any response seemed better than none.

At other times, I started (or conceived of) many posts that I never finished/published, all too mindful of how many colleagues actually look at the Chronicle‘s blogs. If some of these same sentences were tucked into a book that few people ever read, I wouldn’t have to worry about getting a series of emails and phone calls after the writing, which also became a kind of immediate gratification that was, at times, far too intoxicating.

All of this meant blogging less and less like a “beast” every single day. It would be the equivalent of being a filmmaker forced to sit in the theater and listen to audiences complain about the inexplicable ending to your latest movie. You experience that immediate feedback often enough (the audible gasps, the palpable disappointments), and you start to hear those same complaints at your computer screen as you work on the next project. Even positive responses, received as the film’s final credits roll, over-determine a self-conscious writer’s subsequent decisions. Such hauntings can be the kiss of death to any would-be author, and they are incredibly hard to exorcise.

Friday, November 4, 2011

Sacred Bundle 2.0: Anthropology Online

Oxford University Press has launched a new and ambitious on-line project, Oxford Bibliographies Online, which attempts to provide scholars, students, and other interested readers with introductions to important topics and themes from many academic fields/disciplines. Atlantic History, Criminology, Communication, Philosophy and Sociology are among the modules already available. Later this month, Political Science and Psychology go live, with Education soon to follow.

Anthropology is slated for release early in 2012, and I have agreed to help editor that particular module. Oxford was able to put together a strong editorial board for the project, which included scholars from all four of American anthropology's major sub-fields: archaeology, linguistic anthropology, physical/biological anthropology and cultural anthropology. These nine scholars helped to select and vet the entries on various topics (including Applied Anthropology, Cultural Evolution, Public Archaeology, Language Ideology, and Globalization). All in all, OBO's Anthropology site will launch with 50 entries penned by scholars from across the country and the world, including Michael Herzfeld on "Nationalism," Vernon J. Williams on "Franz Boas," Jeremy Sabloff on "Public Archaeology," Neni Panourgia on "Interpretive Anthropology," Kudzo Gavua on "Ethnoarchaeology," "John Trumper on "Ethnoscience," and Christina Campbell on "Primatology" (just to name a random few).

Once the site launches, four anthropologists (Marcus Banks, Maria Franklin, Jonathan Marks, and Bambi Schieffelin) have signed on to help read new entries (about 25 or so will be added every year), and our authors and editors will all update entries as necessary (when new titles merit inclusion or emergent debates in specialties demand discussion). The idea is to make these entries living, breathing documents that morph with ongoing reconfigurations of our discipline.

I only agreed to assist in this effort because I was intrigued by the idea of re-familiarizing myself with the so-called "four fields of anthropology" mentioned above. As a graduate student at Columbia in the 1990s, I was trained in a four-field department, even though I could get away with doing coursework in only two of those sub-fields. And after teaching for four years in Duke University's Department of Cultural Anthropology (where we all seemed to be in the same scholarly conversations), I am back in a four-field department that demands grad students pass exams in all of the sub-fields, one of the few programs in the country with such a stipulation.

Although I don't consider anthropology's four fields a "sacred bundle" never to be dis-assembled under any circumstances, I am intrigued by the idea of forcing myself to learn more about the four farthest corners of this sprawling and hubris-filled discipline that imagines itself to cut across the humanities, the social sciences and the natural sciences.

Oxford's new initiative will allow anthropologists to think about how much (or how little?) we might really gain from conversations across the intradisciplinary domains that often divide us. OBO's intervention will help us to see how Physical Anthropologists and Cultural Anthropologists might differently approach topics such as "race" or "gender." Or we can determine what kind of reviewer an urban anthropologist working in contemporary Latin America would make for a piece on the histories of cities crafted by an archaeologist.

I'm intrigued to see what (hopefully productive) sparks might fly from such contact, and I've already learned so much about those other anthropological spheres during the build-up to near year's OBO launch. So, if you are an anthropologist gearing up for this month's AAA meeting in Montreal, please know that I might be asking you to contribute to this attempt at a somewhat experimental four-field rendering of our discipline's scholarly world. And please consider taking part.

Friday, October 21, 2011

LA Screening of my new documentary this Friday...

Bad Friday: Rastafari after Coral Gardens (63 mins.)
A documentary film
Directed by Deborah A. Thomas and John L. Jackson, Jr.
Produced by John L. Jackson, Jr., Deborah A. Thomas, and Junior "Gabu" Wedderburn

When: Screening in Los Angeles at 2pm on October 28th, 2011

Where: Laemmle’s Sunset 5
8000 Sunset Boulevard
West Hollywood, California 90046
Theater #5: Seating capacity is 198

Online ticket sales begin October 7, 2011. For complete details visit our website at

Ticket Prices
$15 Screening Tickets*
$11 Children under 12/Seniors/Military/Students w/ID
$11 Matinees before 4:00 p.m.

Bad Friday chronicles the history of violence in Jamaica through the eyes of its most iconic community – Rastafari – and shows how people use their recollections of the Coral Gardens “incident” in 1963 to imagine new possibilities for the future.

For many around the world, Jamaica conjures up images of pristine beach vacations with a pulsating reggae soundtrack. The country, however, also has one of the highest per capita murder rates in the world, and the population is actively grappling with legacies of Western imperialism, racial slavery, and political nationalism – the historical foundations of contemporary violence in Jamaica and throughout the Americas. Bad Friday focuses on a community of Rastafarians in western Jamaica who annually commemorate the 1963 Coral Gardens “incident,” a moment just after independence when the Jamaican government rounded up, jailed and tortured hundreds of Rastafarians. It chronicles the history of violence in Jamaica through the eyes of its most iconic community, and shows how people use their recollections of past traumas to imagine new possibilities for a collective future.

“Bad Friday is live evidence for reparations from the Government of Jamaica for the Coral Gardens atrocity of 11 April 1963. The Prime Minister Sir Alexander Bustamante’s order to “Bring in all Rastas, dead or alive!” is a crime against humanity that should not be forgotten.” - Ras Iyah V and Ras Flako, Rastafari Coral Gardens Committee

“Amidst the proliferation of films on Rasta, none have managed to fathom the Rastafari experience of their Jamaican Babylon like Bad Friday. Now that Rasta is an increasingly co-opted global culture, this is as close as the untutored will get to understanding the meanings of being ‘Dread’ during the pre- reggae period when adherents were viewed as a ‘cult of outcasts’ and routinely victimized. A powerful and timely historical document that speaks to the ways that remembering-and-forgetting continue to shape Jamaica’s post-colonial identity.“ - Jake Homiak, Curator of ‘Discovering Rastafari’, Smithsonian Institution

“By bringing to us the poignant testimony of the men and women who witnessed and whose lives were forever scarred by these events, Bad Friday obliges us to confront the shocking level of state violence that was unleashed against not only the individuals involved, but also against the entire Rastafarian community of Jamaica. Now, thanks to this evocative film, we are able to appreciate the full horror of the events from that distant time and what they portended. I salute and congratulate everyone involved in the making of this redemptive and truly valuable work of historical memory.” - Robert A. Hill, University of California, Los Angeles

Tuesday, October 18, 2011

Is Herman Cain Racist?

cross-posted @ The Chronicle.

I’m not following the lead up to 2012′s presidential election the way I hung on every Democratic and Republican candidate’s words in 2006 and 2007. Even still, it is hard to miss the major headlines, no matter how much one might try: Obama’s plunging poll numbers, critiques of Romney’s religious persuasion, Rick Perry’s n-worded family home, and the conspicuously growing journalistic indifference to anything at all Bachmann-related.

Herman Cain’s 9-9-9 tax plan might have gotten panned by several of his fellow candidates during last night’s Republican debate (including Bachmann’s likening it to a version of the Biblical “mark of the beast”), but it is the continued public “controversy” around Cain’s take on racism in America that seems to have everyone up in arms right now.

This all started when Cain dismissed racism as a significant cause for African-American marginalization. “I don’t believe that racism today holds anybody back in a big way.” That was the way he put it last week on Fox News.

Add to this the fact that just yesterday Cain characterized his black detractors as “more racist than the white people that they’re claiming to be racist,” a rehashing of chicken-egg counter-accusations of racism:

“You’re a racist.”

“Oh yeah, well, then you’re a racist for calling me a racist.”

“See what I mean. Only a racist would believe that.”

Of course, Cain’s comments about racism are unintelligible if disconnected from his consistent slamming of the “Occupy Wall Street” movement as a form of seemingly anti-American class warfare, a conspiracy between “unions and Obama supporters to distract the American people from the real problem, which is the failed policies of the Obama administration.”

Just this Monday, Cain spoke with conservative commentator Sean Hannity about the Occupy Wall Street protesters. “They’re trying to legitimize themselves by comparing themselves to the Tea Party movement,” he claimed. “There is absolutely no comparison.” According to Cain, the Tea Party represents a legitimate form of political opposition and civic engagement, but the Occupy Wall Streeters are rabble-rousing thugs, a merely apolitical mob.

It is a position shared by the black conservative pundit (can’t remember his name) who was on MSNBC last night complaining about all the public sex acts, violence, and other illegalities taking place under the banner of the Wall Street crowd. These exact same characterizations were mobilized by many conservative pundits during the Civil Rights Era to dismiss that movement as insincere and degenerate, as full of little more than sex freaks and criminals.The response to such dismissals is often the same: How can an African-American such as Cain (or that aforementioned and unnamed MSNBC pundit) celebrate an "anti-Obama" Tea Party over and against other forms of social protestation about social inequality and corporate greed?

Do figures like Cain represent a form of internalized anti-black racism, which is the critique that Cain seems most keen on challenging? Or is it simply (as if any of this were simple) a case of class-interests trumping racial solidarity for a wealthy former-CEO of a national fast food chain? The party/candidate that wins this ongoing debate about the relative significance of race vs. class for contemporary American society might just end up with the next set of keys to the White House.

Sunday, September 25, 2011

Doing Diversity DIfferently

Many universities operationalize their commitments to “diversity” in curious and conflicted ways.

In some academic quarters, the term has traditionally meant diversifying standing faculties and student bodies on college campuses by increasing the number of underrepresented minorities, but diversity is one of those ideals that people sometimes accept much more readily in theory than in practice, a principle supported in the abstract but harder to justify as a hard-and-fast campus policy with any real teeth to it, especially with legal threats of “reverse discrimination” lurking in the shadows.

Campuses debate the very meaning of diversity these days, some seeing calls for, say, “internationalization” as a calculated attempt to replace ongoing university commitments to U.S.-based minority recruitment, others asking specifically for “ideological diversity” to address the low number of self-described “conservatives” teaching at many elite colleges and universities.

Some of these same schools end up crafting faculty diversity initiatives that might seem to go against the grain of their institutions’ regular hiring practices, counting and categorizing bodies and subsequently creating search committees to lure accomplished scholars (especially scholars of color) from other places.

The next trick, upon locating interesting candidates, entails convincing relevant departments to consider hiring them. If the names are big enough, the CV’s impressive enough, a department might acquiesce. But there are several reasons why such a strategy, though arguably laudable for its directness and relative simplicity, may be a bit short-sighted.

We could think of central administrations as the academic equivalent of our federal government, which is an analogy that would make specific departments akin to individual states. And in such a strained metaphorical context, just about every single faculty member we’d ever meet would be a staunch advocate for states’ rights. So any extra-departmental recommendations about future hires are usually treated as infringements on departmental autonomy, which they are, as academically unconstitutional (and unconscionable). If the departments didn’t come up with those names on their own, there will almost certainly be a contingent of faculty members with a vested interest in thwarting the hires in question, and sometimes just on general principle. Of course, part of the problem is that some of those very same departments have a hard time coming up with any “diverse” job candidates on their own, often chalking it up to deficiencies in the pipeline.

But if universities are trying to identify “diverse” bodies in such ways, they may be setting themselves up for failure in the long-term, and not just because of any willfully obstructionist faculty. (Of course, at a time of ever-shrinking resources, departments might take new faculty lines any way they can get them.)

The very process of having special hires for minority candidates makes the entire thing look like a “political” move and not an “intellectual” one, regardless of the caliber of the candidates in question (and even though all faculty hires are political and intellectual at the same time). Such a targeted process can feel qualitatively different from normal hiring practices, reinforcing a kind of ghettoized mentality about the entire endeavor.

Indeed, once universities have gotten to the point of looking for bodies qua bodies, they may have already lost the diversity battle.

For example, one of the reasons why departments aren’t as diverse as they could be might pivot on the intellectual projects and objectives that those departments privilege (or not), the way they prioritize their needs and intellectual goals for the future. It is an empirical question, but I wouldn’t be surprised, for example, to find out that there might be a kind of mismatch in many social science fields between what scholars of color are interested in studying and what departments are interested in hiring. If nothing else, it might make sense to have more conversations within academic departments about how five-year plans and statements of departmental goals may frame certain intellectual questions (and the scholars who pose them) out of serious consideration in terms of future departmental growth.

x-listed @ The Chronicle of Higher Education

Friday, May 13, 2011

The End of the World: Next Week?

image from

Some people invoke the ancient Mayans to contend that the world’s expiration date is quickly approaching, but there are other arguments afoot these days about just how imminent such an end might be. You may have seen the billboards for one of them: Family Radio’s declaration that Judgment Day is “guaranteed” to begin (and quite conspicuously) on May 21st, 2011.

I have been completely fascinated by this proclamation ever since I first heard Harold Camping’s matter-of-fact declaration earlier this year (while surfing the FM dial on a drive up to NYC). Camping is a long-time Christian radio broadcaster who has been a mainstay at Family Radio since the early 1960s. Shuffling through broadcast options in my Saturn, I knew Camping’s voice as soon as I heard it, mostly because I grew up on it. Extended family members always seemed to have his program on in their homes, so much so that it served as part of the soundtrack to my childhood (explaining, I’m sure, many of my subsequent scholarly interests).

Hearing Camping’s distinctive voice earlier this year (some 20 years or so after the last time) made me immediately nostalgic. And then I heard his claim, which wasn’t being argued back in the 1980s and 90s (as far as I can remember), about the earth being slated for destruction next week, and that was it. I was riveted, spending the next hour listening to his contentious debates with listeners about the veracity of that spectacular prediction.

Camping claims the Bible provides “absolute proof” that the Rapture will take place a week from this Saturday, and that God will then destroy the entire planet (and even the universe) on October 21st, five months later. This is God’s plan for the end of the world, Camping argues, and it is a function of our planet’s “wickedness.” Even still, his theological position is quite clear about our agency in the matter. God chose the elect (in a predestinationist sense) at the beginning of time, about 200 million people in all, and his choices are a function of his own opaque will, not any of our actions, good ones or bad.

Camping is considered something of a heretic in certain Christian circles (for allegedly claiming that the Holy Spirit has no connection to most institutionalized Christian churches/denominations today). And he supposedly set a previous date for the Lord’s return, which was slated for 1994.

Family Radio is a religious radio network that spans over 100 markets in the United States, so I’m sure that hundreds of thousands (maybe even millions) of people have gotten a whiff of Camping’s new date.

I should also say that I am particularly primed for such premilleniallist talk. I grew up Seventh-Day Adventist, which emerged out of the Millerite movement’s inaccurate 19th century prediction of God’s return. They picked a date and organized their entire lives around preparing for it.

I’m currently working on a book about a group of African-Americans who made similar predictions in the late 20th century, a group that has built a vibrant transnational community on top of those earlier pronouncements. So you can see why I am particularly drawn to such contemporary contentions.

I recently relayed Camping’s claim to a friend of mine, someone who isn’t a practicing Christian and doesn’t have much patience for predictions about the future based on interpretations of religious texts. Even still, he laughed, shook his head, and added, “but given how crazy things have been lately [by which, I think, he meant the chain-reaction of uprisings in the Middle East, the Tsunami in Japan (and its nuclear aftermath), and the global financial crisis, amongst other things], I’ll probably wake up on May 22nd with a slight sigh of relief.”

Saturday, April 2, 2011

I hate Jazz Music

I’ve always wanted to start a piece of writing with that one provocation—maybe a bit of creative nonfiction, maybe a would-be short story.

My purposefully dismissive declaration is meant to mark a two-fold resentment. First, not being a musician myself, I privilege jazz’s vocalists over its virtuosic drummers, saxophonists, and trumpeters, and for many jazz purists, that is my initial mistake: I want to hear Louis Armstrong sing more than blow his horn. Nina Simone, Arthur Prysock, Ella Fitzgerald, and Billie Holiday are towering figures, no doubt, with huge and loyal followings, but the Miles Davises and John Coltranes and Thelonius Monks undeniably define the music’s canonical core, especially for many would-be connoisseurs. And there begins my second complaint.

At its most pretentious, jazz music sometimes gets mobilized (by a few of those aforementioned connoisseurs) to justify pompous brands of social sifting, a snobby elitism that functions as the class-coded policing of authentic African American cultural production. No other music, the claim goes, can hold a candle to its essential (and even existential) distillation of African American angst and aspiration. If the blues demands respect for its straightforward and vernacular profundities, jazz adds a learned and well-heeled dose of proficiency to the mix. And neither one is hip-hop, still occasionally invoked as “the anti-jazz.”

In the not-so-distant past, musician Wynton Marsalis and cultural critic Stanley Crouch were the most vocal proponents of jazz’s qualitative difference from (and superiority over) hip-hop. Crouch, also an avid fan of the blues, has mused publicly about the “retarding effect” of hip-hop, a genre that, according to Crouch, takes relatively little talent and profits from the denigration of black culture. Marsalis has gone on record dismissing hip-hop as little more than “a safari for people who get their thrills from watching African-American people debase themselves, men dressing in gold, calling themselves stupid names like Ludacris or 50 Cent, spending money on expensive fluff.”

“Jazz vs. hip-hop” is just one instantiation of a slow-burning intra-racial class warfare played out on the boneless (and, therefore, flexible) back of popular culture, pivoting on the politics of respectability in mixed-raced company. Jazz is one black middle-class response to the threat of racial inauthenticity, its trump-card rejoinder to the equally problematic assumption that urban poverty is singularly constitutive of legitimate African-American subjectivity. And this is true even if the black middle class is deemed unable or unwilling to sustain jazz music, which leads to discussions (at least in Spike Lee joints) about the extent to which jazz has become “white music,” i.e., supported by mostly white audiences.

Hop-hop artists Jazzy Jeff and M1 joined academics Jesse Shipley and Wilfredo Gomez at Haverford College last night to talk about hip-hop’s image, history, and technological innovations. Not only that, they placed their thoughts about hip-hop into conversation with other cultural practices and musical genres (though we didn’t hear that much about jazz).

I only bring this all up because the journal Transforming Anthropology has just published a special-issue on New Orleans that, amongst other things, contextualizes “popular culture” (jazz, hip-hop, and Mardi Gras) with recourse to larger questions about post-Katrina life in that region. (The opening of this post is an excerpt from my own piece in that volume.)

This new issue of TA is worth reading, especially since it includes a moving series of articles on the scholarship of Antonio Lauria. I’m not sure what Lauria knows about hip-hop (or if he shares any of my concerns about how jazz gets deployed in the “class wars”), but his research certainly has had a lasting impact on Caribbean anthropology. And beyond.