Friday, November 19, 2010

To Conference or Not to Conference

The last time I blogged about attending an academic conference, I found myself mercilessly pummeled by several very upset Chronicle of Higher Education readers (the other place where I blog--a little more consistently). They used my post as an excuse to rail against professors who (irresponsibly!) skip out on their classes in order to attend such "conferences," a practice dismissed as little more than a scam, the kind of racket that allows academics to go off gallivanting in exotic locales under the trumped-up auspices of professional development and research dissemination.

“How many classes did you have to cancel to attend your little conference?” It started something like that. One reader asked me the equivalent of that very question several different times, trying to determine if my conference attendance was at the expense of my teaching obligations. Even after I explained that the conference didn’t require me to miss any of my scheduled class sessions, not one, said reader refused to register my response, asking that selfsame question (about how many of my classes I canceled for the conference) at least two more times.

Several more unhappy readers decided that they were going to use the opportunity to make a larger argument about the complete uselessness (and pseudo-intellectualism) of academia’s self-indulgent tradition of conferencing. Some of them argued that scholars should exclusively tele-conference or deploy other new-media options in their would-be efforts to forge and maintain potentially powerful inter-institutional links with peers. Why, they asked, make a fetish of the face-to-face?

Both those who anonymously posted their anti-conference comments on-line (and the many more who emailed me or called my office phone to express their displeasure over my uncritical celebration of academic conferences) seemed to get particularly upset about the post's characterization of conference-attendance as a mixture of informal chats with other academics in packed conference lobbies and laughter-laced drinking atop cushy stools at fancy hotel bars.

I only ponder that previous debate now because I am currently in New Orleans at the 109th Annual Meeting of the American Anthropological Association. It is my first trip back to New Orleans since Katrina, which almost seems like a scandalous thing to admit. And coming to hang out in such a mystical town was clearly an added bonus of attending this year’s conference.

I just got in Wednesday night, but I’ve already gone to several panels, one of which included an absolutely fantastic presentation by one of Penn’s anthropology graduate students. And I even checked out the first half of a rather hypnotic ethnographic film, Movement (R)evolution Africa, which examines the evocative links between contemporary African choreography and newfangled understandings of African subjectivity and embodiment.

Still, most of my day consisted of hallway-talk with colleagues I haven’t seen in a while and getting the word out about some new scholarly initiatives that I am helping to launch: a book series on the intersections between race and religion and an ambitious and expansive on-line bibliography for the discipline of anthropology. So, I’ll spend a lot of time in New Orleans leaving panels early, getting to panels late, and sipping cocktails well into the night. (Well, maybe not so late. Even as an undergrad, I got tired by about 10pm.) But I don’t buy the claim that any of this isn’t a legitimate way to make sure that I stay tied to disciplinary conversations.

I realize that many of those aforementioned anti-conference readers will scoff at my claim, but at least I didn’t have to cancel class. Again, maybe that's some consolation.

Not that that would have been a huge issue, either. Several students from my graduate class this semester arrived in New Orleans even before I did, which means that we could have engineered an impromptu seminar discussion in the hotel lobby if we absolutely had to. Drinks optional.

Monday, July 19, 2010

The R Word

The R-word in question is racism. Everyone's throwing it around these days, but very few people seem to agree on what it means.

The NAACP recently asked Tea Party leaders to repudiate the movement's racist members, to stop displaying "continued tolerance for bigotry and bigoted statements."

Mark Williams of the Tea Party Express responded by describing the NAACP's antiquated use of the word "colored" (in its name) as racist and declaring that the storied Civil Rights organization makes "more money off of race than any slave trader" ever did. (Just over the weekend, Williams was expelled from his own group because of a satirical letter he penned that has been described as racist.)

Other right-wingers simply dismiss the NAACP's accusation of racism as racist, the socio-political equivalent of saying "I'm rubber; you're glue. Everything you say bounces off of me and sticks to you."

Via tweet, Sarah Palin called the NAACP's very charge "appalling."

In other racial news, Jesse Jackson is still being clowned and condemned for claiming that Cleveland Cavaliers owner Dan Gilbert can only see Lebron James as a high-priced "runaway slave," and Whoopi Goldberg has been defending her defense of Mel Gibson all week. For the last few days, we've been getting new tape recorded snippets of a voice that sounds a lot like Gibson's (granted, a demonically possessed version) raging against the mother of his youngest child with a barrage of sexist expletives: c-words, b-words, f-bombs and just about every other letter in the alphabet. That same tape-recorded voice matter-of-factly deploys terms like "wetback" and the n-word to color its apoplectic attacks.

"I have had a long relationship with Mel," Goldberg declared. "You can say he's being a bonehead, but I can't sit [here] and say that he's a racist, having spent time with him in my house with my kids."

Detractors dismiss Whoopi as an apologist with a long history of defending the indefensibly racist, Ted Danson's blackface Friar's Club performance being their prime example. Whoopi's position is instructive though, and reminiscent of when African-American comic Paul Mooney took some heat for not demonizing Michael Richards after the latter's 2006 "meltdown," when Richards peppered his comedy club audience with a string of n-words and lynching imagery (in response to some black hecklers).

But there are at least two important things to remember in any discussion about the facts or fictions of racism (and counter/accusations thereof).

First, racism is almost never a smoking gun. It explains very little all by itself. Social causality is much more complicated than that.

Historians of early America have been unpacking and debating a version of this point for years. Our country's history of chattel slavery wasn't caused (in any simplistic and straightforward sense) by racism. Was Trans-Atlantic slavery a clear-cut example of racism? Yes. Did racism (as ideology) facilitate, justify, and rationalize the dehumanization of African people? It did. But racism alone doesn't provide us with the system's motives and raison d'etre. At the very least, we'd need to add economic arguments to that mix.

All of that is simply to say that racists are never just racists. Racism is not a mysterious island somewhere in the middle of the ocean. Eighteenth- and 19th-century slavemasters were racists, but they weren't only racist. They were also revolutionaries and humanitarians, adventurers and religionists. To call someone racist isn't about explanatory exclusivity. Racism is one important ingredient in the recipe for American apple pie, but there are still other details to be worked out about how much it adds, about when in the process it gets added, and about what else goes into the mixing bowl.

Second, racism is less about what someone is (absolutely and forever) than about what a person does (in specific moments). Racism is at least as much about opportunity as ontology (to butcher a proper philosophical term).

We often imagine ourselves to be looking for racists who are racist 365 days out of the year. To chronicle the several days each week or month or lifetime when they are not demonstrably racist is either (i) to dismiss such fallow periods as exceptions (or mere performance) or (ii) to offer them up as proof that said accusations are false. But it doesn't make sense to think of racism the way we think of, say, racial identity (as something we conspicuously carry around with us all the time, everywhere we go). That's one of the most powerful points demonstrated by Officer John Ryan, the disturbing character played by Matt Dillon in the award-winning 2004 film Crash.

In one scene, Ryan is a working-class cop who mercilessly harasses a middle-class black couple during a traffic stop, clearly relishing his racial privilege and lauding it over his intimidated victims. In another scene, he can risk his own life to pry that same black woman from a burning car before it explodes.

Critics knock the film for ignoring the lopsided specifics of America's racial history, making every example of racial prejudice (black on white, white on black, white on Latino, black on Latino, black on Asian...) equivalent to every other.

Dillon's character was often singled-out as a pathetic attempt to humanize and redeem white racism. But that's only one interpretation. The film also argues that a racial monster in one moment can be a self-sacrificing hero in the next. Very few people organize their every breath around racial animus. We often slip in and out of racism's seductive logic: sometimes rising to meet the better angels of our nature, sometimes falling victim to the easy lure of social scapegoating. That's what's so complicated about how racism animates our social lives today, helping to explain why Whoopi is right and wrong about Mel Gibson. Gibson might be a child-friendly, politically correct dinner guest one night and a maniacal phone caller spitting out the n-word in the morning.

(Cross-posted at the Chronicle of Higher Education)

Thursday, June 10, 2010

'White Guilt' and the Revolution

Is "white guilt" really real? Slavoj Žižek thinks so.

The Slovenian political philosopher (once dubbed "the most dangerous philosopher in the West" by the New Republic and "the Elvis of cultural theory" by The Chronicle of Higher Education) has written a communist manifesto, First As Tragedy, Then As Farce, challenging contemporary interpretations of 9/11 and of the global financial meltdown of 2008. I won't try to capture all the nuances of that ambitious and provocative work, but I will give you my version of its punch line: that only what Žižek calls "a dictatorship of the proletariat" can make up for the limitations and constitutive exclusions that inescapably define capitalism (and liberalism and socialism) in all of their various guises.

Far from being a threat to capitalism's undeniable ubiquity and unchallenged global hegemony (as some Leftists attempt to interpret things), Žižek sees the current global recession as potentially clearing the way for even more ramped up capitalist hysteria/utopianism. He also frames it as the context/pretext for intensified tensions between "democracy" (as a political system) and "capitalism" (as an economic formation). What if "capitalism with Asian values" (i.e., the invisible hand of the free market tightly clasped with an iron fist of totalitarianism) proves to be a more efficient and effective way to capitalize on the fundamental logic of capitalism?

French historian Pierre Rosanvallon claims that Scottish Enlightenment thinker Adam Smith was, in effect, arguing for "the withering away of politics," theorizing the emergence of a free market system that could potentially govern all of social life (rationally and fairly) without recourse to merely political concerns and considerations. Žižek's critique of the complicities between and among liberalism, socialism, and capitalism similarly asks what we might gain from thinking long and hard about how particular understandings of the relationship between politics and economics get naturalized.

As part of his argument, Žižek rails against the pathetic hubris of "white guilt," what he labels "an inverted form of clinging to one's superiority." Quoting from a section of Frantz Fanon's Black Skin, White Masks, a passage that Žižek describes as demonstrating Fanon's "refusal to capitalize on the guilt of the colonizers," Žižek demands that his readers inoculate themselves from the seductive sickness of "identity politics" in all of its "private" and non-universal forms (race, gender, sexuality, religion, and so on).

To be fair, this last point is little more than an aside for Žižek, a drive-by theoretical shooting along a tiny stretch of the much longer highway that eventually leads home to the Communist idea, but any real discussion of "white guilt" (and the ostensible implications thereof) would have to reference the work of Shelby Steele. For Steele, white guilt isn't an aside. It is one of America's central dilemmas. His book on the subject, White Guilt: How Blacks and Whites Together Destroyed the Promise of the Civil Rights Era, argues that "white guilt is quite literally the same thing as black power," the reduction of moral authority to a zero-sum game between blacks and whites wherein what was once the stigma of race becomes the neo-stigma of racism. The more guilty whites feel about race/racism, the more empowered blacks are to use accusations of racism (and invocations of America's racist history) as a disciplining rod. Steele cautions against the lure of white guilt: for blacks, as a form of political capital; for whites, as a performance of social penance.

To hear Steele describe it, white guilt sounds like a metaphysical totality that overdetermines contemporary American life (and maybe not just the parts that have anything to do with racial issues). White guilt gets cast as the overarching organizing principle for race relations, but is that really true? Does white guilt explain the central dynamics of contemporary inter-racial exchanges and interactions?

In this version of things, "playing the race card" is political slang for attempting to exploit forms of white guilt. Affirmative Action gets dismissed as a policy predicated on a misguided effort to manage and minimize white guilt. But is "white guilt" really real? I mean, any more so than, say, what we might call middle-class guilt (vis-à-vis poor people)? Or heterosexual guilt (vis-à-vis homosexuals)? Or even, say, Christian guilt (vis-a-vis Muslims)? What manner of "guilt" is this? And does it make sense to offer it up as the analytical framework for our contemporary socio-racial moment?

Indeed, white guilt doesn't seem to define the ethos behind the Tea Party push. Janeane Garofalo isn't the only one who wants to characterize them as reactionary and racist, as anti-Obama simply because they're anti-black. Self-professed Tea Partyers take offense to such accusations, and they also seem to display a decided lack of guilt about America's racial history, a guilt-freedom that serves as one of the engines powering their political efforts. I would think that even though most Americans (and most white Americans) aren't card carrying Tea Party types, they also aren't particularly angst-filled about America's racial history either. Few Americans are.

Maybe a powerful film or book can provoke a pang of sadness, humanizing the past in ways that are poignant and real. And I wouldn't argue that white Americans never reflect on how or why under-represented minorities are so under-represented in elite spheres. But is it really accurate to claim that "white guilt" haunts the American psyche? Can we use that to explain anti-racist efforts anymore than we can use that aforementioned "heterosexual guilt" as the fundamental psychological drive behind the push to get rid of "don't ask, don't tell" in the military? In fact, people are increasingly willing to invoke bad genes or the "culture of poverty" (over and against America's sordid racial history) to explain contemporary racial disparities in education and employment. That seems like a powerful anti-guilt move.

Most of Obama's detractors might be extra careful about deploying their political rhetoric so that they don't find themselves described as racist, bowing to some of the mandates of a politically corrected public sphere, but they have no qualms at all about attacking America's first black president with all the gusto they can muster. They are trying to foment a revolution, and they don't feel guilty about that, not one bit.

Of course, Steele was really talking about liberals not conservatives. Žižek was too. But I'm not sure that "white guilt" is as big a problem as these cultural critics make it out to be. Moreover, the election of President Obama might be ushering in an era of "white rage" that is more than giving "white guilt" a run for its money.

Thursday, May 6, 2010

Race, Genetics, and Harvard Law School

Is it reasonable to simply ponder the "possibility," ever so idly and hypothetically, that bad genes might explain African American underachievement? It is a an old and many-told tale, I know, but it just got a fresh re-telling at Harvard Law School this month.

A Harvard Law student recently apologized for comments she emailed to friends and colleagues following what sounds like an intriguing and heated dinner-time discussion about Affirmative Action. After first expressing concern that some of her earlier comments during that aforementioned dinner were misconstrued as politically correct, the student attempted to clarify her take on the matter.

"I absolutely do not rule out the possibility," she wrote, "that African-Americans are, on average, genetically predisposed to be less intelligent."

Claiming that sound research could convince her otherwise, she seemed intent on dispelling any lingering sense among her friends that she might be too timid about the notion of considering potential linkages between race and intelligence.

She went on: "I don't think it is that controversial of an opinion to say that I think it is at least possible that African-Americans are less intelligent on a genetic level. And I didn't mean to shy away from that position at dinner."

The student then ended her email with a joke. "Please don't pull a Larry Summers on me," citing the firestorm that Harvard's former president caused by broaching the idea that the under-representation of women in math and science might be predicated on their genetic endowment. Summers was eventually forced to resign his post.

After a public reprimand from the law school's dean, Martha Minow, the student apologized for her email and took back her claim about being open to considering possible genetic links between race and intelligence.

"I emphatically do not believe that African-Americans are inferior in any way," she said. "I understand why my words expressing even a doubt in that regard were and are offensive."

But what is she apologizing for? The very thought? Is this an example of "politics" trumping science by deeming certain research questions impossible to ask?

Ironically, the law student appears to have been reprimanded (during that earlier dinner conversation) for a form of political correctness, for not clearly accepting the premise that genetics might explain race-based differences in intelligence (and, by extension, social achievement), a premise that her friends appear to have chastised her for "shy[ing] away from."

This Harvard student's email has been overshadowed by Harvard Professor Skip Gates's recent New York Times op-ed, which is equally controversial in terms of contemporary racial politics. The Gates essay emphasizes African complicities in the Trans-Atlantic slave trade as a way to problematize calls for reparations here in the United States. He asks, somewhat rhetorically, if African nations should by asked to fork over some cash, too.

One reading of the Gates essay (and its critics abound) castigates him for "blaming the victim" and letting Europe and America off the hook, for pretending that every link in chattel slavery's horrible chain carried equal weight.

Of course, it is easy enough to read genetic explanations for racial achievement gaps as another way of blaming victims (and, in that case, their biological makeup), of letting real (social and political) culprits off the hook. If racial thinking is "bad biology" (as social constructionists and many physical anthropologists currently proclaim), we should be suspicious of any too-easy and essentialist invocation of racial groups as "natural" hooks on which to hang causal claims about inequality.

Gates isn't going to apologize for his (post-racial?) reading of history, and some people won't accept this law student's attempt at an apology. But, again, why is this student apologizing at all? That's one of the most important questions we can ask. Is it simply for offending African-Americans? For invoking race as nature rather than nurture? For racial insensitivity? For fear of being labelled a racist? And why do we often invoke genetics as some kind of holy grail that can reduce the messy machinations of everyday life to ostensible irrelevance? What kind of irrationality might that represent?

Monday, April 12, 2010

Academia on Somebody Else's Terms?

Jay Ruby cautions anthropologists against deploying film and video equipment on terms that are completely determined by an institutionalized media industry with its own assumptions about how stories are supposed to be told and circulated. He argues that anthropologists might need to organize their narratives (and distribute their films) in ways that run counter to industry (and even audience) expectations. There is a danger in approaching film making the way others do, he says, a danger that includes potentially betraying anthropology's intellectual mission.

Philosopher Lewis Gordon has recently penned a powerful piece that asks academics to reconsider current tendencies to perform intellectual authority in ways that traffic in neoliberal logics of financial accumulation and brand-name fetishization, logics that may similarly betray our basic intellectual mission. There is a danger, he argues, in performing scholastic subjectivity on terms that seem foreign (even antithetical) to academia's traditional considerations and methods of appraisal.

Gordon's thoughtful and provocative piece, "The Market Colonization of Intellectuals," reads something like a manifesto, and it made me think about my own too-easy acceptance of academia's hyermarketization. He argues that academics can't serve two masters, can't occupy two separate spheres at the selfsame time: the life of the mind and the mandates of the marketplace. Moreover, he claims that we are increasingly getting used to just such a bifurcated and contradictory existence. Gordon describes a "managerial academic class" of professional administrators charged with aligning academia's values and self-assessments with the organizing principles and measuring modalities of the market. "Market potentiality," he says, "governs everything [that many academics] produce." Gordon designates this "the market colonization of knowledge."

Gordon also questions the branding of analytical concepts such that they are flattened out for public consumption and magically fused with their intellectual creators: deconstruction and Derrida being one of his prime examples. This isn't a critique of Derrida or a dismissal of deconstruction's epistemological purchase. It is a plea for, amongst other things, an academic model of productivity that doesn't reproduce and reinforce the ubiquitous cult of celebrity, one of the most powerful points of entry into a mass mediated public sphere and the overflowing bank accounts of its most recognizable occupants.

The piece even takes on academia's impoverished commitment to (and operationalization of) what it means to be "smart." "In the academy," Gordon writes, "nothing is more marketable than the reputation of being smart. This makes sense: No one wants dumb intellectuals. The problem, of course, is how ‘smart' is defined. In a market-oriented society, that means knowing how to play the game of making oneself marketable. The problem here is evident if we make a comparison with ethics. I once asked an environmental activist, who argued that a more ethical ecological position is the key against looming disaster, which would bother her more: to be considered unethical or stupid? She admitted the latter."

The piece asks what kind of academic world we've created if the universality of a certain apotheosis of smartness becomes our highest (maybe our only) moral value. Gordon demands of academics a more rigorous reflexivity, a critical self-consciousness that challenges what's become orthodoxy in contemporary academic life.

Gordon and Ruby both demand such a critical self-reflexivity from their colleagues. Gordon argues that anything less than that compromises our scholarly significance. Ruby claims that ethnographic films, as one instantiation of intellectual projects, might need to look very different from other motion pictures.

I'm teaching a graduate film course this semester that attempts to take up some of Ruby's challenge, asking students to de-familiarize mechanically reproduced audiovisual products just enough for them to start seeing such offerings in slightly newfangled ways. We are reading critical histories of early cinema (for example, Peter Decherney's analysis of early Hollywood's ties to academia; Jacqueline Stewart's evocative theorization of the links between popular cinema and the lives of African Americans during the Great Migration; Hannah Landecker on the central role of early medical films to any discussion about the creation/popularization of cinema) along with ethnographies of media/mediation (from folks like Roxanne Varzi, Alan Klima, Michael Taussig, Diane Nelson, and John Caldwell), and differently pitched philosophical treatments of film/video/digital products/processes (by Roland Barthes, Walter Benjamin, Kara Keeling, Susan Buck-Morss, Kirsten Ostherr, D.N. Rodowick, and others). We are also watching films/videos that challenge traditional ways of seeing (including Bill Morisson's Decasia, Cheryl Dunye's The Watermelon Woman, William Greaves's Symbiopsychotaxiplasm, and Charlie Kaufman's Synecdoche, New York).

The course hopes to trouble some of the taken-for-granted presuppositions that we all have about ways of approaching the ubiquity of televisual, filmic, and digital representations. If the course works, students may not be quite as prone to unproductively normalized assumptions about how we interface with such technology.

The film/video market and its logics can also colonize and cannibalize the minds and methods of anthropological filmmakers/film critics who can find themselves seduced in ways that mirror some of the criticisms delineated by Gordon's challenging essay. You don't have to agree with every facet of Gordon's piece to imagine it as a wonderfully productive starting point for a spirited conversation about what academia ought to be.

Saturday, April 10, 2010

Michael Steele's Race Card?

The RNC's Michael Steele has recently made national headlines for "playing the race card" by agreeing with the claim that African-Americans like himself, in positions of power, have "a slimmer margin of error" in America. Steele included President Obama in that calculation, which was met by a swift dismissal from the White House press secretary.

Critics always find it ironic (even pathetic) when proponents of purported color blindness frame their own problems in terms of "racial victimization." The "Left" is assumed to traffic in such sophistries. The "Right," however, is supposed to know better. Clarence Thomas calling his confirmation hearing a "high-tech lynching" stands as the quintessential example of such racial irony. Even the people who claim obliviousness to racial reasoning seem susceptible to its rhetorical seductiveness.

But who really doesn't see race? When is it ever invisible? Immaterial? Irrelevant?

I just talked to a small group in Philadelphia about my most recent book, Racial Paranoia, and one of the listeners, an elderly white man, responded with a plea for the insignificance of race and racism as rubrics for understanding everyday life, especially his everyday life. He claimed that race had no impact on his daily activities. He wasn't a racist, he said. And he simply didn't see race. He had spent that very day teaching students, judging a science fair, and debating a group of university scholars. Race and racism, he assured me, had nothing to do with any of these experiences. And he made his case without anger, in clear and confident tones.

I responded by basically telling him that he was wrong, which wasn't the best route to take. I admitted to him that I am always a tad suspicious of people who claim not to see race at all. Indeed, I think that the very aspiration of postracialism (in most of its guises) is misplaced and romantic, repression passing itself off as transcendence. He listened to my response and then restated his point, very matter-of-factly. Several audience members tried to push back against his claim, arguing that even when race isn't explicitly thematized in, say, a classroom setting (one of the locations that the man had invoked), it might still be a valuable analytical lens, a real social fact. It might still be there, even if we don't see it. Not because it is biologically real, but because culture is most powerful when we can't clearly see it.

According to some theories on the matter, the only real racists left in America are the people unwilling to stop obsessing about race and racism, the folks who seem to see race behind every corner. If they just let race go, racism would wither and die away. The invocations of race and racism are incantations that keep bringing this beast back to life.

But Steele is just the most recent example of how easily self-serving calls for color blindness can morph into equally self-serving color cognizance. And it might not be useful to imagine that we only have two options: fetishizing race or ignoring it altogether.

Thursday, April 1, 2010

It's Not Just HBO. It's TV.

Did Congress ever pass health-care? Seriously. Lately, I've been trying to cultivate my own ignorance of all things "political." The news stories are just getting too bizarre: ongoing sagas in the wake of major earthquakes in Haiti and Chile; racial epithets that serve as soundtracks for Tea Parties; sex scandals that allegedly implicate, quite directly, a sitting Pope; Sarah Palin telling protesters to "re-load" in the context of actual violence linked to congressional votes and Tweets calling for Obama's assassination. With that as the backdrop, I've decided to issue my own self-moratorium on watching CNN, FOX and the evening news programs.

Instead, I'm using my television for more otherworldly fare. And TV has never been better. Although it is the quintessential site for sensationalized news-mongering, it is also the best place to spy complicated fictional tales about human life.

When (and why) did TV become so much better than motion picture film? I feel like that undeniable fact just kind of snuck up on the nation's couch potatoes. One minute we were awash in nothing but schlock melodramas and uninspired derivatives of Friends; the next, The Wire, The Chappelle Show, Mad Men, and The Sopranos drastically raised our televisual expectations.

In Production Culture: Industrial Reflexivity and Critical Practice in Film and Television, John Thornton Caldwell argues that a show like 24 radically altered the way television shows get made and further nuanced/complicated the narratives they deployed, a claim that anticipated part of the argument made in Steven Johnson's Everything Bad Is Good for You.

Last year, my colleague Elihu Katz used The ANNALS of the American Academy of Political and Social Science to wonder aloud about TV's potential demise. His query: are we currently witnessing "The End of Television?" Katz's point is hardly reducible to the "repertoire of output (call it content)" that one can watch today. That was just one element in a much more nuanced discussion he facilitated about the place of the "old" medium in a changing (new) media landscape. But if we were to go by content alone, we'd probably have to say that TV is far from dead. It has probably never been more alive.

Indeed, most people consider 2009 one of Hollywood's better years with respect to the quality of movies produced, big-budget fare (Avatar) and more independent/low-budget films (District 9 and The Hurt Locker). But I'd argue that the best of TV in 2009 was still far, far better, by leaps and bounds, than Hollywood's most celebrated offerings.

Of course, TV is a mixed bag, but at its best, it can sometimes best Hollywood, even the latter's most impressive stuff. And I say this as a filmmaker and an enthusiastic film watcher.

For one thing, the complexities of character development that one can witness over a TV show's entire season dwarf the best 2-hour attempts at cramming specificity into a protagonist's portrayal.

TV (even network TV) also allows for taking more chances than Hollywood filmmaking currently affords. Precious is a "controversial" and "daring" little film by Hollywood standards, but it would just be another HBO gem, and an even more impressive adaptation if we had gotten a chance to see Lee Daniels actually unfurl the other nuances of the book (over several weeks and months) that were bracketed out of the the powerful film. (The irony, of course, is that TV adaptations of motion pictures are usually uninspired and short-lived, sometimes even unwatchable. But that's because the TV-makers with the most nerve and talent are more interested in bringing their own projects to the air.)

In fact, you know how people say that movies are never as good as the books on which they are based. I'd go so far as to claim that TV series (at least the very good ones) have the potential of seriously rivalling novels in terms of nuance and artistic virtuosity, even upstaging them.

It is probably reasonable to say that TV is no longer simply Hollywood's mistreated step-child. More and more Hollywood actors, directors and producers are using TV as a venue for their wares. That only makes a good situation better. One potential downside, I think, is what I'll call the Mad Men effect: a too-short commitment to the slow-burn that weekly serials provide (maybe, in part, because it is hard to serve two masters, film and TV, at the same time).

My concerns about its treatment of race notwithstanding, AMC's Mad Men rewards "close reading." It is a well-crafted show. But it also seems to air something like eight episodes a "season." That isn't a season. That's a fairly long movie broken up into a few pieces.

Even the shows with more episodes a year tend to broadcast them in ways that destroy the continuity of their narratives and frustrate fans: two-, three-, even four-week breaks (sometimes more) between new installments. Now FOX's 24, which has just announced that it will not have a ninth season next year, is TV's gold standard: a weekly unfolding of 24 episodes. That's a season! In fact, it spans two.

HBO's How To Make It In America feels like it just started yesterday, and this coming weekend is already its season finale? Did I hear that right? If so, give me a break! The producers might as well have just made a movie. (Of course, the networks sometimes only order a certain number of episodes, less not more, because they don't want to over-commit to a bust. But HTMA just started. I say, bring back True Blood already, and make it last. If not, I might be forced to watch more of contemporary TV at its worst: those dreaded "news" shows. That's one thing that theatrical film clearly has over TV. It got rid of newsreels long ago.

Monday, March 15, 2010

The Politicization of Everything (that the other side is doing)

Frank Rich wrote a NYT op-ed this weekend that began by criticizing former White House Press Secretary Dana Perino and former NYC Mayor Rudolph Giuliani for their ideological readings of 9/11. Giuliani was appearing on ABC's Good Morning America in January; Perino, on FOX's Hannity last November.

"We had no domestic attacks under Bush," Giuliani declared (though he probably meant after 9/11).

"We did not have a terrorist attack on our country during President Bush's term," Perino stated. "I hope they [the Obama administration and the liberal wing of the press] are not looking at this politically. I do think we owe it to the American people to call it [the Ft. Hood shooting] what it is [a terrorist attack]."

The Rich piece is really about the extent to which Karl Rove (in his recent memoir) and Keep America Safe (a new foreign policy advocacy group founded by Liz Cheney and Bill Kristol) engage in ideologically heavy-handy historical revisionism.

"To hear them tell it," Rich writes, "9/11 was so completely Bill Clinton’s fault that it retroactively happened while he was still in office. The Bush White House is equally blameless for the post-9/11 resurgence of the Taliban, Al Qaeda and Iran. Instead it’s President Obama who is endangering America by coddling terrorists and stopping torture."

But I'm most intrigued by Perino's request that others not mislabel last year's horrific Texas tragedy for politically motivated reasons. It is the hollowness of such a call that moves me. And so many people make it. These days, the opening salvo of just about any debate is usually grounded in the charge that the other side's position is over-determined by mere politics and extremist ideology (as opposed to the speaker's own relatively neutral, fact-based analysis). Admittedly, Rich's essay implicitly pivots on something close to that same move. As does my own posting. But it is a question of degree and kind. And of what one imagines to be the categorical difference between competing sides of any social issue.

For instance, the claim that only left-leaning justices might be described as "activist judges" is silliness. Pure balderdash. Just this week, we find out that Virginia Thomas, wife of Supreme Court Justice Clarence Thomas, is starting a conservative lobbying organization (with links to Tea Party groups). It isn't the Justice himself, but her activists efforts will probably reflect the ideological assumptions behind the kinds of Supreme Court decisions that her husband has been making since the early 1990s. Why don't conservative pundits consider his opinions instantiations of judicial activism? Will that be harder to deny with his wife literally functioning as a political activist? (For those who want to imagine "originalism" as some kind of innoculation from petty politicking, read Matthew Engelke's A Problem of Presence. He's talking about Christian Scripture, not the Constitution, but he unpacks the "semiotic ideologies" that anchor claims about written words that are imagined to speak for themselves, or even to speak at all.)

An invocation of "the political" (to describe "the other side" and its self-serving motivations) is probably one of the most political moves (by that very same definition of self-servingness) in our current rhetorical arsenal. It is also a catchall term, ubiquitous in its squishy polyvocality.

For example, I can't tell you how many queries I get from Chronicle readers who want the inside scoop on the weekly's coverage of events: Why haven't they run an article on the racial angle of that Amy Bishop shooting? Do you know that the Chronicle you write for engages in some unethical censoring of its readership vis-a-vis their comments to articles, especially posts left by "conservative" readers? Just today, somebody was concerned that they hadn't found any coverage of the recent deaths at Cornell University. Is the Chronicle being pressured not to cover the story? The person asked this last question with implications that hover closely to a more non-partisan invocation of the political (to describe "backstage" machinations with a conspiratorial tinge, an example of the political's amazing elasticity).

Moreover, the political is cannibalistic. It feeds off other things, making it more difficult to disentangle political posturing from meaningful political practice. Political incentives can compel people to, say, pounce on Rep. Eric Massa. But that doesn't mean that Massa's actions should be defended, because his attackers smell political blood. (Of course, the logic of our current political/partisan system usually means that we defend our teammates almost no matter what, even to the point of hypocrisy and egregious double-standardism.)

Everything is political. And you don't have to be a card-carrying Foucauldian to think so. Even still, two things seem worth mentioning (as ways to organize and ground such an ostensible truism).

1) Claiming some kind of non-political Archimedean vantage point from which to survey the ideological landscape is unhelpful. And a lie. We can aspire toward greater degrees of objectivity without matter-of-factly declaring that our team (unlike the other side) has already achieved it.

2) Attempts to dismiss other positions as merely political distracts us from the point. The option isn't apolitical vs. political. And the folks who most adamantly declaim that the other guys have cornered the market on political motivations have drunken their own Kool Aid. Or they are betting on the fact that they can get some of us to drink it for them.

Wednesday, February 17, 2010

Academia in the Age of "Reactionary Foucauldianism"

I'm taking part in a faculty discussion today on "teaching controversial issues." In preparation for that meeting, I started to jot down some thoughts on the matter. (I'll be responsible for saying a few words.)

There is a hyper-politicization of higher education today, a hyper-politicization that I want to call "reactionary Foucauldianism." If Foucault's nothing-is-innocent post-structuralism gets marshaled to make arguments about knowledge production as a "power play," the same "metaphysics of power" informs reactionary critiques of academic culture. While Foucault is deployed to challenge "the state" and what he labels "governmentality," reactionary Foucauldianism is a critique of those critics (on similar knowledge/power grounds).

To discuss, say, America's history of imperialism is to practice "communist indoctrination." (Of course, some of this is about the logic and language of punditry. Hyperbolic sound-bites are the coin of our realm, but that seems like very little consolation for a targeted faculty member.)

Everything in academia has become controversial (or potentially controversial) as academics are consistently being asked to defend their ostensibly "liberal" leanings. I know of scholars who don't want to put their syllabi on-line for fear that "others" will troll the Internet, find the document, and use their required reading list to castigate them as ideologues. (And one gets very little traction by pointing out that, ironically enough, unabashed ideologues tend to be the folks most interested in such ideological witch-hunting.)

The increasing hegemony of a think tank counter-academy is also part of the discussion, especially when their powerful publishing arms produce best-selling books by circumventing the so-called "leftist" mainstream.

I teach quite a bit about race and religion, both of which are hot-button topics, growing more and more controversial by the semester. Any discussion of "religion" as something that is social, cultural and political (invariably how anthropologists frame their takes on the sacred) bleeds quite easily into the traps of partisan electoral politics vis-a-vis questions about the "war on terror," "Islamic Fundamentalism," and the "Christian Right" (just to name three of the most obvious ones).

For many people, any talk about race at all is an example of racism. Period. According to some, it is the only contemporary manifestation of racism worth noting. This idea that race-talk is an instantiation of racism (nothing more) can mean that a curricular offering on the topic is only ever a venue for preaching to the choir and supposedly damning the unbelievers. Defensiveness (about being dismissed as a "liberal") meets defensiveness (about being labeled a "racist"), which doesn't make for particularly constructive conversations.

Monday, February 15, 2010

In the Wake of Haiti: Jay Leno and Amy Bishop

It feels callous, even pathetic, to go on with business-as-usual while Haiti continues to reel from such a singular catastrophe. Not that it is really a viable alternative to stand still, catatonic and mouth ajar, wallowing in all the graphic (sometimes gratuitous) images offered up all day, everyday, by news outlets.

Those same media outlets have toned down their coverage of Haiti considerably these past two weeks, which seems welcomingly merciful, I have to selfishly admit, even as it also shocks me how quickly the 24-hour news cycle can chew up and spit out any story, including one as massive as the Haitian disaster. A nor'easter seems hardly to merit displacing it at the top of anybody's news hour.

Even after we've sent our checks (contra Rush Limbaugh's suggestions) and commiserated with friends about the tragedy (the injustice of the event itself, the high-profile mean-spiritedness of certain religious explanations for it, the frustrating tales about the difficulty of relief efforts and the plight of those "kidnapped" orphans), we still have to go on with the rest of our day, the rest of our lives, right? Anything else seems almost like courting psychosis, dancing with the devil of existential despair.

And I have certainly taken that advice. I spent the end of January and the beginning of February staying up late at night to watch Conan O'Brien and David Letterman hurl insults (in the guise of "jokes") at Jay Leno. I guzzled down the season premiere of 24, and I made it my business to YouTube Mo'Nique's acceptance speech from the Golden Globes (just because everybody seemed to be talking about it). I did all of this with the Haitian earthquake's aftermath punctuating the noisy pauses between these silly vices.

For me, it feels almost schizophrenic to be following the events in Haiti while, say, preparing for weekly sessions of my graduate course (on the noeme of film). I also have a few grad students and recent grads on the academic job market, so I am writing recommendations and helping them to deal with the inevitable anxieties that such professional hurdles produce. This week, I'll spend much more time on that stuff than I will watching the news coverage from Haiti.

All academic eyes are now focused on the shooting in Huntsville. A faculty member at the University of Alabama is denied tenure (another one of those anxiety-filled professional hurdles), and she makes some of her colleagues pay for it with their lives. The entire affair feels like academia's version of a natural disaster: what can happen when the tectonic plates beneath the ever-secretive tenure process shift just enough for others to really feel it. Of course, the person denied tenure already feels the devastation, but it is a decidedly individualized experience, mostly dealt with off-stage and out of public view.

At the same time, very little really feels "natural" about this Law and Order-type murder story, its headlines coming fast and furious. The Chronicle Of Higher Education has descended on the scene like academia's version of the New York Times, and we are continuing to get details about the shooter's quirkiness and interpersonal oddities.

An academic friend of mine claims to find it "strange" that such post-tenure shootings don't happen more often, especially given how "nutty" academics can be. And she readily includes herself in that unflattering characterization, which I respected, even as I demanded that she make an exception for my own self-avowed normalcy. But is she right? Given how fraught the climb up academia's ladder, is it shocking how infrequently such violent retaliation takes place? What, if anything, does this shooting really tell us about a "life of the mind" or about the way academics adjudicate it? How long before the Amy Bishop story gets bumped from the headlines, and is there something faculty members should actually learn from the entire thing before it does?

Friday, February 12, 2010

What to do with GREs?

Four Theories of the GRE's evaluative significance:

1. The Primacy of Quantitative Scores: This position holds that high quant scores are a good indication of how crisply someone thinks, regardless of whether or not the discipline they are applying to demands any robust use of mathematics at all. The rebuttal maintains that unless someone is going to be working with numbers, the quantitative score can be completely discounted if the other two scores are high enough.

2. All or Nothing: Some reviewers of grad applications maintain that unless the GRE scores are quite high in all three domains, the student should be considered a bit of a risk. I've even been privy to a theory that links high-math/low-verbal scores to anti-social behavior. The high-verbal/low-math applicant sometimes gets dismissed by others as someone who talks a good game but doesn't have anything substantive to say. All style and no substance.

3. The Declining Significance of GREs: Some reviewers don't even look at GRE scores. They dismiss them out of hand. If the statement is strong and the letters are convincing/supportive, they don't need any other information. This position usually gets justified with recourse to discussions about the relative underperformance of minority candidates on standardized tests (whether that's chalked up to cultural biases written into those tests or to "stereotype threats" priming said students for failure). Of course, this anti-GRE position doesn't mesh well with university-wide attempts to clearly demonstrate the exceptionalism of incoming classes.

4. Writing, Writing, Writing: I've heard at least a couple of qualitative social scientists and humanists wax eloquent about the singular significance of the writing component of the test. (I don't even think we had a writing section when I took the test.) A high writing score means that whatever the prospective students know/learn will get translated into the coins of the realm in academia: the written conference talk, the term/final paper, the publishable article, and the Dissertation. This, they argue, is where the rubber hits the road for graduate studentry. And bad writers with good ideas have a more difficult time thriving in the academy. Good writers can survive even if their ideas aren't always instantiations of incomparable genius.

When I'm on a selection committee, I tend to go through all of the other materials in the files before I look at GREs. Starting with GRE scores can sometimes bias one's reading of the rest of a prospective student's application. Once I go through the written materials, I then compare my assessment of the student with his or her GRE scores, usually just to see how hard a case I'll have to make to my colleagues (if those scores are particularly low). Of course, I almost never win those low-GRE cases.

Wednesday, February 10, 2010

Snow Days

What's the best way to spend a snow day?

A nor'easter decided to add an exclamation point to the massive winter storm that pummeled Philadelphia (and the entire mid-Atlantic region) this past weekend, which means that schools famous for almost never closing due to weather concerns have cancelled their classes today. I'll have to pay for this later (trying to re-schedule campus meetings that were difficult to schedule the first time around), but there is one major upside. I can to slash through a chunk of my growing To-Do List.

First things first. I sent out 33 emails in an hour, emails churned out with a reckless disregard for grammar or even comprehension, which probably means that I'll have to spend more time sending follow-ups for clarification.

I've already had three very useful phone calls with colleagues (related to my administrative roles on campus), and I am now all set for a big committee meeting this Friday. Check. Check.

And then I slipped down a bit of a rabbit hole. Snow days are great for such wonderlandesque expeditions.

I have a few doctoral students on the job market right now, and they keep telling me about those infamous "wiki" sites where applicants can get unofficial updates on the status of current academic job searches. This is madness! I am so glad that such sites didn't exist ten years ago, during my first real stint on the academic job market. It reeks of neurotic possibility.

I actually went through some old text messages this morning (from and about undergrads applying to doctoral programs), and I can't believe that the same kinds of cyber-sites are available for them, spaces where other prospective graduate students anonymously post any information they know (or have heard) about the results of departmental decisions about incoming cohorts. So, I have been meandering through these virtual wastelands and fretting over how much discipline it takes for graduate students and would-be graduate students to avoid the gravitational pull of such sparkling baubles.

If I can spend a chunk of my morning shaking my head at the phenomenon, I wouldn't be surprised if snow days give students license to get swallowed whole in these bastions of high-end gossip-mongering. I can see the mesmerizing draw, even if I can't spend all of the time-out-of-time that is my snow day on these addictive sites.

Thursday, February 4, 2010

Why co-teaching is not a scam

I recently had someone tell me that co-teaching was one of the biggest academic scams going. "The biggest, in fact," he corrected. According to him, this was insult to injury in the context of a larger academic universe that was itself, by his estimation, one gigantic institutionalized racket of Mafioso (and "governmental") proportions. (A side note about his "governmental" critique: I should probably add that this person is a libertarian, and something of a conspiracy theorist.)

And he wasn't just talking in the abstract. He was offering me a bit of a browbeating for the amount of co-teaching that I have done over the course of my professorial career.

To hear him tell it, co-teaching is just a way for faculty members to get full credit for half the work. They conspire with their colleagues to split a semester or quarter in two so that they don't have to prepare for (or attend) all of the sessions. With this illicitly gained free time, they can then selfishly work on their own projects, which was at least a better option, he admitted, than what he suspected was the usual alternative: doing absolutely nothing productive at all, like the closeted slackers all academics seemingly want to be.

I have heard this critique of co-teaching many times, and I've seen examples of co-teaching that do seem to merit the cynicism, structuring the "collaboration" such that students experience it as little more than two distinct pedagogical ships passing one another in the dark curricular night. (Of course, these same students tend not to enjoy such courses, or to consider them valuable educational experiences.)

To complicate matters even more, there is also the question of how much co-teaching should really count toward faculty teaching loads: as a full course (like any other)? Half a course? (Even less than that, my interlocutor might argue, given his aforementioned assessment of things.)

If done well, I would argue that co-teaching with a colleague could even count as two courses. Or at least a course and a half. That's because to really do it right, to do it well, means many more hours of preparation beforehand: debating the foundational structure of the course, comparing notes/takes on the material, and doing justice to two distinct perspectives on the subject matter. It can require as long as a year (even longer) for colleagues to effectively collaborate (over coffees, lunches and late-night bull sessions) on the conceptualization and organization of a substantive (and reasonably coherent) co-taught syllabus.

I've actually only ever co-taught courses where both of us attended all of the sessions, read all of the materials and prepared lectures/comments/questions for one another and the students every single week, but I realize that that isn't always possible, especially if an institution asks that such co-teaching be conducted as an overloaded add-on to a person's regular teaching schedule (which is how some academics have described the policies of their schools to me).

In a course on "Film and Reality" that I co-taught with a Kantian philosopher at Duke, every class session was a learning experience for everyone involved. Some sessions he'd lead, and my role was to respond/rebut (from an anthropological perspective). When I led, he'd do the same (providing philosophical/analytical counterpoints/extensions to my positions). In a lecture on semiotics (and the ostensible differences between Ferdinand de Saussure's binaries and Charles Sanders Peirce's tripartitism), my co-instructor pushed back with a challenge to the distinctiveness of iconicity and indexicality vis-a-vis what I had described as the more arbitrary and un-motivated sign. It was a great discussion. Not because we got lost in our own debate (another minefield to avoid on team-taught terrain), but because we were able to use that discussion as a way to structure a series of student questions/comments about the contemporary utility of semiotic approaches to social analysis (and discrepancies between them).

Since coming to Penn, I've co-taught graduate courses and undergraduate courses, small seminars and large lecture offerings. In all of these instances, my collaborators and I met each week, before the actual class sessions, discussing our divergent take on the readings, sharing our thoughts on the specifics of the week's agenda, and making sure that we had a detailed set of expectations (of ourselves and our students) before we stepped into the classroom. When it works, this is an enriching experiences for everyone, which makes the extra preparation worth it.

In an academic world where interdisciplinarity is offered up as part of the intellectual air we breathe, co-teaching will probably become an increasingly valuable way of training students to think across conventional disciplinary (and even methodological) dividing lines.

In an academic world where interdisciplinarity is offered up as constitutive of the intellectual air we all breathe, co-teaching should become an increasingly valued way of training students to think across conventional disciplinary (and even methodological) dividing lines.

(crossposted at The Chronicle of Higher Education)

Wednesday, January 27, 2010

Anna Deavere Smith's Craft

(crossposted at The Chronicle of Higher Education)

Anna Deavere Smith describes her life-long project as an attempt to theorize the links between language and identity. She came to this realization about the fundamental nature of her actorly goals while still studying her craft (several decades ago) at the American Conservatory Theater in San Francisco. Last night, Smith presented excerpts from her most recent one-woman show, Let Me Down Easy, at the University of Pennsylvania's Annenberg Center for the Performing Arts, and she tried to explain to a packed-house just how her creative process works.

For those who don't know Anna Deavere Smith, she is famous for what has been called "documentary theater," a genre that, for her, entails interviewing people from various walks of life (interviews organized around a particular theme or event) and staging those juxtaposed interviews as monologues in critical conversation with one another.

Fires in the Mirror dealt with 1991's Crown Heights riots (between Afro-Caribbeans and Orthodox Jews in that small section of Brooklyn) and included interviews with rioters, African-American activists (such as Al Sharpton), rabbis, city officials, local residents, and other interested parties with a spin on the conflagration. Twilight: Los Angeles dealt with that 1992 riot/uprising, bringing excerpts from her interviews to life on stage as a way to demonstrate the many angles from which Angelenos and others made sense of that public tragedy.

Let Me Down Easy is a commentary on death and dying in America, on the state of health care and on how the actions of health care providers are over-determined by cultural assumptions that get powerfully exposed when Smith places them on conspicuous theatrical display. Given the extent to which our current political conversation pivots on the "health care debate" and its political fallout (including the election of a Republican senator in MA), Smith's material is amazing, even uncanny, for its timeliness.

Smith's power stems from the fact that her performative skills allow her to conjure up her interviewees in all of their demographic and idiosyncratic specificity, seemingly out of thin air, using their words, speaking styles, and bodily gestures to plop these beings unto the stage with an almost occult-like immediacy. She also does a commendable job giving voice to many different swaths of the political spectrum, placing opposing viewpoints in conversation such that each side of the debate is rendered with nuanced humanity. Alas, if only our everyday political discourse followed a similar organizing principle. Indeed, one of her projects as a scholar-artists (she is, after all, an academic: University Professor at NYU) is to promote robust conversations across ideological divides. (She is the founding director of Harvard University's Institute on the Arts and Civic Dialogue.)

As someone who spent the last few month of 2009 beginning my own attempt to think about staging ethnographic data for theatrical presentation (first, this year, at academic conferences and then, much later down the line, in a full-fledged one-man show), it was encouraging and instructive to hear Smith describe her approach to such work. "Documentary theater" is a valuable example of what "ethnographic theater" could look like--and even of what anthropological theatricality might usefully define itself against. Several ethnographers have already begun to dabble in a version of what might be called "ethnographic theater," which is yet another way to continue ongoing discussions within anthropology about the political and poetic implications of ethnographic representation and cultural critique. It is also a different way to think about questions of observation, embodiment and intersubjectivity.

Anna Deavere Smith was an inspiration last night, and not just for scholars interested in harnessing the electrical powers of theatrical space for their own scholastic purposes.

Smith juggles her "documentary theater" work with stints on shows like NBC's The West Wing and HBO's Nurse Betty. That stuff pays the bills, she says, but documentary theater is really her passion. It is also a way for her to show that social identities only emerge as fully meaningful and culturally intelligible once we are willing to slip our feet into other people's shoes, to wrap our mouths and minds around other people's words.

Thursday, January 14, 2010

What is Pat Robertson Really Saying About Haiti?

There are many reasonable people (and even some otherwise unreasonable ones) who would maintain that Pat Robertson's take on the recent earthquake in Haiti need not be dignified with a response. I understand that point, and I see where its adherents are coming from. But we are fooling ourselves if we think that Robertson represents an isolated quack. We ignore him at our own peril, especially since there are many people who accept his basic premises without question. So, I do feel like a few words are in order about the significance of his supernatural claims about divine justice.

One thing to note is that the political "fringe" is no longer as fringe as it might once have seemed. I got about 10 messages (via twitter, email, and facebook) regarding Robertson's comments within a few hours of him making them. I've also seen his thoughts discussed on several cable news programs on several different channels more than just a few times in the last day and a half. His comments have gone viral, and it means that "dignified" or not, they are circulating quite widely already.

If you are still one of the few people who haven't heard it, Robertson argues that 18th and early 19th century Haitians were able to throw off the chains of race-based slavery and colonial dependency by (literally!) making a pact with the devil. As a function of that Faustian bargain, they have been cursed by God, which explains their history of violence and their contemporary degree of poverty.

I got the surreal news (via text message) about the Haitian disaster on an Amtrak train from Washington DC to Philadelphia Tuesday evening (after attending the AAA symposium on race that I blogged about on Monday). And it just so happens that I was reading, in an almost eerie kind of irony, a small new book by Susan Buck-Morss during that ride, Hegel, Haiti, and Universal History.

The book is an extrapolation on her Critical Inquiry article (from 2000) where she tried to argue that Hegel got his master-slave metaphor from the Haitian revolution, and that such a seemingly clear and self-evident historical fact has been sorely under-appreciated (in fact, missed just about entirely) by the best and brightest philosophers and historians who have worked on Hegel. She chalks these omissions up to a series of factors, including the narrowcast biases of disciplinization and academic specialization. Buck-Morss argues that the early Hegel was clearly influenced and inspired by the Haitian revolt (championing the psychic need for slaves to forcibly reclaim their full humanity by asserting it in the face of brutal reprisals), even if the later Hegel (of The Philosophy of History) ends up dismissing all of Africa as radically ahistorical, uncivilized and unprepared for full sovereignty.

In many ways, Robertson's pseudo-religious reading of the Haitian tragedy is a sensationalized version of the very logics that Buck-Morss critiques.

I call it "pseudo-religious" because I think of Robertson's comments as self-serving political claims hiding behind the cloak of religiosity. Of course, religion is inescapably political, but Robertson's own religious texts don't provide evidence for such wildly specific and offensive claims of satanic collusion. On what evidence, from what sacred book, does Robertson base his theory of Haitian history (or any of his past pronouncements, including the "argument" that 9/11 was divine retribution for America's legalization of abortion)? Is he merely performing a xenophobic reading of Voodoo's spiritual difference from his particular version of Christianity?

Instead of seeing 18th and 19th century Haitian freedom fighters as subjects of history, agents capable of throwing off the shackles of foreign oppression (in a manner similar to America's 18th-century revolutionists, a group that I've never heard him call lapdogs of Satan), Robertson removes them from the political and geopolitical playing field altogether, dismissing their post-revolutionary plight as comeuppance for a bad deal with the devil. About that theory, two last things:

First, I would recommend that Robertson read Randall Robinson's An Unbroken Agony: Haiti, from Revolution to the Kidnapping of a President, which shows, quite compellingly, that Haiti's current politico-economic predicament is a direct result of how Europe and the United States responded to the country's 1804 assertion of autonomy: by very purposefully isolating and exploiting Haiti (politically and economically) for the next two hundred years. Therein lies much of the answer, Robinson demonstrates, to Haiti's current woes. (The details he provides, mostly uncontested and unhidden facts of history, will be shocking to many readers).

Second, if the Satan-theory is accurate, I would just ask that Robertson finally let them out of their contract with him. As a function of the kinds of horrible and inhumane ideas he spews, Robertson must be the other contractual party of which he speaks. It would explain how he knows the details of such a secret compact.

Thursday, January 7, 2010


The career pipeline: Not leaking but pouring
By Katherine Sender

At a recent meeting of Penn faculty members from across the University, the provost spoke with concern about “the leaky pipeline,” where large numbers of women and minority faculty drop out of the career track as they move towards senior positions. Then followed our president announcing that Penn was moving from a position of Excellence to Eminence—in the twenty-first century university even Excellence isn’t good enough anymore. I was struck by the juxtaposition. Was there a relationship between this constant push to greater levels of distinction and the leaky pipeline?
What does this leaky pipeline look like at Penn? A Gender Equity Report in 2007 found that women made up 28 percent of all faculty. How this plays out across rank is striking: women made up 42 percent of assistant professors, 30 percent of associate professors, and only 18 percent of full professors. This is not a case of more women coming up through the ranks because the proportion of standing women faculty had increased by only four percent since 1999.

The leaky pipe for racial minorities is as dramatic. A Minority Equity Report of 2007 found that minorities made up 17 percent of Penn’s faculty. People of color made up 27 percent of assistant professors, 17 percent of associate professors, and only 9 percent of full professors. We may take heart that the proportion of minority faculty has almost doubled since 1999, but of the current 17 percent of minority faculty 11 percent are Asian, meaning that the proportions of African American and Latino/a faculty are very small indeed.

Reliable career track information on gay, lesbian, bisexual, and transgender faculty is impossible to come by, but my sense is that the tenure and promotion process isn’t especially kind to this group either. Expressly queer faculty—politically irascible, non-heteronormative and even non-homonormative academics—are likely to have an especially hard time.

I’m using Penn’s figures as an example, but Penn isn’t especially bad—or good—compared with its peers. I also know that some people are leaving academic careers for good, self-chosen, life-affirming reasons. But it’s worrisome that these departures are differentially distributed across gender, race, and probably sexuality. The pipeline isn’t leaking, it’s pouring.

At a recent Gender Studies conference here at Penn the leaky pipeline was addressed as a family issue: the tenure clock is hostile to women who want to have children. Indeed, nationally, women with children are half as likely to get tenure as women without. But this is only part of the problem. If it were only a fertility issue, minority men would be doing just fine.

The tenure and promotion process isn’t only inhuman for women who want and have children, it’s inhuman for everyone. Jerry Jacobs, a sociologist here at Penn, found in 2004 that both women and men faculty work more than 50hrs per week irrespective of rank, and about a third of them work more than 60 hours per week. The expectation of increased working hours is only likely to grow. The MLA found in 2006 that not only research universities but all academic institutions have greatly increased their expectations of tenure track faculty to publish articles and books towards their tenure cases without reducing their teaching hours.

While expectations of productivity have increased, so too has the shift to employing more part-time faculty: in the US only a third of faculty are now full-time tenured or tenure track, down from 55 percent in 1970. This puts increasing pressure on those full timers to do additional service work. Work that more often falls to women, and work that gets little credit in terms of promotions and merit pay. As we are increasingly asked to account for our productivity, I wonder how much of the intellectual and pastoral labor more often done by female and minority faculty are recognized as productive?

These increased pressures are on everybody, but they are experienced unequally by women and minority faculty because of how resources are differently distributed:

Pay: In the US women faculty earn 85 cents to every male dollar, this rate goes down at the higher ranks. [Couldn’t find comparable figs for minority faculty.]

Time: Women faculty are much more likely to be partnered with another full-time worker and are more likely to be partnered with another academic—i.e. someone also working long hours. In heterosexual couples, women are much more likely to carry more responsibilities for childcare and domestic duties.

Emotional resources: Women and minority faculty are less likely to feel confident about their performance. Educational research suggests that girls consistently rank their sense of their own abilities much lower than do men, even though they perform better in assessments. Students of color constantly have to work against teachers’ expectations of low achievement.

Recognition: Who has a voice in the university and what are they allowed to say? Mark Anthony Neal has mentioned the chastisement of faculty who dare to “think while Black.” Tenure and promotion discourage speaking while Black, female, and gay.

The demands on all academics escalate, but different groups have varying access to resources that make those demands bearable. This is not only an issue of pressures on junior faculty to produce for their tenure file. Even those at the top of the ladder continue to work extraordinarily hard.

Senior faculty and administrators need to recognize that few of their group would have met the standards currently set for tenure and promotion. They need to publicly scale back on expectations of quantity and focus more on quality. This is not only for the wellbeing of their junior colleagues, it is also likely to foster more careful, intellectually rigorous research. They also need to think imaginatively about different kinds of productivity than written scholarship in a changing multimedia world where monograph contracts are harder to score.

But we also need to consider our own complicity. In my research I read a lot of scholarly concern about how reality television shows cultivate the ideal self-governing neoliberal citizen—someone who is adaptable, mobile, always a bit anxious, self-monitoring, and willing to work harder not only to get ahead but to stay in place. While we communication scholars worry about the effects of reality TV on its audiences, we need to look for the beam in our own eye: academics are the most obligingly self-governing citizens of all. We can work whenever we want as long as we work all the time.

Like many universities, corporations, and governments, Penn has adopted a strategy of “Sustainability.” I agree that huge communities like universities have a responsibility to environmental issues. But sustainability can’t only be a matter for nations and institutions, we also have to think about sustainability at a human level. The demand for constant growth means that we extract more and more energy from a limited resource. How do the developing nations in the university world—women, men of color, and part-timers—unequally bear the brunt of overtaxed resources? And looking forward, what kind of labor legacy are we leaving for the generation of scholars we are nurturing into the profession?

Don’t get me wrong, I love my job. But I don’t want to do only my job. We need to model livable lives for our students. We need to do more than just work, and not only if we want a family. We need to consider the law of diminishing returns and the possibility that creativity comes from working less. We need to make space for political and community engagements that feed our intellectual work in other ways. We need to think about why universities matter not only for the world but for the people working within them.

Katherine Sender is the associate dean for graduate studies and an associate professor at the Annenberg School for Communication, University of Pennsylvania. She is the author of Business, not Politics: The Making of the Gay Market and the forthcoming Makeover Television and its Audiences.

Monday, January 4, 2010

An Academic Recap of 2009

Given the media's current fixation on one golfer's rampant infidelities, it is hard to remember that anything else happened in 2009, especially before that failed suicide attack on a Detriot-bound airplane Christmas morning took over the headlines this holiday season.

Of course, much did happen last year, and most of the mass mediated, end-of-year lists captured the big stories, including those angry town hall meetings, the concomitant dulling of a "post-racial" president's post-election luster, our ongoing economic crisis, the passing of a Kennedy, America's war efforts in Afghanistan and Iraq, protests in Iranian streets, the King of Pop's unexpected death, and the panic about H1N1.

But academia also had its own big stories this year. Here's my top ten list (in no particular order):

1. Protests against cuts in the University of California system. New Yorker magazine just published a fascinating glimpse into Berkeley's branch of that movement, which has students, staff, faculty, and administrators waging a war over the future of public education in that state (with implications for the rest of us). There are even controversial proposals (published in places like the Washington Post) that pivot on a decoupling of Berkeley from the other UC campuses, of saving top-tier public universities across the country through selective privatization. For now, there are strikes (and threats of more strikes) on Berkeley's campus, and faculty must decide whether or not to cross those picket lines and teach their classes.

2. That bizarre and surreal "story" about an African-American professor at Columbia who allegedly got so upset about a white colleague's indifference/insensitivity to contemporary racism that he punched her in the face at a pub near campus. The story went viral in a day (back in early November) and disappeared just as fast. I can only hope (against hope) that that only means it was all some kind of sick joke/hoax. Indeed, if it wasn't, the dropped coverage on this confounding tale is troubling in and of itself.

3. The long wait for the National Research Council's national ranking of doctoral programs. They released a detailed guide to their methodology this past Fall, but not the actual rankings. This non-story is clearly a big story in its own right. And I'm sure that the plot will only thicken in 2010.

4. Lincoln University's attempt to impose a body-mass index requirement on its graduating seniors. The initiative was met with cheers from some (for addressing rampant obesity) and jeers from others (who labelled it a form of discrimination). The 'nays' won, and Lincoln rescinded the requirement.

5. The stimulis money that funneled into university-based research projects as part of the government's economic recovery package. I know quite a few colleagues (in several different fields) who were able to take advantage of this initiative, stimulating their own research projects, even and especially those that had already run out of funding.

6. Media stories about how the economic downturn potentially made a bad situation worse at Harvard University. Vanity Fair's expose on the matter is still one of the most startling, attempting to blame at least some of Harvard's current financial predicament on its previous investment strategies and the people who made them.

7. University responses to H1N1. Duke University took a particularly pro-active approach to thwarting the threat. We may not be out of the woods yet, but this summer's media coverage now seems somewhat overblown.

8. Ongoing stories about how universities across the nation are tightening their belts to weather the economic downturn. I first heard about massive budget cuts at the University of Washington in Seattle. Other institutions have followed suit. What university initiatives get put off and de-prioritized when annual budgets are slashed by 15% or more?

9. That controversial New York Times op-ed in early April (from Columbia University Professor Mark Taylor) pleading for us to "end academia as we know it." The piece began by describing graduate education as "the Detriot of higher education," a provocative opening salvo. There were many academics who quite publicly disagreed with Taylor's remedies, including his call to end tenure.

10. An anemic academic job market. Newly minted PhDs continue to lament the slim pickings. 2009 was probably a little bit better than 2008 (at least in some fields), but there are even bigger questions to debate about academia's increasing reliance on adjunct labor and its implications for the future of doctoral education.