tag:blogger.com,1999:blog-31712917932646590962024-03-05T04:51:45.616-05:00From the Annals of AnthromanJohn L. Jackson, Jr.http://www.blogger.com/profile/05994353364710886605noreply@blogger.comBlogger114125tag:blogger.com,1999:blog-3171291793264659096.post-79944018003402118572011-11-28T17:46:00.001-05:002011-11-28T17:48:16.883-05:00Blogging like a Beast?!<span style="font-style:italic;">Martha Marcy May Marlene</span> is a film about a young woman trying desperately (and unsuccessfully) to recover from her traumatic stint as a member of a rural cult, sexual concubine of its charismatic spiritual leader. It is one of those “art house” movies that ended with surprisingly little warning. When the closing credits began, audience members gasped. “What?” “You’re kidding me!” “Is that really it?” That’s only what I heard in the theater seats nearest my own. And I laughed, because I knew exactly what they were reacting to.<br /><br />The writer had taken us on a complex and nerve-racking journey with the film’s female protagonist (who, at different moments in the story, answers to each one of the names that make up the film’s title). By the “end” of the story, our filmmaker hasn’t really provided us with any resolution. There is no simple (artificial?) closure to the narrative, just a final tension-filled scene rife with unanswered questions and uncertain outcomes.<br /><br />It isn’t necessarily the way we’re taught to write screenplays, but it was a valuable reminder of what some good storytellers try do accomplish. And how.<br /><br />Many people have made the claim, but it bears repeating: Good writers write like beasts. They don’t worry about the “audience” in any simplistic and condescending sense. And they certainly don’t care to placate them. They write without self-consciousness (at least, without the kind of self-conscious anxiety that allows for any too-precious preoccupations with making readers happy).<br /><br />As I craft one of my final few Brainstorm blog postings this week, my final week as a <a href="http://chronicle.com/blogs/brainstorm/blogging-like-a-beast/41587">Brainstorm blogger</a>, I can’t help but think about all the interesting and energizing exchanges I’ve had with <span style="font-style:italic;">Chronicle</span> readers over these past few years. From posts about the potentially racist underpinnings of “Obama-as-anti-Christ” rhetoric back in 2008 to a knock-down blog-brawl (played out over several postings) sparked by some comments I made about attending an academic conference, from defenders of Michelle Malkin keen on rebutting my (passing) characterization of her work to more recent debates about Herman Cain’s theories of race and class, one of the most conspicuous features of the blog (as platform and/or genre) is its almost immediately dialogical linkage of readers and writers.<br /><br />Anthropologist Johannes Fabian has discussed the possibility that on-line exchanges between researchers and research subjects, exchanges modeled on the back-and-forth interactions between bloggers and blog readers, might be the beginning of the end for traditional forms of ethnographic writing, differently configuring those conventional relationships in radically new ways.<br /><br />At its best, the interactivity of the blog format clears space for the rehearsal of real debates and differences of opinion, especially when the anonymity of the Web doesn’t help foment the worst forms of unproductive incivility. I really have enjoyed my time on the blog, and I feel like it taught me to think about writing in a very meaning-filled way, especially when readers made good-faith efforts to challenge some of my positions. (Even the responses that I’d describe as more like “bad faith” offerings were sometimes useful.)<br /><br />At the same time, however, I realize that I used to feel as though I wrote (or, at least, tried to write) like a beast, with a cultivated indifference committed to getting my point across as honestly as possible (come what may). Although I have always tried to be honest in my Brainstorm posts, I certainly didn’t feel the freedom to write without self-consciousness, without having to worry about which readerly eyes stood in my path. On the contrary, I feel like I have gotten increasingly more self-conscious over the course of my blogging stint, which I know isn’t necessarily how everyone responds to this platform and its many possibilities (and may not, ultimately, be the worst thing). Sometimes that heightened self-consciousness found me pandering to the gods of sensationalist punditry, trying to be purposefully provocative (even unnecessarily harsh and fairly mean-spirited) as a way to drum up more explicit comments from readers. Any response seemed better than none.<br /><br />At other times, I started (or conceived of) many posts that I never finished/published, all too mindful of how many colleagues actually look at the Chronicle‘s blogs. If some of these same sentences were tucked into a book that few people ever read, I wouldn’t have to worry about getting a series of emails and phone calls after the writing, which also became a kind of immediate gratification that was, at times, far too intoxicating.<br /><br />All of this meant blogging less and less like a “beast” every single day. It would be the equivalent of being a filmmaker forced to sit in the theater and listen to audiences complain about the inexplicable ending to your latest movie. You experience that immediate feedback often enough (the audible gasps, the palpable disappointments), and you start to hear those same complaints at your computer screen as you work on the next project. Even positive responses, received as the film’s final credits roll, over-determine a self-conscious writer’s subsequent decisions. Such hauntings can be the kiss of death to any would-be author, and they are incredibly hard to exorcise.John L. Jackson, Jr.http://www.blogger.com/profile/05994353364710886605noreply@blogger.com6tag:blogger.com,1999:blog-3171291793264659096.post-15690832801460027612011-11-04T15:35:00.001-04:002011-11-04T15:38:41.812-04:00Sacred Bundle 2.0: Anthropology OnlineOxford University Press has launched a new and ambitious on-line project, <a href="http://oxfordbibliographiesonline.com/">Oxford Bibliographies Online</a>, which attempts to provide scholars, students, and other interested readers with introductions to important topics and themes from many academic fields/disciplines. Atlantic History, Criminology, Communication, Philosophy and Sociology are among the modules already available. Later this month, Political Science and Psychology go live, with Education soon to follow.<br /><br /><a href="http://aboutobo.com/anthropology/">Anthropology </a>is slated for release early in 2012, and I have agreed to help editor that particular module. Oxford was able to put together a strong editorial board for the project, which included scholars from all four of American anthropology's major sub-fields: archaeology, linguistic anthropology, physical/biological anthropology and cultural anthropology. These nine scholars helped to select and vet the entries on various topics (including Applied Anthropology, Cultural Evolution, Public Archaeology, Language Ideology, and Globalization). All in all, OBO's Anthropology site will launch with 50 entries penned by scholars from across the country and the world, including Michael Herzfeld on "Nationalism," Vernon J. Williams on "Franz Boas," Jeremy Sabloff on "Public Archaeology," Neni Panourgia on "Interpretive Anthropology," Kudzo Gavua on "Ethnoarchaeology," "John Trumper on "Ethnoscience," and Christina Campbell on "Primatology" (just to name a random few).<br /><br />Once the site launches, four anthropologists (Marcus Banks, Maria Franklin, Jonathan Marks, and Bambi Schieffelin) have signed on to help read new entries (about 25 or so will be added every year), and our authors and editors will all update entries as necessary (when new titles merit inclusion or emergent debates in specialties demand discussion). The idea is to make these entries living, breathing documents that morph with ongoing reconfigurations of our discipline.<br /><br />I only agreed to assist in this effort because I was intrigued by the idea of re-familiarizing myself with the so-called "four fields of anthropology" mentioned above. As a graduate student at Columbia in the 1990s, I was trained in a four-field department, even though I could get away with doing coursework in only two of those sub-fields. And after teaching for four years in Duke University's Department of Cultural Anthropology (where we all seemed to be in the same scholarly conversations), I am back in a four-field department that demands grad students pass exams in all of the sub-fields, one of the few programs in the country with such a stipulation.<br /><br />Although I don't consider anthropology's four fields a "sacred bundle" never to be dis-assembled under any circumstances, I am intrigued by the idea of forcing myself to learn more about the four farthest corners of this sprawling and hubris-filled discipline that imagines itself to cut across the humanities, the social sciences and the natural sciences.<br /><br />Oxford's new initiative will allow anthropologists to think about how much (or how little?) we might really gain from conversations across the intradisciplinary domains that often divide us. OBO's intervention will help us to see how Physical Anthropologists and Cultural Anthropologists might differently approach topics such as "race" or "gender." Or we can determine what kind of reviewer an urban anthropologist working in contemporary Latin America would make for a piece on the histories of cities crafted by an archaeologist.<br /><br />I'm intrigued to see what (hopefully productive) sparks might fly from such contact, and I've already learned so much about those other anthropological spheres during the build-up to near year's OBO launch. So, if you are an anthropologist gearing up for this month's <a href="http://www.aaanet.org/meetings/">AAA meeting in Montreal</a>, please know that I might be asking you to contribute to this attempt at a somewhat experimental four-field rendering of our discipline's scholarly world. And please consider taking part.John L. Jackson, Jr.http://www.blogger.com/profile/05994353364710886605noreply@blogger.com1tag:blogger.com,1999:blog-3171291793264659096.post-80816732850453395302011-10-21T22:32:00.001-04:002011-10-21T22:35:25.490-04:00LA Screening of my new documentary this Friday...<span style="font-weight:bold;">Bad Friday: Rastafari after Coral Gardens</span> (63 mins.)<br />A documentary film <br />Directed by Deborah A. Thomas and John L. Jackson, Jr.<br />Produced by John L. Jackson, Jr., Deborah A. Thomas, and Junior "Gabu" Wedderburn<br /><br />When: Screening in Los Angeles at <span style="font-weight:bold;">2pm on October 28th, 2011</span><br /><br />Where: Laemmle’s Sunset 5<br />8000 Sunset Boulevard<br />West Hollywood, California 90046<br />www.laemmle.com<br />Theater #5: Seating capacity is 198<br /><br />Online ticket sales begin October 7, 2011. For complete details visit our website at <a href="http://www.hbff.org">www.hbff.org</a>.<br /> <br />Ticket Prices<br />$15 Screening Tickets*<br />$11 Children under 12/Seniors/Military/Students w/ID<br />$11 Matinees before 4:00 p.m.<br /><br />SHORT SYNOPSIS <br />Bad Friday chronicles the history of violence in Jamaica through the eyes of its most iconic community – Rastafari – and shows how people use their recollections of the Coral Gardens “incident” in 1963 to imagine new possibilities for the future.<br /><br />LONG SYNOPSIS<br />For many around the world, Jamaica conjures up images of pristine beach vacations with a pulsating reggae soundtrack. The country, however, also has one of the highest per capita murder rates in the world, and the population is actively grappling with legacies of Western imperialism, racial slavery, and political nationalism – the historical foundations of contemporary violence in Jamaica and throughout the Americas. Bad Friday focuses on a community of Rastafarians in western Jamaica who annually commemorate the 1963 Coral Gardens “incident,” a moment just after independence when the Jamaican government rounded up, jailed and tortured hundreds of Rastafarians. It chronicles the history of violence in Jamaica through the eyes of its most iconic community, and shows how people use their recollections of past traumas to imagine new possibilities for a collective future.<br /><br />REVIEWS <br />“Bad Friday is live evidence for reparations from the Government of Jamaica for the Coral Gardens atrocity of 11 April 1963. The Prime Minister Sir Alexander Bustamante’s order to “Bring in all Rastas, dead or alive!” is a crime against humanity that should not be forgotten.” - Ras Iyah V and Ras Flako, Rastafari Coral Gardens Committee<br /><br />“Amidst the proliferation of films on Rasta, none have managed to fathom the Rastafari experience of their Jamaican Babylon like Bad Friday. Now that Rasta is an increasingly co-opted global culture, this is as close as the untutored will get to understanding the meanings of being ‘Dread’ during the pre- reggae period when adherents were viewed as a ‘cult of outcasts’ and routinely victimized. A powerful and timely historical document that speaks to the ways that remembering-and-forgetting continue to shape Jamaica’s post-colonial identity.“ - Jake Homiak, Curator of ‘Discovering Rastafari’, Smithsonian Institution<br /><br />“By bringing to us the poignant testimony of the men and women who witnessed and whose lives were forever scarred by these events, Bad Friday obliges us to confront the shocking level of state violence that was unleashed against not only the individuals involved, but also against the entire Rastafarian community of Jamaica. Now, thanks to this evocative film, we are able to appreciate the full horror of the events from that distant time and what they portended. I salute and congratulate everyone involved in the making of this redemptive and truly valuable work of historical memory.” - Robert A. Hill, University of California, Los AngelesJohn L. Jackson, Jr.http://www.blogger.com/profile/05994353364710886605noreply@blogger.com0tag:blogger.com,1999:blog-3171291793264659096.post-8806342432865779502011-10-18T15:58:00.002-04:002011-10-18T16:00:18.198-04:00Is Herman Cain Racist?cross-posted @ <a href="http://chronicle.com/blogs/brainstorm/is-herman-cain-racist/40199">The Chronicle</a>.<br /><br />I’m not following the lead up to 2012′s presidential election the way I hung on every Democratic and Republican candidate’s words in 2006 and 2007. Even still, it is hard to miss the major headlines, no matter how much one might try: Obama’s plunging poll numbers, critiques of Romney’s religious persuasion, Rick Perry’s n-worded family home, and the conspicuously growing journalistic indifference to anything at all Bachmann-related.<br /><br />Herman Cain’s 9-9-9 tax plan might have gotten panned by several of his fellow candidates during last night’s Republican debate (including Bachmann’s likening it to a version of the Biblical “mark of the beast”), but it is the continued public “controversy” around Cain’s take on racism in America that seems to have everyone up in arms right now.<br /><br />This all started when Cain dismissed racism as a significant cause for African-American marginalization. “I don’t believe that racism today holds anybody back in a big way.” That was the way he put it last week on Fox News.<br /><br />Add to this the fact that just yesterday Cain characterized his black detractors as “more racist than the white people that they’re claiming to be racist,” a rehashing of chicken-egg counter-accusations of racism:<br /><br />“You’re a racist.”<br /><br />“Oh yeah, well, then you’re a racist for calling me a racist.”<br /><br />“See what I mean. Only a racist would believe that.”<br /><br />Of course, Cain’s comments about racism are unintelligible if disconnected from his consistent slamming of the “Occupy Wall Street” movement as a form of seemingly anti-American class warfare, a conspiracy between “unions and Obama supporters to distract the American people from the real problem, which is the failed policies of the Obama administration.”<br /><br />Just this Monday, Cain spoke with conservative commentator Sean Hannity about the Occupy Wall Street protesters. “They’re trying to legitimize themselves by comparing themselves to the Tea Party movement,” he claimed. “There is absolutely no comparison.” According to Cain, the Tea Party represents a legitimate form of political opposition and civic engagement, but the Occupy Wall Streeters are rabble-rousing thugs, a merely apolitical mob.<br /><br />It is a position shared by the black conservative pundit (can’t remember his name) who was on MSNBC last night complaining about all the public sex acts, violence, and other illegalities taking place under the banner of the Wall Street crowd. These exact same characterizations were mobilized by many conservative pundits during the Civil Rights Era to dismiss that movement as insincere and degenerate, as full of little more than sex freaks and criminals.The response to such dismissals is often the same: How can an African-American such as Cain (or that aforementioned and unnamed MSNBC pundit) celebrate an "anti-Obama" Tea Party over and against other forms of social protestation about social inequality and corporate greed?<br /><br />Do figures like Cain represent a form of internalized anti-black racism, which is the critique that Cain seems most keen on challenging? Or is it simply (as if any of this were simple) a case of class-interests trumping racial solidarity for a wealthy former-CEO of a national fast food chain? The party/candidate that wins this ongoing debate about the relative significance of race vs. class for contemporary American society might just end up with the next set of keys to the White House.John L. Jackson, Jr.http://www.blogger.com/profile/05994353364710886605noreply@blogger.com0tag:blogger.com,1999:blog-3171291793264659096.post-12785014108564085662011-09-25T09:22:00.001-04:002011-09-25T09:25:38.367-04:00Doing Diversity DIfferentlyMany universities operationalize their commitments to “diversity” in curious and conflicted ways.<br /><br />In some academic quarters, the term has traditionally meant diversifying standing faculties and student bodies on college campuses by increasing the number of underrepresented minorities, but diversity is one of those ideals that people sometimes accept much more readily in theory than in practice, a principle supported in the abstract but harder to justify as a hard-and-fast campus policy with any real teeth to it, especially with legal threats of “reverse discrimination” lurking in the shadows.<br /><br />Campuses debate the very meaning of diversity these days, some seeing calls for, say, “internationalization” as a calculated attempt to replace ongoing university commitments to U.S.-based minority recruitment, others asking specifically for “ideological diversity” to address the low number of self-described “conservatives” teaching at many elite colleges and universities.<br /><br />Some of these same schools end up crafting faculty diversity initiatives that might seem to go against the grain of their institutions’ regular hiring practices, counting and categorizing bodies and subsequently creating search committees to lure accomplished scholars (especially scholars of color) from other places.<br /><br />The next trick, upon locating interesting candidates, entails convincing relevant departments to consider hiring them. If the names are big enough, the CV’s impressive enough, a department might acquiesce. But there are several reasons why such a strategy, though arguably laudable for its directness and relative simplicity, may be a bit short-sighted.<br /><br />We could think of central administrations as the academic equivalent of our federal government, which is an analogy that would make specific departments akin to individual states. And in such a strained metaphorical context, just about every single faculty member we’d ever meet would be a staunch advocate for states’ rights. So any extra-departmental recommendations about future hires are usually treated as infringements on departmental autonomy, which they are, as academically unconstitutional (and unconscionable). If the departments didn’t come up with those names on their own, there will almost certainly be a contingent of faculty members with a vested interest in thwarting the hires in question, and sometimes just on general principle. Of course, part of the problem is that some of those very same departments have a hard time coming up with any “diverse” job candidates on their own, often chalking it up to deficiencies in the pipeline.<br /><br />But if universities are trying to identify “diverse” bodies in such ways, they may be setting themselves up for failure in the long-term, and not just because of any willfully obstructionist faculty. (Of course, at a time of ever-shrinking resources, departments might take new faculty lines any way they can get them.)<br /><br />The very process of having special hires for minority candidates makes the entire thing look like a “political” move and not an “intellectual” one, regardless of the caliber of the candidates in question (and even though all faculty hires are political and intellectual at the same time). Such a targeted process can feel qualitatively different from normal hiring practices, reinforcing a kind of ghettoized mentality about the entire endeavor.<br /><br />Indeed, once universities have gotten to the point of looking for bodies qua bodies, they may have already lost the diversity battle.<br /><br />For example, one of the reasons why departments aren’t as diverse as they could be might pivot on the intellectual projects and objectives that those departments privilege (or not), the way they prioritize their needs and intellectual goals for the future. It is an empirical question, but I wouldn’t be surprised, for example, to find out that there might be a kind of mismatch in many social science fields between what scholars of color are interested in studying and what departments are interested in hiring. If nothing else, it might make sense to have more conversations within academic departments about how five-year plans and statements of departmental goals may frame certain intellectual questions (and the scholars who pose them) out of serious consideration in terms of future departmental growth.<br /><br /><br />x-listed @ <a href="http://chronicle.com/blogs/brainstorm/doing-diversity/39190">The Chronicle of Higher Education</a>John L. Jackson, Jr.http://www.blogger.com/profile/05994353364710886605noreply@blogger.com0tag:blogger.com,1999:blog-3171291793264659096.post-4159502086713294552011-05-13T21:18:00.003-04:002011-05-13T21:23:19.472-04:00The End of the World: Next Week?<a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi68lIEzySYXi7BnABTMmsp-HT8_PXR4dcQtNI0R2sJItVdxT7NtMUyvfy1_0F1-CV2lAPGh1mdVpU5mPmqgWysDt0nOvNBQtGjxkJHSjT-u39i91wTDcQ-g5rQ2HEm_O1kCuc7r4b4zm0/s1600/Family-Radio-Judgment-Day-May-21-12-e1304536542513.jpg"><img style="display:block; margin:0px auto 10px; text-align:center;cursor:pointer; cursor:hand;width: 200px; height: 113px;" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi68lIEzySYXi7BnABTMmsp-HT8_PXR4dcQtNI0R2sJItVdxT7NtMUyvfy1_0F1-CV2lAPGh1mdVpU5mPmqgWysDt0nOvNBQtGjxkJHSjT-u39i91wTDcQ-g5rQ2HEm_O1kCuc7r4b4zm0/s200/Family-Radio-Judgment-Day-May-21-12-e1304536542513.jpg" border="0" alt=""id="BLOGGER_PHOTO_ID_5606375940048044050" /></a><br />image from kingdomexclusion.com/<br /><br />Some people invoke the ancient Mayans to contend that the world’s expiration date is quickly approaching, but there are other arguments afoot these days about just how imminent such an end might be. You may have seen the billboards for one of them: <a href="http://www.familyradio.com/index2.html">Family Radio’s</a> declaration that Judgment Day is “guaranteed” to begin (and quite conspicuously) on May 21st, 2011.<br /><br />I have been completely fascinated by this proclamation ever since I first heard Harold Camping’s matter-of-fact declaration earlier this year (while surfing the FM dial on a drive up to NYC). Camping is a long-time Christian radio broadcaster who has been a mainstay at Family Radio since the early 1960s. Shuffling through broadcast options in my Saturn, I knew Camping’s voice as soon as I heard it, mostly because I grew up on it. Extended family members always seemed to have his program on in their homes, so much so that it served as part of the soundtrack to my childhood (explaining, I’m sure, many of my subsequent scholarly interests).<br /><br />Hearing Camping’s distinctive voice earlier this year (some 20 years or so after the last time) made me immediately nostalgic. And then I heard his claim, which wasn’t being argued back in the 1980s and 90s (as far as I can remember), about the earth being slated for destruction next week, and that was it. I was riveted, spending the next hour listening to his contentious debates with listeners about the veracity of that spectacular prediction.<br /><br />Camping claims the Bible provides “absolute proof” that the Rapture will take place a week from this Saturday, and that God will then destroy the entire planet (and even the universe) on October 21st, five months later. This is God’s plan for the end of the world, Camping argues, and it is a function of our planet’s “wickedness.” Even still, his theological position is quite clear about our agency in the matter. God chose the elect (in a predestinationist sense) at the beginning of time, about 200 million people in all, and his choices are a function of his own opaque will, not any of our actions, good ones or bad.<br /><br />Camping is considered something of a heretic in certain Christian circles (for allegedly claiming that the Holy Spirit has no connection to most institutionalized Christian churches/denominations today). And he supposedly set a previous date for the Lord’s return, which was slated for 1994.<br /><br />Family Radio is a religious radio network that spans over 100 markets in the United States, so I’m sure that hundreds of thousands (maybe even millions) of people have gotten a whiff of Camping’s new date.<br /><br />I should also say that I am particularly primed for such premilleniallist talk. I grew up Seventh-Day Adventist, which emerged out of the Millerite movement’s inaccurate 19th century prediction of God’s return. They picked a date and organized their entire lives around preparing for it.<br /><br />I’m currently working on a book about a group of African-Americans who made similar predictions in the late 20th century, a group that has built a vibrant transnational community on top of those earlier pronouncements. So you can see why I am particularly drawn to such contemporary contentions.<br /><br />I recently relayed Camping’s claim to a friend of mine, someone who isn’t a practicing Christian and doesn’t have much patience for predictions about the future based on interpretations of religious texts. Even still, he laughed, shook his head, and added, “but given how crazy things have been lately [by which, I think, he meant the chain-reaction of uprisings in the Middle East, the Tsunami in Japan (and its nuclear aftermath), and the global financial crisis, amongst other things], I’ll probably wake up on May 22nd with a slight sigh of relief.”John L. Jackson, Jr.http://www.blogger.com/profile/05994353364710886605noreply@blogger.com0tag:blogger.com,1999:blog-3171291793264659096.post-18288909359502504352011-04-02T06:45:00.002-04:002011-04-02T06:49:44.307-04:00I hate Jazz MusicI’ve always wanted to start a piece of writing with that one provocation—maybe a bit of creative nonfiction, maybe a would-be short story.<br /><br />My purposefully dismissive declaration is meant to mark a two-fold resentment. First, not being a musician myself, I privilege jazz’s vocalists over its virtuosic drummers, saxophonists, and trumpeters, and for many jazz purists, that is my initial mistake: I want to hear Louis Armstrong sing more than blow his horn. Nina Simone, Arthur Prysock, Ella Fitzgerald, and Billie Holiday are towering figures, no doubt, with huge and loyal followings, but the Miles Davises and John Coltranes and Thelonius Monks undeniably define the music’s canonical core, especially for many would-be connoisseurs. And there begins my second complaint.<br /><br />At its most pretentious, jazz music sometimes gets mobilized (by a few of those aforementioned connoisseurs) to justify pompous brands of social sifting, a snobby elitism that functions as the class-coded policing of authentic African American cultural production. No other music, the claim goes, can hold a candle to its essential (and even existential) distillation of African American angst and aspiration. If the blues demands respect for its straightforward and vernacular profundities, jazz adds a learned and well-heeled dose of proficiency to the mix. And neither one is hip-hop, still occasionally invoked as “the anti-jazz.”<br /><br />In the not-so-distant past, musician Wynton Marsalis and cultural critic Stanley Crouch were the most vocal proponents of jazz’s qualitative difference from (and superiority over) hip-hop. Crouch, also an avid fan of the blues, has mused publicly about the “retarding effect” of hip-hop, a genre that, according to Crouch, takes relatively little talent and profits from the denigration of black culture. Marsalis has gone on record dismissing hip-hop as little more than “a safari for people who get their thrills from watching African-American people debase themselves, men dressing in gold, calling themselves stupid names like Ludacris or 50 Cent, spending money on expensive fluff.”<br /><br />“Jazz vs. hip-hop” is just one instantiation of a slow-burning intra-racial class warfare played out on the boneless (and, therefore, flexible) back of popular culture, pivoting on the politics of respectability in mixed-raced company. Jazz is one black middle-class response to the threat of racial inauthenticity, its trump-card rejoinder to the equally problematic assumption that urban poverty is singularly constitutive of legitimate African-American subjectivity. And this is true even if the black middle class is deemed unable or unwilling to sustain jazz music, which leads to discussions (at least in Spike Lee joints) about the extent to which jazz has become “white music,” i.e., supported by mostly white audiences.<br /><br />Hop-hop artists Jazzy Jeff and M1 joined academics Jesse Shipley and Wilfredo Gomez at Haverford College last night to talk about hip-hop’s image, history, and technological innovations. Not only that, they placed their thoughts about hip-hop into conversation with other cultural practices and musical genres (though we didn’t hear that much about jazz).<br /><br />I only bring this all up because the journal <a href="http://onlinelibrary.wiley.com/doi/10.1111/traa.2011.19.issue-1/issuetoc">Transforming Anthropology</a> has just published a special-issue on New Orleans that, amongst other things, contextualizes “popular culture” (jazz, hip-hop, and Mardi Gras) with recourse to larger questions about post-Katrina life in that region. (The opening of this post is an excerpt from my own piece in that volume.) <br /><br />This new issue of <span style="font-style:italic;">TA</span> is worth reading, especially since it includes a moving series of articles on the scholarship of Antonio Lauria. I’m not sure what Lauria knows about hip-hop (or if he shares any of my concerns about how jazz gets deployed in the “class wars”), but his research certainly has had a lasting impact on Caribbean anthropology. And beyond.John L. Jackson, Jr.http://www.blogger.com/profile/05994353364710886605noreply@blogger.com3tag:blogger.com,1999:blog-3171291793264659096.post-3490628310741578442011-02-26T15:46:00.005-05:002011-02-26T15:50:53.767-05:00Disrespected, Take Two(cross-posted at <span style="font-style:italic;"><a href="http://chronicle.com/blogs/brainstorm/author/jjackson">The Chronicle of Higher Education.) <br /></span></a><br /><br />I wanted to take a second to acknowledge the responses to my most recent post (about complaints several senior black faculty have expressed to me about their treatment in the academy). I appreciate the discussion that it has sparked, and I definitely want to follow-up on some comments and questions.<br /><br />goxewu thematizes one prominent (and very reasonable) response to my blog post: that it is just too doggone vague and ambiguous. Mere “blind-quote journalism,” goxewu writes, wanting more specificity to be convinced that there’s anything close to a there there. “There is a middle ground,” he writes, “between complete vagueness and anonymity (which is what Professor Jackson has now) and blowing everyone’s cover.”<br /><br />Responding to trendisnotdestiny’s question about whether bringing the topic up at all might be enough, goxewu responds: “I’m going to be a little more severe here: Prof. Jackson is essentially saying, “I know a lot of important black professors at elite schools and they confide in me. So take my word for it that in our private conversations, where they let their hair down like they wouldn’t with anybody else, they complain about racial disrespect.”<br /><br />Then goxewu begins a debate about the similarities and differences between a “paid blogger and a journalist,” arguing that the two are equivalent, which means, he contends, that I am obliged to provide more proof to ground my piece.<br /><br />marktropolis, who always has great feedback in such moments, pushed back against that blogger-as-journalist claim, but he agreed that it might be useful for me to add more particulars to my story.<br /><br />Several people wonder if I can go back and get some quotes, or at least add some more details to the descriptions of the faculty I’m invoking.<br /><br />Then Marc B and Marktropolis have a discussion about the role (and potential cause) of under-representation in academia, a theme I invoke in the post. (And thanks, MB, for the link to your piece from the minnesota review. I just printed it out for an upcoming train ride.)<br /><br />“But what causes under-representation? Every time I give a talk,” Marc B writes, “whether it’s at an Ivy League school or a community college, you can hear a pin drop when I ask folks to speculate about a truth that everyone present already knows: Why are police departments more diverse than history departments?” Other commentators, including marktropolis, try to answer that question.<br /><br />Several other comments reinforce the idea that more of the story needs to be told. Professor Chuck Kleinhans wants more specifics and asks whether or not Clarence Thomas might be read as similarly disrespected (and in race-inflected ways), the latter comment spawning a series of sub-debates about the Supreme Court Justice’s relative ideological autonomy (with marktropolis explaining his skepticism about “bringing Clarence Thomas into this thread” and livefreeordie2 contending that marktropolis is disrespecting Thomas).<br /><br />joelcairo calls for “a full-blown ethnographic study” on the topic, which I do find intriguing. And wilkenslibrary asks about the degree to which “distinguished black women faculty feel as disrespected as their male counterparts.”<br /><br />Thanks for the comments, and here’s what I’m going to do. I’ll make a few phone calls this weekend and early next week to see if any of these scholars might allow me to provide more substantive details about their stories/sagas (without compromising their anonymity). Maybe I can even get someone to let me post a short Q&A with him about his particular concerns. In fact, someone might even be willing to go public in a less anonymous way, but I’ll find out. I’ll also try to chat with some distinguished black female scholars about their lives in the academy. See what I get. In many ways, what struck me about the scholars I brought up in my post was the fact that these stories were decidedly unsolicited, but it might still be valuable to ask some female faculty, point blank, about their own experiences.<br /><br />For now, let me just say that three of the scholars I mentioned are at Ivy League institutions, four more work at research institutions on the East Coast or the Midwest, and one teaches in the University of California system. A little more specificity, with possibly more to come.John L. Jackson, Jr.http://www.blogger.com/profile/05994353364710886605noreply@blogger.com3tag:blogger.com,1999:blog-3171291793264659096.post-84373612537587704202011-02-26T15:46:00.004-05:002011-02-26T15:49:54.133-05:00Disrespected?!(cross-posted at <span style="font-style:italic;">The Chronicle of Higher Education<a href="http://chronicle.com/blogs/brainstorm/author/jjackson"></a></span>.)<br /><br />For the past year or so, I’ve been inadvertently collecting unpleasant and disconcerting stories from senior black faculty. These stories have come mainly (though not exclusively) from men, most of whom are incredibly accomplished and wildly influential in their fields. These academics are housed in several different disciplines across the humanities and social sciences, and their confidential disclosures demonstrate real unhappiness about their treatment in the academy.<br /><br />If I had to use one word to describe how these aforementioned scholars feel, it would be disrespected, profoundly disrespected.<br /><br />In these narratives, senior scholars of color describe themselves as under-appreciated by administrators, relatively marginalized (and even maligned) by fellow colleagues, and somewhat alienated from other experts in their fields.<br /><br />The first time I heard such a tale, over lunch at a coffee shop in California, I tried to dismiss it as an isolated incident, one person’s idiosyncratic experience. Maybe he was just being hypersensitive. Or I could have caught him on a particularly bad (and non-representative) day. But then I sat across from a few more senior scholars (in Michigan and Massachusetts, in New York and North Carolina) with similar stories to tell (of humiliating slights interpreted as race-based disrespect), and I had to admit that something more was going on than what some might imagine as a lone faculty member’s thin-skinned bellyaching.<br /><br />Of course, most of these scholars are sharing such stories with me (as their relatively junior colleague) for my own good, in hopes of steeling me for a similar (potential) future of professional discontent. Their point: No amount of publishing productivity or public notoriety exempts one from the vulnerabilities and burdens that come with under-representation in the academy.<br /><br />In all but one instance, these scholars weren’t lamenting the stain of “affirmative action,” the fear that their successes were tainted by other people’s assumptions about their achievements being predicated on something other than purely meritocratic grounds. Only one person seemed plagued by such a concern. The others were arguing the opposite (or close to it): that they had succeeded at a game decidedly stacked against them, and the thanks they received was a tacit (or not so tacit) attempt to ignore them, to demean them with cool indifference and a series of daily exclusions (from, say, important departmental discussions or substantive leadership roles at their universities).<br /><br />For the sake of protecting their anonymity, I won’t divulge the specifics of these anecdotes. Not one of the scholars shared their examples with me banking on the fact that I would eventually write about them in The Chronicle. In fact, some of these senior scholars probably don’t perform their disaffection in any conspicuous way, especially not in mixed company. But these intimate discussions have been so disheartening and depressing that I wanted to write something, even something relatively opaque and inadequate, to begin describing this troubling discourse.<br /><br />My brief post doesn’t nearly do justice to the stories I’ve been told. Or to the seething anger that those stories narrate. And there are many people who would argue that a lot of older faculty members, no matter how distinguished, feel the sting of disregard from younger colleagues. Race, they’d say, has nothing to do with it. But these scholars are thematizing their stories in explicitly racial terms. And even if they are swinging at mere windmills and making racial mountains out of race-less molehills (or mistaking ageist mountains for racial ones), it is still important to figure out why some senior black faculty, very senior black faculty, feel that they are more disrespected than their white colleagues.John L. Jackson, Jr.http://www.blogger.com/profile/05994353364710886605noreply@blogger.com0tag:blogger.com,1999:blog-3171291793264659096.post-81168096251015716182010-11-19T08:28:00.004-05:002010-11-19T08:39:23.368-05:00To Conference or Not to ConferenceThe last time I blogged about attending an academic conference, I found myself mercilessly pummeled by several very upset <span style="font-style:italic;">Chronicle of Higher Education</span> readers (the other place where I <a href="http://chronicle.com/blogs/brainstorm/author/jjackson">blog</a>--a little more consistently). They used my post as an excuse to rail against professors who (irresponsibly!) skip out on their classes in order to attend such "conferences," a practice dismissed as little more than a scam, the kind of racket that allows academics to go off gallivanting in exotic locales under the trumped-up auspices of professional development and research dissemination.<br /><br />“How many classes did you have to cancel to attend your little conference?” It started something like that. One reader asked me the equivalent of that very question several different times, trying to determine if my conference attendance was at the expense of my teaching obligations. Even after I explained that the conference didn’t require me to miss any of my scheduled class sessions, not one, said reader refused to register my response, asking that selfsame question (about how many of my classes I canceled for the conference) at least two more times.<br /><br />Several more unhappy readers decided that they were going to use the opportunity to make a larger argument about the complete uselessness (and pseudo-intellectualism) of academia’s self-indulgent tradition of conferencing. Some of them argued that scholars should exclusively tele-conference or deploy other new-media options in their would-be efforts to forge and maintain potentially powerful inter-institutional links with peers. Why, they asked, make a fetish of the face-to-face?<br /><br />Both those who anonymously posted their anti-conference comments on-line (and the many more who emailed me or called my office phone to express their displeasure over my uncritical celebration of academic conferences) seemed to get particularly upset about the post's characterization of conference-attendance as a mixture of informal chats with other academics in packed conference lobbies and laughter-laced drinking atop cushy stools at fancy hotel bars.<br /><br />I only ponder that previous debate now because I am currently in New Orleans at the 109th Annual Meeting of the American Anthropological Association. It is my first trip back to New Orleans since Katrina, which almost seems like a scandalous thing to admit. And coming to hang out in such a mystical town was clearly an added bonus of attending this year’s conference.<br /><br />I just got in Wednesday night, but I’ve already gone to several panels, one of which included an absolutely fantastic presentation by one of Penn’s anthropology graduate students. And I even checked out the first half of a rather hypnotic ethnographic film, <span style="font-style:italic;">Movement (R)evolution Africa</span>, which examines the evocative links between contemporary African choreography and newfangled understandings of African subjectivity and embodiment.<br /><br />Still, most of my day consisted of hallway-talk with colleagues I haven’t seen in a while and getting the word out about some new scholarly initiatives that I am helping to launch: a book series on the intersections between race and religion and an ambitious and expansive on-line bibliography for the discipline of anthropology. So, I’ll spend a lot of time in New Orleans leaving panels early, getting to panels late, and sipping cocktails well into the night. (Well, maybe not so late. Even as an undergrad, I got tired by about 10pm.) But I don’t buy the claim that any of this isn’t a legitimate way to make sure that I stay tied to disciplinary conversations.<br /><br />I realize that many of those aforementioned anti-conference readers will scoff at my claim, but at least I didn’t have to cancel class. Again, maybe that's some consolation.<br /><br />Not that that would have been a huge issue, either. Several students from my graduate class this semester arrived in New Orleans even before I did, which means that we could have engineered an impromptu seminar discussion in the hotel lobby if we absolutely had to. Drinks optional.John L. Jackson, Jr.http://www.blogger.com/profile/05994353364710886605noreply@blogger.com2tag:blogger.com,1999:blog-3171291793264659096.post-63636669630832397532010-07-19T07:37:00.002-04:002010-07-19T07:45:04.629-04:00The R WordThe R-word in question is <span style="font-style:italic;">racism</span>. Everyone's throwing it around these days, but very few people seem to agree on what it means.<br /><br />The NAACP recently asked Tea Party leaders to repudiate the movement's racist members, to stop displaying "continued tolerance for bigotry and bigoted statements."<br /><br />Mark Williams of the Tea Party Express responded by describing the NAACP's antiquated use of the word "colored" (in its name) as racist and declaring that the storied Civil Rights organization makes "more money off of race than any slave trader" ever did. (Just over the weekend, Williams was expelled from his own group because of a satirical letter he penned that has been described as racist.)<br /><br />Other right-wingers simply dismiss the NAACP's accusation of racism as racist, the socio-political equivalent of saying "I'm rubber; you're glue. Everything you say bounces off of me and sticks to you."<br /><br />Via tweet, Sarah Palin called the NAACP's very charge "appalling."<br /><br />In other racial news, Jesse Jackson is still being clowned and condemned for claiming that Cleveland Cavaliers owner Dan Gilbert can only see Lebron James as a high-priced "runaway slave," and Whoopi Goldberg has been defending her defense of Mel Gibson all week. For the last few days, we've been getting new tape recorded snippets of a voice that sounds a lot like Gibson's (granted, a demonically possessed version) raging against the mother of his youngest child with a barrage of sexist expletives: c-words, b-words, f-bombs and just about every other letter in the alphabet. That same tape-recorded voice matter-of-factly deploys terms like "wetback" and the n-word to color its apoplectic attacks. <br /><br />"I have had a long relationship with Mel," Goldberg declared. "You can say he's being a bonehead, but I can't sit [here] and say that he's a racist, having spent time with him in my house with my kids."<br /><br />Detractors dismiss Whoopi as an apologist with a long history of defending the indefensibly racist, Ted Danson's blackface Friar's Club performance being their prime example. Whoopi's position is instructive though, and reminiscent of when African-American comic Paul Mooney took some heat for not demonizing Michael Richards after the latter's 2006 "meltdown," when Richards peppered his comedy club audience with a string of n-words and lynching imagery (in response to some black hecklers).<br /><br />But there are at least two important things to remember in any discussion about the facts or fictions of racism (and counter/accusations thereof). <br /><br />First, racism is almost never a smoking gun. It explains very little <span style="font-style:italic;">all by itself</span>. Social causality is much more complicated than that.<br /><br />Historians of early America have been unpacking and debating a version of this point for years. Our country's history of chattel slavery wasn't <span style="font-style:italic;">caused</span> (in any simplistic and straightforward sense) by racism. Was Trans-Atlantic slavery a clear-cut example of racism? Yes. Did racism (as ideology) facilitate, justify, and rationalize the dehumanization of African people? It did. But racism alone doesn't provide us with the system's motives and <span style="font-style:italic;">raison d'etre.</span> At the very least, we'd need to add economic arguments to that mix.<br /><br />All of that is simply to say that racists are never <span style="font-style:italic;">just</span> racists. Racism is not a mysterious island somewhere in the middle of the ocean. Eighteenth- and 19th-century slavemasters were racists, but they weren't only racist. They were also revolutionaries and humanitarians, adventurers and religionists. To call someone racist isn't about explanatory exclusivity. Racism is one important ingredient in the recipe for American apple pie, but there are still other details to be worked out about how much it adds, about when in the process it gets added, and about what else goes into the mixing bowl.<br /><br />Second, racism is less about what someone <span style="font-style:italic;">is</span> (absolutely and forever) than about what a person <span style="font-style:italic;">does</span> (in specific moments). Racism is at least as much about <span style="font-style:italic;">opportunity</span> as <span style="font-style:italic;">ontology</span> (to butcher a proper philosophical term). <br /><br />We often imagine ourselves to be looking for racists who are racist 365 days out of the year. To chronicle the several days each week or month or lifetime when they are not demonstrably racist is either (i) to dismiss such fallow periods as exceptions (or mere performance) or (ii) to offer them up as proof that said accusations are false. But it doesn't make sense to think of racism the way we think of, say, racial identity (as something we conspicuously carry around with us all the time, everywhere we go). That's one of the most powerful points demonstrated by Officer John Ryan, the disturbing character played by Matt Dillon in the award-winning 2004 film <span style="font-style:italic;">Crash</span>.<br /><br />In one scene, Ryan is a working-class cop who mercilessly harasses a middle-class black couple during a traffic stop, clearly relishing his racial privilege and lauding it over his intimidated victims. In another scene, he can risk his own life to pry that same black woman from a burning car before it explodes.<br /><br />Critics knock the film for ignoring the lopsided specifics of America's racial history, making every example of racial prejudice (black on white, white on black, white on Latino, black on Latino, black on Asian...) equivalent to every other.<br /><br />Dillon's character was often singled-out as a pathetic attempt to humanize and redeem white racism. But that's only one interpretation. The film also argues that a racial monster in one moment can be a self-sacrificing hero in the next. Very few people organize their every breath around racial animus. We often slip in and out of racism's seductive logic: sometimes rising to meet the better angels of our nature, sometimes falling victim to the easy lure of social scapegoating. That's what's so complicated about how racism animates our social lives today, helping to explain why Whoopi is right and wrong about Mel Gibson. Gibson might be a child-friendly, politically correct dinner guest one night and a maniacal phone caller spitting out the n-word in the morning.<br /><br />(Cross-posted at the <span style="font-style:italic;"><a href="http://chronicle.com/blogPost/The-R-Word-Again/25585/">Chronicle of Higher Education</a></span>)John L. Jackson, Jr.http://www.blogger.com/profile/05994353364710886605noreply@blogger.com1tag:blogger.com,1999:blog-3171291793264659096.post-11838411879401939762010-06-10T07:43:00.003-04:002010-06-10T11:43:48.549-04:00'White Guilt' and the RevolutionIs "white guilt" really real? Slavoj Žižek thinks so. <br /><br />The Slovenian political philosopher (once dubbed "the most dangerous philosopher in the West" by the <span style="font-style:italic;">New Republic</span> and "the Elvis of cultural theory" by <span style="font-style:italic;">The Chronicle of Higher Education</span>) has written a communist manifesto, <span style="font-style:italic;">First As Tragedy, Then As Farce</span>, challenging contemporary interpretations of 9/11 and of the global financial meltdown of 2008. I won't try to capture all the nuances of that ambitious and provocative work, but I will give you my version of its punch line: that only what Žižek calls "a dictatorship of the proletariat" can make up for the limitations and constitutive exclusions that inescapably define capitalism (and liberalism and socialism) in all of their various guises.<br /><br />Far from being a threat to capitalism's undeniable ubiquity and unchallenged global hegemony (as some Leftists attempt to interpret things), Žižek sees the current global recession as potentially clearing the way for even more ramped up capitalist hysteria/utopianism. He also frames it as the context/pretext for intensified tensions between "democracy" (as a political system) and "capitalism" (as an economic formation). What if "capitalism with Asian values" (i.e., the invisible hand of the free market tightly clasped with an iron fist of totalitarianism) proves to be a more efficient and effective way to capitalize on the fundamental logic of capitalism?<br /><br />French historian Pierre Rosanvallon claims that Scottish Enlightenment thinker Adam Smith was, in effect, arguing for "the withering away of politics," theorizing the emergence of a free market system that could potentially govern all of social life (rationally and fairly) without recourse to merely political concerns and considerations. Žižek's critique of the complicities between and among liberalism, socialism, and capitalism similarly asks what we might gain from thinking long and hard about how particular understandings of the relationship between politics and economics get naturalized.<br /><br />As part of his argument, Žižek rails against the pathetic hubris of "white guilt," what he labels "an inverted form of clinging to one's superiority." Quoting from a section of Frantz Fanon's <span style="font-style:italic;">Black Skin, White Masks</span>, a passage that Žižek describes as demonstrating Fanon's "refusal to capitalize on the guilt of the colonizers," Žižek demands that his readers inoculate themselves from the seductive sickness of "identity politics" in all of its "private" and non-universal forms (race, gender, sexuality, religion, and so on). <br /><br />To be fair, this last point is little more than an aside for Žižek, a drive-by theoretical shooting along a tiny stretch of the much longer highway that eventually leads home to the Communist idea, but any real discussion of "white guilt" (and the ostensible implications thereof) would have to reference the work of Shelby Steele. For Steele, white guilt isn't an aside. It is one of America's central dilemmas. His book on the subject, <span style="font-style:italic;">White Guilt: How Blacks and Whites Together Destroyed the Promise of the Civil Rights Era</span>, argues that "white guilt is quite literally the same thing as black power," the reduction of moral authority to a zero-sum game between blacks and whites wherein what was once the stigma of race becomes the neo-stigma of racism. The more guilty whites feel about race/racism, the more empowered blacks are to use accusations of racism (and invocations of America's racist history) as a disciplining rod. Steele cautions against the lure of white guilt: for blacks, as a form of political capital; for whites, as a performance of social penance. <br /><br />To hear Steele describe it, white guilt sounds like a metaphysical totality that overdetermines contemporary American life (and maybe not just the parts that have anything to do with racial issues). White guilt gets cast as the overarching organizing principle for race relations, but is that really true? Does white guilt explain the central dynamics of contemporary inter-racial exchanges and interactions?<br /><br />In this version of things, "playing the race card" is political slang for attempting to exploit forms of white guilt. Affirmative Action gets dismissed as a policy predicated on a misguided effort to manage and minimize white guilt. But is "white guilt" really real? I mean, any more so than, say, what we might call middle-class guilt (vis-à-vis poor people)? Or heterosexual guilt (vis-à-vis homosexuals)? Or even, say, Christian guilt (vis-a-vis Muslims)? What manner of "guilt" is this? And does it make sense to offer it up as <span style="font-style:italic;">the</span> analytical framework for our contemporary socio-racial moment? <br /><br />Indeed, white guilt doesn't seem to define the ethos behind the Tea Party push. Janeane Garofalo isn't the only one who wants to characterize them as reactionary and racist, as anti-Obama simply because they're anti-black. Self-professed Tea Partyers take offense to such accusations, and they also seem to display a decided lack of guilt about America's racial history, a guilt-freedom that serves as one of the engines powering their political efforts. I would think that even though most Americans (and most white Americans) aren't card carrying Tea Party types, they also aren't particularly angst-filled about America's racial history either. Few Americans are.<br /><br />Maybe a powerful film or book can provoke a pang of sadness, humanizing the past in ways that are poignant and real. And I wouldn't argue that white Americans never reflect on how or why under-represented minorities are so under-represented in elite spheres. But is it really accurate to claim that "white guilt" haunts the American psyche? Can we use that to explain anti-racist efforts anymore than we can use that aforementioned "heterosexual guilt" as <span style="font-style:italic;">the</span> fundamental psychological drive behind the push to get rid of "don't ask, don't tell" in the military? In fact, people are increasingly willing to invoke bad genes or the "culture of poverty" (over and against America's sordid racial history) to explain contemporary racial disparities in education and employment. That seems like a powerful <span style="font-style:italic;">anti-guilt</span> move.<br /><br />Most of Obama's detractors might be extra careful about deploying their political rhetoric so that they don't find themselves described as racist, bowing to some of the mandates of a politically corrected public sphere, but they have no qualms at all about attacking America's first black president with all the gusto they can muster. They are trying to foment a revolution, and they don't feel guilty about that, not one bit.<br /><br />Of course, Steele was really talking about liberals not conservatives. Žižek was too. But I'm not sure that "white guilt" is as big a problem as these cultural critics make it out to be. Moreover, the election of President Obama might be ushering in an era of "white rage" that is more than giving "white guilt" a run for its money.John L. Jackson, Jr.http://www.blogger.com/profile/05994353364710886605noreply@blogger.com1tag:blogger.com,1999:blog-3171291793264659096.post-14136763928364027222010-05-06T23:14:00.003-04:002010-05-06T23:36:38.134-04:00Race, Genetics, and Harvard Law SchoolIs it reasonable to simply ponder the "possibility," ever so idly and hypothetically, that bad genes might explain African American underachievement? It is a an old and many-told tale, I know, but it just got a fresh re-telling at Harvard Law School this month.<br /><br />A Harvard Law student recently apologized for comments she emailed to friends and colleagues following what sounds like an intriguing and heated dinner-time discussion about Affirmative Action. After first expressing concern that some of her earlier comments during that aforementioned dinner were misconstrued as politically correct, the student attempted to clarify her take on the matter.<br /><br />"I absolutely do not rule out the possibility," she wrote, "that African-Americans are, on average, genetically predisposed to be less intelligent."<br /><br />Claiming that sound research could convince her otherwise, she seemed intent on dispelling any lingering sense among her friends that she might be too timid about the notion of considering potential linkages between race and intelligence.<br /><br />She went on: "I don't think it is that controversial of an opinion to say that I think it is at least possible that African-Americans are less intelligent on a genetic level. And I didn't mean to shy away from that position at dinner."<br /><br />The student then ended her email with a joke. "Please don't pull a Larry Summers on me," citing the firestorm that Harvard's former president caused by broaching the idea that the under-representation of women in math and science might be predicated on their genetic endowment. Summers was eventually forced to resign his post. <br /><br />After a public reprimand from the law school's dean, Martha Minow, the student apologized for her email and took back her claim about being open to considering possible genetic links between race and intelligence.<br /><br />"I emphatically do not believe that African-Americans are inferior in any way," she said. "I understand why my words expressing even a doubt in that regard were and are offensive."<br /><br />But what is she apologizing for? The very thought? Is this an example of "politics" trumping science by deeming certain research questions impossible to ask? <br /><br />Ironically, the law student appears to have been reprimanded (during that earlier dinner conversation) for a form of political correctness, for not clearly accepting the premise that genetics might explain race-based differences in intelligence (and, by extension, social achievement), a premise that her friends appear to have chastised her for "shy[ing] away from." <br /><br />This Harvard student's email has been overshadowed by Harvard Professor Skip Gates's recent New York Times op-ed, which is equally controversial in terms of contemporary racial politics. The Gates essay emphasizes African complicities in the Trans-Atlantic slave trade as a way to problematize calls for reparations here in the United States. He asks, somewhat rhetorically, if African nations should by asked to fork over some cash, too. <br /><br />One reading of the Gates essay (and its critics abound) castigates him for "blaming the victim" and letting Europe and America off the hook, for pretending that every link in chattel slavery's horrible chain carried equal weight. <br /><br />Of course, it is easy enough to read genetic explanations for racial achievement gaps as another way of blaming victims (and, in that case, their biological makeup), of letting real (social and political) culprits off the hook. If racial thinking is "bad biology" (as social constructionists and many physical anthropologists currently proclaim), we should be suspicious of any too-easy and essentialist invocation of racial groups as "natural" hooks on which to hang causal claims about inequality. <br /><br />Gates isn't going to apologize for his (post-racial?) reading of history, and some people won't accept this law student's attempt at an apology. But, again, why is this student apologizing at all? That's one of the most important questions we can ask. Is it simply for offending African-Americans? For invoking race as nature rather than nurture? For racial insensitivity? For fear of being labelled a racist? And why do we often invoke genetics as some kind of holy grail that can reduce the messy machinations of everyday life to ostensible irrelevance? What kind of irrationality might that represent?John L. Jackson, Jr.http://www.blogger.com/profile/05994353364710886605noreply@blogger.com1tag:blogger.com,1999:blog-3171291793264659096.post-33859006657111235092010-04-12T15:17:00.002-04:002010-04-12T15:21:40.370-04:00Academia on Somebody Else's Terms?Jay Ruby cautions anthropologists against deploying film and video equipment on terms that are completely determined by an institutionalized media industry with its own assumptions about how stories are supposed to be told and circulated. He argues that anthropologists might need to organize their narratives (and distribute their films) in ways that run counter to industry (and even audience) expectations. There is a danger in approaching film making the way others do, he says, a danger that includes potentially betraying anthropology's intellectual mission.<br /><br />Philosopher Lewis Gordon has recently penned a powerful piece that asks academics to reconsider current tendencies to perform intellectual authority in ways that traffic in neoliberal logics of financial accumulation and brand-name fetishization, logics that may similarly betray our basic intellectual mission. There is a danger, he argues, in performing scholastic subjectivity on terms that seem foreign (even antithetical) to academia's traditional considerations and methods of appraisal.<br /><br />Gordon's thoughtful and provocative piece, "<a href="http://www.truthout.org/the-market-colonization-intellectuals58310">The Market Colonization of Intellectuals</a>," reads something like a manifesto, and it made me think about my own too-easy acceptance of academia's hyermarketization. He argues that academics can't serve two masters, can't occupy two separate spheres at the selfsame time: the life of the mind and the mandates of the marketplace. Moreover, he claims that we are increasingly getting used to just such a bifurcated and contradictory existence. Gordon describes a "managerial academic class" of professional administrators charged with aligning academia's values and self-assessments with the organizing principles and measuring modalities of the market. "Market potentiality," he says, "governs everything [that many academics] produce." Gordon designates this "the market colonization of knowledge."<br /><br />Gordon also questions the branding of analytical concepts such that they are flattened out for public consumption and magically fused with their intellectual creators: deconstruction and Derrida being one of his prime examples. This isn't a critique of Derrida or a dismissal of deconstruction's epistemological purchase. It is a plea for, amongst other things, an academic model of productivity that doesn't reproduce and reinforce the ubiquitous cult of celebrity, one of the most powerful points of entry into a mass mediated public sphere and the overflowing bank accounts of its most recognizable occupants.<br /><br />The piece even takes on academia's impoverished commitment to (and operationalization of) what it means to be "smart." "In the academy," Gordon writes, "nothing is more marketable than the reputation of being smart. This makes sense: No one wants dumb intellectuals. The problem, of course, is how ‘smart' is defined. In a market-oriented society, that means knowing how to play the game of making oneself marketable. The problem here is evident if we make a comparison with ethics. I once asked an environmental activist, who argued that a more ethical ecological position is the key against looming disaster, which would bother her more: to be considered unethical or stupid? She admitted the latter."<br /><br />The piece asks what kind of academic world we've created if the universality of a certain apotheosis of smartness becomes our highest (maybe our only) moral value. Gordon demands of academics a more rigorous reflexivity, a critical self-consciousness that challenges what's become orthodoxy in contemporary academic life.<br /><br />Gordon and Ruby both demand such a critical self-reflexivity from their colleagues. Gordon argues that anything less than that compromises our scholarly significance. Ruby claims that ethnographic films, as one instantiation of intellectual projects, might need to look very different from other motion pictures.<br /><br />I'm teaching a graduate film course this semester that attempts to take up some of Ruby's challenge, asking students to de-familiarize mechanically reproduced audiovisual products just enough for them to start seeing such offerings in slightly newfangled ways. We are reading critical histories of early cinema (for example, Peter Decherney's analysis of early Hollywood's ties to academia; Jacqueline Stewart's evocative theorization of the links between popular cinema and the lives of African Americans during the Great Migration; Hannah Landecker on the central role of early medical films to any discussion about the creation/popularization of cinema) along with ethnographies of media/mediation (from folks like Roxanne Varzi, Alan Klima, Michael Taussig, Diane Nelson, and John Caldwell), and differently pitched philosophical treatments of film/video/digital products/processes (by Roland Barthes, Walter Benjamin, Kara Keeling, Susan Buck-Morss, Kirsten Ostherr, D.N. Rodowick, and others). We are also watching films/videos that challenge traditional ways of seeing (including Bill Morisson's <span style="font-style:italic;">Decasia</span>, Cheryl Dunye's <span style="font-style:italic;">The Watermelon Woman</span>, William Greaves's <span style="font-style:italic;">Symbiopsychotaxiplasm</span>, and Charlie Kaufman's <span style="font-style:italic;">Synecdoche, New York</span>).<br /><br />The course hopes to trouble some of the taken-for-granted presuppositions that we all have about ways of approaching the ubiquity of televisual, filmic, and digital representations. If the course works, students may not be quite as prone to unproductively normalized assumptions about how we interface with such technology.<br /><br />The film/video market and its logics can also colonize and cannibalize the minds and methods of anthropological filmmakers/film critics who can find themselves seduced in ways that mirror some of the criticisms delineated by Gordon's challenging essay. You don't have to agree with every facet of Gordon's piece to imagine it as a wonderfully productive starting point for a spirited conversation about what academia ought to be.John L. Jackson, Jr.http://www.blogger.com/profile/05994353364710886605noreply@blogger.com2tag:blogger.com,1999:blog-3171291793264659096.post-26055313332318861622010-04-10T09:34:00.002-04:002010-04-10T09:36:15.811-04:00Michael Steele's Race Card?The RNC's Michael Steele has recently made national headlines for "playing the race card" by agreeing with the claim that African-Americans like himself, in positions of power, have "a slimmer margin of error" in America. Steele included President Obama in that calculation, which was met by a swift dismissal from the White House press secretary. <br /><br />Critics always find it ironic (even pathetic) when proponents of purported color blindness frame their own problems in terms of "racial victimization." The "Left" is assumed to traffic in such sophistries. The "Right," however, is supposed to know better. Clarence Thomas calling his confirmation hearing a "high-tech lynching" stands as the quintessential example of such racial irony. Even the people who claim obliviousness to racial reasoning seem susceptible to its rhetorical seductiveness.<br /><br />But who really doesn't see race? When is it ever invisible? Immaterial? Irrelevant?<br /><br />I just talked to a small group in Philadelphia about my most recent book, Racial Paranoia, and one of the listeners, an elderly white man, responded with a plea for the insignificance of race and racism as rubrics for understanding everyday life, especially his everyday life. He claimed that race had no impact on his daily activities. He wasn't a racist, he said. And he simply didn't see race. He had spent that very day teaching students, judging a science fair, and debating a group of university scholars. Race and racism, he assured me, had nothing to do with any of these experiences. And he made his case without anger, in clear and confident tones. <br /><br />I responded by basically telling him that he was wrong, which wasn't the best route to take. I admitted to him that I am always a tad suspicious of people who claim not to see race at all. Indeed, I think that the very aspiration of postracialism (in most of its guises) is misplaced and romantic, repression passing itself off as transcendence. He listened to my response and then restated his point, very matter-of-factly. Several audience members tried to push back against his claim, arguing that even when race isn't explicitly thematized in, say, a classroom setting (one of the locations that the man had invoked), it might still be a valuable analytical lens, a real social fact. It might still be there, even if we don't see it. Not because it is biologically real, but because culture is most powerful when we can't clearly see it.<br /><br />According to some theories on the matter, the only real racists left in America are the people unwilling to stop obsessing about race and racism, the folks who seem to see race behind every corner. If they just let race go, racism would wither and die away. The invocations of race and racism are incantations that keep bringing this beast back to life.<br /><br />But Steele is just the most recent example of how easily self-serving calls for color blindness can morph into equally self-serving color cognizance. And it might not be useful to imagine that we only have two options: fetishizing race or ignoring it altogether.John L. Jackson, Jr.http://www.blogger.com/profile/05994353364710886605noreply@blogger.com0tag:blogger.com,1999:blog-3171291793264659096.post-6367904679171673272010-04-01T12:54:00.005-04:002010-04-02T07:44:48.040-04:00It's Not Just HBO. It's TV.Did Congress ever pass health-care? Seriously. Lately, I've been trying to cultivate my own ignorance of all things "political." The news stories are just getting too bizarre: ongoing sagas in the wake of major earthquakes in Haiti and Chile; racial epithets that serve as soundtracks for Tea Parties; sex scandals that allegedly implicate, quite directly, a sitting Pope; Sarah Palin telling protesters to "re-load" in the context of actual violence linked to congressional votes and Tweets calling for Obama's assassination. With that as the backdrop, I've decided to issue my own self-moratorium on watching CNN, FOX and the evening news programs. <br /><br />Instead, I'm using my television for more otherworldly fare. And TV has never been better. Although it is the quintessential site for sensationalized news-mongering, it is also the best place to spy complicated fictional tales about human life. <br /><br />When (and why) did TV become so much better than motion picture film? I feel like that undeniable fact just kind of snuck up on the nation's couch potatoes. One minute we were awash in nothing but schlock melodramas and uninspired derivatives of <span style="font-style:italic;">Friends</span>; the next, <span style="font-style:italic;">The Wire, The Chappelle Show, Mad Men,</span> and <span style="font-style:italic;">The Sopranos</span> drastically raised our televisual expectations.<br /><br />In <span style="font-style:italic;">Production Culture: Industrial Reflexivity and Critical Practice in Film and Television</span>, John Thornton Caldwell argues that a show like <span style="font-style:italic;">24</span> radically altered the way television shows get made and further nuanced/complicated the narratives they deployed, a claim that anticipated part of the argument made in Steven Johnson's <span style="font-style:italic;">Everything Bad Is Good for You</span>. <br /><br />Last year, my colleague Elihu Katz used <span style="font-style:italic;">The ANNALS of the American Academy of Political and Social Science</span> to wonder aloud about TV's potential demise. His query: are we currently witnessing "The End of Television?" Katz's point is hardly reducible to the "repertoire of output (call it content)" that one can watch today. That was just one element in a much more nuanced discussion he facilitated about the place of the "old" medium in a changing (new) media landscape. But if we were to go by content alone, we'd probably have to say that TV is far from dead. It has probably never been more alive.<br /><br />Indeed, most people consider 2009 one of Hollywood's better years with respect to the quality of movies produced, big-budget fare (<span style="font-style:italic;">Avatar</span>) and more independent/low-budget films (<span style="font-style:italic;">District 9</span> and <span style="font-style:italic;">The Hurt Locker</span>). But I'd argue that the best of TV in 2009 was still far, far better, by leaps and bounds, than Hollywood's most celebrated offerings.<br /><br />Of course, TV is a mixed bag, but at its best, it can sometimes best Hollywood, even the latter's most impressive stuff. And I say this as a filmmaker and an enthusiastic film watcher.<br /><br />For one thing, the complexities of character development that one can witness over a TV show's entire season dwarf the best 2-hour attempts at cramming specificity into a protagonist's portrayal.<br /><br />TV (even network TV) also allows for taking more chances than Hollywood filmmaking currently affords. <span style="font-style:italic;">Precious </span>is a "controversial" and "daring" little film by Hollywood standards, but it would just be another HBO gem, and an even more impressive adaptation if we had gotten a chance to see Lee Daniels actually unfurl the other nuances of the book (over several weeks and months) that were bracketed out of the the powerful film. (The irony, of course, is that TV adaptations of motion pictures are usually uninspired and short-lived, sometimes even unwatchable. But that's because the TV-makers with the most nerve and talent are more interested in bringing their own projects to the air.)<br /><br />In fact, you know how people say that movies are never as good as the books on which they are based. I'd go so far as to claim that TV series (at least the very good ones) have the potential of seriously rivalling novels in terms of nuance and artistic virtuosity, even upstaging them.<br /><br />It is probably reasonable to say that TV is no longer simply Hollywood's mistreated step-child. More and more Hollywood actors, directors and producers are using TV as a venue for their wares. That only makes a good situation better. One potential downside, I think, is what I'll call <span style="font-style:italic;">the Mad Men effect</span>: a too-short commitment to the slow-burn that weekly serials provide (maybe, in part, because it is hard to serve two masters, film and TV, at the same time).<br /><br />My concerns about its treatment of race notwithstanding, AMC's <span style="font-style:italic;">Mad Men</span> rewards "close reading." It is a well-crafted show. But it also seems to air something like eight episodes a "season." <span style="font-style:italic;">That isn't a season.</span> That's a fairly long movie broken up into a few pieces.<br /><br />Even the shows with more episodes a year tend to broadcast them in ways that destroy the continuity of their narratives and frustrate fans: two-, three-, even four-week breaks (sometimes more) between new installments. Now FOX's <span style="font-style:italic;">24</span>, which has just announced that it will not have a ninth season next year, is TV's gold standard: a weekly unfolding of 24 episodes. <span style="font-style:italic;">That's a season!</span> In fact, it spans two.<br /><br />HBO's <span style="font-style:italic;">How To Make It In America</span> feels like it just started yesterday, and this coming weekend is already its season finale? Did I hear that right? If so, give me a break! The producers might as well have just made a movie. (Of course, the networks sometimes only order a certain number of episodes, less not more, because they don't want to over-commit to a bust. But <span style="font-style:italic;">HTMA </span>just started. I say, bring back <span style="font-style:italic;">True Blood</span> already, and make it last. If not, I might be forced to watch more of contemporary TV at its worst: those dreaded "news" shows. That's one thing that theatrical film clearly has over TV. It got rid of newsreels long ago.John L. Jackson, Jr.http://www.blogger.com/profile/05994353364710886605noreply@blogger.com0tag:blogger.com,1999:blog-3171291793264659096.post-15493278410932511122010-03-15T12:37:00.003-04:002010-03-15T13:03:26.666-04:00The Politicization of Everything (that the other side is doing)Frank Rich wrote a NYT op-ed this weekend that began by criticizing former White House Press Secretary Dana Perino and former NYC Mayor Rudolph Giuliani for their ideological readings of 9/11. Giuliani was appearing on ABC's Good Morning America in January; Perino, on FOX's Hannity last November. <br /><br />"We had no domestic attacks under Bush," Giuliani declared (though he probably meant after 9/11).<br /><br />"We did not have a terrorist attack on our country during President Bush's term," Perino stated. "I hope they [the Obama administration and the liberal wing of the press] are not looking at this politically. I do think we owe it to the American people to call it [the Ft. Hood shooting] what it is [a terrorist attack]."<br /><br />The Rich piece is really about the extent to which Karl Rove (in his recent memoir) and Keep America Safe (a new foreign policy advocacy group founded by Liz Cheney and Bill Kristol) engage in ideologically heavy-handy historical revisionism.<br /><br />"To hear them tell it," Rich writes, "9/11 was so completely Bill Clinton’s fault that it retroactively happened while he was still in office. The Bush White House is equally blameless for the post-9/11 resurgence of the Taliban, Al Qaeda and Iran. Instead it’s President Obama who is endangering America by coddling terrorists and stopping torture."<br /><br />But I'm most intrigued by Perino's request that others not mislabel last year's horrific Texas tragedy for politically motivated reasons. It is the hollowness of such a call that moves me. And so many people make it. These days, the opening salvo of just about any debate is usually grounded in the charge that the other side's position is over-determined by mere politics and extremist ideology (as opposed to the speaker's own relatively neutral, fact-based analysis). Admittedly, Rich's essay implicitly pivots on something close to that same move. As does my own posting. But it is a question of degree and kind. And of what one imagines to be the categorical difference between competing sides of any social issue. <br /><br />For instance, the claim that only left-leaning justices might be described as "activist judges" is silliness. Pure balderdash. Just this week, we find out that Virginia Thomas, wife of Supreme Court Justice Clarence Thomas, is starting a conservative lobbying organization (with links to Tea Party groups). It isn't the Justice himself, but her activists efforts will probably reflect the ideological assumptions behind the kinds of Supreme Court decisions that her husband has been making since the early 1990s. Why don't conservative pundits consider his opinions instantiations of judicial activism? Will that be harder to deny with his wife literally functioning as a political activist? (For those who want to imagine "originalism" as some kind of innoculation from petty politicking, read Matthew Engelke's A Problem of Presence. He's talking about Christian Scripture, not the Constitution, but he unpacks the "semiotic ideologies" that anchor claims about written words that are imagined to speak for themselves, or even to speak at all.)<br /><br />An invocation of "the political" (to describe "the other side" and its self-serving motivations) is probably one of the most political moves (by that very same definition of self-servingness) in our current rhetorical arsenal. It is also a catchall term, ubiquitous in its squishy polyvocality.<br /><br />For example, I can't tell you how many queries I get from Chronicle readers who want the inside scoop on the weekly's coverage of events: Why haven't they run an article on the racial angle of that Amy Bishop shooting? Do you know that the Chronicle you write for engages in some unethical censoring of its readership vis-a-vis their comments to articles, especially posts left by "conservative" readers? Just today, somebody was concerned that they hadn't found any coverage of the recent deaths at Cornell University. Is the Chronicle being pressured not to cover the story? The person asked this last question with implications that hover closely to a more non-partisan invocation of the political (to describe "backstage" machinations with a conspiratorial tinge, an example of the political's amazing elasticity).<br /><br />Moreover, the political is cannibalistic. It feeds off other things, making it more difficult to disentangle political posturing from meaningful political practice. Political incentives can compel people to, say, pounce on Rep. Eric Massa. But that doesn't mean that Massa's actions should be defended, because his attackers smell political blood. (Of course, the logic of our current political/partisan system usually means that we defend our teammates almost no matter what, even to the point of hypocrisy and egregious double-standardism.)<br /><br />Everything is political. And you don't have to be a card-carrying Foucauldian to think so. Even still, two things seem worth mentioning (as ways to organize and ground such an ostensible truism).<br /><br />1) Claiming some kind of non-political Archimedean vantage point from which to survey the ideological landscape is unhelpful. And a lie. We can aspire toward greater degrees of objectivity without matter-of-factly declaring that our team (unlike the other side) has already achieved it.<br /><br />2) Attempts to dismiss other positions as merely political distracts us from the point. The option isn't apolitical vs. political. And the folks who most adamantly declaim that the other guys have cornered the market on political motivations have drunken their own Kool Aid. Or they are betting on the fact that they can get some of us to drink it for them.John L. Jackson, Jr.http://www.blogger.com/profile/05994353364710886605noreply@blogger.com1tag:blogger.com,1999:blog-3171291793264659096.post-41810370653538174522010-02-17T11:40:00.003-05:002010-02-17T11:56:08.900-05:00Academia in the Age of "Reactionary Foucauldianism"I'm taking part in a faculty discussion today on "teaching controversial issues." In preparation for that meeting, I started to jot down some thoughts on the matter. (I'll be responsible for saying a few words.)<br /><br />There is a hyper-politicization of higher education today, a hyper-politicization that I want to call "reactionary Foucauldianism." If Foucault's nothing-is-innocent post-structuralism gets marshaled to make arguments about knowledge production as a "power play," the same "metaphysics of power" informs reactionary critiques of academic culture. While Foucault is deployed to challenge "the state" and what he labels "governmentality," reactionary Foucauldianism is a critique of those critics (on similar knowledge/power grounds).<br /><br />To discuss, say, America's history of imperialism is to practice "communist indoctrination." (Of course, some of this is about the logic and language of punditry. Hyperbolic sound-bites are the coin of our realm, but that seems like very little consolation for a targeted faculty member.)<br /><br />Everything in academia has become controversial (or potentially controversial) as academics are consistently being asked to defend their ostensibly "liberal" leanings. I know of scholars who don't want to put their syllabi on-line for fear that "others" will troll the Internet, find the document, and use their required reading list to castigate them as ideologues. (And one gets very little traction by pointing out that, ironically enough, unabashed ideologues tend to be the folks most interested in such ideological witch-hunting.)<br /><br />The increasing hegemony of a think tank counter-academy is also part of the discussion, especially when their powerful publishing arms produce best-selling books by circumventing the so-called "leftist" mainstream.<br /><br />I teach quite a bit about race and religion, both of which are hot-button topics, growing more and more controversial by the semester. Any discussion of "religion" as something that is social, cultural and political (invariably how anthropologists frame their takes on the sacred) bleeds quite easily into the traps of partisan electoral politics vis-a-vis questions about the "war on terror," "Islamic Fundamentalism," and the "Christian Right" (just to name three of the most obvious ones). <br /><br />For many people, any talk about race at all is an example of racism. Period. According to some, it is the only contemporary manifestation of racism worth noting. This idea that race-talk is an instantiation of racism (nothing more) can mean that a curricular offering on the topic is only ever a venue for preaching to the choir and supposedly damning the unbelievers. Defensiveness (about being dismissed as a "liberal") meets defensiveness (about being labeled a "racist"), which doesn't make for particularly constructive conversations.John L. Jackson, Jr.http://www.blogger.com/profile/05994353364710886605noreply@blogger.com1tag:blogger.com,1999:blog-3171291793264659096.post-80027790069956702962010-02-15T11:10:00.003-05:002010-02-15T11:23:06.541-05:00In the Wake of Haiti: Jay Leno and Amy BishopIt feels callous, even pathetic, to go on with business-as-usual while Haiti continues to reel from such a singular catastrophe. Not that it is really a viable alternative to stand still, catatonic and mouth ajar, wallowing in all the graphic (sometimes gratuitous) images offered up all day, everyday, by news outlets.<br /><br />Those same media outlets have toned down their coverage of Haiti considerably these past two weeks, which seems welcomingly merciful, I have to selfishly admit, even as it also shocks me how quickly the 24-hour news cycle can chew up and spit out any story, including one as massive as the Haitian disaster. A nor'easter seems hardly to merit displacing it at the top of anybody's news hour. <br /><br />Even after we've sent our checks (contra Rush Limbaugh's suggestions) and commiserated with friends about the tragedy (the injustice of the event itself, the high-profile mean-spiritedness of certain religious explanations for it, the frustrating tales about the difficulty of relief efforts and the plight of those "kidnapped" orphans), we still have to go on with the rest of our day, the rest of our lives, right? Anything else seems almost like courting psychosis, dancing with the devil of existential despair. <br /><br />And I have certainly taken that advice. I spent the end of January and the beginning of February staying up late at night to watch Conan O'Brien and David Letterman hurl insults (in the guise of "jokes") at Jay Leno. I guzzled down the season premiere of <span style="font-style:italic;">24</span>, and I made it my business to YouTube Mo'Nique's acceptance speech from the Golden Globes (just because everybody seemed to be talking about it). I did all of this with the Haitian earthquake's aftermath punctuating the noisy pauses between these silly vices.<br /><br />For me, it feels almost schizophrenic to be following the events in Haiti while, say, preparing for weekly sessions of my graduate course (on the <span style="font-style:italic;">noeme </span>of film). I also have a few grad students and recent grads on the academic job market, so I am writing recommendations and helping them to deal with the inevitable anxieties that such professional hurdles produce. This week, I'll spend much more time on that stuff than I will watching the news coverage from Haiti.<br /><br />All academic eyes are now focused on the shooting in Huntsville. A faculty member at the University of Alabama is denied tenure (another one of those anxiety-filled professional hurdles), and she makes some of her colleagues pay for it with their lives. The entire affair feels like academia's version of a natural disaster: what can happen when the tectonic plates beneath the ever-secretive tenure process shift just enough for others to really feel it. Of course, the person denied tenure already feels the devastation, but it is a decidedly individualized experience, mostly dealt with off-stage and out of public view.<br /><br />At the same time, very little really feels "natural" about this Law and Order-type murder story, its headlines coming fast and furious. <span style="font-style:italic;">The Chronicle Of Higher Education</span> has descended on the scene like academia's version of the <span style="font-style:italic;">New York Times</span>, and we are continuing to get details about the shooter's quirkiness and interpersonal oddities.<br /><br />An academic friend of mine claims to find it "strange" that such post-tenure shootings don't happen more often, especially given how "nutty" academics can be. And she readily includes herself in that unflattering characterization, which I respected, even as I demanded that she make an exception for my own self-avowed normalcy. But is she right? Given how fraught the climb up academia's ladder, is it shocking how infrequently such violent retaliation takes place? What, if anything, does this shooting really tell us about a "life of the mind" or about the way academics adjudicate it? How long before the Amy Bishop story gets bumped from the headlines, and is there something faculty members should actually learn from the entire thing before it does?John L. Jackson, Jr.http://www.blogger.com/profile/05994353364710886605noreply@blogger.com1tag:blogger.com,1999:blog-3171291793264659096.post-72132626989238940652010-02-12T16:41:00.002-05:002010-02-12T16:54:10.619-05:00What to do with GREs?<span style="font-style:italic;">Four Theories of the GRE's evaluative significance</span>:<br /><br />1. <span style="font-style:italic;">The Primacy of Quantitative Scores</span>: This position holds that high quant scores are a good indication of how crisply someone thinks, regardless of whether or not the discipline they are applying to demands any robust use of mathematics at all. The rebuttal maintains that unless someone is going to be working with numbers, the quantitative score can be completely discounted if the other two scores are high enough. <br /><br />2. <span style="font-style:italic;">All or Nothing</span>: Some reviewers of grad applications maintain that unless the GRE scores are quite high in all three domains, the student should be considered a bit of a risk. I've even been privy to a theory that links high-math/low-verbal scores to anti-social behavior. The high-verbal/low-math applicant sometimes gets dismissed by others as someone who talks a good game but doesn't have anything substantive to say. All style and no substance. <br /><br />3. <span style="font-style:italic;">The Declining Significance of GREs</span>: Some reviewers don't even look at GRE scores. They dismiss them out of hand. If the statement is strong and the letters are convincing/supportive, they don't need any other information. This position usually gets justified with recourse to discussions about the relative underperformance of minority candidates on standardized tests (whether that's chalked up to cultural biases written into those tests or to "stereotype threats" priming said students for failure). Of course, this anti-GRE position doesn't mesh well with university-wide attempts to clearly demonstrate the exceptionalism of incoming classes.<br /><br />4. <span style="font-style:italic;">Writing, Writing, Writing</span>: I've heard at least a couple of qualitative social scientists and humanists wax eloquent about the singular significance of the writing component of the test. (I don't even think we had a writing section when I took the test.) A high writing score means that whatever the prospective students know/learn will get translated into the coins of the realm in academia: the written conference talk, the term/final paper, the publishable article, and the Dissertation. This, they argue, is where the rubber hits the road for graduate studentry. And bad writers with good ideas have a more difficult time thriving in the academy. Good writers can survive even if their ideas aren't always instantiations of incomparable genius. <br /><br /> When I'm on a selection committee, I tend to go through all of the other materials in the files before I look at GREs. Starting with GRE scores can sometimes bias one's reading of the rest of a prospective student's application. Once I go through the written materials, I then compare my assessment of the student with his or her GRE scores, usually just to see how hard a case I'll have to make to my colleagues (if those scores are particularly low). Of course, I almost never win those low-GRE cases.John L. Jackson, Jr.http://www.blogger.com/profile/05994353364710886605noreply@blogger.com0tag:blogger.com,1999:blog-3171291793264659096.post-44961395490415172512010-02-10T12:06:00.002-05:002010-02-10T12:13:05.527-05:00Snow DaysWhat's the best way to spend a snow day?<br /><br />A nor'easter decided to add an exclamation point to the massive winter storm that pummeled Philadelphia (and the entire mid-Atlantic region) this past weekend, which means that schools famous for almost never closing due to weather concerns have cancelled their classes today. I'll have to pay for this later (trying to re-schedule campus meetings that were difficult to schedule the first time around), but there is one major upside. I can to slash through a chunk of my growing To-Do List. <br /><br />First things first. I sent out 33 emails in an hour, emails churned out with a reckless disregard for grammar or even comprehension, which probably means that I'll have to spend more time sending follow-ups for clarification.<br /><br />I've already had three very useful phone calls with colleagues (related to my administrative roles on campus), and I am now all set for a big committee meeting this Friday. Check. Check.<br /><br />And then I slipped down a bit of a rabbit hole. Snow days are great for such wonderlandesque expeditions.<br /><br />I have a few doctoral students on the job market right now, and they keep telling me about those infamous "wiki" sites where applicants can get unofficial updates on the status of current academic job searches. This is madness! I am so glad that such sites didn't exist ten years ago, during my first real stint on the academic job market. It reeks of neurotic possibility. <br /><br />I actually went through some old text messages this morning (from and about undergrads applying to doctoral programs), and I can't believe that the same kinds of cyber-sites are available for them, spaces where other prospective graduate students anonymously post any information they know (or have heard) about the results of departmental decisions about incoming cohorts. So, I have been meandering through these virtual wastelands and fretting over how much discipline it takes for graduate students and would-be graduate students to avoid the gravitational pull of such sparkling baubles. <br /><br />If I can spend a chunk of my morning shaking my head at the phenomenon, I wouldn't be surprised if snow days give students license to get swallowed whole in these bastions of high-end gossip-mongering. I can see the mesmerizing draw, even if I can't spend all of the time-out-of-time that is my snow day on these addictive sites.John L. Jackson, Jr.http://www.blogger.com/profile/05994353364710886605noreply@blogger.com2tag:blogger.com,1999:blog-3171291793264659096.post-57554960025807637952010-02-04T11:01:00.003-05:002010-02-04T12:15:45.856-05:00Why co-teaching is not a scamI recently had someone tell me that co-teaching was one of the biggest academic scams going. "The biggest, in fact," he corrected. According to him, this was insult to injury in the context of a larger academic universe that was itself, by his estimation, one gigantic institutionalized racket of Mafioso (and "governmental") proportions. (A side note about his "governmental" critique: I should probably add that this person is a libertarian, and something of a conspiracy theorist.) <br /><br />And he wasn't just talking in the abstract. He was offering me a bit of a browbeating for the amount of co-teaching that I have done over the course of my professorial career.<br /><br />To hear him tell it, co-teaching is just a way for faculty members to get full credit for half the work. They conspire with their colleagues to split a semester or quarter in two so that they don't have to prepare for (or attend) all of the sessions. With this illicitly gained free time, they can then selfishly work on their own projects, which was at least a better option, he admitted, than what he suspected was the usual alternative: doing absolutely nothing productive at all, like the closeted slackers all academics seemingly want to be.<br /><br />I have heard this critique of co-teaching many times, and I've seen examples of co-teaching that do seem to merit the cynicism, structuring the "collaboration" such that students experience it as little more than two distinct pedagogical ships passing one another in the dark curricular night. (Of course, these same students tend not to enjoy such courses, or to consider them valuable educational experiences.)<br /><br />To complicate matters even more, there is also the question of how much co-teaching should really count toward faculty teaching loads: as a full course (like any other)? Half a course? (Even less than that, my interlocutor might argue, given his aforementioned assessment of things.)<br /><br />If done well, I would argue that co-teaching with a colleague could even count as two courses. Or at least a course and a half. That's because to really do it right, to do it well, means many more hours of preparation beforehand: debating the foundational structure of the course, comparing notes/takes on the material, and doing justice to two distinct perspectives on the subject matter. It can require as long as a year (even longer) for colleagues to effectively collaborate (over coffees, lunches and late-night bull sessions) on the conceptualization and organization of a substantive (and reasonably coherent) co-taught syllabus. <br /><br />I've actually only ever co-taught courses where both of us attended all of the sessions, read all of the materials and prepared lectures/comments/questions for one another and the students every single week, but I realize that that isn't always possible, especially if an institution asks that such co-teaching be conducted as an overloaded add-on to a person's regular teaching schedule (which is how some academics have described the policies of their schools to me).<br /><br />In a course on "Film and Reality" that I co-taught with a Kantian philosopher at Duke, every class session was a learning experience for everyone involved. Some sessions he'd lead, and my role was to respond/rebut (from an anthropological perspective). When I led, he'd do the same (providing philosophical/analytical counterpoints/extensions to my positions). In a lecture on semiotics (and the ostensible differences between Ferdinand de Saussure's binaries and Charles Sanders Peirce's tripartitism), my co-instructor pushed back with a challenge to the distinctiveness of iconicity and indexicality vis-a-vis what I had described as the more arbitrary and un-motivated <span style="font-style:italic;">sign</span>. It was a great discussion. Not because we got lost in our own debate (another minefield to avoid on team-taught terrain), but because we were able to use that discussion as a way to structure a series of student questions/comments about the contemporary utility of semiotic approaches to social analysis (and discrepancies between them). <br /><br />Since coming to Penn, I've co-taught graduate courses and undergraduate courses, small seminars and large lecture offerings. In all of these instances, my collaborators and I met each week, <span style="font-style:italic;">before </span>the actual class sessions, discussing our divergent take on the readings, sharing our thoughts on the specifics of the week's agenda, and making sure that we had a detailed set of expectations (of ourselves and our students) before we stepped into the classroom. When it works, this is an enriching experiences for everyone, which makes the extra preparation worth it. <br /><br />In an academic world where interdisciplinarity is offered up as part of the intellectual air we breathe, co-teaching will probably become an increasingly valuable way of training students to think across conventional disciplinary (and even methodological) dividing lines.<br /><br />In an academic world where interdisciplinarity is offered up as constitutive of the intellectual air we all breathe, co-teaching should become an increasingly valued way of training students to think across conventional disciplinary (and even methodological) dividing lines. <br /><br />(crossposted at <a href="http://chronicle.com/blogAuthor/Brainstorm/3/John-L-Jackson-Jr/82/">The Chronicle of Higher Education</a>)John L. Jackson, Jr.http://www.blogger.com/profile/05994353364710886605noreply@blogger.com0tag:blogger.com,1999:blog-3171291793264659096.post-49298170665303482932010-01-27T11:46:00.004-05:002010-01-27T12:12:09.854-05:00Anna Deavere Smith's Craft<a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg4AEzQdB2FxcHQENj4aRpdzH_c9x0VN2glRBlWOJdPMGRs9x3JhEzYGVpBnsIIO3QEAqlQv-Am-FyN08VhLhhWAczybPWrY_vEb-Mc_F20X29LXfc1E8_dSE2tvXm-nkD1VE-pOyutjtM/s1600-h/DEAVERE_shot6_big.jpg"><img style="float:left; margin:0 10px 10px 0;cursor:pointer; cursor:hand;width: 132px; height: 200px;" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg4AEzQdB2FxcHQENj4aRpdzH_c9x0VN2glRBlWOJdPMGRs9x3JhEzYGVpBnsIIO3QEAqlQv-Am-FyN08VhLhhWAczybPWrY_vEb-Mc_F20X29LXfc1E8_dSE2tvXm-nkD1VE-pOyutjtM/s200/DEAVERE_shot6_big.jpg" border="0" alt=""id="BLOGGER_PHOTO_ID_5431466423568210882" /></a><br /><br /><br /><span style="font-weight:bold;">(crossposted at <a href="http://chronicle.com/blogAuthor/Brainstorm/3/John-L-Jackson-Jr/82/"><span style="font-style:italic;">The Chronicle of Higher Education</span></a>)<span style="font-style:italic;"></span></span><br /><br />Anna Deavere Smith describes her life-long project as an attempt to theorize the links between language and identity. She came to this realization about the fundamental nature of her actorly goals while still studying her craft (several decades ago) at the American Conservatory Theater in San Francisco. Last night, Smith presented excerpts from her most recent one-woman show, <span style="font-style:italic;">Let Me Down Easy</span>, at the University of Pennsylvania's Annenberg Center for the Performing Arts, and she tried to explain to a packed-house just how her creative process works. <br /><br />For those who don't know Anna Deavere Smith, she is famous for what has been called "documentary theater," a genre that, for her, entails interviewing people from various walks of life (interviews organized around a particular theme or event) and staging those juxtaposed interviews as monologues in critical conversation with one another.<br /><br /><span style="font-style:italic;">Fires in the Mirror</span> dealt with 1991's Crown Heights riots (between Afro-Caribbeans and Orthodox Jews in that small section of Brooklyn) and included interviews with rioters, African-American activists (such as Al Sharpton), rabbis, city officials, local residents, and other interested parties with a spin on the conflagration. <span style="font-style:italic;">Twilight: Los Angeles</span> dealt with that 1992 riot/uprising, bringing excerpts from her interviews to life on stage as a way to demonstrate the many angles from which Angelenos and others made sense of that public tragedy.<br /><br /><span style="font-style:italic;">Let Me Down Easy</span> is a commentary on death and dying in America, on the state of health care and on how the actions of health care providers are over-determined by cultural assumptions that get powerfully exposed when Smith places them on conspicuous theatrical display. Given the extent to which our current political conversation pivots on the "health care debate" and its political fallout (including the election of a Republican senator in MA), Smith's material is amazing, even uncanny, for its timeliness.<br /><br />Smith's power stems from the fact that her performative skills allow her to conjure up her interviewees in all of their demographic and idiosyncratic specificity, seemingly out of thin air, using their words, speaking styles, and bodily gestures to plop these beings unto the stage with an almost occult-like immediacy. She also does a commendable job giving voice to many different swaths of the political spectrum, placing opposing viewpoints in conversation such that each side of the debate is rendered with nuanced humanity. Alas, if only our everyday political discourse followed a similar organizing principle. Indeed, one of her projects as a scholar-artists (she is, after all, an academic: University Professor at NYU) is to promote robust conversations across ideological divides. (She is the founding director of Harvard University's Institute on the Arts and Civic Dialogue.)<br /><br />As someone who spent the last few month of 2009 beginning my own attempt to think about staging ethnographic data for theatrical presentation (first, this year, at academic conferences and then, much later down the line, in a full-fledged one-man show), it was encouraging and instructive to hear Smith describe her approach to such work. "Documentary theater" is a valuable example of what "ethnographic theater" could look like--and even of what anthropological theatricality might usefully define itself against. Several ethnographers have already begun to dabble in a version of what might be called "ethnographic theater," which is yet another way to continue ongoing discussions within anthropology about the political and poetic implications of ethnographic representation and cultural critique. It is also a different way to think about questions of observation, embodiment and intersubjectivity.<br /><br />Anna Deavere Smith was an inspiration last night, and not just for scholars interested in harnessing the electrical powers of theatrical space for their own scholastic purposes. <br /><br />Smith juggles her "documentary theater" work with stints on shows like NBC's <span style="font-style:italic;">The West Wing</span> and HBO's <span style="font-style:italic;">Nurse Betty</span>. That stuff pays the bills, she says, but documentary theater is really her passion. It is also a way for her to show that social identities only emerge as fully meaningful and culturally intelligible once we are willing to slip our feet into other people's shoes, to wrap our mouths and minds around other people's words.John L. Jackson, Jr.http://www.blogger.com/profile/05994353364710886605noreply@blogger.com11tag:blogger.com,1999:blog-3171291793264659096.post-83115828443049906082010-01-14T10:31:00.006-05:002010-01-14T21:39:24.863-05:00What is Pat Robertson Really Saying About Haiti?There are many reasonable people (and even some otherwise unreasonable ones) who would maintain that Pat Robertson's take on the recent earthquake in Haiti need not be dignified with a response. I understand that point, and I see where its adherents are coming from. But we are fooling ourselves if we think that Robertson represents an isolated quack. We ignore him at our own peril, especially since there are many people who accept his basic premises without question. So, I do feel like a few words are in order about the significance of his supernatural claims about divine justice.<br /><br />One thing to note is that the political "fringe" is no longer as fringe as it might once have seemed. I got about 10 messages (via twitter, email, and facebook) regarding Robertson's comments within a few hours of him making them. I've also seen his thoughts discussed on several cable news programs on several different channels more than just a few times in the last day and a half. His comments have gone viral, and it means that "dignified" or not, they are circulating quite widely already. <br /><br />If you are still one of the few people who haven't heard it, Robertson argues that 18th and early 19th century Haitians were able to throw off the chains of race-based slavery and colonial dependency by (literally!) making a pact with the devil. As a function of that Faustian bargain, they have been cursed by God, which explains their history of violence and their contemporary degree of poverty.<br /><br />I got the surreal news (via text message) about the Haitian disaster on an Amtrak train from Washington DC to Philadelphia Tuesday evening (after attending the AAA symposium on race that I blogged about on Monday). And it just so happens that I was reading, in an almost eerie kind of irony, a small new book by Susan Buck-Morss during that ride, <span style="font-style:italic;">Hegel, Haiti, and Universal History</span>.<br /><br />The book is an extrapolation on her <span style="font-style:italic;">Critical Inquiry</span> article (from 2000) where she tried to argue that Hegel got his master-slave metaphor from the Haitian revolution, and that such a seemingly clear and self-evident historical fact has been sorely under-appreciated (in fact, missed just about entirely) by the best and brightest philosophers and historians who have worked on Hegel. She chalks these omissions up to a series of factors, including the narrowcast biases of disciplinization and academic specialization. Buck-Morss argues that the early Hegel was clearly influenced and inspired by the Haitian revolt (championing the psychic need for slaves to forcibly reclaim their full humanity by asserting it in the face of brutal reprisals), even if the later Hegel (of <span style="font-style:italic;">The Philosophy of History</span>) ends up dismissing all of Africa as radically ahistorical, uncivilized and unprepared for full sovereignty. <br /><br />In many ways, Robertson's pseudo-religious reading of the Haitian tragedy is a sensationalized version of the very logics that Buck-Morss critiques.<br /><br />I call it "pseudo-religious" because I think of Robertson's comments as self-serving political claims hiding behind the cloak of religiosity. Of course, religion is inescapably political, but Robertson's own religious texts don't provide evidence for such wildly specific and offensive claims of satanic collusion. On what evidence, from what sacred book, does Robertson base his theory of Haitian history (or any of his past pronouncements, including the "argument" that 9/11 was divine retribution for America's legalization of abortion)? Is he merely performing a xenophobic reading of Voodoo's spiritual difference from his particular version of Christianity? <br /><br />Instead of seeing 18th and 19th century Haitian freedom fighters as subjects of history, agents capable of throwing off the shackles of foreign oppression (in a manner similar to America's 18th-century revolutionists, a group that I've never heard him call lapdogs of Satan), Robertson removes them from the political and geopolitical playing field altogether, dismissing their post-revolutionary plight as comeuppance for a bad deal with the devil. About that theory, two last things:<br /><br />First, I would recommend that Robertson read Randall Robinson's <span style="font-style:italic;">An Unbroken Agony: Haiti, from Revolution to the Kidnapping of a President</span>, which shows, quite compellingly, that Haiti's current politico-economic predicament is a direct result of how Europe and the United States responded to the country's 1804 assertion of autonomy: by very purposefully isolating and exploiting Haiti (politically and economically) for the next two hundred years. Therein lies much of the answer, Robinson demonstrates, to Haiti's current woes. (The details he provides, mostly uncontested and unhidden facts of history, will be shocking to many readers). <br /><br />Second, if the Satan-theory is accurate, I would just ask that Robertson finally let them out of their contract <span style="font-style:italic;">with him</span>. As a function of the kinds of horrible and inhumane ideas he spews, Robertson must be the other contractual party of which he speaks. It would explain how he knows the details of such a secret compact.John L. Jackson, Jr.http://www.blogger.com/profile/05994353364710886605noreply@blogger.com7tag:blogger.com,1999:blog-3171291793264659096.post-6382775414974821602010-01-07T22:06:00.002-05:002010-01-07T22:09:43.054-05:00GUEST COMMENTARY<span style="font-style:italic;">The career pipeline: Not leaking but pouring</span><br />By Katherine Sender<br /><br />At a recent meeting of Penn faculty members from across the University, the provost spoke with concern about “the leaky pipeline,” where large numbers of women and minority faculty drop out of the career track as they move towards senior positions. Then followed our president announcing that Penn was moving from a position of Excellence to Eminence—in the twenty-first century university even Excellence isn’t good enough anymore. I was struck by the juxtaposition. Was there a relationship between this constant push to greater levels of distinction and the leaky pipeline?<br />What does this leaky pipeline look like at Penn? A Gender Equity Report in 2007 found that women made up 28 percent of all faculty. How this plays out across rank is striking: women made up 42 percent of assistant professors, 30 percent of associate professors, and only 18 percent of full professors. This is not a case of more women coming up through the ranks because the proportion of standing women faculty had increased by only four percent since 1999.<br /><br />The leaky pipe for racial minorities is as dramatic. A Minority Equity Report of 2007 found that minorities made up 17 percent of Penn’s faculty. People of color made up 27 percent of assistant professors, 17 percent of associate professors, and only 9 percent of full professors. We may take heart that the proportion of minority faculty has almost doubled since 1999, but of the current 17 percent of minority faculty 11 percent are Asian, meaning that the proportions of African American and Latino/a faculty are very small indeed. <br /><br />Reliable career track information on gay, lesbian, bisexual, and transgender faculty is impossible to come by, but my sense is that the tenure and promotion process isn’t especially kind to this group either. Expressly queer faculty—politically irascible, non-heteronormative and even non-homonormative academics—are likely to have an especially hard time.<br /><br />I’m using Penn’s figures as an example, but Penn isn’t especially bad—or good—compared with its peers. I also know that some people are leaving academic careers for good, self-chosen, life-affirming reasons. But it’s worrisome that these departures are differentially distributed across gender, race, and probably sexuality. The pipeline isn’t leaking, it’s pouring.<br /><br />At a recent Gender Studies conference here at Penn the leaky pipeline was addressed as a family issue: the tenure clock is hostile to women who want to have children. Indeed, nationally, women with children are half as likely to get tenure as women without. But this is only part of the problem. If it were only a fertility issue, minority men would be doing just fine. <br /><br />The tenure and promotion process isn’t only inhuman for women who want and have children, it’s inhuman for everyone. Jerry Jacobs, a sociologist here at Penn, found in 2004 that both women and men faculty work more than 50hrs per week irrespective of rank, and about a third of them work more than 60 hours per week. The expectation of increased working hours is only likely to grow. The MLA found in 2006 that not only research universities but all academic institutions have greatly increased their expectations of tenure track faculty to publish articles and books towards their tenure cases without reducing their teaching hours. <br /><br />While expectations of productivity have increased, so too has the shift to employing more part-time faculty: in the US only a third of faculty are now full-time tenured or tenure track, down from 55 percent in 1970. This puts increasing pressure on those full timers to do additional service work. Work that more often falls to women, and work that gets little credit in terms of promotions and merit pay. As we are increasingly asked to account for our productivity, I wonder how much of the intellectual and pastoral labor more often done by female and minority faculty are recognized as productive? <br /> <br />These increased pressures are on everybody, but they are experienced unequally by women and minority faculty because of how resources are differently distributed:<br /><br /><span style="font-style:italic;">Pay: In the US women faculty earn 85 cents to every male dollar, this rate goes down at the higher ranks. [Couldn’t find comparable figs for minority faculty.]<br /><br />Time: Women faculty are much more likely to be partnered with another full-time worker and are more likely to be partnered with another academic—i.e. someone also working long hours. In heterosexual couples, women are much more likely to carry more responsibilities for childcare and domestic duties.<br /><br />Emotional resources: Women and minority faculty are less likely to feel confident about their performance. Educational research suggests that girls consistently rank their sense of their own abilities much lower than do men, even though they perform better in assessments. Students of color constantly have to work against teachers’ expectations of low achievement.<br /><br />Recognition: Who has a voice in the university and what are they allowed to say? Mark Anthony Neal has mentioned the chastisement of faculty who dare to “think while Black.” Tenure and promotion discourage speaking while Black, female, and gay.</span><br /> <br />The demands on all academics escalate, but different groups have varying access to resources that make those demands bearable. This is not only an issue of pressures on junior faculty to produce for their tenure file. Even those at the top of the ladder continue to work extraordinarily hard.<br /><br />Senior faculty and administrators need to recognize that few of their group would have met the standards currently set for tenure and promotion. They need to publicly scale back on expectations of quantity and focus more on quality. This is not only for the wellbeing of their junior colleagues, it is also likely to foster more careful, intellectually rigorous research. They also need to think imaginatively about different kinds of productivity than written scholarship in a changing multimedia world where monograph contracts are harder to score.<br /><br />But we also need to consider our own complicity. In my research I read a lot of scholarly concern about how reality television shows cultivate the ideal self-governing neoliberal citizen—someone who is adaptable, mobile, always a bit anxious, self-monitoring, and willing to work harder not only to get ahead but to stay in place. While we communication scholars worry about the effects of reality TV on its audiences, we need to look for the beam in our own eye: academics are the most obligingly self-governing citizens of all. We can work whenever we want as long as we work all the time.<br /><br />Like many universities, corporations, and governments, Penn has adopted a strategy of “Sustainability.” I agree that huge communities like universities have a responsibility to environmental issues. But sustainability can’t only be a matter for nations and institutions, we also have to think about sustainability at a human level. The demand for constant growth means that we extract more and more energy from a limited resource. How do the developing nations in the university world—women, men of color, and part-timers—unequally bear the brunt of overtaxed resources? And looking forward, what kind of labor legacy are we leaving for the generation of scholars we are nurturing into the profession? <br /><br />Don’t get me wrong, I love my job. But I don’t want to do only my job. We need to model livable lives for our students. We need to do more than just work, and not only if we want a family. We need to consider the law of diminishing returns and the possibility that creativity comes from working less. We need to make space for political and community engagements that feed our intellectual work in other ways. We need to think about why universities matter not only for the world but for the people working within them. <br /><br /><span style="font-style:italic;">Katherine Sender is the associate dean for graduate studies and an associate professor at the Annenberg School for Communication, University of Pennsylvania. She is the author of <span style="font-style:italic;"><span style="font-weight:bold;">Business, not Politics: The Making of the Gay Market</span></span> and the forthcoming <span style="font-style:italic;"><span style="font-weight:bold;">Makeover Television and its Audiences</span>.</span> </span>John L. Jackson, Jr.http://www.blogger.com/profile/05994353364710886605noreply@blogger.com2