Jay Ruby cautions anthropologists against deploying film and video equipment on terms that are completely determined by an institutionalized media industry with its own assumptions about how stories are supposed to be told and circulated. He argues that anthropologists might need to organize their narratives (and distribute their films) in ways that run counter to industry (and even audience) expectations. There is a danger in approaching film making the way others do, he says, a danger that includes potentially betraying anthropology's intellectual mission.
Philosopher Lewis Gordon has recently penned a powerful piece that asks academics to reconsider current tendencies to perform intellectual authority in ways that traffic in neoliberal logics of financial accumulation and brand-name fetishization, logics that may similarly betray our basic intellectual mission. There is a danger, he argues, in performing scholastic subjectivity on terms that seem foreign (even antithetical) to academia's traditional considerations and methods of appraisal.
Gordon's thoughtful and provocative piece, "The Market Colonization of Intellectuals," reads something like a manifesto, and it made me think about my own too-easy acceptance of academia's hyermarketization. He argues that academics can't serve two masters, can't occupy two separate spheres at the selfsame time: the life of the mind and the mandates of the marketplace. Moreover, he claims that we are increasingly getting used to just such a bifurcated and contradictory existence. Gordon describes a "managerial academic class" of professional administrators charged with aligning academia's values and self-assessments with the organizing principles and measuring modalities of the market. "Market potentiality," he says, "governs everything [that many academics] produce." Gordon designates this "the market colonization of knowledge."
Gordon also questions the branding of analytical concepts such that they are flattened out for public consumption and magically fused with their intellectual creators: deconstruction and Derrida being one of his prime examples. This isn't a critique of Derrida or a dismissal of deconstruction's epistemological purchase. It is a plea for, amongst other things, an academic model of productivity that doesn't reproduce and reinforce the ubiquitous cult of celebrity, one of the most powerful points of entry into a mass mediated public sphere and the overflowing bank accounts of its most recognizable occupants.
The piece even takes on academia's impoverished commitment to (and operationalization of) what it means to be "smart." "In the academy," Gordon writes, "nothing is more marketable than the reputation of being smart. This makes sense: No one wants dumb intellectuals. The problem, of course, is how ‘smart' is defined. In a market-oriented society, that means knowing how to play the game of making oneself marketable. The problem here is evident if we make a comparison with ethics. I once asked an environmental activist, who argued that a more ethical ecological position is the key against looming disaster, which would bother her more: to be considered unethical or stupid? She admitted the latter."
The piece asks what kind of academic world we've created if the universality of a certain apotheosis of smartness becomes our highest (maybe our only) moral value. Gordon demands of academics a more rigorous reflexivity, a critical self-consciousness that challenges what's become orthodoxy in contemporary academic life.
Gordon and Ruby both demand such a critical self-reflexivity from their colleagues. Gordon argues that anything less than that compromises our scholarly significance. Ruby claims that ethnographic films, as one instantiation of intellectual projects, might need to look very different from other motion pictures.
I'm teaching a graduate film course this semester that attempts to take up some of Ruby's challenge, asking students to de-familiarize mechanically reproduced audiovisual products just enough for them to start seeing such offerings in slightly newfangled ways. We are reading critical histories of early cinema (for example, Peter Decherney's analysis of early Hollywood's ties to academia; Jacqueline Stewart's evocative theorization of the links between popular cinema and the lives of African Americans during the Great Migration; Hannah Landecker on the central role of early medical films to any discussion about the creation/popularization of cinema) along with ethnographies of media/mediation (from folks like Roxanne Varzi, Alan Klima, Michael Taussig, Diane Nelson, and John Caldwell), and differently pitched philosophical treatments of film/video/digital products/processes (by Roland Barthes, Walter Benjamin, Kara Keeling, Susan Buck-Morss, Kirsten Ostherr, D.N. Rodowick, and others). We are also watching films/videos that challenge traditional ways of seeing (including Bill Morisson's Decasia, Cheryl Dunye's The Watermelon Woman, William Greaves's Symbiopsychotaxiplasm, and Charlie Kaufman's Synecdoche, New York).
The course hopes to trouble some of the taken-for-granted presuppositions that we all have about ways of approaching the ubiquity of televisual, filmic, and digital representations. If the course works, students may not be quite as prone to unproductively normalized assumptions about how we interface with such technology.
The film/video market and its logics can also colonize and cannibalize the minds and methods of anthropological filmmakers/film critics who can find themselves seduced in ways that mirror some of the criticisms delineated by Gordon's challenging essay. You don't have to agree with every facet of Gordon's piece to imagine it as a wonderfully productive starting point for a spirited conversation about what academia ought to be.
Monday, April 12, 2010
Saturday, April 10, 2010
Michael Steele's Race Card?
The RNC's Michael Steele has recently made national headlines for "playing the race card" by agreeing with the claim that African-Americans like himself, in positions of power, have "a slimmer margin of error" in America. Steele included President Obama in that calculation, which was met by a swift dismissal from the White House press secretary.
Critics always find it ironic (even pathetic) when proponents of purported color blindness frame their own problems in terms of "racial victimization." The "Left" is assumed to traffic in such sophistries. The "Right," however, is supposed to know better. Clarence Thomas calling his confirmation hearing a "high-tech lynching" stands as the quintessential example of such racial irony. Even the people who claim obliviousness to racial reasoning seem susceptible to its rhetorical seductiveness.
But who really doesn't see race? When is it ever invisible? Immaterial? Irrelevant?
I just talked to a small group in Philadelphia about my most recent book, Racial Paranoia, and one of the listeners, an elderly white man, responded with a plea for the insignificance of race and racism as rubrics for understanding everyday life, especially his everyday life. He claimed that race had no impact on his daily activities. He wasn't a racist, he said. And he simply didn't see race. He had spent that very day teaching students, judging a science fair, and debating a group of university scholars. Race and racism, he assured me, had nothing to do with any of these experiences. And he made his case without anger, in clear and confident tones.
I responded by basically telling him that he was wrong, which wasn't the best route to take. I admitted to him that I am always a tad suspicious of people who claim not to see race at all. Indeed, I think that the very aspiration of postracialism (in most of its guises) is misplaced and romantic, repression passing itself off as transcendence. He listened to my response and then restated his point, very matter-of-factly. Several audience members tried to push back against his claim, arguing that even when race isn't explicitly thematized in, say, a classroom setting (one of the locations that the man had invoked), it might still be a valuable analytical lens, a real social fact. It might still be there, even if we don't see it. Not because it is biologically real, but because culture is most powerful when we can't clearly see it.
According to some theories on the matter, the only real racists left in America are the people unwilling to stop obsessing about race and racism, the folks who seem to see race behind every corner. If they just let race go, racism would wither and die away. The invocations of race and racism are incantations that keep bringing this beast back to life.
But Steele is just the most recent example of how easily self-serving calls for color blindness can morph into equally self-serving color cognizance. And it might not be useful to imagine that we only have two options: fetishizing race or ignoring it altogether.
Critics always find it ironic (even pathetic) when proponents of purported color blindness frame their own problems in terms of "racial victimization." The "Left" is assumed to traffic in such sophistries. The "Right," however, is supposed to know better. Clarence Thomas calling his confirmation hearing a "high-tech lynching" stands as the quintessential example of such racial irony. Even the people who claim obliviousness to racial reasoning seem susceptible to its rhetorical seductiveness.
But who really doesn't see race? When is it ever invisible? Immaterial? Irrelevant?
I just talked to a small group in Philadelphia about my most recent book, Racial Paranoia, and one of the listeners, an elderly white man, responded with a plea for the insignificance of race and racism as rubrics for understanding everyday life, especially his everyday life. He claimed that race had no impact on his daily activities. He wasn't a racist, he said. And he simply didn't see race. He had spent that very day teaching students, judging a science fair, and debating a group of university scholars. Race and racism, he assured me, had nothing to do with any of these experiences. And he made his case without anger, in clear and confident tones.
I responded by basically telling him that he was wrong, which wasn't the best route to take. I admitted to him that I am always a tad suspicious of people who claim not to see race at all. Indeed, I think that the very aspiration of postracialism (in most of its guises) is misplaced and romantic, repression passing itself off as transcendence. He listened to my response and then restated his point, very matter-of-factly. Several audience members tried to push back against his claim, arguing that even when race isn't explicitly thematized in, say, a classroom setting (one of the locations that the man had invoked), it might still be a valuable analytical lens, a real social fact. It might still be there, even if we don't see it. Not because it is biologically real, but because culture is most powerful when we can't clearly see it.
According to some theories on the matter, the only real racists left in America are the people unwilling to stop obsessing about race and racism, the folks who seem to see race behind every corner. If they just let race go, racism would wither and die away. The invocations of race and racism are incantations that keep bringing this beast back to life.
But Steele is just the most recent example of how easily self-serving calls for color blindness can morph into equally self-serving color cognizance. And it might not be useful to imagine that we only have two options: fetishizing race or ignoring it altogether.
Thursday, April 1, 2010
It's Not Just HBO. It's TV.
Did Congress ever pass health-care? Seriously. Lately, I've been trying to cultivate my own ignorance of all things "political." The news stories are just getting too bizarre: ongoing sagas in the wake of major earthquakes in Haiti and Chile; racial epithets that serve as soundtracks for Tea Parties; sex scandals that allegedly implicate, quite directly, a sitting Pope; Sarah Palin telling protesters to "re-load" in the context of actual violence linked to congressional votes and Tweets calling for Obama's assassination. With that as the backdrop, I've decided to issue my own self-moratorium on watching CNN, FOX and the evening news programs.
Instead, I'm using my television for more otherworldly fare. And TV has never been better. Although it is the quintessential site for sensationalized news-mongering, it is also the best place to spy complicated fictional tales about human life.
When (and why) did TV become so much better than motion picture film? I feel like that undeniable fact just kind of snuck up on the nation's couch potatoes. One minute we were awash in nothing but schlock melodramas and uninspired derivatives of Friends; the next, The Wire, The Chappelle Show, Mad Men, and The Sopranos drastically raised our televisual expectations.
In Production Culture: Industrial Reflexivity and Critical Practice in Film and Television, John Thornton Caldwell argues that a show like 24 radically altered the way television shows get made and further nuanced/complicated the narratives they deployed, a claim that anticipated part of the argument made in Steven Johnson's Everything Bad Is Good for You.
Last year, my colleague Elihu Katz used The ANNALS of the American Academy of Political and Social Science to wonder aloud about TV's potential demise. His query: are we currently witnessing "The End of Television?" Katz's point is hardly reducible to the "repertoire of output (call it content)" that one can watch today. That was just one element in a much more nuanced discussion he facilitated about the place of the "old" medium in a changing (new) media landscape. But if we were to go by content alone, we'd probably have to say that TV is far from dead. It has probably never been more alive.
Indeed, most people consider 2009 one of Hollywood's better years with respect to the quality of movies produced, big-budget fare (Avatar) and more independent/low-budget films (District 9 and The Hurt Locker). But I'd argue that the best of TV in 2009 was still far, far better, by leaps and bounds, than Hollywood's most celebrated offerings.
Of course, TV is a mixed bag, but at its best, it can sometimes best Hollywood, even the latter's most impressive stuff. And I say this as a filmmaker and an enthusiastic film watcher.
For one thing, the complexities of character development that one can witness over a TV show's entire season dwarf the best 2-hour attempts at cramming specificity into a protagonist's portrayal.
TV (even network TV) also allows for taking more chances than Hollywood filmmaking currently affords. Precious is a "controversial" and "daring" little film by Hollywood standards, but it would just be another HBO gem, and an even more impressive adaptation if we had gotten a chance to see Lee Daniels actually unfurl the other nuances of the book (over several weeks and months) that were bracketed out of the the powerful film. (The irony, of course, is that TV adaptations of motion pictures are usually uninspired and short-lived, sometimes even unwatchable. But that's because the TV-makers with the most nerve and talent are more interested in bringing their own projects to the air.)
In fact, you know how people say that movies are never as good as the books on which they are based. I'd go so far as to claim that TV series (at least the very good ones) have the potential of seriously rivalling novels in terms of nuance and artistic virtuosity, even upstaging them.
It is probably reasonable to say that TV is no longer simply Hollywood's mistreated step-child. More and more Hollywood actors, directors and producers are using TV as a venue for their wares. That only makes a good situation better. One potential downside, I think, is what I'll call the Mad Men effect: a too-short commitment to the slow-burn that weekly serials provide (maybe, in part, because it is hard to serve two masters, film and TV, at the same time).
My concerns about its treatment of race notwithstanding, AMC's Mad Men rewards "close reading." It is a well-crafted show. But it also seems to air something like eight episodes a "season." That isn't a season. That's a fairly long movie broken up into a few pieces.
Even the shows with more episodes a year tend to broadcast them in ways that destroy the continuity of their narratives and frustrate fans: two-, three-, even four-week breaks (sometimes more) between new installments. Now FOX's 24, which has just announced that it will not have a ninth season next year, is TV's gold standard: a weekly unfolding of 24 episodes. That's a season! In fact, it spans two.
HBO's How To Make It In America feels like it just started yesterday, and this coming weekend is already its season finale? Did I hear that right? If so, give me a break! The producers might as well have just made a movie. (Of course, the networks sometimes only order a certain number of episodes, less not more, because they don't want to over-commit to a bust. But HTMA just started. I say, bring back True Blood already, and make it last. If not, I might be forced to watch more of contemporary TV at its worst: those dreaded "news" shows. That's one thing that theatrical film clearly has over TV. It got rid of newsreels long ago.
Instead, I'm using my television for more otherworldly fare. And TV has never been better. Although it is the quintessential site for sensationalized news-mongering, it is also the best place to spy complicated fictional tales about human life.
When (and why) did TV become so much better than motion picture film? I feel like that undeniable fact just kind of snuck up on the nation's couch potatoes. One minute we were awash in nothing but schlock melodramas and uninspired derivatives of Friends; the next, The Wire, The Chappelle Show, Mad Men, and The Sopranos drastically raised our televisual expectations.
In Production Culture: Industrial Reflexivity and Critical Practice in Film and Television, John Thornton Caldwell argues that a show like 24 radically altered the way television shows get made and further nuanced/complicated the narratives they deployed, a claim that anticipated part of the argument made in Steven Johnson's Everything Bad Is Good for You.
Last year, my colleague Elihu Katz used The ANNALS of the American Academy of Political and Social Science to wonder aloud about TV's potential demise. His query: are we currently witnessing "The End of Television?" Katz's point is hardly reducible to the "repertoire of output (call it content)" that one can watch today. That was just one element in a much more nuanced discussion he facilitated about the place of the "old" medium in a changing (new) media landscape. But if we were to go by content alone, we'd probably have to say that TV is far from dead. It has probably never been more alive.
Indeed, most people consider 2009 one of Hollywood's better years with respect to the quality of movies produced, big-budget fare (Avatar) and more independent/low-budget films (District 9 and The Hurt Locker). But I'd argue that the best of TV in 2009 was still far, far better, by leaps and bounds, than Hollywood's most celebrated offerings.
Of course, TV is a mixed bag, but at its best, it can sometimes best Hollywood, even the latter's most impressive stuff. And I say this as a filmmaker and an enthusiastic film watcher.
For one thing, the complexities of character development that one can witness over a TV show's entire season dwarf the best 2-hour attempts at cramming specificity into a protagonist's portrayal.
TV (even network TV) also allows for taking more chances than Hollywood filmmaking currently affords. Precious is a "controversial" and "daring" little film by Hollywood standards, but it would just be another HBO gem, and an even more impressive adaptation if we had gotten a chance to see Lee Daniels actually unfurl the other nuances of the book (over several weeks and months) that were bracketed out of the the powerful film. (The irony, of course, is that TV adaptations of motion pictures are usually uninspired and short-lived, sometimes even unwatchable. But that's because the TV-makers with the most nerve and talent are more interested in bringing their own projects to the air.)
In fact, you know how people say that movies are never as good as the books on which they are based. I'd go so far as to claim that TV series (at least the very good ones) have the potential of seriously rivalling novels in terms of nuance and artistic virtuosity, even upstaging them.
It is probably reasonable to say that TV is no longer simply Hollywood's mistreated step-child. More and more Hollywood actors, directors and producers are using TV as a venue for their wares. That only makes a good situation better. One potential downside, I think, is what I'll call the Mad Men effect: a too-short commitment to the slow-burn that weekly serials provide (maybe, in part, because it is hard to serve two masters, film and TV, at the same time).
My concerns about its treatment of race notwithstanding, AMC's Mad Men rewards "close reading." It is a well-crafted show. But it also seems to air something like eight episodes a "season." That isn't a season. That's a fairly long movie broken up into a few pieces.
Even the shows with more episodes a year tend to broadcast them in ways that destroy the continuity of their narratives and frustrate fans: two-, three-, even four-week breaks (sometimes more) between new installments. Now FOX's 24, which has just announced that it will not have a ninth season next year, is TV's gold standard: a weekly unfolding of 24 episodes. That's a season! In fact, it spans two.
HBO's How To Make It In America feels like it just started yesterday, and this coming weekend is already its season finale? Did I hear that right? If so, give me a break! The producers might as well have just made a movie. (Of course, the networks sometimes only order a certain number of episodes, less not more, because they don't want to over-commit to a bust. But HTMA just started. I say, bring back True Blood already, and make it last. If not, I might be forced to watch more of contemporary TV at its worst: those dreaded "news" shows. That's one thing that theatrical film clearly has over TV. It got rid of newsreels long ago.
Subscribe to:
Posts (Atom)