Monday, October 30, 2006

Feminism vs. femininity (by Laura Miller, the Salon.com)


Feminism vs. femininity

In the impressive follow-up to her anti-monogamy polemic, Laura Kipnis explains why we feel a little uneasy when the possessor of a brand-new boob job proclaims, "I did it for myself."

By Laura Miller

Oct. 18, 2006 | Feminist punditry has long had a style problem. From the first, it's had a hard time separating how things ought to be from how they really are, which has undermined not only its credibility but its confidence. We all know that "no" does not always mean no, and to have to keep insisting it does over and over erodes even the speaker's faith in herself; stridency is usually a way of sounding more convinced than you actually feel, and it doesn't fool anyone. Then there's the matter of dancing through the eggshell-littered territory of contemporary feminist thinking, knowing that legions of your putative sisters are poised to thrash you for the slightest variation from their (sometimes mutually contradicting) positions. If you anger them, chances are your own life will be dragged out for intensive and merciless scrutiny. If you don't, most likely your caution has made you fatally dull.



On the other hand, for feminism's critics, every day is a field day. Whether it's a nondenominational bomb-thrower like Camille Paglia, a right-wing mouth-frother like Rush Limbaugh or a bargain-bin attack dog like Christina Hoff Sommers, it's hard not to sound like a fearless iconoclast when you're up against such mincing, mealy-mouthed good girls. Whether the good girls have a point or not becomes immaterial. Propriety, which is what too much of feminism has become, is the natural target of humor, too, and if you're funny enough often enough at feminism's expense, you can even get away with never making a coherent argument: case in point being the career of Caitlin Flanagan.



Laura Kipnis, a professor at Northwestern University best known for her provocative defense of adultery, "Against Love: A Polemic," does an impressive job of finessing this impasse in her new book, "The Female Thing: Dirt, Sex, Envy, Vulnerability." Despite the subtitle of her first book, "The Female Thing" is not the work of a polemicist -- nor does it put forth any especially innovative thoughts. Kipnis is like the intelligent woman's version of whatever Carrie Bradshaw was supposed to be on "Sex and the City." You've encountered most of the ideas in "The Female Thing" before, but Kipnis has a way of distilling them down to a well-turned sentence or two that's very pleasing. Hers isn't a gift to be taken lightly, since in the process she makes it clear how untenable many of those ideas are.



Kipnis' knack for epigrammatic sentences fills "The Female Thing" with what amount to some very high-nutrition one-liners. An example: "When it comes to murder, you're actually more than twice as likely to kill yourself as to be killed by someone else, giving weight to the old truism that you're your own worst enemy." She can be acutely funny, though (ironically) the less so the harder she tries to hit that Carrie Bradshaw sweet spot. If you read the four linked essays in this book in order -- and who are we trying to kid; you're going to read the one titled "Sex" first, like everyone else -- you'll have to cruise past a few wince-inducing references to Manolo Blahniks and terms like "the gal set," but don't let that deter you. There's plenty of steak in there, underneath the less convincing attempts at sizzle.



As Kipnis sees it, the situation that educated, middle-class Western women find themselves in is fundamentally absurd. To say so -- rather than pretending the solutions are obvious -- takes nerve. To say so with both humor and sang-froid -- unlike the legions of ethically tormented personal essayists or the pratfalling clones of Bridget Jones -- takes panache.



The absurdity comes from the disparity between our rapidly changing social landscape (including the advances of feminism) and the recalcitrant internal map Kipnis calls "the female psyche." Feminism, she writes, has collided with "an unanticipated opponent: the inner woman." The four essays in "The Female Thing" center on some of the most stubborn aspects of the inner woman, the impulses and irrational passions that suddenly rise up and swamp us despite our best efforts to stick to the designated feminist path. In fact, this rising up and swamping has happened so much in the past 30 years, and women have tried so diligently to redirect the path around the various trouble spots where it does, that by now the path itself is hopelessly muddled. It's like getting lost in the woods and following one promising little trail after another only to see it peter out in an impenetrable thicket.



Kipnis takes a modified Freudian view of this dilemma, which makes her exquisitely attuned to paradoxes. The strongest essay in the book -- "Vulnerability," which is about both sexual abuse and the fear of it -- contains two gemlike analyses of recent confessional writings by Naomi Wolf and the late Andrea Dworkin. Wolf recently favored the readers of New York magazine with a histrionic account of how, 20 years earlier, when she was one of his students at Yale, Harold Bloom put his hand on her thigh after a drunken dinner party. She presented this event -- and the refusal of Yale to address the matter when she finally decided to do something about it years later -- as a deep psychic wound.



"All this is a shade self-dramatizing," Kipnis writes, "but can we say that it's self-dramatizing in a particularly feminine way? The idioms employed have that feminist ring, but it's a genre of feminism dedicated to revivifying an utterly traditional femininity: wounded bird femininity, to borrow Joan Didion's useful formulation." Wolf's drama only makes sense (to the extent that it does make sense) when you understand that she regarded Bloom as so exalted an authority figure that she became "sick with excitement" at the prospect of meeting him, and that she expected nothing less than perfect satisfaction from Yale two decades after she failed to register a complaint. Kipnis' verdict: "this massive overinvestment in paternal figures and institutions has such an Oedipal flavor. The contradiction of Wolf-style devoted daughter feminism is its thralldom to the phallic mythos it's also so deeply offended by." That's very nicely put indeed, so well formulated that if it's not a new interpretation of this minor scandal, it might as well be. In wrestling with Dworkin's writings equating heterosexual intercourse with subjugation -- a more challenging task -- Kipnis is equally astute. "Dworkin didn't read the culture wrong: it's entirely true that all the idioms for penetration -- 'getting fucked,' 'screwed over' -- are about humiliation and exploitation. Which does make it hard to see how anyone can avoid a certain duality about the experience, even when it's pleasurable, as it often is! Dworkin is the great case study in the ambivalence of femininity: after all, she's hardly indifferent about penetration." As Kipnis notes earlier, Dworkin's key work, "Intercourse," hinged around her "wonderfully inflamed" indictment of the practice. "But," Kipnis goes on, "can there really be this much aversion without some corner of desire? The opposite of desire isn't aversion, it's indifference."



As you can see, Kipnis is a great parser of ambivalence -- and she views ambivalence as the defining condition of modern womanhood. In her essay on "Dirt" -- or, rather, about housework -- she reads a passage from Alison Pearson's novel "I Don't Know How She Does It," in which the heroine, a hedge fund executive, resentfully cleans the family kitchen at 2 a.m. after returning from a business trip. Kipnis wonders why so many women obsessively pursue a standard of cleanliness that no one else in the household considers essential. (Despite what such women will tell you, she notes, definitions of what's clean and what's not are neither universal nor unchanging.) "How is it," she writes, "that women have managed to over throw the shackles of chastity -- to cite another rather significant vestige of traditional femininity -- more easily than bondage to the vacuum cleaner?"



She suspects that at the root of this preoccupation lies the buried, primitive association of women's bodies -- and especially menstruation -- with dirt. Kipnis blames this on "the human symbolic imagination, that archaic thing, which isn't fully in sync with external realities like social progress. Maybe some day it will catch up." It probably won't if most of us remain largely unaware of its subterranean influence. "If women didn't have vaginas," Kipnis goes on to speculate, "would we take fewer bubble baths, be less susceptible to the newest cleaning product marketing campaign, let up on the cleaning standards (for those prone to occupying the household enforcer role), and simply not do more than 50 percent of the housework?" Since the vaginas are non-negotiable, the implication is it's time for an overhaul of the symbolic imagination.



In the essay on "Sex," Kipnis mostly focuses on the "erotically mismatched world we've inherited" -- at least for the heterosexual heirs. The lamentable truth is that "the procreative act" -- that is, heterosexual intercourse -- seldom results in orgasm for the female partner, only 20 to 24 percent of the time according to surveys. Kipnis cites the "feminist evolutionary biologist" Elisabeth Lloyd, who has discovered the even worse news that studies of sexual response don't distinguish between women who reach orgasm by intercourse alone and those who need additional stimulation of the clitoris as a "final push." When you subtract those women who (sorry) need a hand, "orgasm-attainment figures are so stunningly low that they seem to imply that reaching orgasm during intercourse isn't normal for the female of the species."



Kipnis compares this situation, hilariously, to "owning one of those hybrid cars that still have a few kinks to work out as your sole source of transport: the engine shuts down unexpectedly, though even when the engine's revved, it can't always be relied upon to get you where you want to go." Combined with the sexual inhibitions most cultures instill in their female members, this leads to a whopping "orgasm gap."



Even the supposedly gone-wild younger generation falls prey to this inequity. Kipnis writes that young women have described themselves as "participating enthusiastically in hookup culture -- one-night stands and booty calls," then complain that "the men involved 'don't care if you're getting off or not.' Yet these girls keep hooking up with them! Without even getting dinner for it! Welcome to the new femininity -- at least under the old femininity, you got taken to dinner." In response to reports from sex researcher Shere Hite, who has interviewed women claiming to enjoy "'emotional orgasm ... an intense emotional peak' followed by feelings of closeness," Kipnis quips, "There's a name for someone who would call that an orgasm: female."



Kipnis sees the current mommy wars as an echo of the old "vaginal-orgasm-versus-clitoral-orgasm dichotomy," in which women who could only climax with clitoral stimulation were told they were insufficiently adapted to their true, natural role as women. "To begin with," she writes, "we have the same cast of characters: the womanly other-directed type versus the masculine-identified striving autonomous type. And in both cases, a socially organized choice masquerades as a natural one, manufacturing a big dilemma where one doesn't really have to exist."



For although Kipnis is willing to admit that some parts of the female psyche have proven ferociously resistant to change, she doesn't think that the situation is intractable. For all her puncturing of feminism's sanguine notions about the malleability of human nature, she doesn't believe that the deep layers of the "symbolic imagination" are hard-wired. Sociobiologists and evolutionary psychologists may be "the go-to guys of the moment when it comes to thorny questions about human nature and gender roles," but they've yet to come up with a convincing justification for the perverse configuration of the female orgasm, for instance. "This is the crowd," she writes, "who likes to tell us how men and women got to be who they are (and will remain for all eternity) by supplying colorful stories about the mating habits of our hominid ancestors and selected members of the animal kingdom," making the usual comparison to Rudyard Kipling's "Just So Stories" -- fables about how the leopard got his spots, and so on.



Kipnis' is an exceptionally sensible voice at a time when people seem to believe that any long-standing cultural norm that can't be completely overhauled in a single generation must therefore be indelibly carved on the stone tablets handed down to Charles Darwin at the foundation of the modern world. And for all her low-key Freudianism, she knows when it's time to follow the money instead of the unconscious. During all the foofaraw about the "opt-out revolution" -- those young, Ivy-League women who are now abandoning the career track to be stay-home moms -- haven't you been wishing someone would say exactly this: "Somehow, as highly educated as these girls are, they don't seem to have heard about the 50 percent divorce rate! Somehow, they imagine that their husbands' incomes -- and loyalties -- come with lifetime guarantees, thus no contingency plans for self-sufficiency will prove necessary ... Somewhere Betty Freidan must be cackling..."



In the first essay, "Envy" -- which is not about catfights, but rather about all the things that men have and women want -- Kipnis asks us to consider the slowly closing gender gap when it comes to pay equity. If you look carefully, she points out, you'll see that "women's wages are up to 80 percent of men's because male wages are down, which evens things out. It looks as though the dirty little secret of the last 30 years is that the job market played women off against men to depress pay." While the sexes rage at each other about dating ethics and dirty socks, somebody (probably that little Monopoly guy with the top hat and cigar) has been laughing all the way to the bank.



Perhaps the most daring statement in "The Female Thing" comes in this first essay. Kipnis observes that even so acclaimed a feminist spokesperson as Eve Ensler, creator of "The Vagina Monologues," can turn around and do an entire stage show about how much she hates her belly. "Ensler works herself into intellectual knots trying to come to terms with these painful body insecurities," Kipnis writes, "but there's a simple explanation for the dilemma she can't quite decipher, which is that feminism and femininity just aren't reconcilable." Think about that one for a moment and consider how much an entire school of tortured female rumination hangs on the avoidance of this insight. "Though if internal gymnastics burned calories," Kipnis adds, "we could all have flatter stomachs, with far fewer hours at the fucking gym."



Femininity -- which Kipnis defines as "tactical: a way of securing resources and positioning women as advantageously as possible on an uneven playing field, given the historical inequalities and anatomical disparities that make up the wonderful female condition" -- seeks to ameliorate all these disadvantages by "doing what it took to form strategic alliances with men." But that means that femininity "hinges on sustaining an underlying sense of female inadequacy," which puts it in opposition to the goals of feminism. No wonder we feel a little uneasy when the possessor of a brand new boob job proclaims, "I did it for myself." I believe this is what Marx called false consciousness.



Scolding other women for failing to embody (literally) an appropriately feminist outlook has never really worked, and Kipnis doesn't seem the type to interrupt yet another rousing chorus of "I Enjoy Being a Girl," even if she felt like it. (I don't think she does.) Instead, she's suggesting that we stop lying to ourselves by pretending we can run with the rabbits and hunt with the hounds. No girl should ever be surprised upon finding herself in that archetypal Carrie Bradshaw position of realizing that with all the cash she spent on ruinously expensive and joint-grinding high heels she could instead have bought a roof to put over her head. (That's the revelation that comes right before you learn you need knee surgery.) Don't say nobody ever warned you.

Friday, October 27, 2006

Love Me, I'm a Journalist (by Jack Shafer, the Slate)


press box: Media criticism.
Love Me, I'm a Journalist

A profession's romance with itself.
By Jack Shafer
Posted Wednesday, Oct. 25, 2006, at 10:42 PM ET

Illustration by Mark Alan Stamaty. Click image to expand.On Monday, I singled out Tim Rutten of the Los Angeles Times and the Washington Post's Howard Kurtz as I admonished journalists for overreacting to staff cuts at newsrooms around the country.

In his columns, Rutten warns that the threatened cuts at the Los Angeles Times will injure democracy and the "stakeholders" (as opposed to stockholders) who rely on the Times' broad coverage. Kurtz declares that news organizations' "corporate slashing" will "mean fewer bodies to pore over records at City Hall, the statehouse or federal agencies"—even though he gives no examples of a newspaper slashing its hard-news staff.

In an e-mail to me, Kurtz expands his point, writing that "some papers are overstaffed, not all belt-tightening is bad and newspapers need to adapt to the digital age. So I don't think I was playing 'hands off the newsroom or we'll shoot this investigative reporter'—just noting that some of these incredibly shrinking papers are likely to find investigative work an unaffordable luxury."
Click Here!

Rutten writes, too. He concurs that a high head count doesn't necessarily guarantee journalistic excellence, "but I'm just as sure that there is a number below which excellence becomes impossible. Is that number in the Times' case 939 or 800? I don't know, and neither does anybody else. I do know, though, that at some point, you have too few good and experienced people to do the job and I, for one, don't want to flirt with that edge."

However appalling newsroom downsizing may be for journalists, it will ultimately reveal what the people who run and own newspapers really think their publications are for. Scratch a serious reporter, and he'll offer volumes about the "public service" his newspaper performs in the form of investigations: It watchdogs government. It keeps corporations honest. It uncovers the dastardly deeds of foreign dictators and prevents genocide. It exposes quacks and charlatans. (It turns the common man into a Socrates if he reads the editorials!)

Newspaper people have enormous egos, if you get my drift, and don't mind massaging the big hairy things in public. Yet the press is hardly the sentry and bulwark of society that reporters imagine it to be. I don't mean to disparage reporters who put their lives on the line to file from Iraq, nor the sleuths who sift through databases to uncover wrongdoing by pharmaceutical companies, or any other enterprising reporter. But too many journalists who wave the investigative banner merely act as the conduit for other people's probing, as George Washington University professor and former investigative journalist Mark Feldstein suggests in a paper-in-progress titled "Ventriloquist or Dummy?"

Feldstein cites a 1992 piece by the late Christopher Georges in the Washington Monthly to illustrate his thesis. Georges reviewed about 800 articles by investigative reporters from the Washington Post, the Los Angeles Times, and the New York Times published over three years and found that "nearly 85 percent of them have been follow-ups or advances of leaked or published government reports." Georges' study is anecdotal since his piece did not name the stories analyzed or describe his methodology, but my hunch is that his conclusions aren't far from the truth.

As Feldstein writes, there's nothing inherently wrong with investigative journalists throwing the spotlight on government reports as part of their mission, as long as the information is accurate and the journalists aren't being spun. But it subtracts from the journalists' self-images as tireless messengers of truth turning millions of pages at the courthouse or the SEC.

"Hitching ourselves to government investigators' bandwagons does more than make us lazy; it leaves us—and the rest of America—thinking falsely that we are looking where the government isn't," Georges wrote. Feldstein adds this codicil: "[I]f investigative reporters can really be turned into something akin to ventriloquist dummies, how independent can other journalists really be?"

In my Monday piece, I noted that many of the so-called investigative scoops that originate inside government come from such nonprofit outfits as the Center for Public Integrity. As former New York Times-man Bill Kovach told Georges, "Most of what we call investigative journalism these days … is really reporting on investigations."

Before all you investigative reporters who survived on maggots and pomegranates for a year to uncover human rights atrocities in Afghanistan start sharpening your knives for my scalp, please relax. My admiration for original investigative reporting knows no bounds. But the defenders of journalistic excellence will have to make a better case for the connection between big staffs and great journalism before I don my helmet and rush to man the Los Angeles Times barricades.

******

Disclosure: Feldstein is a friend—not as good a friend as David Corn, but a lot better than Michael Isikoff. Also, if this column were 1 percent more like my Monday column, I'd sue me for copyright infringement. Send your lawsuit to slate.pressbox@gmail.com. (E-mail may be quoted by name unless the writer stipulates otherwise. Permanent disclosure: Slate is owned by the Washington Post Co.)

Wednesday, October 25, 2006

We will soon be lost for words (by John Humphrys, the Telegraph)


Note: 'Our language continues to be taken over by pseudo-management speak that is itself in danger of becoming meaningless'

We will soon be lost for words

Last Updated: 12:01am BST 24/10/2006

In the final exclusive extract from his new book on language, John Humphrys laments the death of formality and the dumbing down of classic texts

Interview with John Humphrys: 'I nearly became an alcoholic'
Extract one: Mind your language – it matters!




If language is a mirror for the society in which we live, no image could be reflected back more sharply than the dominance of consumerism in our culture. We have become a nation of consumers. We no longer watch television news, we "consume it". The country itself is routinely called "UK plc", as though that's all we are and a British education minister has referred to our universities as "UK Knowledge plc", which needs to keep up its "market share".

'Our language continues to be taken over by pseudo-management speak that is itself in danger of becoming meaningless'

I know that universities need to raise money wherever they can, but using language like this has consequences. It's not surprising if students come to see themselves more as customers than as members of their universities. In one sense they are: they have to pay and they want value for money. Why not? But it seems that increasing numbers of them interpret that in the ordinary sense of customers' rights.

Customers are frequently disappointed. When that happens in the world of commerce they complain. And that's exactly what they are doing now in academia. There were five times as many complaints from students in 2005 as there had been in 2004 and many of them, it seems, expressed in language you might use to complain about a rip-off merchant.

Baroness Deech, the first independent adjudicator for Britain's university sector, is not impressed: "In the course of looking at some complaints, we have seen e-mails from students to tutors which astonish me."

Alongside this commercialisation of our language, there has been an erosion of formality. Formality matters. It creates a space between us that allows for a measure of independence and freedom. Take it away and that space is open to all manner of intruders, not all of them commercial.
advertisement

When, for example, did you last hear a public figure "send their condolences" to someone who'd been bereaved? Not recently, I suspect. Nowadays, if there has been a disaster of some sort, it tends to be: "Our thoughts go out to the loved ones…" Or even: "All our thoughts are with the families of those…"

It may be well meant, but it has the smack of insincerity, for the obvious reason that it's not true. "All" our thoughts do not "go out" to anyone. Of course all of us will feel a degree of sympathy, and it can actually be insensitive to the bereaved. It is the equivalent of that ghastly and much parodied "I feel your pain".

The new enforced intimacy is everywhere. The Queen – widely admired for keeping her distance and exercising iron control over her emotions – is now expected to show she cares. It seems a bit odd. Does anyone really believe she somehow became a different person when she was put under pressure to let us know publicly that she was moved by the death of the Princess of Wales?

Formality is disappearing, too, in how we address each other. The first time I met Tony Blair after the election in 1997 I asked him off-air what I should call him. "Tony, of course," he said. I suppose I knew that's what he would say – we'd known each other for a long time and were obviously on first-name terms – but there's something different about being prime minister. It is, after all, the highest elected rank in the land.

I tried to imagine using Margaret Thatcher's first name when she was at Number 10. I preferred to live.

It's clear that a lot of the public value old-fashioned formality in the way we talk to each other. If I had a pound for every listener who gets het up when politicians use the interviewer's first name I'd be almost as rich as Jonathan Ross. People hate it, so why do politicians do it?

Nor does it gain politicians any advantage when they pepper an interview with "John" or "Jim". If they expect us to react like puppies having our tummies tickled… well, you'd have thought they might have learned by now that it doesn't work like that. We should keep our distance. Formality is one way of doing so.

There can be no more grotesque illustration of the demise of formality on television than the rise of the monstrous confidence trick that goes by the name of "reality television". I do not deny that some of it is hugely entertaining. Indeed, one or two programmes, including Channel 4's Operatunity, have been superb.

But most reality television is a lie. It tries to create the illusion that we are watching people behaving naturally in horribly contrived circumstances. I had my own brush with it when I was invited to appear on a new programme for BBC2. The idea was that four "famous" people (how casually we throw around that word) would spend a fortnight at the Chelsea Art College being taught how to draw and paint.

The working title should have alerted me immediately: Celebrity Art School. But I loved the idea. Like half of the population, I can barely draw a bath and I've always wondered if that's because I was never taught properly. I eventually said yes. But I realised from the first hour of the first day what an idiot I'd been.

Although I was fully prepared to persevere, my perseverance was never called for because technique was never called for. The first time I mentioned the word (in about the first hour, as I recall) I was met with an amused tolerance. Poor chap, you could see them thinking, he really is very naive. Sorry, I said, so what is it about?

Whether or not we still have a firm grasp on the meaning of the word art was a question raised recently by the sculptor David Hensel. He made a piece, called One Day Closer to Paradise, of a human head frozen in laughter and balancing precariously on a slate plinth. He submitted it to the Royal Academy for its 2006 exhibition, but somehow the head and the plinth were separated in transit.

Nonetheless, the academy accepted his submission and displayed it. The strange thing was, though, that they thought the plinth was the work of art, not the head, which was nowhere to be seen.

As he put it ruefully: "I've seen the funny side but I've also seen the philosophical side."

At least some good came out of my art school experience. The other "students", including Radio One DJ Nihal, turned out to be great company. At first Nihal and I were slightly wary of each other and then I told him I wondered if an ageing Radio 4 presenter could learn "street". He humoured me and gave me a lesson.

I flatter myself that I have a reasonably good ear for language. I reckoned I could get away with a bit of "Hey, man… how ya doin?" But, no, it doesn't work like that. Street language is inventive and rich. Even a greeting in street is a complex business. "There's a million ways of not saying anything," says Nihal. "Two people could walk up and say: 'What's happenin? Cool, man. What's goin' on with you? Good? All good? Things are running? Peace. Safe'."

Peace means "I'm outa here" (it's a long story) and safe means "We're safe with each other"; there's no animosity. By contrast Nihal told me that if you want to insult someone in street you might call him "chief". No one seems quite sure why. Of course there is a well-known dark side to contemporary street rap. But the point of this intriguing language, according to Nihal, is "to separate me from you". He told me: "It's like Latin in the church. Knowledge is power." In fact, the moment older people do know is the moment the language dies. "Bling is a classic example," says Nihal. "As soon as you hear commissioning editors at Channel 4 using it it's dead."

Meanwhile, our language continues to be taken over by pseudo-management speak that is itself in danger of becoming meaningless. Take the world of charity, previously known as the voluntary sector. It is now, gradually, changing its name to the Third Sector. Older volunteers are "totally exasperated" not just with the alien language but with what it represents: the transformation of their charity from the kitchen table and the rattling tin to the computer terminal and the huge mailshots. They don't believe it helps them provide a better service.

This language is also entering our schools. Instead of simply teaching, teachers are now being invited to make a "personalised learning offer" to children. It's more than just a dreary piece of business-speak. It implies that a child is a client or a customer, the figure to whom the "offer" is made. The children, in turn, are invited to be "co-investors with the state in their own education".

Come again? I reckon if a child came up to me and said she saw herself as a co-investor with the state in her own education I'd have serious worries about her welfare. I'd start wondering whether management consultants have begun to form sinister sects, grabbing kids in playgrounds and indoctrinating them in business-speak.

And yet when it comes to giving our children a taste of Shakespeare and English at its most beautiful, then suddenly we're all terrified. Might, like turn off the kids… know wha' I mean. Instead they are offered alternative texts, issued by educational publishing houses, that supposedly make our greatest writer more palatable.

Here's a taste. Take a few original lines from Macbeth:

Is this a dagger which I see before me,
the handle toward my hand?

Compare them to the guide version:

Oooh! Would you look at that.

Yes, I know it sounds as if I'm making it up, but you can check it for yourself.

Inevitably the language of politics is changing too. A relatively new phrase in the repertoire is "direction of travel". It's another device for dodging specific detail and talking instead about the "broad picture". I spotted it first when the Government was trying to get its Education Bill through the House of Commons in the face of determined opposition from its own backbenchers.

But it was Guantanamo Bay that provided some of the best examples of how wayward and adrift from reality political language can become. These include a reference by Sandra Hodgkingson, the deputy director of the Office of War Crimes Issues (itself a wonderful linguistic formulation) to "the different care providers" at Guantanmo Bay.

At least some progress with more straightforward language is being made. When the American government realised that the phrase War on Terror was not having the desired effect round the world they came up with a new name. It is now called The Long War. Sometimes the simplest language is the most chilling.

This also brings us right back to why it's important to pay attention to language. Our society, which treats us so much as an audience to be entertained and as consumers to be led to market, often uses language as an anaesthetic.

If verbal blandishments can encourage us to sit back and relax, we can be taken care of in more ways than one. And unless we're trained to be alert to the use of language we're likely to end up duped.

The simple fact is we cannot afford to be careless with our language, because if we are careless with our language then we are careless with our world and sooner or later we will be lost for words to describe what we have allowed to happen to it.

# Beyond Words: How Language Reveals the Way We Live Now by John Humphrys (Hodder & Stoughton) is available for £9.99 + 99p p&p. To order please call Telegraph Books on 0870 428 4112

Information appearing on telegraph.co.uk is the copyright of Telegraph Group Limited and must not be reproduced in any medium without licence. For the full copyright statement see Copyright

Monday, October 23, 2006

And the winner is? (by Jason Cowley, the Guardian/Observer)


Note to the Capture: Nobel Prize Ceremony takes place at Stockholm Concert Hall


And the winner is?

Michael Jackson has won 240 of them. Frank Gehry has bagged 130. The culture of prize-giving has gone mad. It has replaced the art of criticism in determining cultural value and shaping public taste. We enjoy the glamour of a Booker or an Oscar night, but we lose something too in this orgy of awards, says Jason Cowley


Jason Cowley
Sunday October 22, 2006

One of the most fascinating books I have read recently is David Lodge's The Year of Henry James, his account of the consequences of discovering that both he and Colm Toibin were simultaneously publishing novels about the life of James. The year was 2004, the same year Alan Hollinghurst won the Man Booker Prize for The Line of Beauty, the central character of which just happened to be writing a thesis about, yep, James. The fascination of Lodge's book lies less in its literary distinction than in what it reveals about the psychology of the career literary novelist at a time when, in this country at least, to be a literary novelist is, on the whole, a pretty lonely and miserable existence. Unless, that is, you have the luck to win a major prize, and then everything can change: you find a readership, your book is translated into many languages, your advances rise exponentially, Hollywood gets in touch.

Lodge won no prizes for his novel about James, Author, Author; he did not even make the Booker longlist. By contrast, Toibin's The Master won the £68,000 Impac prize, the world's richest award for a single work of fiction, and was shortlisted for the Booker. Lodge writes candidly, self-laceratingly, of how Toibin's recognition by the Booker judges caused him to suffer 'pangs of professional envy and jealousy', of the relief he felt when, watching coverage of the Booker ceremony at home on television, Toibin did not win, and of how even now he cannot bring himself to read The Master, so tormented is he by its wider success.

Reading Lodge's strange, self-revealing memoir, I began to understand how much the psychology of the artist - as well as the entire culture - is being changed by the rise and proliferation of cultural prizes and by what the American academic James English calls our economy of cultural production and prestige. As long ago as 1928 Ezra Pound could write that 'The whole system of prize-giving... belongs to an uncritical epoch; it is the act of people who, having learned the alphabet, refuse to learn how to spell.'

He would have been even more indignant today. For ours is truly the age of awards. Prizes are becoming the ultimate measure of cultural success and value. One prize inevitably spawns another, in imitation or reaction, as the perceived male dominance of the Booker spawned the Orange Prize for women's fiction. There are now so many, in so many different fields, that it can be difficult to find a professional artist, writer or journalist who has not been shortlisted for a prize.

The proliferation of prizes is perhaps greatest in the movie industry, where there are now twice as many cinema prizes (about 9,000) as there are feature films produced each year. The troubled pop star Michael Jackson has won more than 240 awards. The architect Frank Gehry has won 130. The novelist John Updike has won 39. Where will it end? Can it end?

According to English, author of the enthralling The Economy of Prestige: Prizes, Awards, and the Circulation of Cultural Value (Harvard University Press), we are reaching 'the point of a kind of cultural frenzy, with scarcely a day passing without the announcement of yet another newly founded prize'. Any number of large corporations, wealthy institutions and patrons are lining up to partake of the frenzy as sponsors and paymasters, though one wonders how much of this is to do with tax-avoidance issues and how much with the need to be seen as socially and culturally relevant and cool.

There was a time when, as Wordsworth wrote, 'Every great and original writer, in proportion as he is great and original, must himself create the taste by which he is to be relished.'

The culture is no longer so patient. In a time of information overload - of cultural excess and superabundance - our taste is being increasingly created for us by prize juries and award ceremonies. Art is beginning to resemble sport, with its roster of winners and losers and its spectacles of competition: the Oscars, the Baftas, the Brits. Indeed, the larger cultural festivals and prizes, such as the Venice Biennale, the Oscars and the Nobels, are consciously imitative of international sporting competitions like the Olympics.

The format for most major prizes conforms to the model of the Oscars. 'It's very much a case,' says English, 'of maintain perfect secrecy regarding the decision, assemble all the nominees, and roll the cameras in hope of catching bad behaviour, poor sportsmanship or just plain unhappiness.'

In the book world, prizes have long since supplanted reviews as our primary means of literary transmission, and now they are taking on the task, from the professional critics, of judgment as well. This of course is not just a literary phenomenon: the success of the Booker Prize, which was established in 1968, led in this country to a kind of Booker envy. Every arts bureaucrat, it seemed, wanted his or her own equivalent of the Booker, which led, in time, to the creation of the Turner Prize (1984), for the visual arts; the Mercury Prize (1992), for music; and the Stirling Prize (1996), for architecture, which was won this month for the first time by Richard Rogers, as if this global plutocrat, creator of the spectacular public building and connoisseur of fine Italian cooking needed the recognition.

In well-paid activities such as pop music and architecture, the prize fund often seems to be of incidental value to the winner; it has become something of a minor tradition for the winner of the Mercury Prize, before making a rambling, drink-slurred speech, to toss away the winning cheque, worth £20,000, as if it were a mere flyer picked up outside a Tube station. For most novelists, that £20,000 would be worth having. For the pop star, it is so much ticker-tape.

Clearly, then, something more than money is at stake here: recognition, symbolic capital, prestige. Prizes create cultural hierarchies and canons of value. They alert us to what we should be taking seriously: reading, watching, looking at, and listening to. We like to think that value simply blooms out of a novel or album or artwork - the romantic Wordsworthian ideal. We would like to separate aesthetics from economics, creation from production.

In reality, value has to be socially produced. 'The process involves power, money, politics,' English told me. 'Prizes create symbolic value astonishingly quickly and easily, because they bring together economic power, social connections, academic expertise and celebrity and enable rather complex transactions to take place.'

The modern era of prizes began with the Nobel Prize for Literature in 1901, funded by the estate of Alfred Nobel, the dynamite and munitions manufacturer. Its effect was immediate. If in Britain we have Booker envy, the rest of the world once had Nobel envy. In 1903, the Prix Goncourt and the Prix Femina were set up in France, with the Pulitzers soon to follow in the US. The first film awards, the Oscars, began in 1929; the Emmys were up and running in 1949. Today we often speak of prizes in terms of other prizes. The Caine Prize for African Writing is the 'African Booker'; the Pulitzers are the 'Oscars of journalism'; the $250,000 Lillian Gish is 'the Nobel of architecture'. Amusingly, the Prix Goncourt is known even to some in France as the 'French Booker', though it was set up many decades earlier.

In December 1997, the year I was a Booker judge, I travelled to Moscow as a guest of the Russian Booker Prize, which was set up by the late Sir Michael Caine, a former chairman of Booker plc. Moscow was then a city of terrifying extremes: anarchic, astoundingly expensive, and often brutal. The gangster capitalists were in control. An indigenous publishing industry was emerging unsteadily from the darkness and oppression of Soviet totalitarianism; most novels were being published in cultural magazines such as Novy Mir and Znamya - the so-called thick journals. Yet the Booker had succeeded in inspiring a new generation of Russian writers, as well as bringing hope and attention to those older ones who had laboured for so long in secret and without any expectation of reaching a wider public. The prize had created an entire culture of controversy around itself: it was, as in Britain, as much a journalistic as a literary event. Already it had imitators: the Little Booker Prize, for non-fiction; the Anti-Booker Prize, funded by the oligarch Boris Berezovsky, who now lives in disaffected exile in London; and the Solzhenitsyn Prize, supported by the great Russia-returned writer himself. Here was a culture being transformed and energised by prizes.

Most writers understand the cultural importance of prizes, even those such as Martin Amis and Philip Roth who purport to disdain them. For Amis, the Booker, which he has never won, is a 'kind of literary Big Brother' and the award dinner an occasion when writers 'sweat with greed and egocentricity'. Yet Amis, for all his elevated disapproval, is preoccupied perhaps more than any other writer of his generation by the larger literary game, by who is winning and losing.

In 1995 he wrote a novel, The Information, that was in large part about literary competition. Richard Tull and Gwyn Barry, both writers, are perpetual rivals. Richard beats Gwyn at chess, at snooker, at tennis. None of this matters to him because, when it comes to the literary high stakes, Gwyn is winning. He has everything that Richard wants: wealth, a readership, Hollywood interest in his work and a beautiful young aristocratic wife. As if this weren't enough, as the novel begins, Gwyn is shortlisted for a prize, the nicely named Profundity Requital - which, if he wins, will provide him with a lavish income for the rest of his life.

In some way, all artists of ambition, literary or otherwise, must be longing to win their version of the Profundity Requital. Even Philip Roth is not immune from the prize game, though in a recent BBC4 interview with Mark Lawson he made a point of saying that prizes were childish and of little concern, even if he has never been known to reject one. This, I thought, was disingenuous. Roth is thought to take a special interest in his book jackets, how they are presented and what is written on them. He has a chief sub-editor's eye for quality control. It is interesting then, considering what he told Lawson, to read the author blurb on the jacket of his most recent novel, Everyman: 'In 1997 Philip Roth won the Pulitzer Prize for American Pastoral. In 1998 he received the National Medal of Arts at the White House and in 2002 the highest award of the American Academy of Arts and Letters, the Gold Medal in Fiction... He has twice won the National Book Award, the Pen/Faulkner Award, and the National Book Critics Circle Award...'

And so it goes on, his capsule biography reduced merely to a list of prizes won, to an exercise in self-aggrandisement. It is as if we have no other language for praising an author or no other vocabulary of evaluation; and it is as if whoever writes these blurbs (the writers themselves, perhaps?) believe that readers are no longer curious to know about a writer's biography. There is also a sense of presumption in not noting where or when the writer was born: as if to say, 'You know exactly who the famous and excellent Philip Roth is, he needs no introduction.' Instead, we are told only what he won, as if past achievement validates the present offering.

One summer afternoon in 1999 I visited Jim Crace at home in Birmingham. It was a warm day and we sat eating lunch in his garden, overlooked by the tall houses of his neighbours. We were to talk about his new novel, Being Dead, but first he wanted to know what my hopes were for the novel that I was soon to publish. To win one of the smaller first-novel prizes would be fine, I told him.

'Don't you see?' he said, his voice quickening. 'Don't you see what you're letting yourself in for? If you're shortlisted for a small first novel prize, you'll want to win it. If you won it, you'll want to win something bigger. Don't you see that you'll never be satisfied? This is what it's like being a writer.'

Crace talked about the Booker and who would be in contention for the prize later that year. He seemed surprisingly interested in what I had to say and it was clear that his unarticulated desire - unarticulated to me, at least - was to win the prize. Later in the year, I thought of Crace with sadness when the shortlist was announced and Being Dead wasn't on it. He knew, as I did, that for a writer like him being shortlisted for the Booker is the difference between winning and losing, between finding a readership or merely remaining in the literary ghetto, respected and admired but not much known or read.

It is hard to think of another artist whose life has been more changed by winning a cultural prize than the American-born, London-resident Lionel Shriver, whose challenging epistolary novel We Need To Talk About Kevin won the Orange Prize in 2005 and has since sold more than 400,000 copies. Before Kevin, which is about a mother's attempts to understand a wicked and murderous son, Shriver was struggling to earn a living from scraps of journalism tossed to her from the high table of remote commissioning editors. Kevin is reported to be her seventh novel. In fact, it was her eighth - her seventh novel remains unpublished; no one wanted it. No one seemed to want Kevin either, until Serpent's Tail bought it for £2,500. The expectations were low. 'It's said that Kevin was rejected by more than 30 publishers before it came to me,' says Pete Ayrton, who runs the vibrant independent Serpent's Tail. 'Lionel's career was in the doldrums. Her track record wasn't good.'

And yet she continued to write, even as each new book quickly disappeared into the oblivion of the remainder bin and pulping pit. 'Cultural prizes are often given safely to someone who doesn't need one,' Shriver told me recently. 'In my case, the Orange Prize did what prizes are supposed to do - that is, to draw cultural attention to someone hitherto unknown and working very hard, which is why in my acceptance speech for the prize I said that there were a large population of such people.'

Since Kevin won the Orange, Shriver has become not only a bestselling author, with a backlist back in print and a lucrative new book deal from HarperCollins, but also a widely published commentator and columnist. Being a prize-winner has given her reach and authority; people listen to her. 'You do become resentful when you are working, as I did for 12 years, without being noticed,' she says now. 'It was becoming increasingly difficult to get my work into print. There is such a difference between having won one prize and none. You've got the cultural imprimatur. You feel anointed. But you shouldn't trust this thing. My agent keeps encouraging me to consolidate my gains by going on reading tours and so on. I guess it's all about building and keeping an audience. You keep doing it for now because, as a former nobody, you fear that your coach will turn into a pumpkin. I do feel lucky. And I do have a sense of a parallel future which could have been so different if I hadn't won the Orange. But if you ask me if I'd prefer to have had early success or what happened to me, I'd choose my story; I like my story. I like mine a lot.'

Shriver is indeed one of the lucky ones - and I like her story as well. But for every winner like her, there are tens of thousands of anonymous artists competing for recognition, their cultural capital undervalued, their currency depreciating with each new artwork that passes unacknowledged in our economy of prestige.

Are there too many prizes? Is this convergence of art and commerce a sign of a deeper cultural decadence, as Ezra Pound would have had it? These, I think, are the wrong questions. Of course the whole prize-giving culture is bound up with celebrity and commerce and globalisation and our omnipresent media landscape. It is also essentially part of a game, a jamboree. It is fun to go to or watch the awards ceremonies, fun to argue about who has been excluded, and even more fun to be on the inside as a judge and, above all, to win.

But it shouldn't be taken too seriously, especially when one recalls that the very first Nobel for Literature, in 1901, the award that set the modern prize train in motion, was won by, er, Sully Prudhomme. Yes, that's right, Sully Prudhomme. One of the unlucky losers that year was Leo Tolstoy. The author of War and Peace and Anna Karenina never won the prize. Sully Prudhomme, the author of... (well, you tell me!) did. Life is short, but art can be long indeed, with or without prizes.

· Too many prizes, or not enough? What would you give a prize for? Email review@observer.co.uk

Tuesday, October 17, 2006

The Postmodern Moralist (by Pankaj Mishra, the New Yorker)


Note to the Capture: The Book!

Note to the Capture: The Author!

Review by PANKAJ MISHRA
Published: March 12, 2006


Reading David Foster Wallace's new collection of magazine articles, you could be forgiven for thinking that the author of such defiantly experimental fictions as "Infinite Jest" (1996) and "Oblivion" (2004) has been an old-fashioned moralist in postmodern disguise all along. The grotesqueries of the 15th annual Adult Video News Awards, which Wallace writes about at considerable length here, present an easy target. And so, to a lesser extent, do the corruptions of English usage in America and the right-wing radio host John Ziegler. But Wallace poses an unsettling challenge to the way many of us live now when, while visiting the Maine Lobster Festival on behalf of Gourmet magazine, he asks if it is "all right to boil a sentient creature alive just for our gustatory pleasure." His longing for the apparently rare virtues of frankness and sincerity in public life makes him admire John McCain, despite the senator's "scary" right-wing views.



Turning to literature in essays on Kafka, Dostoyevsky and Updike, Wallace employs a largely moral vocabulary to dismiss such older American novelists as Norman Mailer and Philip Roth as "Great Male Narcissists." For him, Updike is "both chronicler and voice of probably the single most self-absorbed generation since Louis XIV." In contrast, he is all praise for Dostoyevsky, largely because the Russian writer's "concern was always what it is to be a human being — that is, how to be an actual person, someone whose life is informed by values and principles, instead of just an especially shrewd kind of self-preserving animal."

Indeed, reading Dostoyevsky revives Wallace's old complaint that American writers face an unparalleled difficulty in trying to create a literature informed by ethical values and principles. In an earlier essay titled "E Unibus Pluram: Television and U.S. Fiction," Wallace claimed that television in its more sophisticated phase had appropriated the "rebellious irony" of the first postmodern writers (Pynchon, Barthelme, Gaddis, Barth), thereby pre-empting and defusing the "critical negation" that was the literary and moral responsibility of his generation of writers. More than a decade later, Wallace remains convinced that "many of the novelists of our own place and time look so thematically shallow and lightweight, so morally impoverished, in comparison to Gogol or Dostoyevsky."

This is strong stuff — Wallace's blithely assertive manner helps him cover much rhetorical ground very quickly, even if a firmer belief in understatement might have helped him avoid such unhelpful generalizations as "our present culture is, both developmentally and historically, adolescent." Given such vehemence, it seems fair to point out that compared with the Russian masters, most novelists of any time or place are likely to look shallow and lightweight. And, at their best, the Great Male Narcissists have appeared to possess the "degrees of passion, conviction and engagement with deep moral issues" that for Wallace distinguish the Russians from contemporary American writers.

You also wonder if television could really have squandered the ironic self-consciousness that was supposed to be Wallace's spiritual inheritance from the postmodernists. But there is not much point in denying Wallace his passion, his outraged sense that he has arrived much too late in history. For it is Wallace's nostalgia for a lost meaningfulness — as distinct from meaning — that gives his essays their particular urgency, their attractive mix of mordancy and humorous ruefulness.

This nostalgia explains, among other things, his attraction to the straight-talking senator from Arizona. Originally written for Rolling Stone, and reproduced in full here, his description of the week of the primaries during which McCain failed to survive Karl Rove's negative campaign is the strongest piece in this collection. Although Wallace never gets to meet his subject, he manages to show just how political spin-doctoring has evolved since 1972, when Timothy Crouse (in "The Boys on the Bus") and Hunter Thompson (in "Fear and Loathing on the Campaign Trail") covered the clumsy attempts at it by the Nixon and McGovern campaign staffs. He is bracingly insightful, too, about the equally cynical process whereby representatives of major TV networks and the mainstream press "select" their news.

But so vast is Wallace's intellectual energy and ambition that he always wants to do more than what anyone else can reasonably achieve in a magazine article — and he has some enviably indulgent editors. He wishes, as much in his nonfiction as in his fiction, "to antagonize," as he said in an interview in 1993, "the reader's sense that what she's experiencing as she reads is meditated through a human consciousness." Accordingly, Wallace appears as a character in his own reportage, and, though he may not like the comparison to a Great Male Narcissist, he reminds one most of the author of "Armies of the Night" as he strives for full self-disclosure.

He tells us, in eye-straining small print, how and why the McCain piece was commissioned and edited, and what the "dozen high-end journalists" who were with him looked like. This is the kind of ironic self-consciousness one would ordinarily be relieved to see confined to "Friends" and "Seinfeld." Happily, Wallace's dazzling powers of description often redeem his bloggerlike tendency to run on. Here, for instance, is his description of a New York Times reporter on the McCain campaign: "A slim calm kindly lady of maybe 45 who wears dark tights, pointy boots, a black sweater that looks home-crocheted and a perpetual look of concerned puzzlement, as if life were one long request for clarification."

Clarification is also what Wallace seeks, though not of the political kind. It may seem odd that he doesn't mention McCain's voting record in the Senate — the clearest indication of the candidate's politics, perhaps even of his sincerity or lack thereof — in an article more than 15,000 words long. But then he wants, above all, to figure out "whether John McCain is a real leader or merely a very talented political salesman, an entrepreneur who's seen a new market niche and devised a way to fill it." He credits McCain's appeal among the young to the fact that they are "starved" for "just some minimal level of genuineness in the men who want to 'lead' and 'inspire' them." He himself thinks it a "huge deal" that McCain, a former fighter pilot who bailed out over Hanoi, rejected, on pain of torture, an offer of unconditional release from his Vietnamese captors.

Wallace keeps stressing this exemplary war record, which seems sufficient proof to him of McCain's moral authority, if not of his political judgment. And much of the essay really works out the tension between Wallace the postmodernist obsessed with "packaging and marketing and strategy and media and spin," and Wallace the moralist seeking evidence of a rooted and authentic self. It is as though Wallace cannot stop expecting McCain to somehow transcend the deceptions and distortions of the spin doctors and the media and remain true to himself: to the McCain who refused to leave prison in Vietnam, and whose moral character has survived an even longer confinement inside the Beltway.

Wallace is never sure if McCain is "truly 'for real.' " But such doubts, repeatedly expressed, merely reveal the larger cultural assumption Wallace is working with: that some fixed essence — the real McCain — lies beyond the wilderness of signifiers unleashed by the spin doctors and the media, and that somewhere out there this all-American hero still exists, untouched by the compromises and expediencies of everyday politicking, and busily realizing the countercultural ideal of "authenticity."

A conventional, rather masculinist notion of personal identity and selfhood also infiltrate Wallace's review of the tennis player Tracy Austin's autobiography. Here, he mistakes precociously and ruthlessly honed skill in a commercialized sport for "genius." As he doggedly examines why Austin's child-prodigy brilliance as a tennis player does not translate into emotional and intellectual profundity, it is hard not to be reminded of Robert Musil's epic "Man Without Qualities" (1943), in which the protagonist, Ulrich, is disturbed enough by the journalistic imputing of genius to sportsmen and racing horses to renounce his ambition for personal greatness.

Writing in the late 1920's, Musil recalled a recently superseded culture in which greatness "was exemplified by a person whose courage was moral courage, whose strength was the strength of a conviction, whose steadfastness was of heart and virtue, and who regarded speed as childish . . . and agility and verve as contrary to dignity."

Wallace does not have this sense of history, which was indispensable to a moralist like Musil — or, indeed, Mencken, Wallace's precursor in the distinguished American tradition of boisterous iconoclasm. What he has instead is nostalgia, for a time when writers possessed moral courage and conviction, and it is no less affecting. Still, it doesn't seem to liberate him entirely from the prejudices and assumptions of his own historical moment — and class. Something of the graduate-school seminar room still clings to his worldview. Trying to explain, for instance, why many American writers have "an ironic distance from deep convictions or desperate questions," he concludes that the modernists "elevated aesthetics to the level of ethics" and writers thereafter have had to meet the "requirement of textual self-consciousness imposed by postmodernism and literary theory."

Literary theorists may long for, but have never actually possessed, such power and influence. If some American writers have a carefully hedged relation with actuality, or prefer an evasive irony over passionate engagement, this has at least something to do with their membership, in these days of generous publishing advances, fellowships and grants, in their country's most privileged classes. Wallace is clearly an exception. Certainly, few of his young peers have spoken as eloquently and feelingly as he has about the hard tasks of the moral imagination that contemporary American life imposes on them. Yet he often appears to belong too much to his own times — the endless postmodern present — to persuasively explain his quarrel with them.



Pankaj Mishra's most recent book is "An End to Suffering: The Buddha in the World." His new book, "Temptations of the West: How to Be Modern in India, Pakistan, Tibet and Beyond," will be published in June.

The Nutty Professors (by Anthony Grafton, the New Yorker)



Note to the Capture: Theodor Mommsen

THE NUTTY PROFESSORS
The history of academic charisma.
by ANTHONY GRAFTON
Issue of 2006-10-23
Posted 2006-10-16

Anyone who has ever taught at a college or university must have had this experience. You’re in the middle of something that you do every day: standing at a lectern in a dusty room, for example, lecturing to a roomful of teen-agers above whom hang almost visible clouds of hormones; or running a seminar, hoping to find the question that will make people talk even though it’s spring and no one has done the reading; or sitting in a department meeting as your colleagues act out their various professional identities, the Russian historians spreading gloom, the Germanists accidentally taking Poland, the Asianists grumbling about Western ignorance and lack of civility, and the Americanists expressing surprise at the idea that the world has other continents. Suddenly, you find yourself wondering, like Kingsley Amis’s Lucky Jim, how you can possibly be doing this. Why, in the age of the World Wide Web, do professors still stand at podiums and blather for fifty minutes at unruly mobs of students, their lowered baseball caps imperfectly concealing the sleep buds that rim their eyes? Why do professors and students put on polyester gowns and funny hats and march, once a year, in the uncertain glory of the late spring? Why, when most of our graduate students are going to work as teachers, do we make them spend years grinding out massive, specialized dissertations, which, when revised and published, may reach a readership that numbers in the high two figures? These activities seem both bizarre and disconnected, from one another and from modern life, and it’s no wonder that they often provoke irritation, not only in professional pundits but also in parents, potential donors, and academic administrators.


Not that long ago, universities played a very different role in the public imagination, and top academics seemed to glitter as they walked. At a Berlin banquet in 1892, Mark Twain, himself a worldwide celebrity, stared in amazement as a crowd of a thousand young students “rose and shouted and stamped and clapped, and banged the beer-mugs” when the historian Theodor Mommsen entered the room:



This was one of those immense surprises that can happen only a few times in one’s life. I was not dreaming of him; he was to me only a giant myth, a world-shadowing specter, not a reality. The surprise of it all can be only comparable to a man’s suddenly coming upon Mont Blanc, with its awful form towering into the sky, when he didn’t suspect he was in its neighborhood. I would have walked a great many miles to get a sight of him, and here he was, without trouble, or tramp, or cost of any kind. Here he was, clothed in a titanic deceptive modesty which made him look like other men. Here he was, carrying the Roman world and all the Caesars in his hospitable skull, and doing it as easily as that other luminous vault, the skull of the universe, carries the Milky Way and the constellations.




Mommsen’s fantastic energy and work ethic—he published more than fifteen hundred scholarly works—had made him a hero, not only among scholars but to the general public, a figure without real parallels today. The first three volumes of his “History of Rome,” published in the eighteen-fifties, were best-sellers for decades and won him the Nobel Prize in Literature in 1902. Berlin tram conductors pointed him out as he stood in the street, leaning against a lamppost and reading: “That is the celebrated Professor Mommsen: he loses no time.” Mommsen was as passionately engaged with the noisy, industrializing present as with the ancient past. As a liberal member of the Prussian legislature, he fought racism, nationalism, and imperialism, and clashed with Bismarck. Yet Mommsen knew how to coöperate with the government on the things that really mattered. He favored reorganizing research in the humanities along the autocratic, entrepreneurial lines of the big businesses of his time—companies like Siemens and Zeiss, whose scientific work was establishing Germany as the leading industrial power in Europe. This approach essentially gave rise to the research team, a group of scholars headed by a distinguished figure which receives funding to achieve a particular goal. Mommsen’s view was that “large-scale scholarship—not pursued, but directed, by a single man—is a necessary element in our cultural evolution.” He won public support for such enterprises as a vast collection, still being amassed, of the tens of thousands of inscriptions that show, more vividly than any work of literature, what Roman life was like. He also advised the Prussian government on academic appointments, and helped make the University of Berlin and the Prussian Academy of Sciences the widely envied scientific center of the West—the Harvard, you might say, of the nineteenth century.



The model that Mommsen represented was revered and imitated around the world. In the United States, the new universities founded after the Civil War—Clark, Johns Hopkins, and Chicago—set out to gain prominence as Berlin had: by becoming research institutions and competing to attract faculty stars. In 1892, the University of Chicago, then two years old, wooed the historian Hermann von Holst away from Freiburg by promising him more than five times his previous salary. New labs and libraries popped up in cities and college towns across the country—at least until the Depression and the Second World War created other priorities. The age of academic prosperity that has lasted, with interruptions, from the nineteen-eighties to the present, and that has inspired campus novels and provoked skirmishes in the culture wars, has arguably been little more than an ironic replay of that late-nineteenth-century zenith, with academic stars fighting as hard for their own preferment as Mommsen did for the young and gifted.


But what does the academic agenda of the modern research-based university have to do with the other side of college life as we know it—with fraternity pledges, the choruses of “Gaudeamus igitur,” the stone façades of Victorian Gothic buildings? The mixed inheritance of the modern university is the subject of a new book with the somewhat oxymoronic title “Academic Charisma and the Origins of the Research University,” by William Clark, a historian who has spent his academic career at both American and European universities. Clark thinks that the modern university, with its passion for research, prominent professors, and, yes, black crêpe, took shape in Germany in the eighteenth and nineteenth centuries. And he makes his case with analytic shrewdness, an exuberant love of archival anecdote, and a wry sense of humor. It’s hard to resist a writer who begins by noting, “Befitting the subject, this is an odd book.”




Clark’s story starts in the Middle Ages. The organizations that became the first Western universities, schools that sprang up in Paris and Bologna, were in part an outgrowth of ecclesiastical institutions, and their teachers asserted their authority by sitting, like bishops, in thrones—which is why we still refer to professorships as chairs—and speaking in a prescribed way, about approved texts. “The lecture, like the sermon, had a liturgical cast and aura,” Clark writes. “One must be authorized to perform the rite, and must do it in an authorized manner. Only then does the chair convey genuine charisma to the lecturer.” Clark assumes his notion of charisma, loosely but clearly, from the work of Max Weber, who developed the idea that authority assumes three forms. Traditional authority, the stable possession of kings and priests, rested on custom, “piety for what actually, allegedly or presumably has always existed.” Charismatic authority, wild and disruptive, derived from “the exceptional sanctity, heroism or exemplary character of an individual person.” Rational authority, the last of the three forms to emerge, represented the rise of bureaucratic procedure, dividing responsibilities and following precise rules.



As Weber pointed out, in real organizations these different forms of authority interact and collide. In the medieval classroom, for all its emphasis on tradition-bound hierarchy and order, a contrary force came into play, one that unleashed the charisma of talented individuals: the disputation, in which a respondent affirmed the thesis under discussion and an opponent attempted to refute it. (Unlike the lecture, the disputation hasn’t survived as an institution, but its modern legacy includes the oral defenses that Ph.D. candidates make of their theses, and the format of our legal trials.) Clark calls the disputation a “theater of warfare, combat, trial and joust,” and, indeed, early proponents likened it to the contests of athletic champions in ancient Rome.



One early academic champion was the Parisian master Abelard, who cunningly used the format of the disputation to point up the apparent inconsistencies in orthodox Christian doctrine. He lined up the discordant opinions of the Fathers of the Church under the deliberately provocative title “Sic et Non” (“Yes and No”) and invited all comers to debate how the conflicts might be resolved. His triumphs in these “combats” made him, arguably, the first glamorous Parisian intellectual. A female disciple, Héloïse, wrote to him, “Every wife, every young girl desired you in absence and was on fire in your presence.” Their story has become a legend because of what followed: Héloïse, unwed, had a child by Abelard, her kin castrated him in revenge, and they both lived out their lives, for the most part, in cloisters. But even after Abelard’s writings were condemned and burned, pupils came from across Europe hoping to study with him. He had the enduring magnetism of the hotshot who can outargue anyone in the room.



Traditionalist plodders and charismatic firebrands shared the university from the beginning. The heart of Clark’s story, however, takes place not during the Middle Ages but from the Renaissance through the Enlightenment, and not in France but in the German lands of the Holy Roman Empire. This complex assembly of tiny territorial states and half-timbered towns had no capital to rival Paris, but the little clockwork polities transformed the university through the simple mechanism of competition. German officials understood that a university could make a profit by attaining international stature. Every well-off native who stayed home to study and every foreign noble who came from abroad with his tutor—as Shakespeare’s Hamlet left Denmark to study in Saxon Wittenberg—meant more income. And the way to attract customers was to modernize and rationalize what professors and students did.



These German polities called themselves “police states”—not in the sense of being oppressive but, as Clark explains, in the sense that they tried “to achieve the good policing, die gute Policey, of the land by monitoring and regulating the behavior of subjects by paperwork.” At first, what Policey meant for the universities was just finding out what the professors were up to. Bureaucrats pressured universities to print catalogues of the courses they offered—the early modern ancestor of the bright brochures that spill from the crammed mailboxes of families with teen-age children. Gradually, the bureaucrats devised ways to insure that the academics were fulfilling their obligations. In Vienna, Clark notes, “a 1556 decree provided for paying two individuals to keep daily notes on lecturers and professors”; in Marburg, from 1564 on, the university beadle kept a list of skipped lectures and gave it, quarterly, to the rector, who imposed fines. Others demanded that professors fill in Professorenzetteln, slips of paper that gave a record of their teaching activities. Professorial responses to such bureaucratic intrusions seem to have varied as much then as they do now. Clark reproduces two Professorenzetteln from 1607 side by side. Michael Mästlin, an astronomer and mathematician who taught Kepler and was an early adopter of the Copernican view of the universe, gives an energetic full-page outline of his teaching. Meanwhile, Andreas Osiander, a theologian whose grandfather had been an important ally of Luther, writes one scornful sentence: “In explicating Luke I have reached chapter nine.”



Bureaucracy has its own logic, and officials pushed for results that looked rational: results that they could codify, sort, and explain to their masters. Glacially, the universities responded. The old disputations were discontinued. These had always placed greater emphasis on formal skill in argument than on truth of outcome, and during the Baroque period and the Enlightenment they came to seem sterile and farcical. (Rather like department meetings and creative-writing workshops today, they had begun to inspire biting satires.) Instead, the universities instituted formal examinations—exercises that were carefully graded and recorded by those who administered them. Doctoral candidates had to defend printed dissertations. Clark wonderfully describes these strenuous, scary exercises. When Dorothea Schlözer, the daughter of a professor, underwent her examination for a doctorate at Göttingen in 1787, she confronted a committee of seven examiners. In deference to her sex, she was seated not at the far end of the table, facing the professors, but between two of them. The examination—which was interrupted for tea—allowed for masterly displays of professorial snideness. One professor “pulled a rock out of his pocket and asked her to classify it. After a couple more questions, he said he was going to ask her one on the binomial theorem, but, as he reckoned most of his own colleagues knew nothing of it, he decided to skip it.” The student calmly outperformed her masters. When another professor asked about art history, she noted that she had not listed this topic on her résumé, and thus should not be asked about it—but then she answered anyway. After about two hours, a professor who had been silent until then interrupted a colleague to note that “it was 7:30 and time to quit.” Schlözer passed.



In an even more radical break with the past, professors began to be appointed on the basis of merit. In many universities, it had been routine for sons to succeed their fathers in chairs, and bright male students might hope to gain access to the privileged university caste by marrying a professor’s daughter. By the middle of the eighteenth century, however, reformers in Hanover and elsewhere tried to select and promote professors according to the quality of their published work, and an accepted hierarchy of positions emerged. The bureaucrats were upset when a gifted scholar like Immanuel Kant ignored this hierarchy and refused to leave the city of his choice to accept a desirable chair elsewhere. Around the turn of the nineteenth century, the pace of transformation reached a climax.


In these years, intellectuals inside and outside the university developed a new myth, one that Clark classes as Romantic. They argued that Wissenschaft—systematic, original research unencumbered by superstition or the authority of mere tradition—was the key to all academic achievement. If a university wanted to attract foreign students, it must appoint professors who could engage in such scholarship. At a great university like Göttingen or Berlin, students, too, would do original research, writing their own dissertations instead of paying the professors to do so, as their fathers probably had. Governments sought out famous professors and offered them high salaries and research funds, and stipends for their students. The fixation on Wissenschaft placed the long-standing competition among universities on an idealistic footing.



Between 1750 and 1825, the research enterprise established itself, along with institutions that now seem eternal and indispensable: the university library, with its acquisitions budget, large building, and elaborate catalogues; the laboratory; the academic department, with its fellowships and specialized training. So did a new form of teaching: the seminar, in which students learned by doing, presenting reports on their original research for the criticism of their teachers and colleagues. The new pedagogy prized novelty and discovery; it was stimulating, optimistic, and attractive to students around the world. Some ten thousand young Americans managed to study in Germany during the nineteenth century. There, they learned that research defined the university enterprise. And that is why we still make our graduate students write dissertations and our assistant professors write books. The multicultural, global faculty of the American university still inhabits the all-male, and virtually all-Christian, research universities of Mommsen’s day.



Clark leads the reader through these transformations, year by year and document by document. He also uses the ancient universities of Oxford and Cambridge as a traditionalist foil to the innovations of Germany. Well into the nineteenth century, these were the only two universities in England, and dons—who were not allowed to marry—lived side by side with undergraduates, in an environment that had about it more of the monastery than of modernity. The tutorial method, too, had changed little, and colleges were concerned less with producing great scholars than with cultivating a serviceable crop of civil servants, barristers, and clergymen. The eighteenth century, which saw the flowering of modern German academe, marked a nadir recorded by Edward Gibbon, the Magdalen College dropout who became the greatest historian of imperial Rome, in memorable (and slightly exaggerated) terms:



The fellows or monks of my time were decent easy men, who supinely enjoyed the gifts of the founder. Their days were filled by a series of uniform employments; the chapel and the hall, the coffee-house and the common room, till they retired, weary and well-satisfied, to a long slumber. From the toil of reading or thinking or writing they had absolved their conscience, and the first shoots of learning and ingenuity withered on the ground.




Yet, even at Oxford, some scientists and scholars offered innovative lecture courses, and, conversely, the innovative German universities did not abandon all the old ways of doing things. Professors continued to give lectures as well as to hold seminars. Academic ceremonies continued to take place, and continued to do a great deal for the reputations of universities—especially once the giving of honorary degrees began to attract the attention of newspapers. Invented traditions, moreover, proved as attractive as ancient ones—particularly at universities that drew young men of high birth and others with social pretensions. Nineteenth-century German students were even more dedicated to duelling with sabres and attending formal banquets (such as the one at which Twain saw Mommsen) than they were to original research. Twain himself was as charmed by the picturesque duelling corps and taverns of Heidelberg as he was by the avatars of modern Wissenschaft in Berlin.



Similarly, although the hiring of professors became more meritocratic, administrators faced the enduring problem of how to assess merit systematically. Clark demonstrates this by inviting us to accompany Friedrich Gedike, a Prussian minister, on the visits he made to fourteen universities in June and July, 1789, just as the French Revolution was breaking out. Where his sixteenth- and seventeenth-century predecessors would have asked about the character and teaching abilities of local professors—did they have an audience, were they punctual, were they too friendly with the students?—Gedike undertook a ruthless talent search in an academic world where states competed for researchers. At the University of Göttingen, for instance, a hub of innovation only half a century old, he found an interesting anomaly. Professors tended to remain frozen at their acquisition salaries unless they could extract more money with the leverage of an outside offer. And, because universities mostly wanted to hire professors whose greatest works were still ahead of them, junior professors were often paid more than senior ones. Hence, academics at Göttingen found the whole subject of salaries too embarrassing to discuss, and Gedike had to collect information from “sensible and well-informed students, rather than professors.”



Gedike asked sharp, precise questions, but his judgments were, necessarily, reliant on the words of the specialists he spoke to. His report offered a long and precise evaluation of Christian Gottlob Heyne, the classicist who had done more than any other professor to make Göttingen a world-class center of learning. But often he could do little more than offer character assessments—“timid,” “hypochondriac,” “very sinister and misanthropic”—of the eccentrics who dominated the various faculties. In essence, Gedike and his colleagues gathered academic gossip and passed it on. The opinions were compiled, the decisions were made, and the jobs were handed out, not solely on the basis of rational, informed scrutiny of candidates’ merits but also on the basis of what people who might know something had to say about who was hot and who was not. These procedures are all too familiar to anyone who has taken part in academic hiring decisions today. A committee sits in a room, discussing folders full of organized gossip—and, nowadays, densely technical reports—about professors at other universities. Then it does its best to decide which of them to hire and what it will take to attract them—even though no one in the room may be competent to sum up, much less assess, the work of the candidates in question. We apply our best hermeneutics to the C.V. and letters of recommendation, discount known feuds, add points for this and that—and then, somehow, arrive at a decision.



As Clark shows, the assessment of professors is only one incidence of a much larger phenomenon. Universities are strange and discordant places because they are palimpsests of the ancient and the modern. Their history follows a Weberian narrative of rationalization, but it also reveals the limits of that rationalization. Mommsen, for all his modernity, spoke and wrote elegant, lucid Latin, like the humanists of the Renaissance, and enjoyed traditional academic ceremonies. Modern universities sincerely try to find the best scholars and scientists, those who work on the cutting edge of their fields, but they are also keen to preserve the traditional aspects of their culture and like their professors to wear their gowns with an air. They hope that some undefined combination of these qualities will attract the best crop of seventeen-year-olds available.


In the end, Clark never fully anatomizes how individual academics—those strange creatures flapping about in their batlike gowns—came to possess inherent charisma, as opposed to the authority conferred on them by chairs, titles, and the other “material practices” that form the core of his study. After all, charisma is to some extent irreducible; in the classroom a scholar can inspire by sheer force of intellect and personality, an effect to which bureaucratic reports seldom do justice. But Clark is shrewd in charting one aspect of academic charisma—namely, the importance of asceticism in creating an aura of greatness. Mommsen, with his heroic self-control and self-abnegation, had many precursors. The roots of academic asceticism surely lie in the university’s monastic prehistory. Indeed, Gadi Algazi, an Israeli historian, has shown that although German scholars, unlike their English counterparts, were allowed to marry and set up households from the fifteenth century onward, they took endless pains to show that they demanded big houses only so that they could work uninterrupted and married only so that they could have orderly, well-run homes.


In the eighteenth and nineteenth centuries, professorial asceticism moved from the home to the workplace, where it took new forms, most notably that of productivity on an epic, and sometimes eccentric, scale. The new model professor wore himself out: greatness of mind and depth of learning, like beauty, could be attained only through suffering. Christian Gottlob Heyne, who integrated the visual arts into the formal study of antiquity, also ran Göttingen’s university library—one of the largest and best organized in Europe—and published reviews of some eight thousand of the books that he obtained and catalogued for the university’s collection. Heyne’s pupil Friedrich August Wolf became legendary by similar means. As a scholar, his importance rested on his 1795 “Prolegomena to Homer”—an enormously successful book, though only the first volume ever appeared and it was written in Latin—which argued that the Iliad and the Odyssey were collections of originally oral poems, assembled by the poet-scholars of Hellenistic Alexandria. But what really made him a celebrity was his combination of daring and self-denial. Wolf insisted on registering as a student not of theology but of philology, even though the few available jobs for graduates were for ministers rather than scholars. Heyne showed him his desk, piled with letters from schoolteachers “who tell me that they would be glad to be hanged, from actual destitution,” but Wolf persevered. He replaced the student’s usual pigtail with a wig, so that he would not have to go to the barber; stayed away from the taverns where students caroused and the salons where they met young women; and even stopped attending lectures, since he thought that his time could be more productively spent reading the assigned books. He infuriated his teacher by reading ahead of the class and taking out all the library books that Heyne needed to prepare his lectures. And his reward came soon: a professorship at Halle, at the age of twenty-four. This brilliant, bitter nonconformist paradoxically became a model for later generations of students. No wonder observers praised Mommsen’s ceaseless industry so extravagantly half a century later: he was not only doing history at a superb level but also living an ascetic ideal that still mattered.




Today, academic charisma—and the ascetic life of scholarship that goes with it—retains a central place in the life of universities. Scholars in all fields continue to gain preferment because they are “productive” (the academic euphemism for obsessive), and students continue to emulate them. Future investment bankers pull all-nighters delving into subjects that they will never need to know about again, and years later, at reunions, they recall the intensity of the experience with something close to disbelief—and, often, passionate nostalgia. The university has never been a sleek, efficient corporation. It’s more like the military, an organization at once radically modern and steeped in color and tradition. And it’s not at all easy to say how much of the mystique could be stripped away without harming the whole institution. If you thoroughly rationalize charisma, can it remain charismatic?


If Clark helps us to understand why the contemporary university seems such an odd, unstable compound of novelty and conservatism, he also leaves us with some cause for unease. Mommsen may have liked to see himself as a buccaneering capitalist, but his money came from the state. Today, by contrast, dwindling public support has forced university administrators to look for other sources of funding, and to assess professors and programs through the paradigm of the efficient market. Outside backers tend to direct their support toward disciplines that offer practical, salable results—the biological sciences, for instance, and the quantitative social sciences—and universities themselves have an incentive to channel money into work that will generate patents for them. The new regime may be a good way to get results, but it’s hard to imagine that this style of management would have found much room for a pair of eccentrics like James Watson and Francis Crick, or for the kind of long-range research that they did. As for the humanities, once the core of the enterprise—well, humanists these days bring in less grant money than Mommsen, and their salaries and working conditions reflect that all too clearly. The inefficient and paradoxical ways of doing things that, for all their peculiarity, have made American universities the envy of the world are changing rapidly. What ironic story will William Clark have to tell a generation from now?