Monday, February 28, 2005

 

Pointers

New content was sparse last week, I know. Hopefully more this week. I have some posts to write, but I am making myself put in a day of work first.

In the meantime, check out the latest History Carnival at Detrimental Postulation. Also, thanks to Inside Higher Ed for linking to my dissertation haikus. If you haven't heard about Inside Higher Ed yet, haikued you not have?

Wednesday, February 23, 2005

 

Dissertation haiku

As we will see,
On the other hand, therefore ...
What am I saying?

Control-A! Delete!
Take that, you accursed crap!
Blink, blink ... Control-Z.

Gordian chapters
that tighten then unravel,
I will cut you yet.

Tuesday, February 22, 2005

 

On persecution and prophecy

We went out of town for a brief and refreshing weekend getaway, and I returned to find that there are many blogs to read and little time to do it.

I also returned to find that my post on Dorothy Stang has been getting quite a few search engine hits. It also garnered the following comment, which I'm still trying to decipher:
I love Sis. Stang's passion for standing up for Christ, and interceded for her killers, she gained life where her real home is now! You should'nt weep for Sis. Stang, you should weep for yourselves, after a while her death will be a memory, and you will go on doing the things of the world, ex. defiling, and cohabiting yourselves with women, cursing others, speaking badly about others, having hatred and bitterness towards others, etc. A persons death is like a drug you get sad and depressed until sickness and later you find so-called joy in the things of the world, which will lead you to be perished. Get use to leaders being persecuted, and killed, for their purpose is to serve God, there will be more to come!!!!!!
I'm not sure how to identify the "yourselves" that I should be weeping for. I do cohabit myself with a woman -- my wife. I also confess that, far more often than I would like, I fall into habits of hatred and bitterness. And, alas, I also find many things in the world a source of real joy. Weep for me.

Behold, I started this post resolving not to be snarky, and look how poorly I've done already. But enough snarkiness. For lurking somewhere behind this comment, and appearing only faintly between the lines, I think there is a serious (and seriously wrong) idea, which deserves serious refutation. The idea is that the persecution of Christians is somehow inevitable. This explains the ominous exclamation at the end that we should just "get use[d] to leaders being persecuted," because "there will be more to come!!!!!!"

I suspect, although this is conjectural, that in recent years that idea has become increasingly prevalent among many American Christians, thanks to the success of books like the Left Behind series. From what I can tell, based on reliable second-hand reports, those books and countless others about the "end times" represent "persecution" as just another sign of the times. That Christians will be persecuted must be inevitable, according to this genre of very popular Christian writing, because it is one of the markers on the road to the world's annihilation. It has been foretold, and therefore it will be.

That's why The Rapture Index (via Positive Liberty) lists "Anti-Christian" persecution as just one of the various hash-marks on "the prophetic speedometer of end-time activity." (Added later: See also the similar way that persecution and prophecy are represented by The Great Separation. You'll be interested to note that, according to this site, many prophets are bloggers.)

But the use of the word "prophetic" to describe this view of persecution relies on a false conception about what "prophecy" is, at least according to the scriptural traditions out of which Christianity took shape. As Walter Brueggemann argues in his book, The Prophetic Imagination, "the dominant conservative misconception, evident in manifold bumper stickers, is that the prophet is a future-teller, a predictor of things to come (mostly ominous), usually with specific reference to Jesus." In stark contrast, the prophetic tradition in the Hebrew Bible, the tradition that would have mattered most to the writers of the New Testament, is "concerned with the future as it impinges upon the present."

The conservative Christian view of the prophet as "future-teller" disconnects the future from the present. For future-telling prophecy does not necessarily call for change in the present, any more than a fortune cookie telling me that I will have success in business would change the way I conduct business. Why would it call for change in the present, since it merely predicts that what will be will be, regardless of what happens here and now?

I once opened a cookie with the following fortune, bookended by two smiley faces: "A nice cake is waiting for you." I still have the fortune pinned to my bulletin board. But obviously I leave it there as a joke. As serious advice, it would be less than worthless. Does such a fortune admonish me to do anything in particular? Does it entail any particular ethical obligations on my part? No. The cake's there for me. There's nothing I have to do to get it, and there's nothing I can do to avoid it. It's just there, waiting.

What Brueggemann describes as the conservative misconception of prophecy is the theological equivalent of "A nice cake is waiting for you." Just substitute "crown" for "cake," and you have a perfect summary of the eschatology that is entailed by a one-dimensional view of the prophet as predictor. Of course, to round out that eschatology you would have to include a few other permutations. First, a horrible persecution is waiting for you. Then, a Rapture is waiting for you. Then, a climactic battle with the Antichrist is waiting for you. Finally, a nice crown is waiting for you. But it doesn't really matter, from a causal perspective, what order these fortunes come in. They are simply arranged chronologically and mostly arbitrarily, like tarot cards of the end times. On this misleading view of Christian eschatology, there is nothing to do but to wait for these fates that are waiting for us. The clock is ticking, and there is nothing you can do to stop it. At best you can merely devise ingenious ways of telling time.

The prophetic tradition with which Jesus identified, however, and with which his earliest followers were familiar, is starkly opposed to this picture of prophecy as a kind of static fatalism. Since I've quoted Brueggemann, I might as well recapitulate briefly the argument of his book. Brueggemann argues that the kind of prophecy envisioned in the Hebrew Bible always invoked the future as a way of indicting, and therefore changing, the present. The value of the prophetic tradition was its use of imaginative (and often frightening) visions of future possibilities as ways of compelling change in the here and now.

Sometimes those visions of the future may have appeared to take the form of simple future-telling. For example, a good thumbnail summary of the book of Jeremiah might begin: "A not-so-nice humble pie is waiting for you." But it would not end there. Rather, Jeremiah's prophecies always include a "because." Rather than obliterating the causal relationship between possible futures and the present, the prophet makes clear that the humble pie is waiting for you because (to loosely paraphrase) you went around whoring with other gods and taking advantage of your neighbors. Stop doing that, Jeremiah prophesies, and a nice cake will wait for you instead.

Prophets like Isaiah and Jeremiah are represented within the corpus of biblical literature as persecuted people. And there's no wonder that they were persecuted. After all, there is no need to persecute a walking fortune cookie. Nothing sweeter, more sugary, or more harmless could be imagined, even if the fortune is less than glowing. But there is every reason to persecute a walking indictment of the way things are in the present.

A prophet who says "to the king and the queen mother: 'Take a lowly seat, for your beautiful crown has come down from your head'" is likely to be unpopular with the king and the queen mother. He is likely to be persecuted, however, if he goes on to say that the reason for this prophecy is because the king "builds his house by unrighteousnesss, and his upper rooms by injustice; [and] makes his neighbors work for nothing, and does not give them their wages." That kind of prophet is not just unpopular with the powers that be. He is dangerous to them because he is likely to be popular with those who have not been getting their wages.

In short, a prophet who merely foretells that the high and mighty will be brought down can be tolerated, because a fortune about the future does not entail any necessary change in the present. Everyone eventually is brought low by death, so foretelling this about a king does not change much of anything. But the prophet who connects that future with the way that the mighty are treating the lowly now can count on persecution from the people wearing crowns: "This man deserves the sentence of death."

Even these elliptical observations about prophets like Jeremiah show why it is no wonder that augury and sorcery are routinely portrayed in the Hebrew Bible as tools of kings, whereas genuine prophets are usually opposed to those in power. Mere "soothsaying" literally sooths those in power, for like any good fortune teller, a soothsayer tries to tell his client what he wants to hear. Real prophecy, on the other hand, discomforts those in power now. A truly prophetic vision of the future calls into question the arrangement of power in the present.

Okay, so what does all of this amateur theologizing (apologies, apologies) have to do with Dorothy Stang, or with the confusing comment someone left me about her? Well, to Christians who profess an interest in conforming themselves to biblical traditions, I want to suggest that certain ideas about prophecy and persecution, implicit in the comment, get the relevant biblical ideas wrong.

The comment, according to my reading of it, tells us that the fact that Stang was killed and that she was a Christian should not surprise us, because (here's where I'm interpolating into the comment) the Bible (or the Left Behind books) predicted these things would happen. Stang's death is not only unsurprising, on this view -- it also fails to change anything about the way we should behave in the here and now. Indeed, the comment bleakly offers no hope that anything will change: Stang's memory will fade, and everyone will go back to doing what they do. The best we can do is to get used to the idea of more "cake," so to speak, because there's more to come.

This bleak vision of Christian persecution, I've been arguing, stems from a mistaken view of prophecy, which holds that because the suffering of Christians has been foretold, it shall be. Period. Such a view actually empties Stang's death of any specific meaning, because it means that the particular reasons why she died -- the "because" of the prophecy -- matter very little. She was a Christian; Christians will be persecuted, inevitably; ergo, she was persecuted. Maybe the speedometer of end-time activity will tick up a notch or two, but that's about it.

In contrast to these views, represented obliquely by the comment, I've been working up to this point: The idea that Jesus had about prophecy, according to the gospels, was the kind of prophecy represented by figures like Jeremiah. Yes, he foretold that his followers would be persecuted, but they would be persecuted like the prophets. That is, they would be persecuted because they would represent, as Jesus and Jeremiah both did, a threat to the powers that were. Brueggemann writes:
The coming of Jesus meant the abrupt end of things as they were. Two texts are commonly cited as programmatic for the preaching of Jesus. In Mark 1:15 he announced the coming of the kingdom. But surely implicit in the announcement is the counterpart that present kingdoms will end and be displaced. In Luke 4:18-19 he announced that a new age was beginning, but that announcement carried in it a harsh criticism of all those powers and agents of the present order. His message was to the poor, but others kept them poor and benefited from their poverty. He addressed the captives (which means bonded slaves), but others surely wanted that arrangement unchanged. He named the oppressed, but there are never oppressed without oppressors.

His ministry carried out the threat implicit in these two fundamental announcements. The ministry of Jesus is, of course, criticism that leads to radical dismantling. And as is characteristic the guardians and profiteers of the present stability are acutely sensitive to any change that may question or challenge the present arrangement. Very early Jesus is correctly perceived as a clear and present danger to that order.
According to the gospels, Brueggemann is saying, Jesus was persecuted because of his perceived threat to the established social order, not as a result of some ineluctable prediction that he would be persecuted. Announcing a new kingdom in which those defined in the present as the least powerful (women, the poor, the enslaved, the socially outcast) would become the greatest -- well, that struck ruling kings as dangerous, and they decided, consequently, to execute him like a common criminal, mocking his pretensions to be a "king of the Jews." But if this story about Jesus' cross is correct, then when Jesus told his disciples that they would also be persecuted, he was not giving them a fortune cookie: "A cross is waiting for you." He was saying, essentially, that a cross would await them because they did as he did. Insofar as his disciples also spoke truth to power on behalf of the powerless, they would be persecuted.

What this means is that deaths like Stang's are not inevitable -- mere markers of the "end times." Stang's death was meaningful precisely because it was contingent, not inevitable. It should be meaningful to Christians because (as best we can tell) it was connected to her prophetic yet peaceful resistance to people who were defrauding the poor and flaunting the law. She was killed because she was trying to change the present, as any good prophet does.

So a Christian informed by the biblical account of prophecy won't simply accept an interpretation of her death that makes it unsurprising and unremarkable. Such a Christian will not just chalk another one up to the end times and buckle her seat belt for more to come. Rather, the Christian should see Stang's death as a meaningful case of Christian persecution only because she was behaving as a Christian. And she was only behaving as a Christian if, like the Christian's namesake, she was threatening the present by announcing a kingdom in which the poor are rich, the least are greatest, and the last are first.

The commenter who inspired this post left behind a URL to Persecution.com. It's a site all about the persecution of Christians. Oddly enough, however, its statement of faith includes no reference to the idea that Christian persecution might be connected to the kind of challenge that Jesus posed to power. It contains no statement about Christian ethics, or any implication that those ethics might be causally related to Christian persecution.

In reading the site, and in reflecting on the potential sources of this comment, I recalled and returned to this passage in John Howard Yoder's masterwork, The Politics of Jesus:
Christian thought is accustomed to conceiving of "persecution" as a ritual or "religious" matter independent of any immediate ethical import. Christians are made to suffer because they worship the true God; what has this to do ... with an attitude to government, to violence, war, conflict? Is not being persecuted for the faith quite independent of social ethics?

Such a dichotomy between the religious and the social must be imported into the [biblical] texts; it cannot be found there. The "cross" of Jesus was a political punishment; and when Christians are made to suffer by government it is usually because of the practical import of their faith, and the doubt they cast upon the rulers' claim to be "Benefactor."
By disconnecting a "statement of faith" from the "practical import" of that faith, sites like Persecution.com disconnect persecution from ethics, prophecy from the present, and the cross from the announcement of a different kingdom. A good case could be made, I've tried to argue here, that such a theology of persecution and prophecy would have to be imported into the Bible, because it cannot be found there. It tries to turn the narrative of prophetic figures like Jeremiah and Jesus into a series of unrelated fortune-cookie slips.

I would not have spent so long belaboring that point if I did not feel that this woefully incomplete theology of persecution has serious political consequences in a country where many Christians seem to be rejoicing about their newfound electoral power.

For one thing, a view of persecution as inevitable discourages Christians from engaging in any efforts to make the world a more equitable place to live. There is no ethics behind the "nice cake" view of eschatology, only certain destruction. As Fred Clark put it in a different context,
"At a very basic level, this worldview opposes and undermines any long-term thinking, any sustained effort to make the world a better place -- replacing the hope of redemption with a perverse longing for apocalypse." This worldview was not Stang's, clearly. It was precisely her "sustained effort to make the world a better place" that led to her death. So it should not marshalled as evidence to support a "perverse longing" for the world to perish. Christians, presumably like the God they worship, do not desire the destruction of the world, but instead its renovation.

Second, and more obviously, it should be clear that a conservative misconception about prophecy and persecution fits neatly with the assumption of many American evangelical Christians that there's nothing wrong with the people in power. On this view, there's nothing wrong if a presumably Christian president pursues war, ignores poverty, or tells lies. Instead of being Jeremiahs in the court of Bush, most visible evangelical leaders are playing the role of soothsayer to the king. And facing no objections from a government that they fail to criticize, many American Christians instead dwell on their persecution at the hands of secular prophets of gloom and doom. The distorted view of persecution that characterizes much American Christianity today therefore leads to this truly bizarre conclusion: Christians can both be in power and still be persecuted.

Dorothy Stang, I'd like to believe, recognized that a prophet can't have it both ways, and of the two ways, the cross points to only one.

Friday, February 18, 2005

 

More on Stang

Thanks to Henry Richardson, Dorothy Stang's nephew, who stopped by to share some of his memories of his aunt.

 

Clippings

"Christianity has all too often meant withdrawal and the unwillingness to share the common suffering of humankind. But the world has rightly risen in protest against such piety. ... The care of another -- even material, bodily care -- is spiritual in essence. Bread for myself is a material question; bread for my neighbor is a spiritual one." -- Jacques Maritain

* * *

"We drifted along lazily, very happily, through the magical light of the late afternoon.

All those fall afternoons were the same, but I never got used to them. ... That hour always had the exultation of victory, of triumphant ending, like a hero's death -- heroes who died young and gloriously. It was a sudden transfiguration, a lifting-up of day."

-- From My Antonia by Willa Cather

* * *

However carefully we work, however devotedly we put our efforts into faithfulness and consistency and achievement and all the rest of it, there’s always something that escapes us. There’s a phrase in a poem by a Welsh Quaker poet I think you could only really translate as ‘the one who escapes the recruiting sergeants’. Jesus doesn’t let himself be recruited, and when you think you’ve got him he’s slipped round somewhere else.

So, there is that dimension of, I don’t know about divine laughter but… the irony of Christ’s presence and absence. The moment we think ‘This is it!’, that is my definition of the moment when it isn’t. In that sense, I suppose, God is always laughing. He’s always round that ironic corner for me.

And that’s rather liberating, because if I thought, ‘Right, I’ve got it now, God,’ it’s a terribly vulnerable place to be. You think it’s secure, but actually it is frightfully vulnerable, because if anything goes wrong, then God goes, too. Whereas if you think, ‘All right, I’ll do my best. I really think this is right,’ but round the corner is God who alone knows, it means that when things unravel there’s a spar to cling to in the ocean, which is the mystery that I’ve never yet got hold of.

-- Rowan Williams, in an interview in Third Way (thanks to Dave Rattigan for the link in a comment at The Parish)

* * *

"Political parties in America have been traditionally defined as competing organizations that 'put forward candidates for office, advovate particular courses of government action, and if their candidates win, create enough of a sense of joint responsibility for the direction of government.' However, political parties have increasingly become accepted by historians as reference groups through which voters 'define themselves.' Politics, then, 'acts as a sounding board for identities, values, fears, and aspirations.' And each political party's character is respectively shaped by the character of those groups whose concerns that party embodies. These conceptualizations suggest that much may be learned about a party by examining the social composition of its constituency."

-- Alan M. Kraut, "The Forgotten Reformers: A Profile of Third Party Abolitionists in Antebellum New York," in Antislavery Reconsidered (derived from this book meme that's been making the rounds for a while; I've waited to post on it until I had a book nearby that produced something interesting enough, which I guess is cheating)

Wednesday, February 16, 2005

 

Comments fixed

Ralph Luker informs me that my comment system has not been working. Sure enough, Ebenezer Orthodoxy had released a new version of his comment hack for blogger, which I had not installed. Apparently the old version no longer worked now that Blogger has updated its comment system. Thanks for pointing out the problem, Ralph.

I presume that this means comments have been broken for about a week, so I'm sure you're dying to let loose all of the comments you've been saving up since then.

 

On Douthat

Timothy Burke posted last Friday about Ross Douthat's critique of Harvard's curriculum in the Atlantic Monthly, which might have been appropriately and sardonically titled, "The Education of Ross Douthat."

Douthat doesn't go as far as Henry Adams; he doesn't refer to himself in the third person. But he might well have quoted Adams, his fellow Harvard alumnus, who made most of Douthat's points in 1918:
For generation after generation, Adamses and Brookses and Boylstons and Gorhams had gone to Harvard College, and although none of them, as far as known, had ever done any good there, or thought himself the better for it, custom, social ties, convenience, and above all, economy, kept each generation in the track. Any other education would have required a serious effort, but no one took Harvard College seriously. All went there because their friends went there, and the College was their ideal of social self-respect.

Harvard College, as far as it educated at all, was a mild and liberal school, which sent young men into the world with all they needed to make respectable citizens, and something of what they wanted to make useful ones. (p. 50)
Now, no one really takes Adams' lack of seriousness about Harvard seriously either. Indeed, Adams spends his entire autobiography complaining about the ill fit between his nineteenth-century education and his twentieth-century experience, but in the process of doing so he does a pretty good job convincing his readers that he's a pretty educated guy, in all senses of the word. Harvard must not have been that bad for Adams. After all, he went back to work there.

But back to Douthat. There has been more discussion of the article at Left2Right, Brad DeLong, and Matthew Yglesias. Douthat speaks back here and here.

Most of these discussions concern the covering laws that Douthat suggests about contemporary philosophy -- that the dearth of metaphysics and morality in philosophy departments has doomed the discipline to popular irrelevance. Several philosophers in the above discussions have bristled. Since this is a history blog, though, I might as well point out that historians make out even more poorly in Douthat's article, what with all their pointless essay questions, microhistory, and, of course, their postmodern sensibilities. (I might respond by invoking Henry Adams too: "In essence incoherent and immoral, history had either to be taught as such -- or falsified. Adams wanted to do neither." Ah, the eternal dilemma of the conscientious history teacher.)

In both the article and one of his blog responses to critics, Douthat takes history and their disciplinary cousins to task for what he calls "the tendency of the humanities to become more scientistic in various ways over the last half-century -- via the dominance of Theory in the English and Literature departments; via the emphasis on primary research, material history, etc. in the History Departments; or more recently, via the rise of rational-choice theory in the realm of political science." This seems to me an odd way of thinking about what it means for the humanities to be "scientistic," and it is especially disorienting that the article dubs this baneful tendency as "postmodern." Emphasizing primary research and adopting rational-choice theory also seem to me horses of very different colors; I'm not sure I see what the tendency that Douthat is identifying is, unless it's the standard complaint that humanistic writing has become more science-like -- jargony and what not.

What I really wanted to point out, however, was Douthat's response to those (like me) who believe he underestimates the crucial role of the student in higher education. His response is that we overestimate that role. I'm inclined to agree (as Burke also seemed to suggest) that a balance has to be struck between administrative facilitation and student self-motivation in order for that mysterious cocktail called "education" to be shaken up. Douthat's complaint about those who place the onus on students reads like this:
I tend to think if you take a bunch of teenagers, however smart they may be, and drop them into a stress-ridden, hypercompetitive school in which the only academic guidance takes the form of a terrible, terrible Core Curriculum, most of them will take the path of least resistance, seek out easy classes and popular, potentially lucrative concentrations (hello, economics!), and generally fail to get the most of their four years. Is this a moral failure on the students' part, and therefore something that the administration and faculty shouldn't be concerned with? DeLong et. al. seem to think so. Their attitude is apparently that if you didn't do a good job picking, without any kind of guidance (I don't know what the advising system was like in DeLong's era, but it's nearly nonexistent now), thirty-two classes out of the hundreds and hundreds of potential offerings that Harvard flings at you -- well, then tough luck, buddy. And good luck at the consulting firm.
It struck me while reading this how uncannily it sounds like a liberal view of government. We know from others of his writings that Douthat doesn't like what he calls "left-liberalism." But his impassioned critique of the university as a passive institution, and his defense of the betrayed student from the charge of "moral failure," sound an awful lot like he's a closet liberal. Try this experiment: go back through the paragraph above, and read it this way:
I tend to think if you take a bunch of [people], however smart they may be, and drop them into a stress-ridden, hypercompetitive [market] ... most of them will take the path of least resistance ... and generally fail to get the most of their [potential]. Is this a moral failure on the [people's] part, and therefore something that the administration and [government] shouldn't be concerned with? DeLong et. al. seem to think so. Their attitude is apparently that if you didn't do a good job picking, without any kind of guidance ... [life options] out of the hundreds and hundreds of potential offerings that [the world] flings at you -- well, then tough luck, buddy. And good luck [on the unemployment line].
Okay, it's a highly selective edit, but it raises this question: Why are so many conservative critics willing to make structural arguments about the failures of "the system" when it comes to higher education, while they sneer at the insistence of liberal critics that "failure" in the school of hard knocks is not knock-down evidence of "moral failure" on the part of people who need help now? Are the rules that apply to universities different from those that apply to all other social institutions?

(Cross-posted at Cliopatria.)

Tuesday, February 15, 2005

 

Dorothy Stang

I've been reading about Sister Dorothy Stang, a Catholic nun from Ohio who was shot three times in the face over the weekend in Brazil.

A naturalized Brazilian citizen, Stang lived and labored in the Amazon for 22 years. She helped rural Amazonian farmers protect themselves from the illegal expropriation of their land by mercenary logging companies, who rule the region with the help of hired gunmen and corrupt local officials. She lived on the conviction that right now saving the rainforest is about more than saving trees. It is about saving lives and communities from forces that operate outside the boundaries of law and order.

For these efforts Stang was apparently accustomed to receiving death threats. But it was to report death threats against other people that Stang met recently with Brazil's human rights officials, claiming that loggers and landowners had been threatening rural farmers near her home. About a week after this meeting, she was murdered. According to an NPR interview I heard this afternoon, Brazilian mourners walked and rode bikes through as many as 20 miles of mud to attend her funeral today.

Stang was 73 years old.

The details of Stang's death are still trickling into the news, and I am skeptical enough about the media to realize that stories like hers will be assembled and told in a particular way. I also confess that I never knew of Stang while she was alive, nor do I know much about the countless others like her who work in obscurity until my part of the world turns its flickering and restless spotlight in their direction. None of this, however, mitigates the fact that Stang's death has meanings that many deaths do not. And it seems to me that any way you look at this, and regardless of how the storyline shapes up, Stang was killed because of particular practices in which she was involved, and she was involved in those particular practices because of specifically Christian convictions.

If you asked me what I think Christian convictions entail, I would point you not to more propositions, but to people like Stang. Here's a Christian, I would say: A 73-year-old woman considered so dangerous that her death is required in order for the powers-that-be to continue business as usual. I'm aware that such an answer would immediately entitle you to doubt whether I am a Christian. But if Stang raises doubts about any claims I might make to be a Christian, that's precisely as it should be.

A Christian does not exist apart from the practices that, over time, make a person more like people like Stang. To the extent that the practices in which I engage point me in any other direction, I'm not a Christian. To the extent that I engage in practices that do tend towards the possibility of a death like hers, I am. That doesn't mean that every Christian has to be shot in the Amazon to be a Christian, of course. But I think it means something terrifyingly close to that. For this is laying one's life down for friends -- not, as is usually thought, death suffered in the act of taking other lives, but the open-handed giving of one's own life.

A person like Stang should at least act as a standing indictment against any of us who thinks that facing professional opprobrium or public ridicule for one's faith is anything like carrying a cross. A person like Stang should also be a standing indictment to American Christians who believe they are being persecuted by animated sponges, or Kwanza. Sometimes contemplating the number of things that democratic prosperity allows Christians to think of as crucifixion strikes me as an exercise in turning towards the absurd.

Monday, February 14, 2005

 

Basketball and jazz

As a fan of basketball, jazz, and extended metaphors, I enjoyed reading Michael Sokolove's article in yesterday's NYT Magazine. According to Sokolove, the problem with the N.B.A. these days is that young players are conditioned to think that highlight reels and shoe deals are the path to stardom. And given the way that the N.B.A.'s star-making machine works, that often is the case. But what gets lost in the shuffle of slam dunks and individual accolades are the fundamentals of the game, and the elegance of one of the truest team sports ever invented.

Sokolove won me over by saying nice things about my own favored team, whose success is generally credited by coaches and players alike to their collective chemistry. And he also sweetened the deal by throwing in compelling analogies between a good basketball team and a good jazz ensemble. But Sokolove's prescription for fixing the N.B.A. -- banning slam dunks in the high school and college game (I'm serious) -- doesn't seem like the solution to me. Here's what Sokolove says about basketball and jazz:
Earl Monroe, a stylish guard who played for the New York Knicks in the 1970's, employed ''tempo changes only Thelonious Monk would understand,'' the music and social critic Nelson George has written. Many others over the years have seen basketball as jazz, an apt comparison when the game is played well -- as an amalgam of creativity, individuality, collaboration, improvisation and structure. Much of what makes basketball interesting is the give and take, the constant tension, between individual expression and team concepts. On the best teams, players take their turns as soloists, but not at the expense of others in the quintet.
I like the analogy, but in some ways it breaks down. Many people think of collective improvisation as the defining hallmark of good jazz music, but in New Orleans (the putative birthplace of collective improvisation), jazz developed more than anything out of individual competition. In the early twentieth century the city's premier jazz bands routinely performed by holding "cutting contests," in which the band's star soloist would use his "axe," or instrument, to chop down the opposing band's soloist. In other words, on Sokolove's analogy, the birth of jazz was an age of slam dunks and highlight reels. Now, one could make the argument, I suppose, that jazz "developed" away from that stage to a more mature group concept, but that would be a pretty loaded argument to make.

I have to admit that I'm one of those jazz fans who sometimes laments the way the current music industry seeks to identify individual proteges and stars for recording contracts, and who longs for halcyon days when jazz artists were not the exclusive property of particular labels (just as Sokolove longs for the days when basketball was not more or less owned by shoe companies). In fact, it would be plausible to turn Sokolove's analogy in the other direction, and complain that the contemporary jazz recording industry in the United States also privileges the "slam dunk" and the soloist over the cohesive group. (Interestingly, the European jazz recording industry and labels like ECM have a somewhat different aesthetic, just as European basketball teams are not the star vehicles that many N.B.A. teams are. ECM's leading acts are groups like the Keith Jarrett Trio and the Dave Holland Quintet. Although this is sometimes the case with American labels, too, I find that the mailings I get from Blue Note Records tend to be dominated by copy about how Jason Moran or Wynton Marsalis or some other player is the best jazzman around. All you have to do is visit their website and watch the reel on the main page to judge whether the label sees its stars as groups or as individuals.)

But even granted my sympathy for Sokolove's basic metaphor -- a metaphor that I think could go both ways -- I think his real insight is to stress that any definition of what constitutes good jazz would have to recognize the "constant tension" in the music between "individual expression and team concepts." The same is true of basketball -- there's a tension between the slam dunk and the extra pass, but it's not an easily resolvable one. Sometimes a good slam dunk, created by the extra pass, made possible by the motion offense, can be just as much a testament to team play as individual excellence. That's why outlawing the dunk seems to me the worst way to deal with the problem. It would be like outlawing the jazz solo.

That's the easy way out: the real art, both in atheletics and in music, happens when individuals listen to each other and work with each other, but without the illusion that they can completely suppress their individualities. Good team basketball is hard to perform well, which is probably why so many star athletes would rather create their own shots and go for the shoe deal. But Sokolove's solution -- to simply legislate against star-making shots -- also fails to recognize how hard (and therefore how artful) good team basketball is. You can't legislate good team art.

Since I don't think of myself as a libertarian, I'm somewhat surprised by this post, but I suspect my libertarian reader(s) will be pleasantly surprised.

Saturday, February 12, 2005

 

Village idiocy

My wife and I just braved the bad reviews (there was, um, a forest full of them) and watched M. Night Shyamalan's The Village. (I think it's required that you say "M. Night Shyamalan's" before the title.)

At the end I made a crack about William Hurt's character ("I'm an American historian at the University of Pennsylvania, and I have this idea ..."). But then my wife asked, pointedly, whether I was making fun of an idealistic historian with utopian visions. I didn't really have a good comeback to that.

 

Antislavery scripts: Part II

This is the belated sequel to Antislavery scripts: Part I, but it will not be the last. As I've thought more about Hochschild's book and interview, as well as Marilynne Robinson's review, I've decided that really explaining my thinking about them requires a brief tour through the last 60 years of British antislavery historiography.

That comes across sounding pretty ambitious, but I intend this to be more like a brief tour than an extensive scholarly review, so take it in that spirit. There will be at least one more installment in this series, and perhaps two more. I think there will be a contemporary pay-off to this series in the end, in the form of some commentary on American patriotism and the relationship between freedom and empire, but I'm not yet sure what this pay-off will be because I'm thinking through these issues as I write. Hopefully that teaser will be enough to get you through, but if not, you could always just tune in at the end for the controversial stuff.

The earliest histories of British antislavery were hagiographies of abolitionists, who were often portrayed as self-sacrificial saints. And praise for abolitionists merged easily with praise for Britannia itself, the glorious empire that had heroically suppressed the Atlantic slave trade and stamped out slavery in its colonies. But these Whiggish scripts for the drama of British abolitionism were provocatively disrupted in 1944 by Eric Williams' seminal book, Capitalism and Slavery.

Williams argued that Caribbean slavery fueled the Industrial Revolution in the British metropolis, but that after the American Revolution, West Indian slavery's profitability and importance in the Atlantic economy declined. Emancipation was driven not by purely humanitarian motives, then, but by economic pressures. British industrialists had grown rich off the wealth of the colonies, said Williams, and only turned against slavery once it no longer served the interests of their economic sector. As the book's title made clear, he attributed the rise and fall of British slavery to the relationship between slavery and capitalism, which undercut the nationalist myths that linked abolition to the progressive unfolding of British liberty.

According to Williams, that narrative of emancipation especially ignored the equivocal positions of British antislavery industrialists after West Indian emancipation. As advocates of free trade, many supporters of abolition in the British colonies opposed tariffs on slave-grown products from Brazil, Cuba, and the United States, where slavery persisted well into the mid- and late-nineteenth century. For Williams, this was further proof that British capitalists had opposed British slavery on opportunistic rather than principled grounds.
The capitalists had first encouraged West Indian slavery and then helped to destroy it. When British capitalism depended on the West Indies, they ignored slavery or defended it. When British capitalism found the West Indian monopoly a nuisance, they destroyed West Indian slavery as the first step in the destruction of West Indian monopoly. That slavery to them was relative not absolute, and depended on latitude and longitude, is proved after 1833 by their attitude to slavery in Cuba, Brazil and the United States. They taunted their opponents with seeing slavery only where they saw sugar and limiting their observation to the circumference of a hogshead. They refused to frame their tariff on grounds of morality, erect a pulpit in every custom house, and make their landing-waiters enforce anti-slavery doctrines. (p. 169)
British capitalists were really opposed, in other words, to the principle of monopoly, not to the practice of chattel slavery. "The desire for cheap sugar after 1833 overcame all abhorrence of slavery," Williams said, as "Exeter Hall, the center of British humanitarianism, yielded to the Manchester School, the spearhead of British free trade" (p. 192). This idea that "the commercial part of the nation" viewed slavery from the perspective of Manchester counting-houses rather than from the perspective of London pulpits clearly departed from previous histories that had crowned the abolitionist "saints" with haloes of holiness.

Although he acknowledged that abolitionists like William Wilberforce and Thomas Clarkson were "the spearhead of the onslaught which destroyed the West Indian system and freed the Negro," Williams stressed that "their importance has been seriously misunderstood and grossly exaggerated by men who have sacrificed scholarship to sentimentality and, like the scholastics of old, placed faith before reason and evidence" (p. 178). He especially skewered Sir Reginald Coupland, whom he accused of seeing the abolition movement uncritically through the eyes of his own hero, Wilberforce:
Professor Coupland, in an imaginary interview with Wilberforce, asks him: "What do you think, sir, is the primary significance of your work, the lesson of the abolition of the slave system?" The instant answer is: "It was God's work. It signifies the triumph of His will over human selfishness. It teaches no obstacle of interest or prejudice is irremovable by faith and prayer" (p. 178).
Since Williams understood monopoly to be the aspect of the West Indian slave system that doomed it to death, he clearly could not accept such romantic ideas about abolitionism. The "student of the social sciences" had to avoid such "emotionalism," because "if, as so many have held, slavery falls into the realm of theology, monopoly most emphatically does not." Viewed from Williams' perspective, the idea that British Christianity had turned against slavery was a distraction from the main issue. Indeed, it was the kind of distraction that Williams believed aided British capitalists, for if they could get the British population to watch their song-and-dance on colonial slavery, they could divert attention from the increasingly oppressive consequence of the Industrial Revolution at home. (A similar accusation was directed at British abolitionists by West Indian slaveholders themselves, and one of the most controversial aspects of Williams' book was that he seemed to be endorsing their own view that the abolitionists' charity abroad was spite at home.) "The abolitionists were not radicals," Williams said. "In their attitude to domestic problems they were reactionary. The Methodists offered the English worker Bibles instead of bread and Wesleyan capitalists exhibited open contempt for the working class. Wilberforce was familiar with all that went on in the hold of a slave ship but ignored what went on at the bottom of a mineshaft" (p. 181-82).

Thus, whereas historians like Coupland had wreathed abolitionists with laurels, Williams drew attention to the blinders that they wore. He attached particularly withering words to Wilberforce in particular -- who, with his "effeminate face," was "small in stature," whose "smugness" made him "inept," "addicted to moderation, compromise and delay," and reliant on "aristocratic patronage." In Williams' hands the word "saint" became pejorative, a way of marking abolitionists like Wilberforce as soapy and supercilious. (Clarkson made off better in Williams' treatment; from what I've read Hochschild also seems to represent a particular strand in antislavery historiography that pits Clarkson's radicalism against Wilberforce's conservatism.)

To summarize, then, Williams' thesis combined several provocative claims: The primary factors both in the rise and in the demise of slavery were economic. Abolition was a result of the decline of slavery's profitability and the rise of capitalist views on political economy that opposed the monopolies that protected West Indian products. Although there were religious and humanitarian arguments made against slavery, they had limited force and questionable consistency, since many antislavery industrialists later opposed tariffs against slave-grown products from other places. Moreover, while abolitionists claimed the mantle of humanitarian religion, their concern was limited and inconsistent, since they opposed working-class political movements and showed little concern for the workers in the mines and "Satanic mills" of industrial England.

Over the years, almost every aspect of this complex thesis has been overturned or put into serious question by historians of British slavery and abolition. (See British Capitalism and Caribbean Slavery: The Legacy of Eric Williams, edited by Barbara Solow and Stanley Engerman.) Nonetheless, Williams' book exerted an unparalleled influence on postwar historians of these subjects, and well into the 1980s and 1990s, questions about how British capitalism, slavery, and antislavery were linked dominated the field. Indeed, these questions continue to be at the forefront of scholarship.

In a third installment of this series, however, I will show that as the particular aspects of Williams' thesis have been drawn into question, so too have historians moved away from purely economistic explanations of abolition to more cultural, political and social ones. Because of Williams, historians of antislavery are constantly on guard against a resurrection of Whiggish hagiographies of abolitionists; we are too conscious now of ironies and complexities in the lives and times of figures like Wilberforce. But as Williams' particular accusations against the "saints" have lost their force, different ironies and complexities have moved to center stage in histories of antislavery, which focus less on how abolitionism exalted a particular economic system (capitalism) and more on how abolitionism exalted a particular nation (Britain). I'll turn to that shift next, and eventually end up back with Hochschild.

Wednesday, February 09, 2005

 

The blogging graduate student

As the blogosphere turns, the question of whether and why graduate students should blog probably comes around with a fair degree of frequency. Laura McKenna suggested in December that what bloggers need is some kind of bibliography for such "recurrent topics ... so that we don't keep repeating ourselves." But a certain amount of repetition may be salutary since the demographics of blogdom are constantly changing. Graduate students (like myself) who were not paying attention to blogs in January 2004 will have missed discussions here and here about the potential professional dangers of graduate student blogging. So it may be good to spin the wheel back around to the subject every once and a while, for the benefit of those who have only recently gotten on this merry-go-round.

It seems, at any rate, a good subject to broach in my first post for Cliopatria. My joining this blog, at the very kind invitation of Ralph Luker and his fellow Cliopatriarchs, signifies that I have come to terms with being a graduate student who blogs. Feeling comfortable with that fact has not been easy, however, especially since when the subject does come up, it is often in the form of cautionary tales about "blogging from the bottom" of the academic totem pole. These cautionary tales are usually a round-up of usual suspicions: hiring committees will wonder about the work ethic of academic bloggers; they will raise eyebrows at the political or professional views expressed by their job candidates online; they will wonder about the mixing of personal and professional life on blogs. Of course, blogging graduate students are not the only academic bloggers who wonder about how their presence online affects their professional prospects. The subject of how blogging affects tenure committees also comes around the horn every so often, but rather than representing a different issue, it addresses the same basic concern: What is the relationship between academic blogging and professional security? The question is only most acute for graduate students because our horizons of professional possibility are the most open-ended.

One reason the jury remains out on that question is because the trial has only recently begun. Since blogging is a relatively recent phenomenon in academia, and because its profile in the mainstream and academic media is just beginning to rise, hard data on how blogging correlates to job hiring or tenure decisions are scarce. This means that discussions of the subject are rife with anecdotal evidence that warrant, at most, a certain agnosticism. As a recent profile of MLA bloggers (including Cliopatria's own Miriam Burstein) at Inside Higher Ed put it, "It's hard to draw too many conclusions about these blogs. They haven't been around that long (the oldest one in this group started in 2002). And it's hard to know what impact the blogs will have on these academics' careers (the oldest is 38 and none have tenure)."

It's easy to see, however, why the question continues to be of interest, even if it is hard to know what impact blogging really has on professional prospects. We do have hard data on the job market, and especially for historians, that data can be discouraging. So it would be entirely natural for history graduate students to conclude that, given the harrowing conditions of the job market, it is better to be safe than sorry. It is better not to do anything that might possibly jeopardize one's career. If the jury is still out, it's best not to disturb their deliberations by rapping on the jury room door.

But the fact that this tendency is a natural one is one of many instances in which agnosticism really serves as a thin mask for a full-fledged opinion. After all, if the evidence really is out on whether blogging is professionally damaging, then why is the most reasonable position to conclude that it probably is? Why can't agnosticism just as easily validate behaving as if the evidence might come out the other way? The problem here is not that graduate student bloggers are acting in the face of clear risk to their careers, but rather that graduate student bloggers are resolving to pursue a certain course under conditions of uncertainty.

As James Kloppenberg pointed out in his magisterial study of fin-de-siecle Progressive thinkers, Uncertain Victory, "Uncertainty can animate or disable. When certainty inhibits exploration, its loss can be liberating, but when conviction fortifies resolution, doubt can end in paralysis." In the case of blogging, agnosticism about professional implications might potentially liberate graduate students to explore new possibilities for intellectual discourse. Why, then, should uncertainty necessarily paralyze? Conversely, though, why resolve to blog in the absence of a clear conviction about its professional value?

I thought of such questions while reading Timothy Burke's recent essay on why he blogs. I find that many of my reasons for blogging are the same as his. But in a comment thread on another blog, one of Burke's colleagues asked him about the "power dynamics" of blogging for untenured professors -- dynamics that are equally relevant for graduate student. "Can they dare put out into the blogosphere what Tim (as a tenured professor) can put out there?" she asked. Burke's response was ...
... no, not unless they're unusually fearless.

It's not because the content might offend someone, but because most academics still perceive blogging (if they perceive at all) as greasy kid's stuff, as something done by marginal scholars. ...

What graduate students who blog need to remember is that even if they later abandon their blog, it will not disappear. If their name is out there and associated with particular arguments, sentiments, claims, it can be found if someone really wishes to find it. Though I think it's pretty rare that someone does: I doubt if more than the smallest fraction of academics google their job candidates' names, for example.
Those sentiments aptly summarize some of the confusing reservations that I feel as a graduate student blogger. On the one hand, I shouldn't worry about the content of my blogging offending someone. On the other hand, I should remember that my content will be forever associated with my name. On the one hand, I shouldn't be afraid of job committees finding my blog. On the other hand, to blog requires me to be "unusually fearless." My resolution to blog often founders on these vaguely incompatible shoals of uncertainty.

Does my blogging, then, in the face of such uncertainty, really require an "unusually" large amount of fearlessness? It depends. The graduate student blogger is only unusually fearless if the most fearsome possibility imaginable is failure to secure a certain kind of academic job. And surely there are much greater things to fear which are far more usual. In comparison to many things, the fear of suffering the opprobrium of a tenure committee pales. Nonetheless, blogging as a graduate student or an untenured professor does require resolution unfortified by certainty. But it would be a sad thing indeed if this kind of fearlessness really is unusual in academia.

The kind of intellectual exploration that academic life is supposed to encourage often depends on venturing into the public sphere without the assurance of certain rewards or the guarantee of approbation. Perhaps, then, there is something to be said for graduate student blogging as an apprenticeship in learning to be animated by doubt, rather than disabled. It may seem that this kind of intellectual "fearlessness" -- the courage not to be paralyzed by doubt -- is of a different sort than the resolution required to blog despite agnosticism about professional gains. But I'm not sure those two kinds of resolution are unrelated. As Kloppenberg's book demonstrated, the decline of certainty in epistemology and metaphysics coincided historically with the rise of academic professionalization in philosophy. The gate-keeping procedures of modern university life are attempts to replace philosophical certainty with professional certification.

Certification is the rough approximation that we have now for the epistemological certainty that thinkers before the nineteenth century usually possessed. It is a way of allowing us to continue to think, to learn, to teach, to adjudicate, even without the assurance of reaching solidly certain conclusions about intellectual matters. Professionalization, in other words, is a kind of intellectual therapy that keeps the modern mind from being disabled by its doubts. What would happen, then, if we allowed ourselves to be disabled by professional uncertainty? What would then fortify our convictions and keep us thinking?

By pursuing this line of thought, I don't mean to exalt the blogging graduate student as the last action hero of academic life. On the contrary, to be an academic these days is, unfortunately, to be uncertain about what the virtues of intellectual heroism would look like. The hermeneutic of suspicion bequeathed to us by Kloppenberg's uncertain philosophers makes us equally suspicious of either triumphalism or dismissiveness when it comes to blogging or, for that matter, most other social practices. But given that we find ourselves in this doubt-full position, we must at least find practices that encourage us to keep moving despite our uncertainty -- either about our convictions or our careers. For me, at least, blogging has been a practice (like the best therapy) that is sometimes uncomfortable, but that slowly increases my range of intellectual motion and keeps me from being paralyzed by doubts. I don't claim to be unusually fearless by blogging; rather, what I'm trying to acquire is the usual quota of fearlessness (such as it is) required by contemporary academic life.

(Cross-posted at Cliopatria.)

Friday, February 04, 2005

 

Clippings

As if to prove my claim earlier this week that I could win a "bad blogger" contest hands-down, I have not posted since making that claim. But I have been productively busy this week: a new class and a new chapter begun. I will be leisurely busy (I checked, and "leisurely" really is an adverb) this weekend: a quick trip back home.

Before I go, an announcement and a couple of clippings from the week. Thanks to the kind invitation of Ralph Luker and his blogging colleagues, I have joined the group blog Cliopatria. I'm honored and excited by this unearned "cliopatrimony," which makes it more regrettable that I have not been able to do much blogging this week. I'll continue to blog here at Mode for Caleb, but will cross-post and post at Cliopatria on subjects that might be of interest to the blogging historian or historian blogger. Speaking of which, it looks like there is much of interest at Cliopatria in the second History Carnival.

* * *

"What most people don't realize about writer's block is that it's not a dearth of words or ideas that plagues the author, but a multitude. The paralysis of writing is the paralysis of choice.

There is a blank page. There are an infinite number of stories one could tell. Even the brave act of marring the potential of that sheet with a sentence (and how to chose, when for all practical purposes there are an infinite number of sentences to pick from?) does nothing to limit the direction the writer might take. Human beings can be crippled by only a dozen or so choices. How does one even conceive of the infinite?

The fact that anything gets written at all is a miracle."

-- Siona at Nomen est Numen

* * *

"Freedom to take one's own chances amid the example of exceptionally skilled and humane practitioners of one's craft is the sum total of good graduate education ..."

-- Daniel Rodgers, in acknowledgements

Tuesday, February 01, 2005

 

Good writing contest, redux

You may remember me requesting nominations for a Good Writing Contest about two months ago. I said it would end about one month ago. Instead, I've lazily allowed the link to sit in my sidebar until now. Sepoy and Rob both made some great nominations, but since the pool of nominations was small, I've decided not to pick a winner.

In truth, the contest was always tongue-in-cheek, an attempt to criticize the pretensions and apocalyptic tone of the Bad Writing Contest. Secretly, I always knew that I would be able to spin the results of the contest either way, as I pointed out in a comment over at Locus Solus, responding to a post on my Good Writing Contest by Paul:
You're right that the same charges brought against the BWC -- lack of context, begging of the question, etc. -- could be aimed at the GWC.

That's part of my point: can the critics of the BWC reasonably defend what "good writing" is? And can they do so while avoiding the same pitfalls that they accuse "bad" writers of falling into?

I think they can, but we'll see. I'm running the contest by following a variation of the First Law of Cross-Examining: never run a contest for which you don't know how to spin the results. If there are lots of nominations, I can declare the humanities alive and well. If there are very few nominations, I can claim that if it is so hard to say what "good" writing is, then why is it so easy to identify the "bad"? (Leave aside for a moment the more likely interpretation that no one reads my blog. I suspend that explanation thanks to the First Law of Blogging, which is always to assume that your statcounter must not work very well.) The fact that the deck is stacked is also part of the point: the deck was always stacked with the BWC too.
I still believe there are plenty of good writers in the humanities, and I continue to resist the ideas that all jargon is inaccessible, that all subtlety is obscurantist, and that all scholarship is dense. But as for now, I'm laying the First Ever Good Writing Contest quietly to rest. Thanks to those who linked, participated, and snickered along with me.

Site Meter