Monday, January 31, 2005

 

More on Jesus, Jefferson, and Prothero (with a little Moltmann on the side)

A word to the wise: this may break the record for my longest post, and I'm not sure it is substantive enough to deserve that dubious distinction. The title alone should indicate that it is scatter-brained, but at least I've separated it into some bite-sized pieces. I know I've promised a Part II to my post on antislavery historiography, but in the meantime I felt moved in a different direction. This post is the wandering (and wondering) result.

As I mentioned earlier, I've been using some of my spare moments to read Stephen Prothero's American Jesus: How the Son of God Became a National Icon. This post contains an interim review on the book so far, as well as some foolhardy conjectures about theology and politics toward the end.

* * *

First, the book report. I've now finished Part I, "Resurrections," which considers how successive generations of American Protestants have imagined the figure of Jesus. Each chapter in Part I follows a basic structure. Prothero begins by outlining some broad cultural transformation in ideas about Jesus and then suggests how this transformation has contributed to the shape of contemporary American Protestantism.

I've alluded to Chapter 1, "Enlightened Sage," in my earlier post. In the late eighteenth century, Enlightenment rationalists like Thomas Jefferson redescribed Jesus as a Great Thinker and Ethicist, stripping him of the various robes of royalty or divinity placed on his person by church traditions. Prothero sees echoes of Jesus the "Enlightened Sage" in modern projects like the Jesus Seminar, which also attempt to separate the "historical Jesus" from the Christ of faith.

Chapter 2, "Sweet Savior" deals with nineteenth-century evangelicals, who feminized the figure of Jesus and reimagined him as a sentimental friend. These developments sowed the seeds for the now pervasive idea among Protestant Christians that Jesus is a "personal Savior," to be either accepted or rejected by the individual believer. Contemporary evangelicals still reveal their roots in nineteenth-century revivalism when they speak today of inviting Jesus into one's heart, having a "personal relationship" with him, and so on. Ironically, however, Prothero shows that the "Sweet Savior" image of Jesus was also an outgrowth of the "Enlightened Sage." Both portraits emphasized the humanity of Jesus over his divinity, so it is not surprising that liberal Protestant theologians in the second half of the nineteenth century were influenced both by the quest for a historical Jesus and by the evangelicals' turn towards a more humane Jesus. "All but the most radical Protestant liberals affirmed the divinity of Jesus," Prothero says near the close of Chapter 2, "but they emphasized his humanness. Though some focused on the Jesus of history, most spoke of Jesus in experiential terms, placing him not in first-century Palestine but in their own hearts" (p. 83).

Chapter 3, "Manly Redeemer," follows Jesus into the turn of the twentieth century. Many cultural historians now depict this era -- the age of Teddy Roosevelt and Rough Riders -- as an age of endangered masculinity. From psychologists to physicians to Protestants, many Americans worried about things like the "over-civilization" of young boys and the emasculation of men by the quotidian routines of modern business. Many men (including the earliest members of the YMCA: consider what the acronym stands for) took up hobbies like boxing and hunting, hoping thereby to rediscover a raw and "strenuous life." In the process, their Jesus also became a rough and tumble Man with a capital "M." A "muscular Christianity" was born. Songs like "Onward Christian Soldiers" were sung -- with feeling. Billy Sunday, the hugely popular forerunner to celebrity evangelists like Billy Graham, told his audiences that Jesus "was no dough-faced, lick-spittle proposition." Instead, Sunday said, "Jesus was the greatest scrapper that ever lived," adding that "the manliest man is the man who will acknowledge Jesus Christ" (p. 94).

Jesus was not just a manly man; he was a business man, too. In the early twentieth century, bestselling biographies of Jesus stressed his career as a carpenter, his interactions with people "on the job," his familiar and jocular way of associating with the common man, his "personality." Prothero argues that the American Jesus was caught up in a broader shift from a "culture of character" to a "culture of personality." "New notions of the hero and a new emphasis on personality produced a new understanding of Jesus as a man of action with a magnetic personality," he writes. And at a time when strenuous self-assertion was becoming widely accepted, "Jesus became a personality par excellence -- someone his followers could imitate only by endeavoring to discover in themselves their own true selves" (p. 111). Jesus was "bully," to borrow the idiom of TR.

Once again, and perhaps paradoxically, Prothero shows that Jesus as "Manly Redeemer" would not have been possible without Jesus the "Enlightened Sage" or Jesus the "Sweet Savior." And it's again not hard for Prothero to suggest connections between the Jesus "cult of personality" and the current culture of Protestant evangelicalism in the United States, which every year produces countless books on Jesus "leadership style" and his gospel of prosperity and self-actualization. Although Prothero does not mention these titles in particular, Jesus the "Manly Redeemer," the ideal CEO, still makes regular appearances in books like Tender Warrior: God's Intention for a Man and the bestselling Wild at Heart: Discovering the Secrets of a Man's Soul.

Chapter 4, "Superstar," takes us into the late twentieth century and up to the present, beginning with the 1960s "Jesus movement" that set about rediscovering Jesus as a hippie. Theatrical productions like Godspell and Jesus Christ: Superstar, says Prothero, were manifestations of a general interest in making Jesus real and relevant to pop cultural audiences. Jesus became, for "Jesus freaks" and flower children, a homeboy and the highest "high" there is. We are still living, of course, with the effects of Jesus' rocket-ride into superstardom. A commercial empire of Christian retail still seeks to make Jesus "cool" to today's generation, just as the "Jesus movement" of the 1960s attempted to do its generation. Christian contemporary music, first marketed in the 1970s, has now become a mainstream music industry. Prothero also connects "Jesus the Superstar" to the rise of huge "seeker-sensitive" megachurches, since the intent of churches like the influential Willow Creek Community Church is to figure out what attracts contemporary seekers to Jesus, and then to give that Jesus to them. The synergy between Christian retail, Jesus the "Superstar," and the "seeker-sensitive" church model was clearly demonstrated by the marketing and commercial success of Mel Gibson's recent film on the Passion, in which, of course, Jesus played the starring role.

* * *

Historians are trained to ask relentlessly: "So what?" So, so what? Why does Prothero think it matters that we take note of these images of Jesus -- as Enlightened Sage, as Sweet Savior, as Manly Redeemer, as Superstar? As far as I can tell, Prothero has two kinds of answers to the "so what" question. The first answer is simply this: Americans really like Jesus. A lot. Prothero wants to argue that despite all the discontinuities in the history of American Christianity, one thing has remained constant: a fascination with the figure of Jesus. His personality has been a perennial palimpsest for the cultural values of American Protestants. Here is Prothero on p. 155:
What is intriguing about the history of American Christianity is that. . . . [n]o one has seen Jesus as inessential. Over the American centuries, some liberals have given up on miracles, the inspiration of the Bible, and (in the case of the "Death of God" theologians of the 1960s) divinity itself. Some conservatives have given up on creeds, while others have jettisoned doctrines once thought sacrosanct, including predestination, original sin, and the substitutionary atonement. But rather than killing Jesus, these adaptations have only made him stronger. It almost seems as if the Christians who subtracted this doctrine or that rite were beginning to question their own standing, and in order to convince themselves (and their neighbors) of their bona fides they bent over backwards to laud and magnify their Savior.
So Prothero's first answer to the "so what" question is this: We should find it significant that American Christians have persistently seen "Jesus" as an essential part of their religion. By itself, however, this claim strikes me as about as significant as a tautology. Christianity, by definition, is a religion that makes claims about Jesus. So why should it be especially "intriguing" that some idea about Jesus is the lowest common denominator of American Christianity? If we found some Christians who claimed that "Jesus" was inessential to their Christianity, we could rightly wonder whether we are still talking to Christians.

By saying that I don't mean to become embroiled in a theological controversy about what a "real" Christian is. I simply presume that historians of Christianity need some working definition of the religion they are analyzing, and "the religion derived from Jesus Christ" seems as good a definition as any. If, however, that is the definition, then it seems little more than tautologous to point out that Christians just can't seem to let go of Jesus. Of course, they can't. If they could, we would need some other word to describe their particular form of religious belief.

To be fair to Prothero, I think what he really means to say is that Protestant American Christians have placed an intriguing emphasis on the person of Jesus, especially when compared to other kinds of Christians. This would be a more historically interesting claim--not that Christians can't let go of Jesus, but that some Christians have a tighter grip on him than others. Yet Prothero's book is not well suited to answer that question, because it lacks the kind of comparative context in which such a claim would have to be assessed. Part I of the book focuses exclusively on Protestant American Christians and then concludes that Protestant American Christians are intriguingly Jesus-centric. That may be true (it strikes me as plausible), but to prove it would require (a) bringing other Christian traditions into one's analytical frame and (b) widening the same frame to include other national contexts. You need something with which to compare Protestant American Christianity if you want to claim that its particular focus on Jesus is particularly intriguing.

Successfully achieving (a) would have made Prothero's book considerably longer. But this may be a case where elaboration would have been justified, even necessary. In the opening pages of the Introduction, Prothero notes that his focus will be exclusively on mainstream American Protestants, and especially evangelicals: "Artifacts of the American Jesus number in the millions, and one book obviously cannot cover them all. So this project is by necessity selective and by admission idiosyncratic. Here I ignore Native American and Hispanic Jesuses, and devote scant attention to liturgical traditions such as Roman Catholicism, Episcopalianism, and Lutheranism" (p. 14-15). This is the standard throat-clearing that every historian must do, and I have no problem with selectivity. As I've argued before, histories are by necessity selective and focused. But this does not mean that every way of limiting one's project is as good as any other. And in Prothero's case I suspect he knows that leaving out vast swaths of American Christianity like Roman Catholicism (!) is problematic.

Other parts of Prothero's introduction hint at the fact that he has these excluded traditions in the back of his mind when he writes that Protestants are especially Jesus-centric. For example, Prothero notes in passing that his story really begins "in the ancient Mediterranean, where Jesus was sustained in the scriptures, creeds, and rites of Roman Catholic and Orthodox Christians." In these liturgical traditions, he says, "Jesus receded from popular view, overshadowed by God the Father and, among many Roman Catholics, by saints such as the Blessed Virgin Mary" (p. 12). Now there is a crucial and extremely relevant claim: that for some Christian traditions, in certain places and at certain times, Jesus may not have been the lowest common denominator for their adherents. I have my doubts about that claim, too, but those doubts at least prove that these kinds of comparisons need to be explored within the book rather than simply asserted. By limiting the frame of the book to evangelical Protestant images of Jesus, while simultaneously gesturing to off-stage traditions which have putatively different images of Jesus, Prothero tempers the significance of his claim that "although the Christians highlighted in [Part I] often disagree about just who Jesus is, they all affirm his standing as a unique figure in sacred history" (p. 15). Don't all Christians affirm Jesus as a "unique figure"?

* * *

Fortunately, however, I don't think that Prothero's main answer to the "so what" question is simply to point out the Jesus-centrism of American Protestants. What's more interesting about the book is what goes on between the lines, where Prothero implicitly (sometimes explicitly) draws connections between each of the Jesus images he outlines. Contours of the "Enlightened Sage" can be spotted in portraits of the "Sweet Savior." The tell-tale brushstrokes used to paint Jesus as a "Manly Redeemer" were also used to portray him as a "Superstar." More important, the idea of Jesus as an "Enlightened Sage" set up the easel for all subsequent Jesus freaks and put the paintbrush in their hands. For what is significant about all of these Jesus portraits is how intensely privatized they are, how well they reflect the peccadilloes of their artists. Historically, all of them have depended on the idea that "the church" in one guise or another had gotten Jesus wrong, and that he therefore needed a makeover. In this sense, I think Prothero is right -- and provocatively so -- that Jefferson prepared the canvas for America's "Jesus nation" by defending the right of individual admirers to cut and paste their own Jesus together. All of the Jesuses that followed Jefferson's were made possible, so to speak, by the radical privatization of Jesus that he and other Enlightenment thinkers encouraged.

So what? Well, if Jesus the "Enlightened Sage" is intimately related to Jesus the "Sweet Savior," and if the "Sweet Savior" served in turn as a baseline drawing for the "Manly Redeemer" and the "Superstar," then criticizing what American Christianity has become might require going back to the drawing board altogether, or even going back to before there was a drawing board (to the extent that this would be possible). As Prothero notes on p. 156, critics of CCM or Christian retail or megachurches have more to erase than just Jesus the "Superstar," because that Jesus is really a collage of Jesus the "Sweet Savior," the "Manly Redeemer," and the "Enlightened Sage."
The modern-day Calvinists who decry the loss of doctrine that has beset even born-again Christianity have a legitimate complaint. But in order to carry the day, they need to take on not only the Jesus movement, the megachurches, and CCM but also Dwight Moody (who advertised aggressively), Ira Sankey (who set Christian songs to popular music), Billy Sunday (who was allergic to theology), and Billy Graham (who perfected the evangelism business). They might need to take on as well the nineteenth-century evangelical congregations whose decision to embrace Jesus as a sweet and tender Savior made them the first "seeker-sensitive" churches in the nation.
Prothero doesn't take this paragraph one step further, but he could. He could also say that critics of today's born-again Christianity, which has "personality" in spades but is often short on suits like doctrine and tradition, would have to take on Jefferson and his "Enlightened Sage" as well. Jefferson, after all, was also "seeker-sensitive." He too rejected antiquated church dogmas in favor of a Jesus who could be cut out of tradition and pasted together again in modern garb. Across the pages of Jefferson's edited New Testament fall the shadows of "Do You Know My Jesus?" and "What Would Jesus Do?"

Those who admire Jefferson as a staunch defender of the separation of church and state might be uncomfortable with my implication that Jefferson could be at all responsible for the state of today's evangelical megachurches, who seem to threaten the very wall that Jefferson labored to construct. (Members of megachurches might be equally uncomfortable with the idea that they share something in common with Jefferson.) So before either of you gets too uncomfortable, let me stress that I don't hold Jefferson directly responsible for Jesus the "Superstar" or Jesus the "Manly Redeemer." I'm moving into the part of the post where I play fast and loose with causation in order to make a conjectural point. And that point is this: the freedom with which American Christians now cling to their own personal Savior would be harder to imagine if Jefferson and his followers had not relegated Jesus to the private sphere. In other words, if nineteenth-century Christians went looking for a "Sweet Savior" in their hearts rather than in their churches, that's precisely where Jefferson had looked as well. I think that this is the subtle implication of Prothero's Part I, and if so it is a significant one.

* * *

Jefferson said, in 1814, that "our particular principles of religion are a subject of accountability to our god alone. I enquire after no man's, and trouble none with mine." Here is a paradigmatic statement of the view that religion is a private matter, a personal thing. Religion is a personality trait. Is it any wonder that the same historical trajectories that produced Jefferson later produced Jesus the manly man and the superstar? For the claim that religion is a personality trait and the claim that Jesus was a "personality" par exellence may be morphologically distinct, but they are genetically related. They share the gene that celebrates the atomized, individual person and regards the traditional community with automatic skepticism.

I do not want this gene to be suppressed, since I regard it as one of the healthiest features of American intellectual and religious life. Neither, however, do I think that the dominance of this strain of individualism has been an unqualified good. If it becomes the wholly dominant strain, it can encourage us to think that we are accountable to no one but ourselves for our beliefs. And by making religion simply a matter between me and my god alone, the apotheosis of personality can make matters difficult between me and you.

We can find examples of such difficulties littering the political landscape of contemporary America. What if my god, my personal Savior, says you must deserve your lot in life? Are you poor? Well, you must not have been the kind of carpenter that Jesus was; your manliness must not have been up to snuff. What's that? That's not what Jesus says about the poor? Sorry, but my particular principles of religion are a subject of accountability to my god alone. I don't have to give you reasons for thinking Jesus was a "manly redeemer," so long as my reasons satisfy me.

The conclusion I'm trying to draw out is this: Many critics of evangelical Protestants -- particularly critics of their political conservatism -- believe the solution to the problem of religion in American life is to reinforce that religion is a private matter. But that idea is also part of the problem that religion in American life has become. If you don't like Jesus the Superstar, Jesus the General, Jesus the President, then you have to come to grips with the fact that a radically Jeffersonian privatization of religion -- a belief that one is only accountable to one's own god -- creates the conditions under which all of those other American Jesuses can exist.

More significant, the radical privatization of religion fits hand in glove with the simplistic reduction of complicated social and political problems into issues of personal character, personal integrity, personal relationships, personal savings accounts. The theologian Jürgen Moltmann, along with other liberation theologians, made this point as well as anyone in his Theology of Hope, whose closing pages lamented the way that modern society has turned religion into a cult of subjectivity and personal preference. "The primary conception of religion in modern society," Moltmann says, "assigns to religion the saving and preserving of personal, individual and private humanity." Don't trouble me with your private humanity, and I won't trouble you with mine. But Moltmann goes on to argue that this modern conception of religion can also encourage one to simply say "don't trouble me," period. On this radically subjective view of religion,
Faith is the receiving of one's self from God. This places it in a position of radical loneliness, makes it "individual". ... [The] Christian ethic is then reduced to the "ethical demand" to accept one's self and take responsibility for the world in general. But it is no longer able to give any pertinent ethical instructions for the ordering of social and political life. Christian love accordingly quits the realm of justice and of the social order. ... The "neighbour" who is the object of Christian love is then the man who encounters us at any given moment, our fellow man in his selfhood, but he can no longer be known, respected and loved in his juridical person and his social role. Our "neighbour" comes on the scene only in personal encounter, but not in his social reality. It is the man within arm's length or at our door who is our neighbour, but not man as he appears in the social juridical order, in questions of aid to under-developed countries and race relationships, in social callings, roles and claims. [314-315]
This is, admittedly, a dense and thorny passage, but I think Moltmann is suggesting one answer to a question that has preoccupied many pundits of American politics for the last several months: Why does evangelical Christianity in America, once the seedbed of progressive movements like abolitionism and women's rights, seem to have suddenly "quit the realm of justice and of the social order"? Immediately after the election, The Decembrist asked the question this way:
The right question, I think, is not whether religion has an undue influence, but why it is that the current flourishing of religious faith has ... virtually no element of social justice? Why is its public phase so exclusively focused on issues of private and personal behavior? Is this caused by trends in the nature of religious worship itself? Is it a displacement of economic or social pressures? Will that change? What are the factors that might cause it to change?
This post is not intended to settle those questions; it doesn't even address most of them. At most I'm merely suggesting another related question: Is there some connection between the fact that American Christianity is now so "exclusively focused on issues of private and personal behavior" and the Jeffersonian idea that religion is an issue of private and personal behavior? To put things back in the terms of Prothero's book, is there a relationship between the extremely emotional, masculine, seeker-sensitive, privatized Jesus (who makes claims on Christians only as individual seekers and not as a community grounded in tradition) and Jefferson's Jesus (whose teachings are boiled down from church orthodoxy to the "ethical demand to accept one's self," as Moltmann puts it)?

I intend that sincerely as a question, and not as a declarative sentence. It is a question that needs careful thought. Perhaps to salvage the credibility of my question, I'll conclude by pointing out that smarter people than I have been considering similar questions lately. In the November 2004 issue of Harper's, David Hollinger wrote, in his review of two recent books on religion in America:
The very church-state separation that might lead one to expect a more robust secularist tradition in the United States has, ironically, promoted a dynamic of religious affiliation in its stead. Religions, old and new, conservative and liberal, are likely to maintain their considerable clout in the United States for some time, influencing indirectly the public affairs of the nation--no matter how the Supreme Court interprets the constitutions provisions for the free exercise of religion and against the incursion of religious authority on matters of state.

Perhaps acceptance, however grudging, of this enduring condition could inspire a more vigorous and forthright debate about religious ideas themselves. It is far from obvious that the future of Christianity in the United States belongs to Jerry Falwell, Pat Robertson, and the Catholic bishops who want to deny communion to politicians who deviate from Rome's pronouncements on abortion, but candid discussions of religious ideas are rare in the United States today. Believers argue among themselves and occasionally attack, the academic and media elites for their godless ways, and secularists usually let the believers alone, treating their ideas as private matters to be respected or tolerated but not challenged. Only when confronted with something immediately threatening and scientifically obscurantist--such as the banning of evolution lessons by some willfully ignorant school board--do secularists actually bestir themselves to refute what is being said in God's name. When Al Gore claims to resolve life's tough questions by asking himself, "What Would Jesus Do?" he can count on the respectful silence of those who privately doubt the guidance promised by this pious principle of applied ethics.
The point I think Hollinger is making is that the radical privatization of religion, which makes the religious believer accountable only to himself or herself, makes possible the religious believer's unchallenged adoption of privatized theories of applied ethics -- that a Jeffersonian banishment of religion to "a position of radical loneliness" (Moltmann) contributes to the fact that religion takes such radically individualistic postures when it returns from its banishment back into the public sphere.

In other words, the basic belief that made possible Jefferson's "Enlightened Sage" -- that a lone individual is authorized to create a stand-alone Jesus -- made it possible for so many Americans to create the "Sweet Manly Superstar" that now vexes Jefferson's heirs. But Jefferson's basic premises now also make it impossible for either side's Jesus to say anything to the others. As a Christian, I can't take issue with your "Enlightened Sage" Jesus, and as a secularist, you can't take issue with my personal savior either. "Candid discussions of religious ideas" are silenced before they can even begin.

I'm not sure what the answer to this problem is, and I don't want to conclude with any rushed conclusions. I'm still groping toward a clearer statement of the problem, and since this is only a blog, I'm afraid that this highly unsatisfactory conclusion will have to do for now.

Friday, January 28, 2005

 

Bibliomania

Baldanders goes to The Strand.

Thursday, January 27, 2005

 

Clippings

"I would like to care less about the things other people say about me, but I can't imagine caring less. I think people pay heavy prices for armor and callousness." -- Tony Kushner, in an interview for The New Yorker, 3 January 2005

* * *

"There are a thousand thousand reasons to live this life, every one of them sufficient." -- From Gilead by Marilynne Robinson.

* * *

Nought can Deform the Human Race /
Like to the Armour's iron brace. /
When Gold and Gems adorn the Plow /
To Peaceful Arts shall Envy Bow.

-- William Blake, from "Auguries of Innocence"

* * *


"If there is any reaction to the Greeks which may be called typical of our age as compared with preceding times, it is, I think, a feeling that they were a very odd people indeed, so much so that when we come across something they wrote which seems familiar to our own way of thinking, we immediately suspect that we have misunderstood the passage. It is the unlikeness of the Greeks to ourselves, the gulf between the kind of assumptions they made, the kind of questions they asked and our own that strikes us more than anything else. ...

"Take, for instance, the following passage from the Timaeus:
"Such was the whole plan of the eternal God about the god that was to be, to whom for this reason he gave a body, smooth and even, having a surface in every direction equidistant from the centre, a body entire and perfect, and formed out of perfect bodies. And in the centre he put the soul, which he diffused throughout the body, making it also to be the exterior environment of it; and he made the Universe a circle moving in a circle, one and solitary, yet by reason of its excellence able to converse with itself, and needing no other friendship or acquaintance. Having these purposes in view he created the world a blessed god."
"... Even those of us whose mathematical equipment is of the most meager, have so imbibed the modern conception of number as an instrument for explaining nature, that we can no more think ourselves back into a state of mind where numbers were regarded as physical or metaphysical entities so that one number was 'better' than another than we can return to a belief in sympathetic magic. Nor is the Platonic assumption about the moral nature of the godhead any less peculiar to us [than] his shape. We may or may not believe that god exists, but the only kind of god in which we can think of believing is a god who suffers ... the kind of god who is both self-sufficient and content to remain so could not interest us enough to raise the question of his existence."

-- W. H. Auden, from "The Greeks and Us"

* * *

I may as well confess myself the author /
Of several books against the world in general. /
To take them as against a special state /
Or even nation's to restrict my meaning. /
I'm what's called a sensibilist, /
Or otherwise an environmentalist. /
I refuse to adapt myself a mite /
To any change from hot to cold, from wet /
To dry, from poor to rich, or back again. /
I make a virtue of my suffering /
From nearly everything that goes on round me. /
In other words, I know wherever I am, /
Being the creature of literature I am, /
I shall not lack for pain to keep me awake. /
...
Samoa, Russia, Ireland I complain of, /
No less than England, France, and Italy. /
Because I wrote my novels in New Hampshire /
Is no proof that I aimed them at New Hampshire.


-- Robert Frost, from "New Hampshire"

Wednesday, January 26, 2005

 

Antislavery scripts: Part I

A few weeks ago the Sunday Times published Marilynne Robinson's review of two new books on British abolitionism: Steven M. Wise's Though the Heavens May Fall, and Adam Hochschild's Bury the Chains. I have not read either book, although I did read Hochschild's shorter article on British abolitionism published in Mother Jones a year ago. Both that article and this review may suggest why, as I noted earlier, abolitionists have been so much in the news of late.

Abolitionism, whether in America or in Britain, is the movement that everyone wants to claim. In November, both Republicans and Democrats tripped over each other trying to wear the mantle of Lincoln, as both parties do every four years. And since the election has since become defined as a referendum on "moral values," people from all sides of the political spectrum have sought to identify with a movement whose moral values seem unimpeachable. We are all abolitionists now -- or, at least, this has become a powerful cultural presumption in Western liberal democracies. Forms of slavery persist in the world, even at this late date, and apparently its defenders persist as well. (Via Whig Hill.) Nevertheless, our national identity is bound up with a trinity of ideas: that slavery is dead, that "we" -- either Britons or Americans -- killed it, and that this is a good thing.

Because consensus on these matters is so overwhelming, abolitionism is a ready-made moral high ground in contemporary debates. As one particularly bizarre moment from the presidential debates made clear, some pro-life advocates on the Religious Right seek out this high ground by comparing Roe v. Wade to the Dred Scott decision. On the other end of the spectrum, Christians who like their politics progressive point to the abolitionists as examples of how religion can be a force for good in the public sphere. In his recent appearance on The Daily Show, Jim Wallis expressed his belief that movements change history, counting off the usual examples on his left hand -- women's suffrage, civil rights, and, of course, abolitionism.

Progressives who like their politics secular also point to the abolitionists as their own, as I noted in a previous post. It was not coincidental that Hochschild's article appeared in Mother Jones, the progressive magazine he helped to found. For the piece was clearly intended as an exercise in edification, a pep-talk to progressives who had taken to the streets in vast numbers the year before to protest a war that happened anyway. History is still on our side, Hochschild seemed to be arguing, and abolitionism was his Exhibit A:
Though born in the age of swords, wigs, and stagecoaches, the British anti-slavery movement leaves us an extraordinary legacy. Every day activists use the tools it helped pioneer: consumer boycotts, newsletters, petitions, political posters and buttons, national campaigns with local committees, and much more. But far more important is the boldness of its vision. Look at the problems that confront the world today: global warming; the vast gap between rich and poor nations; the relentless spread of nuclear weapons; the poisoning of the earth's soil, air, and water; the habit of war. To solve almost any one of these, a realist might say, is surely the work of centuries; to think otherwise is naive. But many a hardheaded realist could-and did-say exactly the same thing to those who first proposed to end slavery. After all, was it not in one form or another woven into the economy of most of the world? Had it not existed for millennia? Was it not older, even, than money and the written word? Surely anyone expecting to change all of that was a dreamer. But the realists turned out to be wrong. "Never doubt," said Margaret Mead, "that a small group of thoughtful, committed citizens can change the world. Indeed, it is the only thing that ever has."
I gather from Robinson's review that this paragraph would serve as a fitting coda to Hochschild's book as well as his article: His abolitionists are knight exemplars. Though "born in the age of swords, wigs, and stagecoaches," they still have an air of swashbuckling derring-do and a flair for the dramatic. Consider an interview with Hochschild in this month's Mother Jones, one year after his article appeared. In answer to the very first question -- "How did you come across the topic for this book?" -- Hochschild gives a revealing response:
As with many books, I started off trying to do something else entirely. I had long been fascinated by the character of John Newton, because I always like stories of personal transformation. The idea that this former slave trader had become an abolitionist and would write this beautiful hymn long intrigued me. So I had the idea of doing a biography of him. I started looking into his life and fairly quickly discovered that didn’t fit the script that I had in mind. He left the trade for medical reasons and not out of belief and kept all his savings invested with his former employers, even while he was a minister and started to write these hymns. He never said a word in public about slavery for more than 30 years after leaving the sea, and only spoke up when some guy named Thomas Clarkson, whom I’d never heard of, came to see him. So I began to wonder, “Who is Thomas Clarkson?” Gradually, it dawned on me after three or four months of going up the wrong path that the story was the movement and not Newton.
Newton "didn't fit the script" Hochschild "had in mind" -- the Margaret Mead script of history. Neither, we discover from the interview, did William Wilberforce work well as Hochschild's leading man: "Wilberforce has always been more politically convenient to lionize as the hero. He was such a respectable figure of the establishment, while Clarkson was quite a radical and quite a rabble-rouser, especially in his younger days. To me, he is by far the more interesting figure: riding 35,000 miles by horseback all over England, and going out again in his 40s and his 60s and making the rounds. An incredible man. He really got shortchanged by history." Radical. Rabble-rouser. Anti-establishment. Young. Clearly Clarkson is the better role model for today's generation of anti-war and anti-globalization activists.

In pointing out that Hochschild is following a script, my aim is not to dismiss his book. I have it on very good authority that "Hochschild's book is by far the best general survey we have of the British abolitionist movement." Besides, you should never believe a historian who tells you she works without a script -- especially a historian of antislavery. History is always brought to you written, directed, and produced by a particular historian. This does not mean that historians of abolitionism are simply writers of fiction, or that they all shoehorn their characters into preconceived dramatic molds. (History has its share of Steven Spielbergs and M. Night Shamalayans, but there are also plenty of directors who are less omnipresent in their works.) Yet after every good historian has made sure to do the relevant research and to get her facts straight, she has to mold those facts into some sort of narrative shape. And although a good director gets out of the way of her actors as much as possible, no director can remove herself entirely from the performance. There is always some kind of script. Dramas are more realistic than melodramas, but not because only one genre has scripts.

Nor do I necessarily take issue with Hochschild's belief that lessons can be learned from the past. You should also never believe a historian who tells you we historians never bestow any praise or blame on historical figures, because that historian is actually doing that very thing -- passing a normative judgment on the Whiggish history of a bygone age. The proper attitude towards histories that seem to pass moral judgments is skepticism, but not outright dismissal. This skepticism hopefully will encourage us to do what I have done above. Figure out where the historian is coming from. Try to set his history in the context of its own history. Some historians have the wrong idea that the aim of these appropriately skeptical methods is to rid history of any bias or subjectivity. Yet I'm skeptical about that, too. We fool ourselves if we think that all judgment can be rooted out of historical practice, since that idea is itself a normative one. Rather, the best historians can do is to keep being historians -- try to isolate the bias, not for the purpose of destroying it, but for the purpose of understanding it as contingent and constructed.

So what I found interesting about Robinson's review is not that Hochschild (and apparently Wise, too) appears to have a script. Nor am I prepared to dismiss these books automatically because its authors appear to admire the abolitionists. I admire many abolitionists, too. Rather, what is interesting to me is that Hochschild's script is pretty well-worn. (Again, I speak without having read the book, so take what I say here with a grain of salt.) In some ways the script dates back to emancipation itself: a few heroic abolitionists, with no gain for themselves and "against all odds," succeeded in persuading the British public and its leaders to purge the empire of slavery.

Robinson seems aware that this script needs some work. It leaves the characters underdeveloped, it leaves some characters out completely, and there are not really enough scene changes. Most of the action takes place in Britain, which seems problematic. So Robinson knows that there needs to be a little more complexity and irony in Hochschild's story. But here's what is interesting: there are several ways of deepening the story's complexity, and Robinson is not sure which more ironic, more complicated script to use.

One possible way to tweak the script is to suggest that Hochschild's abolitionists were selective philanthropists. This is also an old line, which also dates back to the era of emancipation: the abolitionists' charity abroad was neglect at home. According to this script, abolitionists turned a blind eye to the sufferings of the English working class, screening out those evils while denouncing the evil of slavery. Another way to tweak the script is to point out some of the ironies of British history after emancipation: Britain's claim to be a liberating nation authorized the expansion of its supposedly enlightened empire into Africa and Asia. These scripts are not necessarily incompatible, but they have developed sequentially. The second alternative script is a more recent development in the history of antislavery history. (You may also remember me using it in an earlier post.)

In Part II, I will argue that Robinson's review reveals the influence of several different impulses in antislavery historiography. All of these impulses take issue, in some sense, with Hochschild's story. But they do so for different reasons and in slightly different ways. There may be a lesson we can learn about ourselves by figuring out which scripts seem most compelling right now.

Monday, January 24, 2005

 

Age of anxiety

This weekend the latest issue of Perspectives, the magazine of the American Historical Association, arrived in the mailboxes of historians around the country. Apparently Jason Kuznicki and I had very similar reactions to two of the articles.

There was yet another job market report that seemed to suggest things are bad -- and not getting any better. The supply of job candidates continues to outstrip the demand for tenured junior faculty, and the numbers are especially discouraging for an American historian like myself, since my field is overrepresented in the numbers of doctorates awarded every year.

Perhaps the editorial staff at Perspectives knew the effect that such an article would have on anxious graduate students like Jason and myself, because they packaged the report with a feature on the Beyond Academe website. The site is run by two historians who are not academics, and it is an upbeat and generally convincing account of the grass on the other side. Apparently, some of it really is greener. But the fact that I say "it really is greener" (no, really) is one of the problems with the academic job market. Why do so many graduate students reach the precipice of graduation with the idea that the alternatives to an academic job are hopelessly bleak?

One explanation is institutional. As Jason points out, many graduate programs in history do little to encourage students to seek out nonacademic jobs, nor do they prepare students for the possibility that they will need to compete for some job other than a faculty position.
Forgive me for repeating myself, but if only one third of all PhDs in history ever end up working as academic historians, and if two thirds end up elsewhere, shouldn't the academy focus more--rather than less--on securing jobs for the majority? It may be easier to help place students in teaching positions, but it can't be that hard to place them elsewhere: Two nonacademics with a website are already doing more in their spare time than many entire departments.
As the Beyond Academe authors point out, however, the resistance of departments to training for jobs outside the academy has little to do with how easy it would be from an institutional perspective. The resistance is better explained by socialization. There is an unspoken presumption among academics that leaving the academy represents either professional failure or a loss of interest in intellectual life. As Jason says, "it's not usually an explicit prejudice in academia, but it's certainly there beneath the surface."

What prevents graduate students from seeking jobs out of the academy may not necessarily be a lack of preparation or professional training for such jobs. I suspect that, for many potential employers, simply having a doctorate gives a job candidate an immediate advantage. Corporations and non-profit organizations, unlike universities, are used to having to provide additional on-the-job training to employees. So I don't really fault history graduate programs for preparing people to be academics, since this preparation does a reasonably good job of preparing one for nonacademic jobs as well. Rather, what needs to be corrected is the idea--a relatively recent one in the long run--that only academics are intellectuals, and that only students who move immediately into a tenure-track position have succeeded.

Job market woes are always fueled by particular definitions of the job market. If we enter graduate school with the expectation that it will guarantee us a posh position at a tier-one research university in an urbane location, with a package that includes plenty of time and money to jet around to conferences in Hawaii and Italy, then we are bound to be disappointed. We know this. But what we are less likely to accept is that even our expectations of receiving an academic job might be defined too narrowly.

There is something about academia that is almost redolent of gymnastics programs in Eastern European countries. We are drilled to accept the idea that our deprivations (low stipends and a lack of financial self-sufficiency) will all seem worth it when our anthem is sung and the medal is hung around our neck. No wonder we find ourselves in tears at the prospect of receiving a bronze medal! We have been conditioned to think of bronze as failure, because our trainers know that is the only condition under which we will accept the self-discipline required to go for the gold. The obvious way out of this dilemma, though, is to banish the idea that there is a chasm of difference between bronze and gold, or even that the beauty and athleticism of bodies and minds can be assigned absolute values.

Anxiety about the job market befalls every graduate student sooner or later. We can prepare ourselves for academic jobs or non-academic jobs, but either way we will feel a certain amount of anxiety about leaving behind the known for the unknown. Recently I read parts of a short book on writing by Heather Sellers, in which she advises the aspiring writer to "study anxiety." She even recommends this exercise to writers who constantly struggle with self-doubt. (I tried it, so I can vouch for the method.) Take a blank sheet of paper, give yourself three minutes, and write down your 25 top writing fears, without stopping to think about them -- remember, you only have three minutes, so you have to tap your anxiety and let it flow.

The idea is to acquaint yourself with your anxieties -- even befriend them. Fears and self-doubts come with the territory of writing, so when they come around, that's a good thing. As Sellers says, it means you are a writer. Plus, once you are acquainted with your worst fears about your writing, your anxiety loses the strategic element of surprise. The next time those anxieties begin creeping up from the corners of your mind, you can invite them to pull up a chair and sit around, just like overly familiar, slightly annoying, but oddly comforting friends. This advice about writing may be worth applying to the job market as well. Study anxiety. Figure out what it really is that worries me. Defuse the power of my fears with the antidote of familiarity.

One conclusion I've drawn from my anxiety studies is this: Fears about job security are a normal part of being young. High school graduates who go to college manage to put off these fears about the future for four years or so. Those who go on to graduate school postpone the fears for another four or five or six years. By the time the graduate student approaches the job market -- academic or otherwise -- these fears have often grown more acute precisely because more time has passed. Life seems shorter. The urgency of fears about the future are heightened. And this understandably makes those of us who have been in school for years uneasy. But it helps to remember that it is not an unusual fear. It is simply one that we are feeling later than many of our peers.

Everyone faces a similarly liminal moment in life -- a precarious point of balance between the familiar past and the unfamiliar future. Realizing the normalcy of that anxiety takes some of the edge off, or at least it should. In Auden's masterful longer poem, "The Age of Anxiety," one of the characters, Emble, personifies the kind of liminal state I'm describing. In a few strokes, Auden sketches a man who is not very different from many graduate students:
Having enlisted in the Navy during his sophomore year at a Mid-Western university, he suffered from that anxiety about himself and his future which haunts, like a bad smell, the minds of most young men, though most of them are under the illusion that their lack of confidence is a unique and shameful fear which, if confessed, would make them an object of derision to their normal contemporaries. Accordingly, they watch others with a covert but passionate curiosity. What makes them tick? What would it feel like to be a success? Here is someone who is nobody in particular, there even an obvious failure, yet they do not seem to mind. How is that possible? What is their secret.
If graduate students cannot recognize some of themselves in Emble, his later monologue on the Fifth Age of Man cannot fail to sound familiar:
EMBLE said:
Why leave out the worst
Pang of youth? The princes of fiction,
Who ride through risks to rescue their loves,
Know their business, are not really
As young as they look. To be young means
To be all on edge, to be held waiting in
A packed lounge for a Personal Call
From Long Distance, for the low voice that
Defines one's future. The fears we know
Are of not knowing. Will nightfall bring us
Some awful order--Keep a hardware store
In a small town. ... Teach science for life to
Progressive girls--? It is getting late.
Shall we ever be asked for? Are we simply
Not wanted at all?
I wonder whether Auden cribbed that speech from his own top-25 list of writing fears. If so, perhaps verbalizing the anxiety and giving it a name helped. Hello, Emble, old friend. Pull up a chair. As long as you're here, I know that my pangs are those of youth. And those, in many ways, are still the sweetest pangs of all.

Friday, January 21, 2005

 

How to be a jazz snob

Good evening, ladies and gentlemen. I hope that you have all rested up after last week's class on lip-curling and eye-rolling. (Remember: The curl should not be rushed, and the eye-roll should be counter-clockwise.) This week's reading is Francis Davis' "X Jazz" in the latest Atlantic Monthly. Here Davis wonderfully demonstrates some rules of thumb for beginning jazz snobs like yourselves. For this lecture, I'll be turning my back to you, which will demonstrate another key tool of jazz snobbery, pioneered by that other Davis.

1. Always appear irritated when someone who doesn't know better asks you for jazz recommendations. When this happens, perhaps at a party, it's a good time to bring out those lip-curls and eye-rolls. Try to affect an air of: "Where would I even begin with you?" And always assume the questioners are only asking because they want to be as cool as you are. It's best to start your answer off with some obscure reference to names having to do with jazz (best to use only the first name or the last name). You want to set boundaries right away: you are the jazz snob, they are the wanna-bes. Observe:
I think of Donald Barthelme's short story "The King of Jazz" whenever I'm at a party and people at a loss for appropriate small talk after I've said I write about jazz ask me to name a good place in town to hear some. [Insert eye-roll here.] They want me to point them to a hangout like the one that Hokie Mokie, Barthelme's king of jazz, strolls into after inheriting the crown from the deceased Spicy MacLammermoor--"Hi Bucky! Hi Zoot! Hi Freddie! Hi Thad! Hi Roy! Hi Dexter! Hi Jo! Hi Willie! Hi Greens!" A hangout with all the giants on the bandstand or at the bar, being fawned over by an audience for whom the music is incidental to the satisfaction of not being square.
2. When your inquiring outsider appears disappointed that you have not given them a straight answer about a good jazz place to visit, or a good jazz musician to hear, make clear that this is not because you don't deserve the "king of jazz" crown. The king is only dead, remember, because jazz is dying. That's the thing you need to emphasize as quickly as possible. The weight of the jazz world is on your shoulders: you can't be bothered by pesky questions from wanna-be hipsters. Make clear that there is a whole universe of jazz debate and insider knowledge and personal angst that they can only glimpse. Observe:
In reality, it's been two full generations since being a jazz insider was taken as proof of being hip, and almost as long since jazz fans or musicians agreed on such basic issues as what jazz is and who the legitimate heirs of Louis Armstrong, Duke Ellington, Charlie Parker, and John Coltrane are. [You can switch to using full names for the purpose of showing the wanna-bes that the only names they recognize are dead. Now we need heirs.] The problem with wanting to dig the scene is that there isn't a scene anymore--not one that could live up to the fanciful expectations [excellent phrase, class: remember, jazz snobs are members of the reality-based community now] of the people I politely excuse myself from at parties, by that point not merely pretending to need another drink.
3. You might think that Tip #2 should be the end of this lecture. If there is no jazz scene, if jazz is dead, what are we doing here trying to learn how to be jazz snobs? I'll let you in on a little secret only jazz snobs know. Jazz is not dead, it's just underground. That's right, buried alive. That's how you can consistently declare the scene to be gone and yet move on to demonstrate that you alone know where the scene is. It's sort of like a Greek mystery religion. Observe:
What jazz does offer today, along with a bewildering profusion of sub-genres and hybrids, is vest-pocket scenes, the most vital of which is the most marginalized [see! the alive is dead, the dead is alive; the "vital" is "marginalized"]--banished to the furthest reaches of bohemia in its home base of New York City, and documented chiefly on musicians' vanity labels and small labels here and abroad.
4. Inevitably, your wanna-be friends will want to know how this happened. One word: fusion. It was not the fault of avant-garde jazz artists that younger audiences began leaving jazz in the 1960s. That would be like saying it was your fault, the jazz snob's fault. Notice how I'm curling my lip. No, the answer is fusion. Jazz sold out. Observe:
The latter-day free-jazz scene of which Shipp is a part--along with Parker, Brown, Ware, the drummer Susie Ibarra, and the saxophonists Rob Brown, Daniel Carter, Charles Gayle, and Assif Tsahar, among others--is the only one in jazz right now with younger faces noticeably represented in the audience. I don't mean young, mind you; that would be hoping for too much. [Underline that last sentence in your textbooks, class.] I mean people a decade or two younger than Baby Boomers like Steve Dalachinsky and me. [You are dying breed, fellow jazz snobs. Make that clear. Repeat after me: "I am the last of the Bohemians."]

Free jazz was wrongly blamed for chasing people away in the late 1960s, around the time that the graying of the jazz audience first became a grave concern. The truth was more complicated. By then soul music and psychedelic rock not only had achieved greater popularity than jazz ever dared to hope for but also, in some odd way, had eliminated any need for it. No longer greasy kid stuff, pop suddenly offered simplified [key word: fans of pop music are always "simple," and I mean that in the snobbiest of ways] and easier-to-find versions of everything that had once drawn certain kinds of listeners to jazz: its own Charlie Parker and John Coltrane in Eric Clapton and Jimi Hendrix, its own Stan Kenton in Frank Zappa, its own wigged-out Ornette Coleman and Sun Ra in Captain Beefheart and George Clinton. ...

Jazz survived despite all of this, but just barely. It joined classical music as one of those fine arts that people pay lip service to out of guilt but shy away from out of fear they might be too difficult. [Nice shout-out to our fellow classical music snobs.]
"Just barely" survived. That's the key. It's not dark yet, but it's getting there. This makes you, the jazz snob, like Saint Denis. According to legend, "after his head was chopped off, St Denis picked it up and walked several miles, all the time preaching a sermon." That's you! It's like you've had your head chopped off by Jimi Hendrix's axe, but you still manage to stumble across the Brooklyn Bridge and walk down to the Vanguard every week, head in hand. Remember the point I made in my second lecture: only a few letters separate saintliness from snobiness.

5. Saint Denis preached a sermon, so to conclude, make sure that you keep preaching. The best sermon is one that makes sweeping generalizations about how jazz might be saved. For instance, you've noticed that there are some younger faces (not too young) at a jazz venue. There's a mystery to it, I know. But you need an answer for this. Here's how to come up with one: What do you know about young people these days? Have an answer in your head? Great! That's what you can use to explain their inexplicable attraction to jazz! You may be worried that the only thing that popped in your head was the only thing you know about young people these days. That's okay. If you say something absurd about youth culture, this will only reinforce your standing as a jazz snob--aloof, saintly, perpetually crucified. Observe:
Today's youth culture is a body culture, and both Shipp's music and free jazz as a whole are far too cerebral ever to become a significant part of it. [Hopefully by now you know which words to underline without me telling you.] Even so, I suspect that in listening to free jazz, many intellectually curious younger people vicariously experience [remember: they can only like jazz vicariously; you're the one on the cross] a thrill similar to the one experienced by participants in skateboarding, motorcross, BMX, and the sports featured at the annual X (for "extreme") Games. ... A good name for what Shipp and his fellow revivalists are up to might be "X Jazz"--even when it's acoustic, it's amped. [If you want, you can throw in some other words here like "bodacious," "radical" and "gnarly," just to get you some "younger people" cachet.]
Well, folks, that concludes this week's class. Be sure to study up for next week's quiz on "inside jokes." Have a gloomy Sunday.

Thursday, January 20, 2005

 

Different drummers

I've spent most of the day preparing this draft syllabus (PDF) for the course I'm teaching this semester, "Black Abolitionists." Below is a rough draft of the course description for the first page, which I include for students who enjoy poring over their syllabi after the first day of class (like I did). Right now it might be a little too elliptical and metaphorical. Feedback on either the description or the syllabus would be welcome.

FOR A CENTURY after the Civil War, most historians credited white reformers as the primary movers behind the abolition of American slavery. The year 1831, when fiery white editor William Lloyd Garrison founded the Liberator, was often treated as the starting point for the timeline of American antislavery, a timeline that went from Garrison to Harriet Beecher Stowe, John Brown, and the “Great Emancipator” himself, Abraham Lincoln. If black abolitionists were recognized as part of this timeline, they were often portrayed (except by pioneering African American historians) as little more than the followers of white abolitionists.

This standard timeline began to be revised in the 1960s and 1970s, with the publication of new histories like Benjamin Quarles’ Black Abolitionists (1969). Quarles criticized the fact that African American contributions to the crusade against slavery had been perennially neglected. And he showed that “the black abolitionist phalanx was not just another group of camp followers.” In many cases, their efforts and ideas preceded and made possible those of white abolitionists like Garrison. Black abolitionists, according to Quarles, were the movement’s “different drummers.”

This course is about those “different drummers.” Over the course of the semester, we will try to become, like Quarles and others, historians of black abolitionists. To do that, we will first have to listen closely to the historical record, so we can hear the drumbeats that were often drowned out in nineteenth- and early-twentieth-century histories of abolitionism. Most of our reading will be from “primary sources” written by black abolitionists themselves. And one of your major assignments will be to do your own primary research and write a paper about a black abolitionist of your choice.

Yet once we begin to hear these “drummers,” as historians we must also ask: How “different” were they, and in what ways? To extend the metaphor that Quarles used, we will be hearing an antislavery movement that was polyrhythmic. There were not simply two “drummers”—the white abolitionist, on the one hand, and the black abolitionist on the other. There were more than two drummers—many more. Even among black abolitionists, there were disagreements about which beat to march to. Both white and black abolitionists thus had to be disc jockeys as well as drummers. They often sampled ideas that they heard both from other reformers and the culture around them, looping and layering those ideas on top of each other. And from out of these intellectual “remixes,” leading black abolitionists improvised new rhythms that emphasized certain beats and muted others.

One thing all black abolitionists shared in common: a courageous willingness to drum up protests against slavery and white racism. But there were different ways of doing this—various kinds of percussion to be used, and different beats for marching to. Should African Americans stay in the United States and beat the drum for full citizenship? Should they emigrate to Africa, Canada, or the Caribbean and start new communities there? Were black Americans degraded by their enslavement, or were they racially superior to Europeans? Was violence justified in the pursuit of emancipation? These were only some of the questions that black abolitionists asked. And different drummers gave different answers.

Tuesday, January 18, 2005

 

Favorite tracks

Thanks to all who helped me sort out my feelings about archives. I've decided to keep the archives, but lower on the sidebar. And I'm going to start keeping a list of representative posts under "Favorite tracks." My goal is to keep the list right around ten items long, so adding to it will always mean making judgments about what gets bumped off.

Hopefully new readers (and thanks, Ed and Jason, for sending me so many today!) will find the list to be a helpful snapshot of the blog. As for those of you with the dubious distinction of being old readers, feel free to holler if you think I've missed a post that I should not have, or distinguished a post that would have been better to leave in the dustbin.

 

Abolitionists everywhere

Lately it seems that "my" abolitionists have been popping up everywhere in the MSM. In the January 10 issue of The New Yorker, there is a nice little article on Bronson Alcott, the father of Louisa May Alcott, who was one of the numerous utopian seers, education reformers, and abolitionists who sprouted in antebellum New England. And in the current issue of the Atlantic Monthly, one of the most prominent American evangelical abolitionists, Lewis Tappan, gets a mention in Benjamin Schwarz's review of Scott Sandage's Born Losers: A History of Failure in America. (By the way, has any one else ever wondered if Schwarz, the Atlantic's main book editor, is or was an historian? He seems intent on giving academic history books prominent billing on the glossy pages, and I, for one, am not complaining.)

Tappan is mentioned in the review because of his role in establishing one of the first credit-rating systems in the United States, and Schwarz calls him a "busybody, moral reformer, fervent abolitionist, and one of history's great snoops." The humorous index that is now published at the end of every Atlantic lists this entry: "Tappan, Lewis, as moral reformer, 160; as busybody, 160; as spy, 160." Perhaps not entirely fair to Tappan, but certainly not entirely unfair either.

I'm simply happy to have large audiences of readers discover how perpetually fascinating abolitionists are. They never cease to amaze me, and I mean that in the obsolete sense of the word. Why, just the other day, I was reading along in an issue of the Liberty Bell, an annual gift-book published by Garrisonian abolitionists in Boston, and ran across this gem by David Lee Child, in an article that echoes many of Bronson Alcott's ideas about education reform:
Thus in all things we see the dawning rays of a better age. In all things the Equalizing Principle manifests itself. In Anti-slavery associations, the democracy of morals; in Free Schools, the democracy of Intellect; in Phrenology, the democracy of metaphysics; in Daguerrotype, the democracy of painting; in the invention of the Hand-Harmonicon, and the theory that every man may be a singer, the democracy of music; in Free Trade, the democracy of commerce; and in Jacotot's system, the genuine democracy of education. Verily, all is in all.
How can you not be fascinated by people who saw the same "Equalizing Principle" in the harmonica, abolitionism, phrenology, and free trade? Sometime I'll have to post about how some Garrisonians believed medicine should be democratized, too, since doctors were as tyrannical in their control of the body as slaveholders were. Think about that the next time the doctor is hitting your knee with a blunt object and forcing a piece of cardboard down your throat.

But with abolitionists everywhere, their highest-profile appearance came in this recent review in the New York Times of two new books about British abolitionism. I have much more to say about this review, but it will have to wait, because speaking of abolitionists, I have a dissertation on them to write, and a syllabus about them to prepare for this semester (scroll down to 100.168).

Monday, January 17, 2005

 

Would I be a bad blogger if ...

... I took the links to my archives off of my main page? I'm contemplating replacing these links with links to the posts that I am most satisfied with as representations of what I still think. Here's the deal: I've always been uncomfortable with the archives. I know this makes me suspect as a blogger; there is a strange social psychology of blogging that has developed which seems to make it an ethical requirement that all of your posts be readily available in perpetuity. So before I sin so boldly against the bloggy gods, I wanted the feedback of the many more experienced bloggers who grace these pages with their presence.

First, in my defense, let me point out that removing the archive links would not abolish the archives. My "search bar" at the top of the page makes it possible to pull up any post from the past, and it doesn't take too much thinking to figure out how one could, say, return all the results from "August 2004" by using the search function. Also, because of the way my "Recent Posts" list works, you can work your way in reverse through the entire history of Mode for Caleb by repeatedly clicking the last link in the list. So I'm not deleting the past, only making it accessible in different ways.

I know that archives seem to be a constitutive feature of what makes a blog what it is: a dated series of writings. But although blogs have the appearance of an ordinary time-stamped journal or diary, their visual organization is in fact very synchronous. Blogs screw up my sense of a writer's development of ideas over time. In the first place, this is because they are organized with the most recent posts first, so whenever I click on an archive page, I'm encouraged to read backwards, instead of in the proper chronological order.

The other synchronous feature of blogging has to do with its character as hyper-text. You can leap with abandon from July 2004 to November 2004 in my life, without any sense of the passage of time between these months, or the possible transformation of my moods and thoughts over that period. Of course, you could do the same thing if you were reading a physical diary or volume of correspondence. But there is a difference in the experience of reading a bound book and a blog: if you skip around, the physical weight of the pages before and after the one you are looking at makes it impossible for you to forget that there are many things you are skipping. When you jump to an archive page of a blog, which looks exactly like the main page, these tangible reminders of time's passage are absent. Your eye easily tricks your mind into ignoring the dates, and the past comes to seem every bit as present as the present.

Okay, that's the nerdy philosophical justification for what I'm contemplating. On a practical level, though, I just don't see the need for the archive links. It seems to me that there are two reasons why a reader might use them. First, someone might be looking for an old post that they have read. In that case, the search bar seems like a much better tool. Secondly (and most likely), a new reader might want to get a sense of who I am as a writer and what this blog is about. But in that case, why should I not help them by giving them a list of the posts that I feel most represent me and the blog? That seems more accurate and efficient than the scatter-shot introduction that comes from clicking around the archives.

So ... what do you think? Would I be a very bad blogger if I did this? Would you care? Should I?

Saturday, January 15, 2005

 

Are wars necessary or chosen?

Two things have become clear about our war against Iraq. First, since weapons of mass destruction have not been found and are no longer being sought out, it is hard to resist the following impression: We chose this war not because we were sure that the enemy possessed the weapons we most fear, but precisely because we were not sure that it did.

President Bush was right from the very beginning when he said that we claim the power to fight our wars at the times and places of our choosing. This has been the aspiration of our defense policy for half a century -- not permanent peace, but the permanent ability to choose only those wars we believe we can win. Our definition of securing the homeland is to so arm ourselves with overweening might that every conceivable war can be won.

Yet if it is increasingly clear that we chose to fight the war against Iraq because we judged it to be our most vincible enemy, it is ironically also becoming clear that we cannot even win the war we judged to be the easiest.

Americans often forget that the United States resisted entering World War II as long as possible, because we were not sure that we could win. It has only been since the end of that conflict that we have been possessed with the illusion that we can create, by the force of our own military hegemony, conditions in which there is no war that cannot be won. But it is only since this illusion gripped us -- that we are or can be invincible -- that we have been defeated in war. That is the perpetual irony of our national security calculus: we want to arrange things so that victory is assured, but so far this has proven to assure our only defeats. We entered this war because its success was determined to be the most probable, yet even it will not give us the satisfaction of complete success.

So why choose to fight wars at all? Because wars have been declared upon us: that is the most common response. Our deaths have been caused by our enemies, so we must cause our enemies' deaths. But is war the only response to declarations of war, which come in the form of murderous and infamous acts against our loved ones and compatriots?

In other words: Is causing death the necessary response to caused deaths?

The answer to that last question is clearly negative. Our reaction to the recent deaths of more than 150,000 South Asians is proof that we are capable of responding to death with charity and peaceable goodwill toward the living. It even demonstrates that we can respond charitably and peaceably by using the very same resources -- planes, helicopters, aircraft carriers -- that we call upon to deliver destruction and death.

Why has our response to these undeserved deaths been different from our response to the far fewer undeserved deaths of September 11, 2001? The answer, of course, seems obvious; the question ridiculous even to ask. Against what enemy would we wage a war to avenge the lost lives in Banda Aceh? Would we beat against wind and tide? What can we obliterate in response to this obliteration?

Our reaction to deaths caused by natural disasters suggests that the same calculus which governs our choices about war guides our decisions about when to declare war impossible, and charity thus the only response. For in the case of the Indian Ocean, we have run up against a force of mass destruction that we cannot ourselves destroy. We cannot kill this enemy; the asymmetry of its power to ours is overwhelmingly in its favor. So of course we respond to the death it causes not by causing death, but by nurturing life.

All wars are finally chosen only because they can be waged against beings that can die. When compared to the absurd idea of winning a war against the brute forces of nature -- tsunamis, mudslides, lightning bolts -- all wars against human beings seem relatively capable of being won. We choose war in general for the same reasons that we chose the Iraq war in particular: because victory against any human enemy appears more probable than victory against the cosmos itself. That is an opposing force whose hegemony we cannot even approach.

But here is where this line of thinking leaves us: Our politicians construed this war against Iraq as necessary, when in fact it becomes clear that it was a war caused not by necessity, but by the accidental truth that it was deemed capable of being won.

The same is true, however, of war itself. We construe war as the necessary response to caused deaths, when in fact war is chosen in light of the accidental fact that there is something to defeat, which is to say someone whose death we also can cause.

If Iraq was chosen as our battleground because we could not win against North Korea, for instance, then war itself is chosen only because we cannot win against things we cannot kill. When these things kill us, we have no choice but to respond in peace. But that proves that in all other circumstances, we actually do have choice -- either to wage war or to wage peace. We convince ourselves that war is ineluctable, but then the world itself reminds us how to define ineluctability, and shows us that war is completely avoidable by comparison.

Being able to respond to deaths with charity and goodwill thus requires being able to see that the murders of innocents by tsunamis are no different than murders caused by terrorists. While the difference between these kinds of death seems essential, they are in fact only contingently different -- made so not by an absolute difference in the grief suffered or the loss sustained, but by the accidental fact that one kind of death is caused by a being also vulnerable to grief and loss. We fight wars at the time and place of our own choosing.

Friday, January 14, 2005

 

History carnival

The first-ever History Carnival is now posted at Early Modern Notes. Sharon has done an incredible job organizing a fascinating group of links. Take a look!

 

Brief notices

At Easily Distracted, Timothy Burke has one of his usually insightful and inspiring essays about academia. He argues that academics need to think more carefully about whether the indices we use to measure productive scholarship (articles published, conference talks given) have any actual connection to the primary aspirations of our institutions. Here's an excerpt:
So my simple suggestion is this: stop. Administrations and faculties need to stop caring how much someone writes or publishes or says, or even how important what they’ve published is according to some measurable or quantifiable metric. Not only because trying to measure productivity in terms of scholarship destroys scholarship, but because it detracts from the truly important kind of productivity in an academic institution.

What really matters is this: how different are your students when they graduate from what they would have been had they not attended your institution, and how clearly can you attribute that difference to the things that you actively do in your classrooms and your institution as a whole? What, in short, did you teach them that they would not have otherwise known? How did you change them as people in a way that has some positive connection to their later lives?
The universities who have been trying out the National Survey of Student Engagement at least seem to have the right idea about what to measure. (See Michael Arnzen's post about NSSE, which links to the 2004 report. NSSE was also profiled last November in the Atlantic Monthly.) The idea behind the survey is to determine how well universities actually teach their students (novel idea, I know), but the jury is still out on how effective such a survey can be. It relies heavily on students' surveys, in which they are asked to correlate how they think and what they know directly to their classroom experience, but as Burke points out, it can be difficult even for someone who loved college to separate how much of their learning was connected directly to their classes, and how much was connected to auto-didactic efforts and individual charismatic professors. All interesting reads.

Also: I wonder whether my jumping up and down and cheering while reading Burke's essay has something to do with the fact that I'm a graduate student still, and hence on the lowest rung of the academic ladder. Yep, I think it probably does. The Will to Produce (which roughly parallels the Will to Power in academia) is therefore bound to look more daunting to me than it is to someone who has already produced enough to secure the right not to produce as much (i.e., tenure).

That may the problem inherent in getting a sea-change to happen in the way we evaluate scholarly and institutional performance. Those whose performance has already been demonstrated have no reason to invest in seeing things differently (schools rated well by U.S. News don't need NSSE), while those whose performance has not been demonstrated in the conventional way are bound to be looked at skeptically (perhaps we want to change things because we cannot pass the bar). Still, I don't lose hope as long as there are thoughtful scholars like Burke and Arnzen above me on the ladder.

On another note, Rob at Detrimental Postulation links to a new online venture by the University of California Press. The Press has digitized 1400 of its books and made them searchable, although only 400 are made available full-text to the public (I know, only 400!). A couple of public texts that look especially interesting to me are Roads to Rome: The Antebellum Protestant Encounter with Catholicism, which I've already been meaning to look at, and The French Revolution and the Birth of Modernity. Enjoy!

Thursday, January 13, 2005

 

Jefferson's Jesus Nation

I've just started reading Stephen Prothero's American Jesus: How the Son of God Became a National Icon. I was going to wait until finishing it to make any comments, but the first chapter is relevant to a thread on this recent post at In the Agora. There you will find the latest installment of a perennial debate about what to call Thomas Jefferson -- an atheist? a deist? a Unitarian? a freethinker? a Christian? The debate, of course, is connected to the perennial debate about whether the United States is a "Christian nation."

Even if a nation could be Christian (and personally, I do not believe that particular noun can be modified by that particular adjective), profiling the religious beliefs of Founding Fathers would be more or less irrelevant to determining whether this one is. Many members of the Religious Right would bristle at the following arguments: Jefferson was a racist, therefore this is a racist nation; Jefferson was a man, therefore this is a masculine nation. But many participants in the debate on church-state separation accept arguments of the same basic form: Jefferson was a Christian, therefore this is a Christian nation; or Jefferson was not a believer, so this is a secular nation. Suffice it to say that I think settling the issue of how religion fits in the public sphere of a liberal democracy is too complex to be settled by a tug-of-war over individual Founding Fathers.

Prothero's book helpfully brackets the "Christian nation" question in the introduction. Instead, Prothero sets out to understand how America became a "Jesus nation" (a phrase that I like somewhat better because it sets two nouns at odds, as if to testify even grammatically that these two things do not mix well). Whatever one thinks about the "Christian nation" question, it is undeniable that Americans of all stripes have been obsessed throughout our history with Jesus. What Prothero proposes to show, in fact, is that Americans have frequently used their various interpretations of Jesus, whether historical or theological, as critical tools against "Christianity."
Jesus became a major personality in the United States because of the ability of religious insiders to make him culturally inescapable. He became a national icon because outsiders have always felt free to interpret him in their own fashion. To put it another way, while Christian insiders have had the authority to dictate that others interpret Jesus, they have not had the authority to dictate how these others would do so. In the United States, thinkers from Frederick Douglass and Rabbi Stephen Wise to Swami Yogananda and Malcolm X have boldly distinguished between the religion of Christianity and the religion of Jesus. And while they have rejected the former, they have embraced the latter as their own. ... The vast majority of U.S. citizens today are committed Christians. Yet no one group has an interpretive monopoly. Everyone is free to understand Jesus in his or her own way. And Americans have exercised that freedom with wild abandon. [p. 16]
I'm not sure whether I'll agree with all the points Prothero makes by the end of the book, but here's why his first chapter is relevant to the question of Jefferson's religious beliefs. In Chapter 1, "Enlightened Sage," Prothero presents Jefferson as one of the first in a long line of Americans (culminating, Prothero says, with the Jesus Seminarians of the 1990s) who attempted to separate the Jesus of history from the Christ of faith.

This was why Jefferson made his famous redactions of the gospels, first in an essay called "The Philosophy of Jesus of Nazareth" (no longer extant), and then in a second edition called "The Life and Morals of Jesus of Nazareth." (Interestingly, Prothero says that the first edition, which became the Q gospel, so to speak, for the second, took Jefferson only two or three evenings of leisure time to complete. I wonder what he would make of our putting so much weight on the text now.)

Jefferson called the first essay a "precious morsel of ethics." This was how he viewed Jesus, above all: as a great ethicist. His intent in editing the gospels was actually not primarily to expunge it of miracles (though of course he would have done that, too, given his deistic leanings), but to create a syllabus of Jesus's ethics, which stood opposed in Jefferson's mind to the metaphysical and theological knots that American divines of his day excelled at creating.

Here's Prothero's summary:
After he completed "The Philosophy of Jesus of Nazareth," Jefferson claimed in correspondence with a friend that his Bible demonstrated his bona fides as a Christian: "It is a document in proof that I am a real Christian, that is to say, a disciple of the doctrines of Jesus." Earlier he had told Benjamin Rush, "I am a Christian, in the only sense in which he wished any one to be; sincerely attached to his doctrines, in preference to all others; ascribing to himself every human excellence, and believing he never claimed any other." Whether Jefferson really was a Christian has been much debated, both in his time and in ours. Over the last two hundred years, Jefferson has been called an atheist and an infidel, a theist and a Deist, a Unitarian and an Anglican, an Epicurean and a secular humanist. In fact, the list of historical Jeffersons is nearly as long (and creative) as the list of historical Jesuses.

What is most clear about Jefferson's faith is what he was not, and what he was not was a traditional Christian [my emphasis]. Jefferson unequivocally rejected the Nicene Creed, which has defined orthodoxy for the overwhelming majority of Christians since 381, as well as the Council of Chalcedon (451) formula of Jesus as "truly God and truly man." He sneered at Calvinist verities such as predestination, which throughout his political career dominated American religious thought, and was particularly contemptuous of the doctrine of the Trinity ("mere Abracadabra" and "hocus-pocus phantasm," he said, distinguishable from paganism "only by being more unintelligible.") [p. 26]
If we really wanted to figure out what Jefferson believed, this would be a good start to a subtler account. But what is clear even from this start is that Jefferson's ideas about Jesus were "Christian" only in a highly qualified sense. What is even clearer is that Jefferson intended his Jesus to stand in judgment against Christianity. It is important that he underlined his status as a "real" Christian. He was not claiming Christian faith in such statements, but was (as many Americans subsequently have) attempting to evacuate the term and fill it with new meaning. And, again like many subsequent Americans, he wanted to claim that this new meaning of Christian was the original, unadulterated one. Jefferson's new/old definition of Christian was basically "a disciple of the doctrines of Jesus."

Perhaps the real shame is that this critical edge to Jefferson gets lost in the shuffle of labeling him. His whole point was to recoil from the label "Christian" and get back to the ethical germ of Jesus's teachings. How well he succeeded in doing that is another question entirely. But whether he was a deist or a Unitarian or an atheist, one thing at least is clear: the attempt of contemporary Christians to label him "Christian" on the basis of his metaphysical and ontological views about the "Supreme Being," instead of by assessing his adherence to the "doctrines of Jesus," must have him rolling in his grave. For this is the very confusion his revised gospels were designed to clarify.

[P.S. I've also written more on Jesus, Jefferson, and Prothero, and I've made minor revisions to this post.]

 

Thanks, whoever you are

I discovered this morning that the name of this blog has been tossed in the general direction of a Koufax awards category: "Most Deserving of Wider Recognition."

To be honest, the idea that some kind soul thought of Mode for Caleb when presented with such a prompt is recognition enough for me. Whoever you are, thanks. I voted for Slacktivist, but wish I could have voted for Velveteen Rabbi at the same time. I discovered some new blogs, too. Thanks to the list, the circle of recognition for Adventus has now widened at least enough to include me.

Site Meter