Friday, September 30, 2005


Friday shuffle

1. "Who Knows Where the Time Goes?" by Deanna Kirk, from Mariana Trench
2. "You Know My Name (Look Up the Number)" by The Beatles, from Past Masters, Vol. 2
3. "The Man of Metropolis Steals Our Hearts," by Sufjan Stevens, from Come On Feel the Illinoise!
4. "Helena," by Nickel Creek, from Why Should the Fire Die?
5. "The Boy Who Wouldn't Hoe Corn," by Alison Krauss and Union Station, from New Favorite
6. "Free," by Alana Davis, from Blame It On Me
7. "Under You," by Better Than Ezra, from How Does Your Garden Grow?
8. "Shady Grove," by Bryan Sutton, from Appalachian Picking Society
9. "Neither Luminary," by The Moon Seven Times, from Sunburnt
10. "I Want to Sing that Rock and Roll," by Gillian Welch, from Time (The Revelator)

Tags: ,


Rob on Ivan Tribble

Rob Macdougall has a brilliant post on recent debates, sparked by the pseudonymous Ivan Tribble, about the perils of academic blogging. (The post gives his extended answers to Rebecca Goetz's survey of graduate student bloggers.)

Rob makes two points worth making about reactions to Tribble. First, the Tribble kerfuffle shows how much the academic job market creates a culture of fear:
The job market is scary and stressful and competitive, of that there is no doubt. But I also think there’s a culture of fear in grad school that goes far beyond what is necessary or healthy. At least there is at our particular grad school, and I doubt that Harvard is alone. That fear is often fed by well-meaning career advice workshops and Chronicle of Higher Ed. columns. … For some people maybe recounting stories of job hunt disasters and pitfalls are cathartic, but I can’t stand them. I never needed outside help coming up with things to worry about.
Ironically, Rob adds, fears about the job market are accompanied by an illusion of control: it is because graduate students "cling" to that illusion that they are afraid of doing anything (like blogging) that might lessen their control over job prospects.

I will be the first to confess that I am not immune to this fear, but I've tried to argue before that blogging can be a way of overcoming fear of disapprobation, which has a dampening effect on intellectual life. And I've also argued that the anxiety graduate students feel about jobs is part of a normal phase of life: realizing it is normal might help us keep from going beyond the level of fear that is "necessary or healthy," as Rob puts it.

The second point that Rob makes--and he's so right--is that the Tribble column reveals how totalizing a claim academia makes on academic lives. The underlying insinuation of Tribble's column is that blogging hurts a job applicant because it appears frivolous, because it makes job committees wonder where a productive academic gets the time to send silly little missives into the oblivion of cyberspace. Rob argues that there is value--even intellectual value--in protecting our right to engage in pursuits that are not strictly utilitarian, that are leisurely, that are convivial. As Rob puts it:
What does disturb me about the Tribble article … is the tenacity of the idea, sometimes spoken, often just internalized, that as an academic you are not entitled to have a life outside of work. You lose your right to be frivolous. You are not supposed to have hobbies, to let your hair down with your friends, to geek out about comic books or the ouevre of Joss Whedon, to get into flame wars, whatever. That tendency has to be fought.
While reading Rob's post (read the whole thing) it occurred to me that his two major points are related. Why are graduate students so afraid of the academic job market? Because academia often encourages the belief that there is no life beyond it. The culture of fear goes hand in hand with a theory of academics as a no-leisure class. Perhaps the way to respond to Tribble is to reject both, and thus kill two errors with one blog.

Friday, September 23, 2005


Friday shuffle

1. "Christmas," by Leona Naess, from Leona Naess
2. "Thankful," by Glen Phillips, from Winter Pays for Summer
3. "The Fiddling Ladies," by The Chieftains, from Tears of Stone
4. "Make You Feel My Love," by Bob Dylan, from Time Out of Mind
5. "Nobody's Crying," by Patty Griffin, from 1000 Kisses
6. "Waltz Across Texas Tonight," by Emmylous Harris, from Wrecking Ball
7. "Carry Me Ohio," by Sun Kil Moon, from Ghosts of the Great Highway
8. "The Nearness of You," by Norah Jones, from Come Away With Me
9. "Country Feedback," by R.E.M., from Out of Time
10. "Red Hill Mining Town," by U2, from Joshua Tree

If anyone out there likes Son Volt, you might like Sun Kil Moon (#7 above). He sounds a lot like Jay Farrar, except more frequently on-key.



Sincere, smart, and reasonable people disagree about how quickly American troops should be withdrawn from Iraq. That's why I don't mind listening to the arguments of those who believe that immediate withdrawal is a bad idea. Reasonable people can think that; here's an example.

But that doesn't mean that all reasons for opposing immediate withdrawal are equally good. For example, the reasons that President Bush gave yesterday just don't strike me as good ones. I freely admit I'm a layman when it comes to debates about what to do in Iraq; but good citizenship demands that I say how things strike me. And from where I sit, President Bush's reasons for staying the course are terribly confused.

The president said that withdrawing now would "allow the terrorists to claim a historic victory over the United States." And he added to that assertion the claim that the failure of previous administrations to commit and maintain troops in the Middle East is what has allowed terrorism to flourish in the last twenty-five years.
''The terrorists saw our response to the hostage crisis in Iran, the bombings of the Marine barracks in Lebanon, the first World Trade Center attack, the killing of American soldiers in Somalia, the destruction of two US embassies in Africa, and the attack on the USS Cole," Bush said. ''The terrorists concluded that we lacked the courage and character to defend ourselves, and so they attacked us."
The problems with this line of reasoning seem, to me, to be twofold.

First, it seems painfully simplistic to believe, four years after September 11, that the terrorists "attacked us" because they "concluded that we lacked the courage and character to defend ourselves." Al-Qaeda did not attack the United States because it thought we would not strike back, because it had concluded that we were we were a bunch of sissy faces. On the contrary, terrorism "works" because its architects know that their victims will strike back. In invading an Arab country, I believe we acted out the precise role that terrorists had scripted for us--the role that now allows them to persuade would-be insurgents and new terrorists that our designs in the Arab world are imperialistic. I could be wrong about this construal of terrorists' strategic thinking, but that interpretation of what Bin Laden in particular is up to seems much more believable, to me, than the interpretation that the President gives: They thought we were cowards, so they attacked us. At the very least, how can anyone believe that Bin Laden or any of his associates expected us not to do anything after September 11? If Al-Qaeda really did learn anything from previous examples of terrorism like the first World Trade Center bombing and the bombing of the Cole, it was that those attacks were not spectacular enough to get us to commit troops. They didn't see it as a victory that we didn't come crashing out of the gates talking about crusades and axes of evil; they saw it as a failure, and a reason to try again.

But here's the second problem, as I see it, with President Bush's logic: that we shouldn't withdraw immediately because insurgents and terrorists could declare a "historic victory." That may be true; it probably is true. But it's not a specific reason against immediate withdrawal. Whenever we withdraw from Iraq--whether now or ten years from now--some terrorist somewhere in the world will claim an "historic victory." You can bet on it. It doesn't matter what godforsaken cave he broadcasts from: shortly after the last American soldier leaves Iraq, some terrorist will send a tape to Al-Jazeera saying the Great Satan is defeated and so on and so on. That's not going to change if we stay another year or another two years. It's going to happen; a "historic victory" is going to be claimed whether it really is a victory or not.

(If you read this blog regularly, you know I'm actually more of an idealist than that: I think we assume too quickly that we know how terrorists would react if we made some truly surprising, nonviolent change of course. Simple withdrawal, though, probably would not be that surprising, so the realist in me is reasonably sure that whenever we withdraw, some terrorists will still make use of anti-American rhetoric.)

So the real question is not, "How can we make sure no terrorist claims victory?," but rather, "How can we make sure that as few people as possible join the terrorists in throwing confetti?" When the terrorists claim victory after our withdrawal (remember: it's a given; they will), that will simply be part of their long-standing ad campaign. What we need to think about is how to minimize the number of people likely to "buy" their product. And that's where immediate withdrawal starts to make more sense, at least if the primary objective is to reduce the recruitment of terrorists in the future. Because the longer we stay, it seems, the more people there will be who are likely to put their fists in the air when we eventually do withdraw.

Unless, that is, one believes that we can (1) leave Iraq having killed or captured every terrorist in the world or (2) never leave Iraq. Since those aren't reasonable options, since we have to withdraw at some point even knowing that terrorists remain, our primary objective should be to minimize the appeal of the inevitable victory celebration that will occur when we leave. And that objective is better served, I think, by a sooner withdrawal than a later one.

(I realize in this post that I'm playing fast and loose with phrases like "the terrorists." Generalization about the motives of "the terrorists" is always dangerous. But I think these points stick even if you fill in that blank more specific referents.)

UPDATE: Matthew Yglesias shows how President Bush's logic in yesterday's speech does point towards the option I labeled as (2) above, because it defines withdrawing--ever--as by definition a "loss."
Losing, in other words, is leaving Iraq. Winning, by contrast, is staying there. So when do the troops come home? Not until we win. But if they come home, then we lose. So they can't come home. Apparently ever. But at other times Bush has said that "as Iraqis stand up, we will stand down." According to the administration, however, Iraqis are standing out, as witnessed by the Iraqi militia-turned-army's perfomance in Tal Afar. So why don't we start standing down? Because standing down would mean losing.

There's a problem here. The organizers of the al-Qaeda movement, as the president himself has had occassion to remark, aren't the sort of people who are ever going to sign a surrender document on the deck of a battleship stationed in the Persian Gulf. As a result, we more-or-less need to define our own policy objectives. Insofar as we define our objective as "not withdrawing," which is what the president seems to have done, we're dooming ourselves to pointless fighting and some kind of national crisis. If we left early next year not because "we lost, got scared, and ran away" but because "we came to topple Saddam and install an elected government and now that's done so we won" we would be able to end the war and to win it.

Saturday, September 17, 2005


History links

The latest History Carnival is up at Respectful Insolence. So is the latest issue of History Now from the Gilder Lehrman Center. It's a great issue on abolitionism and reform. (Full disclosure: one of the articles is by my advisor.)

Speaking of abolitionists, I also cannot resist linking to this story from Minnesota Public Radio. Last month, over one hundred of William Lloyd Garrison's descendants descended on Boston to discuss their bold and balding forbear, eat a birthday cake with Garrison's picture on it, and photograph each other's hands. (You'll have to click to find out why. Listening to the audio is best.)

Thursday, September 15, 2005


Currently playing

Recorded four years ago tonight on September 15, 2001: Sonny Rollins' Without a Song (The 9/11 Concert).

Downloaded from iTunes.

There's a nice interview with the Colossus himself here. (Hat-tip: Tim at Jazz and Blues, who has become a fanboy.)


Is this progress? Part II

A city is struck by a natural disaster. Thousands die, their corpses allowed to decompose in the streets, the mingled stench of smoldering fires and moldering bodies quickly becoming unbearable. The poor of the city are disproportionately affected, since the wealthiest citizens were more likely to have somewhere to go and somehow to get there. And among the poorest of the city's poor are the black men and women, left behind among the dead or dying. Yet the first widely seen accounts of the disaster suggest that black and impoverished citizens stayed behind for another reason: to "loot" the deserted city and profit from the misfortune of others.

* * *

That's a rough summary of what happened in Philadelphia in 1793, when a yellow fever epidemic struck the national capital, killing as many as 5,000 people, usually within three or four days of their having contracted the disease. The wealthiest citizens, including government officials like George Washington and Thomas Jefferson, fled the city early on, or were able to afford expensive and experimental medical treatments. (Alexander Hamilton and his wife were taken with the fever but managed to be cured by a doctor, who gave them quinine.) But the poor who stayed behind had to try, in vain, to ward off the disease, which no one knew to be borne by mosquitos. Acting on rumors that smoke helped to screen the disease, many set fires on the streets. Acting on other rumors, some chewed on cigars and carried garlic in their pockets. Acting on scientific knowledge that was not much more advanced than such rumors, the city's most esteemed doctors, including Benjamin Rush, tried to purge patients of the disease by cutting open their arms.

Among those who stayed behind were Richard Allen and Absalom Jones, two leaders of Philadelphia's free black community, which in 1793 was one of the largest in the nation. As relatively wealthy members of their community, Allen and Jones could have fled if they had chosen. But they were asked to stay by the mayor and by Rush, who had assisted them the year before in raising funds to build an independent African Church. Rush asked Jones and Allen to assist in relief efforts. (Partly because another rumor had suggested, conveniently, that black bodies were less susceptible to the fever than white ones, a rumor that proved just as disastrously false as the others.)

Eager to prove their competence, especially to the many white Philadelphians who were hostile to the growing number of black Philadelphians, Jones and Allen agreed to stay. They rallied their parishioners together to aid the sick. Because yellow fever was mistaken for a contagious disease, it was certainly not easy to find volunteers for such work; contemporaries lamented the fact that even family members left family members behind, fearing that they would catch the fever from their loved ones. In many cases, therefore, it fell to black Philadelphians to bleed patients, bury the dead, and transport the dying to a makeshift hospital on the outskirts of the city. Being as susceptible to mosquito bites as anyone, many of them died too.

* * *

After the epidemic subsided, Matthew Carey, an Irish immigrant and founder of some of Philadelphia's earliest magazines, published a best-selling Short Account of the Malignant Fever, which quickly went through several editions.

Carey, who had fled the city, nonetheless had much to say about what had reportedly happened in his absence. First, he interpreted the behavior of many Philadelphians during the crisis--husbands leaving behind infected wives, for instance, and parents separating from their children--as evidence of how low humanity could sink in the face of disaster. The virtue of the young republic's capital had been thrown into doubt. Indeed, perhaps the disaster had been "man-made" as much as natural; Carey said that the fever culminated a long declension of morals in the city, leaving readers who were inclined to see connections between such things to make of them what they would. Carey spent many pages, though, praising the efforts of white philanthropists in the city during the crisis--a sign, perhaps, that virtue had not deserted the city entirely.

But Carey added only a paragraph mentioning (positively) the efforts of Jones and Allen, and most of that paragraph was spent noting the "salutary" effect of early rumors that blacks were less vulnerable to the fever. Even though that rumor proved "erroneous," it provided the city with a ready supply of nurses when very few white nurses "could be procured." (Since most, like Carey, had gotten out of Dodge.) But Carey went on to add, in the conclusion to his paragraph on Allen and Jones, that "the great demand for nurses," created by fears of contagion, "afforded an opportunity for imposition, which was eagerly seized by some of those who acted in that capacity, both coloured and white. They extorted two, three, four, and even five dollars a night for such attendance, as would have been well paid for, by a single dollar. Some of them were even detected in plundering the houses of the sick." [Source for quotes.]

Carey suggested, in other words, that nurses had become looters and profiteers. True, he had mentioned that "both coloured and white" nurses had been guilty of such crimes. (But this was immediately after pointing out that most nurses were "coloured" because white ones were so hard to procure. And he chose to mention the looting in the paragraph on Allen and Jones, not in the chapter devoted to white philanthropists who organized relief efforts.) And true, in later editions, Carey would modify this paragraph. In the fourth edition, for instance, he added a qualification to the above lines: "it is wrong to cast a censure on the whole for this sort of conduct, as many have done," Carey said. "The services of Jones, Allen, and [William] Gray, and others of their colour, have been very great, and demand public gratitude."

But from the perspective of Jones and Allen, the damage to Philadelphia's people of color had already been done in earlier editions. So they published a response to Carey: A Narrative of the Proceedings of the Black People, During the Late Awful Calamity in Philadelphia in the Year 1793 and a Refutation of Some Censures, Thrown upon them in some late Publications. Although they acknowledged Carey's corrective revisions in the fourth edition, they pointed out that thousands had probably already read the uncorrected early editions, which made them fearful of the effects that Carey's "partial representation" of "the Black People" would have on popular opinion.

The narrative of Allen and Jones gave a very different picture of the behavior of people of color during the epidemic. First, they demolished Carey's suggestions that some nurses had charged outrageous prices for their work by giving a careful accounting of their expenses and income. The ledger showed that Allen and Jones had personally accrued large debts after paying the nurses they had hired to work for them; their income had certainly not allowed them to turn a profit. They pointed out that if anyone had turned a profit from the disaster, it was Carey. ("Is it a greater crime for a black to pilfer, than for a white to privateer?")

Second, they detailed the courage of black Philadelphians, which Carey had not done. (Carey was inclined to say that black people had not been as terrified as whites because they were believed to be immune. But Allen and Jones made clear that black nurses had faced gruesome and terrible risks and met the challenge with courage born of benevolence, not ignorance.)

Third, they pointed out that while there had been some looting, such criminals had been in the minority and had included whites as often as blacks. Carey's late qualifications were not enough to offset his earlier insinuations. Many of his racist readers, no doubt, would gladly skip over Carey's exculpatory asides and dwell instead on the evidence that confirmed what they already expected: that black people were barely human, or that black people showed how low humanity could sink, or that black people were willing to turn the misfortune of a city and a nation into an occasion for riot.

Sadly, over the following fifty years, black Philadelphians had more to fear from white rioters than the other way around. In 1838, mobs burned down Pennsylvania Hall, a building erected as an antislavery headquarters, because of rumors that abolitionists encouraged interracial coupling. And between 1829 and 1842, black Philadelphians experienced six race riots, including one in 1842 that destroyed black homes and injured African American reformers who were marching in a temperance parade.

After this latest cataclysm, black abolitionist Robert Purvis wrote to his friend Henry Clarke Wright that "I feel that my life and those tendrils of my heart, dearer than life to me, would find no change in death, but a glorious riddance of a life, weighed down & cursed by a despotism whose sway makes Hell of Earth--We the tormented, our persecutors the tormentors. But I must stop; I am sick--miserably sick--every thing around me is as dark as the grave. Here & there the bright countenance of a true friend is to be seen, save that--nothing redeeming, nothing hopeful, despair as black as the pall of Death hangs over us. And the bloody Will is in the heart of the community to destroy us." [Witness for Freedom, p. 62.]

Allen and Jones could sense the strength of that "bloody Will" even in 1793. They could see clearly the potential chain of causation from the "pall of Death" caused by the yellow fever and the "pall of Death" that successors like Purvis would experience. Was it possible that the sons of the same community they nursed in 1793 would one day have the heart to "destroy" the black people? There certainly were connections between these times, between the epidemic of 1793 and the antebellum scourge of racial violence. For as one racist white Philadelphian would later say in 1830, the "aspirings and little vanities [of black people] have been rapidly growing since they got those separate churches," which Allen and Jones had pioneered the year before the yellow fever. "Thirty to forty years ago, they were much humbler, more esteemed in their places, and more useful to themselves and others.” (Forging Freedom, p. 275.) Now, he implied, it had become clear that Allen and Jones had been the exceptions; the looters were the rule. Black people were uppity, vicious, dangerous.

And some antebellum Philadelphians would have added, without hesitation, that they were better off back in Africa--or dead.

* * *

The fact that I've been thinking about the yellow fever epidemic over the past couple of weeks, while trying to process all of the news that has come out of New Orleans and the Gulf Coast, may simply be evidence that historians are too prone to notice parallels between the present and the areas of the past that they happen to be studying. Rob MacDougall happened to be preparing a syllabus discussing the Louisiana Flood of 1927 when Katrina struck. I happened to be preparing a class on the yellow fever epidemic.

In an earlier post, I explained the first day of my survey class. I tried to convey to students that history is necessarily selective, that any historical narrative leaves all kinds of events and details out. But on the second day, I tried to make the further point that even once historians decide what events to include in a narrative, they are faced with the problem that there are multiple ways of looking at the same historical events, multiple narratives by contemporaries about the way things were. We have to make interpretive choices not just in deciding what to include in our historical narratives, but also in deciding how to write about what we do include, when we often have sources presenting multiple perspectives on the same event or (in many other cases) sources presenting only one perspective even when we know there were others. To illustrate this point, I assigned excerpts from Carey's Short Account and Allen's and Jones's Narrative. We discussed them, I believe, the day after the levees broke.

It's sometimes hard to say how historical reflection can help us think about the present. The past doesn't always provide practical "lessons" or clear prescriptions about what to do when the past seems to be repeating itself. The Philadelphia of 1793 was very different from the New Orleans of 2005. (If the past is a foreign country, maybe the lesson of similarities between the past and the present is this: the present is a foreign country, too.) And I confess even after reflecting on what seems like a depressingly familiar episode in our national past, I haven't emerged with some easy bromide about how this will never happen again.

Instead, I'm mainly left with the realization that there is a long history of white Americans leaving black Americans behind. Left behind to bury Philadelphia's dead. Left behind, in "them dark days," on the dismal rice plantations of South Carolina, to die of malaria and lockjaw while wealthy planters spent the summer in Charleston. Left behind in urban ghettos while jobs and white residents fled to the suburbs. Left behind in the Superdome. The point is not to say that all of these examples of leaving behind were the same, but just to notice how much the world still lacks the virtue that Michael Eric Dyson calls "spiritual empathy--not to be confused with maudlin emotion, or pitying affirmation, but a willingness to be kept awake in another's bed of pain before lashing them for being morally asleep." (From Is Bill Cosby Right?)

I don't mean to deny that our nation has made great progress between 1793 and 2005, as Condoleeza Rice has been insisting: "when I'm talking to my colleagues, I say yeah, we have a problem when race and poverty comes together, we really do. And it's a vestige of our history. It's a vestige of particularly the Old South in this case. But don't misread that there has been no progress on issues of race in America." Of course there's been progress. But the historian in me can't resist pointing out that even in our earlier episodes of leaving black people behind, there were many who palliated those desertions by pointing out what progressive times they were living in. Charles Maginault, one of those planters who tolerated abysmal mortality rates among his slaves in the swampy lowcountry, where black men and women waded waist-deep through flooded pads to cultivate rice, could nonetheless boast about the wonders of American democracy in 1847, "our state of social advancement in every thing ... in the most flourishing Condition, unaided by Government" (p. 44).

And yes, of course there's been progress in the number of people of color "in America's cabinet, in America's Foreign Service, in America's business community, in America's journalistic community." But one thing this metric of progress ignores is that there were people of color in America's business community in 1793. James Forten, Purvis's father-in-law, was the wealthiest sailmaker in early national Philadelphia, and Purvis himself was well-off. But that in and of itself didn't protect black homes from racist rioters in 1842, anymore than the diversity in America's Foreign Service puts food on the table for those who are still left behind. The point here is just that gauging progress according to only one measure of progress--the diversity of professional, white-collar America--is misleading.

(Incidentally, one reason the professions that Secretary Rice points too are especially misleading is because these professions are public and high-profile. Decision makers in these professions know that public opinion expects diversity to show, and that in and of itself is a real sign of progress. But the owner of a machine shop in Memphis who is trying to decide between hiring a white mechanic and a black mechanic does not have to think about how that hiring decision will be perceived. And it is in these millions of daily decisions, beyond the realm of public scrutiny, that we obviously still have a lot of progress to make.)

Of course, there has been objective progress in America, notwithstanding the fact that, subjectively speaking, every generation of Americans has always said that progress has been made. But there comes a time when pointing out that progress has been made doesn't actually help us make more progress--when it simply becomes a defensive maneuver by those who think they are being blamed for the lack of progress. Secretary Rice, for instance, told the New York Times that she hopes "that around the world it's noted that on matters of race, the United States is about 100 percent ahead of any place else in the world in issues of race. And I say that absolutely fundamentally. You go to any other meeting around the world and show me the kind of diversity that you see [here]. ... Show me that kind of diversity any place else in the world, and I'm prepared to be lectured about race."

That's the dilemma we have right now in the United States: we can't perceive frank discussion of our deficiencies and faults as anything other than a "lecture." Talk about progress still to be made is necessarily a "blame game," and hey, buddy, if you're going to blame us, look at all those other countries out there and then come back and talk to me. But all of that's beside the point: you can read a post like this one as a finger-pointing "lecture" if you want, but I'm not pointing my finger at you so much as I am pointing my finger at the persistence of a problem. Although, of course, the more we stare at the persistence of that problem, the more the finger points at us, at each of us, at me, at you.

Perhaps what the Philadelphia yellow fever epidemic can teach us is that representations about race matter. That's what Allen and Jones understood: that partial representations can do enormous damage. As they argued to Carey, you can correct those representations later, but not everyone who sees your first representation is going to see the correction, and you never know how far and how wide the damage done by your representation will be. Sure, you can say the Fox News reporter was just trying to keep viewers when he said, over footage of a downed rescue helicopter, "Let's hope it wasn't shot down by snipers, since we know that's been happening at some of the hospitals." You can say that better reporting eventually clarified that the helicopter crash was an accident. But how many people switched off the TV with the false representation of a sniper shooting down a helicopter in their heads? How many people read Matthew Carey's first edition and never read the fourth? How many people never read Allen and Jones's rejoinder? And how much cultural work, how much psychological work, did their partial and incomplete perceptions do, the next time they saw a person of color on the streets of Philadelphia, or on their television screens?

To say that racial inequality is just a "vestige" of our history suggests it is a part of the body politic that no longer functions as it once did: but I suspect that the representations of black citizens of New Orleans as disobedient idlers or thuggish looters, the narratives of pilfering and "complaining," are still doing lots of work for white Americans inclined to think that there is no progress to be made. (And there are many of them, including some truly poisonous ones like the lurkers that Rachel Sullivan found on Craigs List.) A vestigial organ has lost its power. Race has not.

Tuesday, September 06, 2005


Is there a doctor in the neighborhood?

Yes, there is. Congratulations, Jason.

Friday, September 02, 2005


Friday shuffle

1. "Summer Song," by Louis Armstrong
2. "When It's Sleepy Time Down South," by Louis Armstrong
3. "St. Louis Blues," by Louis Armstrong
4. "Body and Soul," by Louis Armstrong
5. "Just a Gigolo," by Louis Armstrong
6. "I Surrender, Dear," by Louis Armstrong
7. "Memories of You," by Louis Armstrong
8. "I'm Not Rough," by Louis Armstrong
9. "Love, You Funny Thing," by Louis Armstrong
10. "On the Sunny Side of the Street," by Louis Armstrong


Thursday, September 01, 2005


Suffering strangers

All of us who can, ought to give. All of us who can, ought to give.

In the late 1980s, as part of a now famous historiographical debate about the rise of the antislavery movement, historian Thomas Haskell argued that our ideas about moral responsibility are tied up with our conventional ideas about causation. Haskell pointed out that "ought" implies "can," but that our ideas about what we "can" do are contingent and conventional. To illustrate his point, Haskell used what he called "the case of the starving stranger." All of us know that at this very moment, there are people in distant--and not so distant--parts of our world who are literally starving. We also know that we could get on an airplane, fly to these starving strangers, and save their lives with only a fraction of the food that molds and sours in our cupboards and refigerators. Should we do this?

Answering that question, said Haskell, would depend on our understandings about what we reasonably can do. If the starving stranger were across the street, rather than across an ocean, that would naturally seem to raise my moral culpability for his death, if I allow him to go on without food. And our sense of moral responsibility for the starving stranger in Africa is probably higher than it used to be, Haskell argued, for abolitionists. At least in their case, going to the stranger's aid was more onerous than boarding an airplane. Surely we now can do more, and so surely we feel that we ought to do more. But this suggests, according to Haskell, that "new technology," which he defines as any instrumental means for accomplishing ends, "can change the moral universe in which we live. Technological innovation can perform this startling feat, because it supplies us with new ways of acting at a distance and new ways of influencing future events and thereby imposes on us new occasions for the attribution of responsibility and guilt."

Writing in 1987, before the World Wide Web, Haskell mused about some "as yet uninvented technology, far more advanced than the airplane, that will enable us to save the starving stranger with minimal expenditure of time and energy, no disruption of our ordinary routine. If we could save him by just reaching out to press a button, then a failure to act would become indefensible."

Here's a button. Why not press it?

Haskell went on to argue that the rise of humanitarian sentiment that accompanied the antislavery movement was spurred on partly by changes in technology--not just new machines or methods of transportation, but new social mechanisms, like expansive overseas trade and imperial apparatuses, that made suffering strangers seem closer to Europeans than before.

As a specific explanation of abolitionism, Haskell's argument (which I've grossly simplified here) can be faulted. But the basic premise--that our sense about what we are capable of affects our sense about what we are culpable for, and vice versa--is difficult to dispute. And undoubtedly, we can do more today to help distant, suffering strangers, without even getting up from our chairs or opening a new browser window, than previous generations could. So we ought to do at least some of what we can. Sure, you can point out that your dollars might not go directly to the person you just saw on CNN, that waste or greed or ineptitude might waylay your alms. But are we really prepared to argue that this absolves us of giving, considering how little twenty dollars costs us compared to how much it could do? (If you are prepared to argue that, read this and get back to me.)

The weight of these arguments all bear heavily on my mind and heart. But there's another side of Haskell's argument about the ability of technology to change our "moral universe." The fact is that the same technology that shrinks that moral universe, and makes it possible for us to do more, literally at the click of a button, also makes us more aware of all the suffering strangers there are to help. So much suffering in so many places at the same time. So much suffering. So much suffering. So much suffering. The technology that makes it possible for us to care for more distant strangers does not necessarily leave us feeling empowered; it can, at the same time, make us feel more powerless.

That was true for the abolitionists as well, who were seized with moments of doubt all the time. On the one hand, they sensed they could do more than generations before them, as steamships began to cross the Atlantic, as public opinion became first a national and then international force to be reckoned with, as lines of communication became freer and faster.

But they too wondered whether all of this really meant they could do more. As the Boston abolitionist Wendell Phillips wrote to a British friend in 1842, "When you think of all the oppressions that are done under the sun from the Emperor Nicholas with 14 million serfs to the domestic [servant] in the outer room under an (equally?) false social system, does it ever seem [to be] idle to attempt laboring to leave the world better than we found it?" In another letter written the same day, Phillips expanded the point: was not "the work ... greater than what we can attempt, & so hopeless"?

"New technology" may have impacted the abolitionists' "moral universe," but it brought with it as many questions as emphatic answers. Abolitionists like Phillips often answered those questions with religious hopes. The important point is that they did not give up hope. They too struggled with the overwhelming sense that there were millions of starving strangers, seemingly within reach and yet constantly proliferating just out of their reach, like the detritus of the past that Walter Benjamin imagined always receding beyond the wingtips of the Angel of History. Yet in so many cases, they were not overwhelmed. Technology did not make it a cinch to save the starving stranger; the old question of "what can I do" remained, as persistent as ever. But so many of them persisted too.

I don't mean to make the abolitionists out to be angelic heroes. On the contrary, what moves me about letters like Phillips' is the palpable message that abolitionists were human--flawed, limited humans who sensed their limitation, who wrestled with "ought" and "can" just like we continue do, even in an age of one-button technology.

All of us who can, ought to give.

* * *

The famous abolitionist William Lloyd Garrison welcomed new technologies like the steamship, mostly for its potential to draw the moral universe closer together, but also because he never found his sea legs. The faster he could cross the Atlantic, the better, because the longer it took, the more time he usually spent leaning over the side of his ship. The ocean was simply not Garrison's friend. “Though I am fond of agitation,” he punned to a friend after crossing the ocean in 1846, “it does not run in that line.”

Garrison was also fond of quoting the biblical prophecy that in a new heaven and a new earth, there would be "no more sea." Partly this was a matter of "personal accommodation," Garrison often joked, given his dislike of seafaring. But that prophecy was also part of his hope that one day, the distance between human beings, with all the pain that attended it, would be obliterated. For Garrison and other reformers like him, to hope that the sea would be no more was to hope that there would be no more suffering, and no such thing as a stranger. Dear God please, no more sea.


Carnivals galore

Be sure to check out the latest History Carnival at Clioweb, and the inaugural edition of the Teaching Carnival, which focuses on teaching in higher education, at Thanks for Not Being a Zombie. You can also view even more posts for the Teaching Carnival here.

Site Meter