Thursday, September 30, 2004
Certainly wrong
I wanted to publish the sequel to my post on transnational history sometime today, but I'm still working on it. I was certain that I would have time to finish it, but, as Senator Kerry eloquently pointed out tonight, "It's one thing to be certain, but you can be certain and be wrong."
Transnational history
[For a complete list of my posts on transnational history, see here.]
My dissertation identifies me as a "transnational historian." Transnational historians, instead of focusing on the official doings of nation-states, emphasize the migrations of people, ideas, and goods across national borders. They speak of "borderlands" and "diasporas," of encounters between nations, of travels across geographical boundaries. All of these things interest me too. But I'm somewhat ambivalent about my own genre.
To the extent that "transnational history" even is an identifiable genre. In the past eight or ten years, many scholars have issued manifestoes calling for histories of the United States that are less centered on the nation-state. But one sometimes wonders whether there are more manifestoes for transnational history than there are examples of it; we transnational historians are somewhat like Communists in that respect. You might also see already that we transnational historians are paradoxical people: how can we ask, with a straight face, for histories of the United States that are less centered on the nation?
But please, for a moment, suspend your disbelief about that particular tension at the heart of transnational history. Let me pose a different question: Why are there so many calls for transnational history at this particular moment in time? After all, historians have always paid some attention to the travels of people, ideas, and goods across geo-political borders. Only thirty paragraphs into the Histories, Herodotus talks about how the Athenian legislator "Solon set out upon his travels, in the course of which he went to Egypt to the court of Amasis, and also to Croesus at Sardis." Transnational history, in some form or another, is as old as history itself.
But the problem, as today's transnational historians see it, is that even those histories which deal with travels across borders reify those borders and make them rigid. Historians since the beginning of our craft have invested nations with essential or exceptional characteristics, even if they have also showed how national borders have been traversed by people and things. This, certainly, was Herodotus's assumption. He talked about travels between Greece and Egypt, but there was no question that those lands were essentially different. As he says in Book 2:
Even if some form of transnational history has been practiced for a long time, it is only relatively recently that you will find historians making points like these. For a greater part of the past, most transnational history involved making comparisons between well-defined nations. Or they involved demonstrating that a particular nation was exceptionally superior to others. Consider the Athenian exceptionalism of Pericles, in his famous funeral oration recorded by the Greek historian Thucydides: "Our constitution does not copy the laws of neighbouring states; we are rather a pattern to others than imitators ourselves. ... In short, I say that as a city we are the school of Hellas; while I doubt if the world can produce a man who, where he has only himself to depend upon, is equal to so many emergencies, and graced by by so happy a versatility as the Athenian. ... For Athens alone of her contemporaries is found when tested to be greater than her reputation. "
Sound familiar? Substitute "American" and "America" for "Athenian" and "Athens," and Pericles could have given the oration at Ronald Reagan's funeral. Exceptionalism characterizes the way many people think of American history and America--we are the "school" of the world, a country like no other. Transnational histories, by arguing that nations are invented, that their boundaries are protean, try to avoid the related errors of national essentialism and national exceptionalism. That is the first reason why you find scholars these days calling for more transnational histories. They can provide an antidote to national histories that accept, uncritically, various kinds of American exceptionalism. And as an aspiring transnational historian, I sympathize with this mission.
But this brings me to the second main reason why transnational history is in vogue at the present moment. You may have already guessed it ... Transnational history is in vogue because "globalization" is. Later today I'll put up a post about why that bothers me. [The sequel is now posted here.]
Incidentally, I'm hoping that this post and its sequel can serve as my entries for Dissertation Week, even though they may not answer Sepoy's call for some of the "nitty-gritty." I'm glad that he's encouraged those of us who are blogging graduate students to use this forum as a dissertation workshop, since that's one of the reasons I started this blog. (See my very first post.) I'd be lying if I said I don't share some of Ed's reticence about blogging at length about his dissertation, but I also see the logic of this post at Culture Cat; posting ideas gives them a time-stamp and a "posted by."
Another reason for reticence: I don't want my academic ramblings on this blog to be seen unequivocally as "scholarship" in the conventional sense of that word. (See related discussions here.) My posts are not finely honed or fully vetted. But that doesn't mean that this forum is not a valuable place to try out ideas, or to work through my fear of letting ideas out of the bag because they are not wholly formed.
This blog isn't scholarship in the sense that it meets professional standards of peer review, footnoting, etc. But it is scholarship in another sense of the word: it is "learning" done in public. When I do share what I'm working on in my dissertation, it's not because I want to take the posture of an "expert," but because I want to take the posture of a "student," which is, after all, what being a "scholar" means. And in that sense, I think of all of you who have somehow found this blog as fellow scholars.
My dissertation identifies me as a "transnational historian." Transnational historians, instead of focusing on the official doings of nation-states, emphasize the migrations of people, ideas, and goods across national borders. They speak of "borderlands" and "diasporas," of encounters between nations, of travels across geographical boundaries. All of these things interest me too. But I'm somewhat ambivalent about my own genre.
To the extent that "transnational history" even is an identifiable genre. In the past eight or ten years, many scholars have issued manifestoes calling for histories of the United States that are less centered on the nation-state. But one sometimes wonders whether there are more manifestoes for transnational history than there are examples of it; we transnational historians are somewhat like Communists in that respect. You might also see already that we transnational historians are paradoxical people: how can we ask, with a straight face, for histories of the United States that are less centered on the nation?
But please, for a moment, suspend your disbelief about that particular tension at the heart of transnational history. Let me pose a different question: Why are there so many calls for transnational history at this particular moment in time? After all, historians have always paid some attention to the travels of people, ideas, and goods across geo-political borders. Only thirty paragraphs into the Histories, Herodotus talks about how the Athenian legislator "Solon set out upon his travels, in the course of which he went to Egypt to the court of Amasis, and also to Croesus at Sardis." Transnational history, in some form or another, is as old as history itself.
But the problem, as today's transnational historians see it, is that even those histories which deal with travels across borders reify those borders and make them rigid. Historians since the beginning of our craft have invested nations with essential or exceptional characteristics, even if they have also showed how national borders have been traversed by people and things. This, certainly, was Herodotus's assumption. He talked about travels between Greece and Egypt, but there was no question that those lands were essentially different. As he says in Book 2:
Concerning Egypt itself I shall extend my remarks to a great length, because there is no country that possesses so many wonders, nor any that has such a number of works which defy description. Not only is the climate different from that of the rest of the world, and the river unlike any other rivers, but the people also, in most of their manners and customs, exactly reverse the common practice of mankind. The women attend the markets and trade, while the men sit at home at the loom; and here, while the rest of the world works the woof up the warp, the Egyptians work it down; the women likewise carry burdens upon their shoulders, while the men carry them upon their heads. The women urinate standing, the men crouching. ...And so on, "to a great length." These are the kinds of generalizations with which contemporary historians are uncomfortable. The reason we talk about border-crossings now is because we believe those crossings destabilize the very concept of monolithic nations. Borders are not natural and impervious, but permeable and fluid. Nations, too, are imagined communities, rather than entities rooted in a country's climate or topography.
Even if some form of transnational history has been practiced for a long time, it is only relatively recently that you will find historians making points like these. For a greater part of the past, most transnational history involved making comparisons between well-defined nations. Or they involved demonstrating that a particular nation was exceptionally superior to others. Consider the Athenian exceptionalism of Pericles, in his famous funeral oration recorded by the Greek historian Thucydides: "Our constitution does not copy the laws of neighbouring states; we are rather a pattern to others than imitators ourselves. ... In short, I say that as a city we are the school of Hellas; while I doubt if the world can produce a man who, where he has only himself to depend upon, is equal to so many emergencies, and graced by by so happy a versatility as the Athenian. ... For Athens alone of her contemporaries is found when tested to be greater than her reputation. "
Sound familiar? Substitute "American" and "America" for "Athenian" and "Athens," and Pericles could have given the oration at Ronald Reagan's funeral. Exceptionalism characterizes the way many people think of American history and America--we are the "school" of the world, a country like no other. Transnational histories, by arguing that nations are invented, that their boundaries are protean, try to avoid the related errors of national essentialism and national exceptionalism. That is the first reason why you find scholars these days calling for more transnational histories. They can provide an antidote to national histories that accept, uncritically, various kinds of American exceptionalism. And as an aspiring transnational historian, I sympathize with this mission.
But this brings me to the second main reason why transnational history is in vogue at the present moment. You may have already guessed it ... Transnational history is in vogue because "globalization" is. Later today I'll put up a post about why that bothers me. [The sequel is now posted here.]
Incidentally, I'm hoping that this post and its sequel can serve as my entries for Dissertation Week, even though they may not answer Sepoy's call for some of the "nitty-gritty." I'm glad that he's encouraged those of us who are blogging graduate students to use this forum as a dissertation workshop, since that's one of the reasons I started this blog. (See my very first post.) I'd be lying if I said I don't share some of Ed's reticence about blogging at length about his dissertation, but I also see the logic of this post at Culture Cat; posting ideas gives them a time-stamp and a "posted by."
Another reason for reticence: I don't want my academic ramblings on this blog to be seen unequivocally as "scholarship" in the conventional sense of that word. (See related discussions here.) My posts are not finely honed or fully vetted. But that doesn't mean that this forum is not a valuable place to try out ideas, or to work through my fear of letting ideas out of the bag because they are not wholly formed.
This blog isn't scholarship in the sense that it meets professional standards of peer review, footnoting, etc. But it is scholarship in another sense of the word: it is "learning" done in public. When I do share what I'm working on in my dissertation, it's not because I want to take the posture of an "expert," but because I want to take the posture of a "student," which is, after all, what being a "scholar" means. And in that sense, I think of all of you who have somehow found this blog as fellow scholars.
Tuesday, September 28, 2004
An afterthought
I'm afraid I've been too busy this evening to write a full post, which means, unfortunately, that there won't be any "edumacated ponterificatin'" tonight. One benefit of this, at least for me, is that I won't carelessly commit myself to arguing for Platonic ideals, as I seem to have done below. But fear not: full-blown pontification will resume tomorrow.
In the meantime, I had an afterthought about my earlier post on climates of opinion, which I've argued might be worth dusting off and using again as a literary device in intellectual history. Here's the thought: The world of blogging might indicate that "climates of opinion" are on their way back in, or at least that using the metaphor can be more palatable than it once was. What I'm referring to, of course, is the neologism "blogosphere."
It would be interesting to trace the evolution of metaphors for the Internet, beginning with the image of a "net" itself, a linear network of connected nodes. Closely akin is the idea of a "world wide web," which can literally be represented visually as a web. The notion of "cyberspace" is distinct from this image of a "web"--both are spatial metaphors, but one focuses on connections between nodes while the other takes in both the silk and the interstices. The "blogosphere," on the other hand, is metaphorically more fluid than either "web" or "space": at least to me it suggests that there is now an online "atmosphere"--a "climate" of opinion.
An alternative etymology might prove that "blogosphere" is a cognate of "public sphere." But the Wiki entry on the word credits The BradLands for coining the term. Its jocular usage in the original post seems to have "atmosphere" in mind. I think it's likely this is the sense in which most bloggers mean "blogosphere." Although blogs do have the essential features of "the Web"--they are nodes in complex networks, connected by links--they also seem to have the unpredictable moodiness of weather. And those who breathe in the blogosphere each day are likely to be aware of certain opinions or news items by the end of the day, without being able to trace exactly where in the web those ideas began or ended. "Blogosphere" thus might serve some of the same metaphorical purposes that I've attributed to "climates of opinion." Perhaps we should also have a new word for meteorological events like the recent CBS memos incident ... May I suggest "bloggicane"?
Well, what do you know ... I've managed to do a little pontificating after all. Or maybe it's just (a) bull.
P.S. Apropos of this pontification on blogospheres, biospheres, and the like, is this post by Michael Berube.
In the meantime, I had an afterthought about my earlier post on climates of opinion, which I've argued might be worth dusting off and using again as a literary device in intellectual history. Here's the thought: The world of blogging might indicate that "climates of opinion" are on their way back in, or at least that using the metaphor can be more palatable than it once was. What I'm referring to, of course, is the neologism "blogosphere."
It would be interesting to trace the evolution of metaphors for the Internet, beginning with the image of a "net" itself, a linear network of connected nodes. Closely akin is the idea of a "world wide web," which can literally be represented visually as a web. The notion of "cyberspace" is distinct from this image of a "web"--both are spatial metaphors, but one focuses on connections between nodes while the other takes in both the silk and the interstices. The "blogosphere," on the other hand, is metaphorically more fluid than either "web" or "space": at least to me it suggests that there is now an online "atmosphere"--a "climate" of opinion.
An alternative etymology might prove that "blogosphere" is a cognate of "public sphere." But the Wiki entry on the word credits The BradLands for coining the term. Its jocular usage in the original post seems to have "atmosphere" in mind. I think it's likely this is the sense in which most bloggers mean "blogosphere." Although blogs do have the essential features of "the Web"--they are nodes in complex networks, connected by links--they also seem to have the unpredictable moodiness of weather. And those who breathe in the blogosphere each day are likely to be aware of certain opinions or news items by the end of the day, without being able to trace exactly where in the web those ideas began or ended. "Blogosphere" thus might serve some of the same metaphorical purposes that I've attributed to "climates of opinion." Perhaps we should also have a new word for meteorological events like the recent CBS memos incident ... May I suggest "bloggicane"?
Well, what do you know ... I've managed to do a little pontificating after all. Or maybe it's just (a) bull.
P.S. Apropos of this pontification on blogospheres, biospheres, and the like, is this post by Michael Berube.
Saturday, September 25, 2004
Inspecting gadgets
Yet another post here at Mode for Caleb has been inspired by Paul Musgrave, whose excellent blog has fast become a daily read. Recently Paul wrote about what he calls the Gadget Index, a tool for measuring one's daily reliance on digital tools. "At the moment," Paul writes, "I have: a Dell Axim PDA, a Maxtor external hard drive, an iPod, a Fujifilm digital camera, a Sony Ericsson camera phone, and, of course, the Dell Inspiron laptop. Six gadgets."
The Gadget Index, using Paul's terminology, measures how many of these six gadgets he can't leave home without. Lately, the Index has been at a high "3"--the phone, the iPod, and the camera. And the Index never falls below a "1" for Paul because the cell phone is a must. My index is "1" if I'm lucky, because I have a bad habit of forgetting my cell phone at home; on some days, the index is "2" because I take my laptop and phone with me to campus. My max is probably also "3" with the third gadget being my Archos Jukebox MP3 player.
Paul's post moves from these reflections on his personal Gadget Index to a meditation on technological change and progress. (I'm putting the word "progress" into Paul's mouth, but I think it's consistent with the gist of the post. You be the judge.) In a few years, Paul speculates, the Gadget Index might drop and stabilize at "1" with the invention of a "Swiss Army Knife of digital tools"--an all-in-one personal assistant, phone, camera, MP3 player, and God knows what else. Before the coming of this Tool of Tools, there will be another decade of "spectacular improvements in consumer technology," writes Paul. "The Gadget Index will probably fall to zero, or perhaps one, in the wealthy world. Or, as is more likely, the Gadget Index will simply no longer be a matter of comment, no matter how many accoutrements we acquire."
A few thoughts ... It's unclear to me that market forces will progressively drive the Gadget Index down. Gadgets are, as Paul notes, "consumer technology." The makers of gadgets therefore have a vested interest in making sure that the Index remains high--they need us to buy more gadgets. Of course, suppliers make the kinds of tools we demand, and it's clear that wrapping several gadgets into one is where the demand curve is heading. But does the invention of these new gadgets constitute real technological progress and innovation? Or are they manifestations of conspicuous consumption, a product of our need to have the latest and greatest versions of gizmos, even when the ones we have work just fine.
When it comes to other gadgets, it is easier to recognize how companies introduce superfluous innovations in order to convince us to buy a new version. I'm thinking, for instance, of the new Glade Wisp, "the only home fragrancer that automatically releases a measured puff of fragrance every few seconds. Unlike electric air fresheners, Glade® Wisp® Home Fragrancer has a microchip that ensures a consistent release of fragrance. With Glade® Wisp® Home Fragrancer there's always the same fresh fragrance in the air. The proof is in the puff—you can see it working!" As far as I can tell, this air freshener does what all the previous ones have done: it freshens air. But this is the first freshener that releases a puff into the air, in case you ever wondered whether your Glade Plug-In was working. This is a brilliant consumer innovation, because the advertising tries to convince you that the company's own previous products were "faulty" in some way. How do you know that old freshener we sold you really works? Buy a new one.
Another infamous example of this kind of consumer "gadgetage" is the continual iterations of men's razors. Here's the hard truth: shaving hurts. Give me the best razor in the world, and I'll still cut myself and irritate my skin. Give me the closest shave possible, and I'll still have a little stubble. Razor makers like Gillette know this, but they use your misery to their advantage by releasing, every couple of months, a brand new razor that is going to revolutionize shaving. Your old razor had two blades? This one has three ... make that four ... make that four with a gel strip. The latest in this line is the Gillette M3Power, a "MACH 3 innovation." The M3Power (a brilliant play on MP3?) runs on a AAA battery that loads into the handle. The battery causes the ravor to vibrate, and "the pulsing action stimulates hair upward and away from the skin, making it dramatically easier to shave more thoroughly in one easy power stroke." Once again, even though we also sold you your old razor, it's no good any more--this one's dramatically better.
I know that these kinds of gadgets are different from phones, personal assistants, cameras, but I think it is very hard to specify how different they are. The makers of both types of gadgets have become extremely adept at using technological jargon as selling points. Notice that Glade says the Wisp has a special "microchip," and the product website (link above) includes an amusing diagram of the ghost inside the machine. Likewise, the M3Power's blades "are enhanced by a new coating process, called 'thin uniform telomer,' which provides a perceptible improvement in shaving comfort throughout the life of the blade." Perhaps the difference between "Celeron" and "Pentium" is different from the difference between "thin uniform telomer" and the old coating process. But perhaps, on the other hand, having more and more and more processor speed, or more and more and more hard drive space, really is akin to having an old razor or a pulsating one, or to having three or four or five blades.
Aside from the way in which all of these gadgets ape technological authority, the same consumer logic applies in their production. The difference is that right now, the suppliers of digital gadgets don't have to work as hard to convince us that their innovations are not gratuitous, but essential. (Sometimes, between real version leaps, they have to strain. Remember in the early days of text-messaging, all those wacky commercials trying to come up with some reason, any reason, why you might need to use the service?) But are these innovations really changing our lives in progressive ways? The question is not whether they are simply affecting our quotidian routines--Paul makes clear that they are. The question is whether they are really making our quotidian routines better to a measurable extent. Will we someday look on the production of cell phones the way we now do the production of razors? I intend that as an open, not a rhetorical, question.
A second important reaction to Paul's post is this. Innovations in "gadgetage" cannot be measured objectively in a vacuum. "Technological progress" is a cultural construction, not an indisputable fact of life. And usually, people are unable to see this from the vantage point of the present: it's hard for us to imagine a real technological leap, so the most we can do is imagine successive refinements of what we have now. And as I'll suggest in a moment, it's not insignificant that we sometimes miss seeing the ways in which our views of technology are constructed.
It's helpful, in this regard, to look back to the past. When the first transatlantic telegraph lines were laid in the middle of the nineteenth century, people spoke as if time and space themselves had been obliterated. A new age of universal peace and harmony seemed just around the bend. In 1846, for example, Elihu Burritt, an eclectic American peace activist, wrote an essay in his newspaper on "Agents or Elements of Universal Brotherhood." Among these agents, Burritt listed the "gadgetage" of his day.
Hopefully, reading Burritt induces some humility about our own gadgets and their seeming progressiveness. I use the word "humility" intentionally, because valorizing certain kinds of technology as essential or universally good can have unintended effects. For example, the conclusion Burritt drew from his observations on technology was this:
I've obviously diverged very far from Paul's post. I hope he won't take offense at the divergence, because I'm not imputing to him Burritt's or Wilson's views. If I'm talking to anyone here, it's primarily myself. (That's what blogging is all about, right?) After all, I too have the gadgets Paul describes, and I have come to believe that they are essential parts of my daily life.
But it's important to be constantly reminded that this belief is the product of my particular situation in the world, rather than a pointer to universal truths about technology. (Again, not saying Paul is saying this.) As Paul points out in his closing thoughts, "the Gadget Index will probably fall to zero, or perhaps one, in the wealthy world." (My emphasis.) In Darfur, the Gadget Index for many is already zero, but for very different reasons. It's also important to stress that this fact only proves that I live in "the wealthy world," not that Darfur is an inferior world. (See Epictetus's second teaching in my earlier post.)
Although I am not in any way attaching them to Paul, views connecting technology and civilization are not unheard of today. You won't hear people stridently saying that cell phones are vehicles for the fruits of the Anglo-Saxon race. But you will hear people implying that spreading gadgets and technology goes hand in hand with spreading democracy. How different, in the end, is Burritt's vision of "railway engines that shall thunder through the heart of Asia" from the Bush administration's view that building brand new roads and schools with ceiling fans justifies our occupation of Iraq?
On a White House page offering the grateful testimonies of liberated Iraqis, there is this assessment from the National Review, quoted right alongside tales of Saddam's torture and coercion. "They [Iraqis] have never been so free and prosperous, and they expect things will get better still. There's been banking and currency reform, with lines of credit now readily available. Markets are thriving, property values are rising. Welcome novelties include free speech and almost 200 periodicals; Internet cafes, bloggers, and cellphones are everywhere." I'm hoping you noticed that "blogging" is being ranked with gadgets as proof that Iraqis have never been so "free and prosperous." The entire page is worth reading by the way. It includes this:
The Gadget Index, using Paul's terminology, measures how many of these six gadgets he can't leave home without. Lately, the Index has been at a high "3"--the phone, the iPod, and the camera. And the Index never falls below a "1" for Paul because the cell phone is a must. My index is "1" if I'm lucky, because I have a bad habit of forgetting my cell phone at home; on some days, the index is "2" because I take my laptop and phone with me to campus. My max is probably also "3" with the third gadget being my Archos Jukebox MP3 player.
Paul's post moves from these reflections on his personal Gadget Index to a meditation on technological change and progress. (I'm putting the word "progress" into Paul's mouth, but I think it's consistent with the gist of the post. You be the judge.) In a few years, Paul speculates, the Gadget Index might drop and stabilize at "1" with the invention of a "Swiss Army Knife of digital tools"--an all-in-one personal assistant, phone, camera, MP3 player, and God knows what else. Before the coming of this Tool of Tools, there will be another decade of "spectacular improvements in consumer technology," writes Paul. "The Gadget Index will probably fall to zero, or perhaps one, in the wealthy world. Or, as is more likely, the Gadget Index will simply no longer be a matter of comment, no matter how many accoutrements we acquire."
A few thoughts ... It's unclear to me that market forces will progressively drive the Gadget Index down. Gadgets are, as Paul notes, "consumer technology." The makers of gadgets therefore have a vested interest in making sure that the Index remains high--they need us to buy more gadgets. Of course, suppliers make the kinds of tools we demand, and it's clear that wrapping several gadgets into one is where the demand curve is heading. But does the invention of these new gadgets constitute real technological progress and innovation? Or are they manifestations of conspicuous consumption, a product of our need to have the latest and greatest versions of gizmos, even when the ones we have work just fine.
When it comes to other gadgets, it is easier to recognize how companies introduce superfluous innovations in order to convince us to buy a new version. I'm thinking, for instance, of the new Glade Wisp, "the only home fragrancer that automatically releases a measured puff of fragrance every few seconds. Unlike electric air fresheners, Glade® Wisp® Home Fragrancer has a microchip that ensures a consistent release of fragrance. With Glade® Wisp® Home Fragrancer there's always the same fresh fragrance in the air. The proof is in the puff—you can see it working!" As far as I can tell, this air freshener does what all the previous ones have done: it freshens air. But this is the first freshener that releases a puff into the air, in case you ever wondered whether your Glade Plug-In was working. This is a brilliant consumer innovation, because the advertising tries to convince you that the company's own previous products were "faulty" in some way. How do you know that old freshener we sold you really works? Buy a new one.
Another infamous example of this kind of consumer "gadgetage" is the continual iterations of men's razors. Here's the hard truth: shaving hurts. Give me the best razor in the world, and I'll still cut myself and irritate my skin. Give me the closest shave possible, and I'll still have a little stubble. Razor makers like Gillette know this, but they use your misery to their advantage by releasing, every couple of months, a brand new razor that is going to revolutionize shaving. Your old razor had two blades? This one has three ... make that four ... make that four with a gel strip. The latest in this line is the Gillette M3Power, a "MACH 3 innovation." The M3Power (a brilliant play on MP3?) runs on a AAA battery that loads into the handle. The battery causes the ravor to vibrate, and "the pulsing action stimulates hair upward and away from the skin, making it dramatically easier to shave more thoroughly in one easy power stroke." Once again, even though we also sold you your old razor, it's no good any more--this one's dramatically better.
I know that these kinds of gadgets are different from phones, personal assistants, cameras, but I think it is very hard to specify how different they are. The makers of both types of gadgets have become extremely adept at using technological jargon as selling points. Notice that Glade says the Wisp has a special "microchip," and the product website (link above) includes an amusing diagram of the ghost inside the machine. Likewise, the M3Power's blades "are enhanced by a new coating process, called 'thin uniform telomer,' which provides a perceptible improvement in shaving comfort throughout the life of the blade." Perhaps the difference between "Celeron" and "Pentium" is different from the difference between "thin uniform telomer" and the old coating process. But perhaps, on the other hand, having more and more and more processor speed, or more and more and more hard drive space, really is akin to having an old razor or a pulsating one, or to having three or four or five blades.
Aside from the way in which all of these gadgets ape technological authority, the same consumer logic applies in their production. The difference is that right now, the suppliers of digital gadgets don't have to work as hard to convince us that their innovations are not gratuitous, but essential. (Sometimes, between real version leaps, they have to strain. Remember in the early days of text-messaging, all those wacky commercials trying to come up with some reason, any reason, why you might need to use the service?) But are these innovations really changing our lives in progressive ways? The question is not whether they are simply affecting our quotidian routines--Paul makes clear that they are. The question is whether they are really making our quotidian routines better to a measurable extent. Will we someday look on the production of cell phones the way we now do the production of razors? I intend that as an open, not a rhetorical, question.
A second important reaction to Paul's post is this. Innovations in "gadgetage" cannot be measured objectively in a vacuum. "Technological progress" is a cultural construction, not an indisputable fact of life. And usually, people are unable to see this from the vantage point of the present: it's hard for us to imagine a real technological leap, so the most we can do is imagine successive refinements of what we have now. And as I'll suggest in a moment, it's not insignificant that we sometimes miss seeing the ways in which our views of technology are constructed.
It's helpful, in this regard, to look back to the past. When the first transatlantic telegraph lines were laid in the middle of the nineteenth century, people spoke as if time and space themselves had been obliterated. A new age of universal peace and harmony seemed just around the bend. In 1846, for example, Elihu Burritt, an eclectic American peace activist, wrote an essay in his newspaper on "Agents or Elements of Universal Brotherhood." Among these agents, Burritt listed the "gadgetage" of his day.
... there is the great steam engine at work with all the indomitable enthusiasm of its glowing heart, contracting space, reducing oceans to a river’s width, bringing the compass of a continent within the travel of a day; compressing sea-divided nations into immediate neighborhood; ... strapping countries together with railway bars--countries which kept each other's borders red with blood for centuries; transplanting the seated hills; ... The whole bent of this iron sinewed giant seems to be, to collocate the different tribes of mankind within a family circle, and around the central idea of Universal Brotherhood. Then there is the Magnetic Telegraph. ...When I read passages like these, it makes it easier to see myself through some future historian's eyes. For Burritt, it seemed manifestly obvious that railway bars and telegraphs were the apogees of technological progress. We can see that they were not, but can we also see some of Burritt's beliefs in our own views about computers and digital technology?
Hopefully, reading Burritt induces some humility about our own gadgets and their seeming progressiveness. I use the word "humility" intentionally, because valorizing certain kinds of technology as essential or universally good can have unintended effects. For example, the conclusion Burritt drew from his observations on technology was this:
If Christianity keeps pace with Commerce, will there not be a glorious brotherhood, a nice family circle of mankind, by the time these literary lightnings shall be mounted, and running to and fro over the whole earth? But who are doing all this? Why, who else but that wonderful Anglo-Saxon race, that is diffusing itself and its genius over the world? That wonderful race, which thrives better abroad than at home; conforms to any climate or condition; whose language is fast absorbing or displacing all the spiritless tongues and dialects of the heathen world; in which millions of young pagans in the far-off ocean isles, "from Greenland’s icy mountains to India’s coral strand," and thence to the Yellow Sea, North and South American Indians, Polynesians, Australians, Hottentots, Caffres, Egyptians, Hindoos, Seikhs, and Japanese, are now learning their first lessons in civilization and Christianity. If British and American Christians do their duty, the boy is at school who will live to see half the human family speaking the English language, and half the habitable surface of the globe covered with the Anglo-Saxon race, and blessed with its civilization. The railway engines that shall thunder through the heart of Asia, Africa, and the American continent, will speak and teach the English language, and so will the mounted lightnings on all the highways and wire bridges of thought that shall be erected for the converse of the world’s extremes.By making steamships and telegraphs "Anglo-Saxon," Burritt authorized an imperial vision of civilizing the world through the spread of technological innovation. Likewise, in an 1887 travelogue on China, James Harrison Wilson argued that the Chinese, whom he deemed inferior, “must be led to adopt our ways by showing them that our ways are better than theirs.” This superiority was proved by “the greatest industrial movement of all time,” which had “annihilated time and space,” “overcome Nature,” and was now spreading “its beneficent fruits to all nations and races of men.” (I've quoted directly from Wilson's book, which I found out about from Michael Adas's Machines as the Measure of Men: Science, Technology, and Ideologies of Western Dominance, an indispensable work on this subject.)
I've obviously diverged very far from Paul's post. I hope he won't take offense at the divergence, because I'm not imputing to him Burritt's or Wilson's views. If I'm talking to anyone here, it's primarily myself. (That's what blogging is all about, right?) After all, I too have the gadgets Paul describes, and I have come to believe that they are essential parts of my daily life.
But it's important to be constantly reminded that this belief is the product of my particular situation in the world, rather than a pointer to universal truths about technology. (Again, not saying Paul is saying this.) As Paul points out in his closing thoughts, "the Gadget Index will probably fall to zero, or perhaps one, in the wealthy world." (My emphasis.) In Darfur, the Gadget Index for many is already zero, but for very different reasons. It's also important to stress that this fact only proves that I live in "the wealthy world," not that Darfur is an inferior world. (See Epictetus's second teaching in my earlier post.)
Although I am not in any way attaching them to Paul, views connecting technology and civilization are not unheard of today. You won't hear people stridently saying that cell phones are vehicles for the fruits of the Anglo-Saxon race. But you will hear people implying that spreading gadgets and technology goes hand in hand with spreading democracy. How different, in the end, is Burritt's vision of "railway engines that shall thunder through the heart of Asia" from the Bush administration's view that building brand new roads and schools with ceiling fans justifies our occupation of Iraq?
On a White House page offering the grateful testimonies of liberated Iraqis, there is this assessment from the National Review, quoted right alongside tales of Saddam's torture and coercion. "They [Iraqis] have never been so free and prosperous, and they expect things will get better still. There's been banking and currency reform, with lines of credit now readily available. Markets are thriving, property values are rising. Welcome novelties include free speech and almost 200 periodicals; Internet cafes, bloggers, and cellphones are everywhere." I'm hoping you noticed that "blogging" is being ranked with gadgets as proof that Iraqis have never been so "free and prosperous." The entire page is worth reading by the way. It includes this:
"Mister good!"No mention of the Anglo-Saxon race ... But are the references to "broken English," or "cellphones" as agents of freedom, substantively different--in a rhetorical sense--from Burritt's description of railroads as agents of human brotherhood? Consider, in conclusion, the lede to this story in USA Today: "Denied many modern luxuries under Saddam Hussein, Baghdad's consumers welcomed the arrival of cell phone service over the weekend." See the hidden presumption? Political backwardness is linked with a low Gadget Index. Should we include, among the "modern luxuries" that Saddam Hussein denied Iraqis, the Glade Wisp and the Gillette M3Power?
--Iraqi children, in broken English, to British soldiers in Basra, The Boston Globe, November 11, 2003
Life before death
I cannot resist linking to the website for Christian Aid, a fair trade organization in the United Kingdom, because its motto is too good to miss: "We believe in life before death." I found the site via a sermon by Tony Price, which sermon I found via his two posts at Storyteller's World.
Friday, September 24, 2004
A cure for toe gout
I'm headed up to Philadelphia this afternoon, so there probably will not be a new post today. I have some ideas brewing. In the meantime, head over to ::: wood s lot :::, a blog that was nice enough to link to my post on used book stores. There are some fantastic photographs and poems up on the blog today, as well as a link to Daniel Defoe's 1704 "Essay on the Regulation of the Press." Here's an excerpt:
To Cure the ill Use of Liberty, with a Deprivation of Liberty, is like cutting off the Leg to cure the Gout in the Toe, like expelling Poison with too Rank a Poison, where both may struggle which Poison shall prevail, but which soever prevails, the Patient suffers.
If the Exorbitance of some few People in Printing Seditious and Dangerous Books, must Abridge all the Men of Learning in the Nation of their Liberty in Printing, what after exceeding toil and unwearied Pains they are willing to Communicate to Posterity, then who will Study, who will breed up their Children to Letters, when all the Fruits of their Labours are liable to the Blast of the Arbitrary Breath of Mercenary Men.
Thursday, September 23, 2004
Hotter school
You can add to this earlier post some more anecdotal and diverting evidence that schools are either sorely underfunded or that their budgets are badly administered. Today at the high school where my wife teaches, an elevator and mechanical room caught on fire. The students and teachers were evacuated, and the fire engines were called. When one teacher saw smoke at the end of his hall, he grabbed the nearest fire extinguisher, a logical thing to do. But he was told to desist by a custodian who said, "Those cost too much money to refill. Just wait for the trucks."
Used book stores
"Likewise we ought to read simple and devout books as willingly as learned and profound ones. We ought not to be swayed by the authority of the writer, whether he be a great literary light or an insignificant person, but by the love of simple truth. We ought not to ask who is speaking, but mark what is said." Thomas a Kempis
In a recent post, I somewhat enigmatically said that "I hope to regain and then to retain, until late in life, the spirit I had as an undergraduate in a used book store." Have a look at Paul Musgrave's reflections on undergraduates in bookstores, and be sure to follow the link to the Orwell essay.
Paul wonders about this "spirit" of which I speak. Part of what I'm referring to is a certain nostalgia for that time in my life when everything was education--education in the Henry-Adams sense, not just in the classroom-sense. It is also a nostalgia for place, akin to the nostalgia one feels for college hangouts and favorite coffee shops. There simply are not very many good used book stores in Baltimore, and I miss them. Unlike Orwell, apparently, I also have a tangible nostalgia for the smell of dusty books, for the labyrinthine shelves with books precariously piled on the tops, for the surprise of turning a corner and finding the store's resident cat sitting on top of a discount table.
But the "spirit" I mentioned in my post had to do more with the way I looked at books as an undergraduate, before I acquired the book-buying tics of a graduate student. When I was an undergrad, I really and truly browsed at used book stores. I picked up books just because they looked interesting. I bought and read books by authors I had never heard of, and never will hear of again. I read books because I was looking, with a mixture of trepidation and urgency, for answers to profound problems. I guess I did think of myself, as Paul puts it, as being on an "intellectual adventure."
I'm not saying I don't still have that sense now about books. But academic training can sometimes compete with this sense of "adventure" by giving you detailed maps to the treasure, complete with complex keys and dotted lines. I find books now by following footnotes more often than I do by browsing shelves. (Although I should say that even in this age of digital libraries, I often have great success in finding new books by locating a book I was looking for and then scanning the surrounding volumes. But even then, I have ended up at that set of call numbers for very specific reasons. I've followed some kind of map.)
These maps do not by themselves rob reading of its adventure. But when I pick up a volume in a used shop now, I often judge the book, almost against my will, by its cover. "Who published it?" I hate that I notice immediately whether it is published by a university press. This is the kind of thing that would have seemed incidental to me as an undergraduate, but that matters to me now. "Who wrote it?" As an undergraduate, if I hadn't heard of the author, I would assume--as one should--that this pointed to a deficiency in my knowledge, rather than suggesting some immediate deficiency in the author. But when you're a graduate student, not having heard of the author sends up red flags, again against your will. (If you don't recognize the markings on the ship, maybe it's a pirate--or, gasp, a popular historian.) My undergrad reaction to a new author was the proper one--unabashed curiosity and reserved judgment. My graduate reaction, sad to say, is often automatic suspicion or hasty condemnation. Another symptom of the disease I'm describing is that I often flip quickly to the acknowledgements of a new book--the acknowledgements! When I see myself as an undergraduate in a used book store, I see someone genuinely searching for knowledge from books, not someone looking for acknowledgement(s).
Of course, I have caricatured both my undergraduate and graduate selves. I do retain the "spirit" I had then--to say that I need to "regain" it might have been too strong a word. And I'm sure even as an undergraduate the "spirit" I've been alluding to was sometimes weak. But I do feel that I must actively continue to tear down many of the gate-keeping devices that my graduate student brain has erected. I have to unlearn some things if I want to really learn.
I sometimes hear people describing graduate school as "soul-crushing," either because of the amount of work, or because of the sense of inferority and anxiety it can breed. But if there is anything potentially "soul-crushing" about graduate education, it is that it can potentially destroy the sense of curiosity and fair-mindedness that genuine readers have about books. I don't intend to let graduate school do that.
P.S. A somewhat related post at Giornale Nuovo, via Cliopatria. Also, there's a wonderful reflection on reading and books at Hoarded Ordinaries, via wood s lot (be sure to scroll down for the picture of "The Bookworm").
In a recent post, I somewhat enigmatically said that "I hope to regain and then to retain, until late in life, the spirit I had as an undergraduate in a used book store." Have a look at Paul Musgrave's reflections on undergraduates in bookstores, and be sure to follow the link to the Orwell essay.
Paul wonders about this "spirit" of which I speak. Part of what I'm referring to is a certain nostalgia for that time in my life when everything was education--education in the Henry-Adams sense, not just in the classroom-sense. It is also a nostalgia for place, akin to the nostalgia one feels for college hangouts and favorite coffee shops. There simply are not very many good used book stores in Baltimore, and I miss them. Unlike Orwell, apparently, I also have a tangible nostalgia for the smell of dusty books, for the labyrinthine shelves with books precariously piled on the tops, for the surprise of turning a corner and finding the store's resident cat sitting on top of a discount table.
But the "spirit" I mentioned in my post had to do more with the way I looked at books as an undergraduate, before I acquired the book-buying tics of a graduate student. When I was an undergrad, I really and truly browsed at used book stores. I picked up books just because they looked interesting. I bought and read books by authors I had never heard of, and never will hear of again. I read books because I was looking, with a mixture of trepidation and urgency, for answers to profound problems. I guess I did think of myself, as Paul puts it, as being on an "intellectual adventure."
I'm not saying I don't still have that sense now about books. But academic training can sometimes compete with this sense of "adventure" by giving you detailed maps to the treasure, complete with complex keys and dotted lines. I find books now by following footnotes more often than I do by browsing shelves. (Although I should say that even in this age of digital libraries, I often have great success in finding new books by locating a book I was looking for and then scanning the surrounding volumes. But even then, I have ended up at that set of call numbers for very specific reasons. I've followed some kind of map.)
These maps do not by themselves rob reading of its adventure. But when I pick up a volume in a used shop now, I often judge the book, almost against my will, by its cover. "Who published it?" I hate that I notice immediately whether it is published by a university press. This is the kind of thing that would have seemed incidental to me as an undergraduate, but that matters to me now. "Who wrote it?" As an undergraduate, if I hadn't heard of the author, I would assume--as one should--that this pointed to a deficiency in my knowledge, rather than suggesting some immediate deficiency in the author. But when you're a graduate student, not having heard of the author sends up red flags, again against your will. (If you don't recognize the markings on the ship, maybe it's a pirate--or, gasp, a popular historian.) My undergrad reaction to a new author was the proper one--unabashed curiosity and reserved judgment. My graduate reaction, sad to say, is often automatic suspicion or hasty condemnation. Another symptom of the disease I'm describing is that I often flip quickly to the acknowledgements of a new book--the acknowledgements! When I see myself as an undergraduate in a used book store, I see someone genuinely searching for knowledge from books, not someone looking for acknowledgement(s).
Of course, I have caricatured both my undergraduate and graduate selves. I do retain the "spirit" I had then--to say that I need to "regain" it might have been too strong a word. And I'm sure even as an undergraduate the "spirit" I've been alluding to was sometimes weak. But I do feel that I must actively continue to tear down many of the gate-keeping devices that my graduate student brain has erected. I have to unlearn some things if I want to really learn.
I sometimes hear people describing graduate school as "soul-crushing," either because of the amount of work, or because of the sense of inferority and anxiety it can breed. But if there is anything potentially "soul-crushing" about graduate education, it is that it can potentially destroy the sense of curiosity and fair-mindedness that genuine readers have about books. I don't intend to let graduate school do that.
P.S. A somewhat related post at Giornale Nuovo, via Cliopatria. Also, there's a wonderful reflection on reading and books at Hoarded Ordinaries, via wood s lot (be sure to scroll down for the picture of "The Bookworm").
Tuesday, September 21, 2004
Climates of opinion
[FAIR WARNING: Pedantic rambling ahead. Proceed at your own risk.]
You rarely find historians these days talking at length about "intellectual climates," but they used to be quite comfortable talking about "climates of opinion." In his 1932 book, The Heavenly City of the Eighteenth-Century Philosophers, Carl Becker's first chapter took its title from the phrase. The phrase itself, according to Becker, was of seventeenth-century origin. But he was glad to see that it was being "restored to circulation" by historians of ideas.
To give an example of what he meant by a "climate of opinion," Becker used this thought experiment: Suppose, he told his readers, that you suddenly found yourself face-to-face with a resuscitated Dante or Thomas Aquinas. Imagine trying to argue with them, Becker said, about some (then) contemporary issue like the viability of the League of Nations. Dante and Aquinas would doubtlessly make arguments for the League premised on a kind of Christian universalism or on the idea of "natural law." Many of these arguments would have little purchase, though, for twentieth-century interlocutors. The problem would not be that Aquinas and Dante were stupid or their arguments formally invalid; the problem would be that their worldviews are not easily compatible with modern "climates of opinion." As Becker puts it,
Well, one reason is that the metaphor itself is faulty. It implies that opinions can be disembodied, that intellectual "worldviews" somehow float above the heads of historical actors like a fog or a layer of ozone. It implies, too, that we can extrapolate the "climate" of an entire period in history merely from the writings of especially visible thinkers like Thomas Aquinas or the eighteenth-century philosophes--this is the "dead white men" problem of traditional intellectual history. The successive arrivals of social and cultural history have rightly cast doubt on the idea that we can infer things about an era's "world pattern" from the writings of a few elites. Finally, historians today probably feel that the climatological metaphor is too structuralist and naturalistic. You can't change the weather, after all, and speaking of ideas as a "climate" makes it seem as though culture can be objectified and made independent of human agency.
Instead of talking about intellectual "climates," most intellectual historians now talk about "discourses," a terminological substitution that has at least two virtues. The metaphor is more modest; rather than proposing a metanarrative about the "world pattern" of an age, it can be used in smaller narratives about the patterns of particular intellectual communities. And at the same time that "discourse" is more modest, it also feels less constraining: it metaphorically gives individual thinkers more control over the shape of their ideas.
Let me elaborate on and obfuscate what I mean. First, talking about "discourse" forces historians to specify a particular community of intellectuals as their subject, instead of talking abstractly about an epochal Weltanschauung. If you're going to be talking about the discourse of elite intellectuals, you say that up front, instead of concealing the sleight of hand that turns the ideas of a few into a "world pattern."
Secondly, "discourse" (especially in the sense meant by French poststructuralist thinkers) conveys something of the constraints placed on thinkers by shared presumptions, while at the same time conveying something of the creativity with which thinkers can interrogate those presumptions. "Climates of opinion," one could argue, was all structure and no play.
Yet lately I've been thinking that "discourse" faces some of the same metaphorical pitfalls that once vexed "climates of opinion." Like all metaphors, its application has limitations.
For example, in its literal meaning, a "discourse" is a conversation--a conversation between members of an intellectual community. But intellectual historians often use the word "discourse" without carefully drawing lines of connection between every interlocutor in that conversation. When I speak of the discourse of "secularism" in the Enlightenment, for instance, do I need to demonstrate that every time a thinker voiced secular ideas, they were doing so in conversation with another thinker? Surely what I want to say, instead, is that secular ideas were ready at hand to Enlightenment intellectuals, that they formed the preconditions for conversation between certain intellectuals rather than always being part of the conversation itself.
And of course, this is often what historians mean when they use the word "discourse." By "discourse," they mean to refer to the presuppositions, the things that could be taken for granted, in conversations between certain thinkers. But in that case, does "discourse" (as a metaphor) really improve on Becker's use of "climates" to refer to the "instinctively held preconceptions" that certain communities shared?
"But don't forget," you might say, "the other problem with the climates metaphor." And I haven't; the real problem with Becker might not be that he spoke of "preconceptions" held in common by intellectuals. The real problem is that he generalized from these preconceptions to speak of the shared beliefs of an entire age. Now, personally, I have my doubts whether Becker really thought any such thing. There was a looseness in his language, granted, but there is a looseness in all metaphors. Besides, historians who would fault "climates of opinion" as an elitist metaphor often do the same kind of generalization with the word "discourse." We talk freely about discourses of race, discourses of gender, discourses of nationalism, etc. etc., without bothering every time to specify the "interlocutors" in those discourses.
By point is that as a metaphor for "culture" or "opinion," "discourse" has flaws of its own that are not wholly dissimilar from the flaws that led to the abandonment of "climates." And other metaphors of culture that are used more frequently, like Clifford Geertz's idea of culture as a "web" has problems too. The most important problem with the "web" metaphor--which suggests that "man is an animal suspended in webs of significance he himself has spun," in Geertz's words--is that it too ignores differentials in power between elite and non-elite thinkers. Culture is spun, indeed, but some people do more of the spinning, while others spend most of their time being suspended in those webs. And as I think Isaiah Berlin said (maybe Jason knows where), freedom for the spider is death for the fly.
All of which is a much too long way of saying, why can't I use "climate of opinion"? Why can't I refer to the fact that some ideas are sort of like the air you breathe? You didn't come up with the ideas; they were there before you; they do surround you in some ways, and you do tend to take them for granted, just like you take for granted that you just took a breath. This doesn't mean you don't have any control over the ideas around you, or that you can't not take them for granted. When I just referred to your taking a breath, you probably thought about your breathing. And if you didn't before, surely you are now. In the same way, we can think deliberately about the climates of opinion around us. We can even, to follow the metaphor a little further, exercise the freedom to hold our breath. But it is somewhat accurate to say that you can't hold your breath forever. There are certain ideas in your head that you probably can't willfully get rid of without the help of amnesia or brain damage.
Yes, the metaphor breaks down. But my point is that all metaphors do. And sometimes it's better to use ones that we know break down, because their breaking down tells us something interesting about the phenomenon we are trying to talk about.
I'll leave you with a final thought (if you haven't left me already). Perhaps "climates of opinion" are safer to use now that we have a different understanding of climate than historians of Becker did. Becker used "climate" as a metaphor because he wanted to refer to relatively stable patterns of thought. But the more we learn about climate (notice, no quotes), the more we discover how volatile and unstable it can be. Witness this hurricane season for instance. If we realize that climates are not unchanging structures, then why are we still afraid to talk about "climates of opinion"?
In a famous essay, for instance, Roger Chartier once took issue with the various metaphors that historian Robert Darnton used to interpret certain bizarre episodes of French culture. Darnton variously speaks of culture as "shared, like the air we breathe," as webs (since he was influenced by teaching at Princeton with Geertz), and as a system of symbols or a discourse. Chartier thinks the problem with all of these metaphors is that they make cultural meanings too stable. For this and other reasons, he writes that "metaphorical use of the vocabulary of linguistics" to describe culture "comports a certain danger." And "it seems risky," he writes, "to claim that symbols are 'shared, like the air we breathe.' Quite to the contrary, their significations are unstable, mobile, equivocal."
But the "air" is also unstable and mobile and unpredictable. So what's wrong with using it as a metaphor? Talking about "climates of opinion" does not prohibit us from speaking of "climate change," or "hurricanes of opinion." On the contrary, speaking of "intellectual climates" might lend itself to talking about an "ecology of thought," in which we study both the stable environment of a "thinking organism" (intellectual) and its own impact on that ecosystem. Using Becker's metaphor in this kind of way might make him roll in his grave, but I am certain there are plenty of things about contemporary historiography that have him doing jumping jacks already.
P.S. Siris has some thoughts on this post. Alas for the fact that Blogger does not support Trackback, and for the fact that I am too lazy to implement Haloscan.
You rarely find historians these days talking at length about "intellectual climates," but they used to be quite comfortable talking about "climates of opinion." In his 1932 book, The Heavenly City of the Eighteenth-Century Philosophers, Carl Becker's first chapter took its title from the phrase. The phrase itself, according to Becker, was of seventeenth-century origin. But he was glad to see that it was being "restored to circulation" by historians of ideas.
To give an example of what he meant by a "climate of opinion," Becker used this thought experiment: Suppose, he told his readers, that you suddenly found yourself face-to-face with a resuscitated Dante or Thomas Aquinas. Imagine trying to argue with them, Becker said, about some (then) contemporary issue like the viability of the League of Nations. Dante and Aquinas would doubtlessly make arguments for the League premised on a kind of Christian universalism or on the idea of "natural law." Many of these arguments would have little purchase, though, for twentieth-century interlocutors. The problem would not be that Aquinas and Dante were stupid or their arguments formally invalid; the problem would be that their worldviews are not easily compatible with modern "climates of opinion." As Becker puts it,
Whether arguments command assent or not depends less upon the logic that conveys them than upon the climate of opinion in which they are sustained. What renders Dante's argument or St. Thomas' definition meaningless to us is not bad logic or want of intelligence, but the medieval climate of opinion--those instinctively held preconceptions in the broad sense, that Weltanschauung or world pattern--which imposed upon Dante and St. Thomas a peculiar use of the intelligence and a special type of logic. To understand why we cannot easily follow Dante or St. Thomas it is necessary to understand (as well as may be) the nature of this climate of opinion. (p. 5)Among recent generations of professional historians, this kind of talk has been fading into obsolescence. The phrase "climates of opinion" itself is now exceedingly rare. But I'm not sure why we can't use the metaphor of "climates" to talk about intellectual formations. And sometimes, while writing about commonly held ideas in a culture, "climates" is the word in my toolbox that I find myself wanting to reach for. Why can't I pick it up?
Well, one reason is that the metaphor itself is faulty. It implies that opinions can be disembodied, that intellectual "worldviews" somehow float above the heads of historical actors like a fog or a layer of ozone. It implies, too, that we can extrapolate the "climate" of an entire period in history merely from the writings of especially visible thinkers like Thomas Aquinas or the eighteenth-century philosophes--this is the "dead white men" problem of traditional intellectual history. The successive arrivals of social and cultural history have rightly cast doubt on the idea that we can infer things about an era's "world pattern" from the writings of a few elites. Finally, historians today probably feel that the climatological metaphor is too structuralist and naturalistic. You can't change the weather, after all, and speaking of ideas as a "climate" makes it seem as though culture can be objectified and made independent of human agency.
Instead of talking about intellectual "climates," most intellectual historians now talk about "discourses," a terminological substitution that has at least two virtues. The metaphor is more modest; rather than proposing a metanarrative about the "world pattern" of an age, it can be used in smaller narratives about the patterns of particular intellectual communities. And at the same time that "discourse" is more modest, it also feels less constraining: it metaphorically gives individual thinkers more control over the shape of their ideas.
Let me elaborate on and obfuscate what I mean. First, talking about "discourse" forces historians to specify a particular community of intellectuals as their subject, instead of talking abstractly about an epochal Weltanschauung. If you're going to be talking about the discourse of elite intellectuals, you say that up front, instead of concealing the sleight of hand that turns the ideas of a few into a "world pattern."
Secondly, "discourse" (especially in the sense meant by French poststructuralist thinkers) conveys something of the constraints placed on thinkers by shared presumptions, while at the same time conveying something of the creativity with which thinkers can interrogate those presumptions. "Climates of opinion," one could argue, was all structure and no play.
Yet lately I've been thinking that "discourse" faces some of the same metaphorical pitfalls that once vexed "climates of opinion." Like all metaphors, its application has limitations.
For example, in its literal meaning, a "discourse" is a conversation--a conversation between members of an intellectual community. But intellectual historians often use the word "discourse" without carefully drawing lines of connection between every interlocutor in that conversation. When I speak of the discourse of "secularism" in the Enlightenment, for instance, do I need to demonstrate that every time a thinker voiced secular ideas, they were doing so in conversation with another thinker? Surely what I want to say, instead, is that secular ideas were ready at hand to Enlightenment intellectuals, that they formed the preconditions for conversation between certain intellectuals rather than always being part of the conversation itself.
And of course, this is often what historians mean when they use the word "discourse." By "discourse," they mean to refer to the presuppositions, the things that could be taken for granted, in conversations between certain thinkers. But in that case, does "discourse" (as a metaphor) really improve on Becker's use of "climates" to refer to the "instinctively held preconceptions" that certain communities shared?
"But don't forget," you might say, "the other problem with the climates metaphor." And I haven't; the real problem with Becker might not be that he spoke of "preconceptions" held in common by intellectuals. The real problem is that he generalized from these preconceptions to speak of the shared beliefs of an entire age. Now, personally, I have my doubts whether Becker really thought any such thing. There was a looseness in his language, granted, but there is a looseness in all metaphors. Besides, historians who would fault "climates of opinion" as an elitist metaphor often do the same kind of generalization with the word "discourse." We talk freely about discourses of race, discourses of gender, discourses of nationalism, etc. etc., without bothering every time to specify the "interlocutors" in those discourses.
By point is that as a metaphor for "culture" or "opinion," "discourse" has flaws of its own that are not wholly dissimilar from the flaws that led to the abandonment of "climates." And other metaphors of culture that are used more frequently, like Clifford Geertz's idea of culture as a "web" has problems too. The most important problem with the "web" metaphor--which suggests that "man is an animal suspended in webs of significance he himself has spun," in Geertz's words--is that it too ignores differentials in power between elite and non-elite thinkers. Culture is spun, indeed, but some people do more of the spinning, while others spend most of their time being suspended in those webs. And as I think Isaiah Berlin said (maybe Jason knows where), freedom for the spider is death for the fly.
All of which is a much too long way of saying, why can't I use "climate of opinion"? Why can't I refer to the fact that some ideas are sort of like the air you breathe? You didn't come up with the ideas; they were there before you; they do surround you in some ways, and you do tend to take them for granted, just like you take for granted that you just took a breath. This doesn't mean you don't have any control over the ideas around you, or that you can't not take them for granted. When I just referred to your taking a breath, you probably thought about your breathing. And if you didn't before, surely you are now. In the same way, we can think deliberately about the climates of opinion around us. We can even, to follow the metaphor a little further, exercise the freedom to hold our breath. But it is somewhat accurate to say that you can't hold your breath forever. There are certain ideas in your head that you probably can't willfully get rid of without the help of amnesia or brain damage.
Yes, the metaphor breaks down. But my point is that all metaphors do. And sometimes it's better to use ones that we know break down, because their breaking down tells us something interesting about the phenomenon we are trying to talk about.
I'll leave you with a final thought (if you haven't left me already). Perhaps "climates of opinion" are safer to use now that we have a different understanding of climate than historians of Becker did. Becker used "climate" as a metaphor because he wanted to refer to relatively stable patterns of thought. But the more we learn about climate (notice, no quotes), the more we discover how volatile and unstable it can be. Witness this hurricane season for instance. If we realize that climates are not unchanging structures, then why are we still afraid to talk about "climates of opinion"?
In a famous essay, for instance, Roger Chartier once took issue with the various metaphors that historian Robert Darnton used to interpret certain bizarre episodes of French culture. Darnton variously speaks of culture as "shared, like the air we breathe," as webs (since he was influenced by teaching at Princeton with Geertz), and as a system of symbols or a discourse. Chartier thinks the problem with all of these metaphors is that they make cultural meanings too stable. For this and other reasons, he writes that "metaphorical use of the vocabulary of linguistics" to describe culture "comports a certain danger." And "it seems risky," he writes, "to claim that symbols are 'shared, like the air we breathe.' Quite to the contrary, their significations are unstable, mobile, equivocal."
But the "air" is also unstable and mobile and unpredictable. So what's wrong with using it as a metaphor? Talking about "climates of opinion" does not prohibit us from speaking of "climate change," or "hurricanes of opinion." On the contrary, speaking of "intellectual climates" might lend itself to talking about an "ecology of thought," in which we study both the stable environment of a "thinking organism" (intellectual) and its own impact on that ecosystem. Using Becker's metaphor in this kind of way might make him roll in his grave, but I am certain there are plenty of things about contemporary historiography that have him doing jumping jacks already.
P.S. Siris has some thoughts on this post. Alas for the fact that Blogger does not support Trackback, and for the fact that I am too lazy to implement Haloscan.
Sunday, September 19, 2004
Things stoic
Paul Musgrave inspired me to pull out an old copy of Epictetus's Discourses, which I remember picking up at a used book store while I was an undergraduate. I hope to regain and then to retain, until late in life, the spirit I had as an undergraduate in a used book store. Here are some things Epictetus said:
What things are to be learned, in order to know how to conduct an argument, the philosophers of our sect have accurately taught; but we are altogether unpracticed in the proper application of them. Only give to any one of us whom you will some illiterate person for an antagonist, and he will not find out how to treat him. But when he has moved the man a little, if he happens to answer at cross purposes, the questioner does not know how to deal with him any further, but either reviles or laughs at him, and says: "He is an illiterate fellow; there is no making anything of him." Yet a guide, when he perceives his charge going out of the way, does not revile and ridicule and then leave him, but leads him into the right path. Do you also show your antagonist the truth, and you will see that he will follow. But till you show it, do not ridicule him; but rather recognize your own incapacity. ...
These reasonings have no logical connection: "I am richer than you; therefore I am your superior." "I am more eloquent than you; therefore I am your superior." The true logical connection is rather this: "I am richer than you; therefore my possessions must exceed yours." "I am more eloquent than you; therefore my style must surpass yours." But you, after all, consist neither in property nor in style. ...
When any person treats you badly, or speaks ill of you, remember that he acts or speaks from an impression that it is right for him to do so. Now, it is not possible that he should follow what appears right to you, but only what appears so to himself. Therefore, if he judges from false appearances, he is the person hurt; since he too is the person deceived. For if anyone takes a true proposition to be false, the proposition is not hurt, but only the man is deceived. ...
More on war
"Is that you, my brother? Is that you?" -- An Iraqi civilian to his injured brother, shortly before being shot and killed by an American helicopter
"The war horse is a vain hope for victory, and by its great might it cannot save." Psalms 33:17 (NRSV)
War is hell. On this point, most people agree, especially after wars have begun or after they have ended. And yet wars continue. Why? Many explanations are given; alleged justifications for war (and hell) abound. But my question is not one about arcane theories of "just war." It is simpler than that: why do people who would quickly agree that war is hell if it touched them directly, nonetheless accept its continuation?
The answer is partly psychological, but it also has to do with a powerful repertoire of metaphors and assumptions that people use to romanticize war and lend it legitimacy. Most people who believe in war do not rationalize their belief with complicated theories about justice and history. Rather, they reason in favor of war by appeal to imagined commonalities between large-scale conflicts and smaller acts of violence, acts that are less likely to be questioned. For example, whenever a strict pacifist or principled advocate of nonviolence talks for long with someone who believes in war, she will inevitably be faced with some variation on a "self-defense" metaphor, which goes something like this. What if someone with a shotgun broke into your house and pointed the gun at your child? Wouldn't you be justified in acting violently--in killing to save? The emotional power of this scene is so overweening that the defender of war does not need careful logic to bring the point home. That's what war is, he can argue--killing to save.
I find the shotgun story troubling and compelling. It highlights the rhetorical power of "self-defense" as a rationale for violence. And it spotlights the visceral nature of violent acts. No amount of calm debate about the moral status of violence could possibly prepare one for the horrifying experience of looking down the barrel of a shotgun in the presence of one's child. As someone who leans very hard in the direction of pacifism, I am still suspicious of pacifists who say they know how they would act in that situation, even if they have beliefs about what they should do.
But I am also suspicious of the leap from this "self-defense" metaphor to justifications for war. The metaphor works as a defense if and only if war respects the parable's pellucid boundaries between the aggressor and the victim, the guilty and the innocent, the attacker and the defender. The implication is that we can sometimes go to war with the same confidence and clarity with which a father would defend his son from an armed intruder. In fact, however, war is hell precisely because it seldom (if ever) offers this kind of clarity. And when it does, it often clearly reverses the story's roles. The metaphorical "father" who goes to war to defend the metaphorical "child" ends up, in reality, being the one pointing a shotgun at another father's children.
This lack of clarity in war is especially clear in Iraq. Last week, American helicopters fired on a crowd that included unarmed civilians, killing or injuring dozens of civilians, many young boys. The crowd was swarming around a burning Bradley armored vehicle. A military statement identified the crowd with "anti-Iraqi" forces who might have looted the Bradley, and said that helicopters defended the "loss of sensitive equipment and weapons."
Do you see what has happened here? Who is the proverbial "parent" in this war story? Who is the proverbial, unprotected "child"? Who is the intruder holding the "shotgun"? These questions cannot be unproblematically answered. At this point, the defender of war will clutch the metaphor, try to prevent it from slipping away. "But someone in the crowd draped a terrorist flag on the Bradley ..." or "But there was some small-arms fire coming from the general vicinity ..." Go ahead, grope for the parallels. They simply are not there. Look at the picture of the wounded pre-teenagers, and tell yourself that they are terrorists. But they are not, though they may be now. Tell yourself that "small arms fire" pointed at an armored helicopter is the same as the shotgun pointed at the child. But it is not.
In fact, if you want to map the metaphor to the reality, to achieve a prefect one-to-one symmetry between what happened and the parable of the parent, it would have to be this. The "child" being protected here was a Bradley vehicle's "sensitive equipment." The "persons holding the shotgun" were children. Or were the "persons holding the shotgun" the helicopters? The metaphors crumble like houses of cards.
Yet the metaphors continue to stand up for many people. For every clear proof that war is a hellish fog, in which one's "duty" can be to kill in order to protect weapons, there is some other case that makes the metaphor seem right. This is why the American media reports every instance of a kidnapped American or European, but glosses over Iraqi civilian deaths by burying them in body counts. Images of hostages correspond more exactly to the metaphor: the hostages are innocent, the terrorists are shotgun-toting villains, and we are the preternaturally enraged parents. But of course, images like these also correspond well to the metaphor, except the children are Iraqis, and "we" are sometimes the ones with the shotguns.
It never ceases to amaze me that in the media hype over Fahrenheit 911, reviewers returned again and again to "My Pet Goat" as the most disturbing and damning scene. For me, it was infinitely more disturbing to watch American soldiers, equipped with night goggles, raid an Iraqi home and point their machine guns squarely at the head of a terrified child. Or the scene in which an Iraqi woman wails to Allah about the destruction that has befallen her because of an American air strike. But then, I know why these scenes do not garner attention: because they too deeply challenge the tidiness of our war metaphors. Or rather, they are tidy examples of the metaphor, but with Americans playing the wrong roles.
As lamentable as it is that Americans screen out these images that disturb their metaphors, it is especially disturbing to me that American Christians do this. Christians who support war sometimes do not even trouble themselves with the parable of the parent. They have recourse to an easier and even more glib justification: war happens, they say. It's in the Bible. Right there in the "Old Testament." God willed it. So be it. War happens.
Earlier this summer, I remember standing on a street here in Baltimore, passing out flyers against the war in Iraq. A very friendly woman politely explained to me that there has always been war, and always will be. She believes in the Bible, she tells me, and Israel fought wars. There's nothing we can do about war, she argues. It's been around forever.
I wish I had been more articulate in my response. I wish that I had pointed out that the same argument was offered in defense of slavery--it's in the Bible, and it's been around forever. ("Writers, by nature, tend to be people [who] ... are always thinking of the perfect riposte after the moment for saying it has passed. So they take a few years longer and put it in print," wrote Louis Menand. I haven't even waited a few years.) But I wish I had also said that the Bible's narratives and commentaries on war also fail to support the simplicity of the parable.
How to deal with ancient Israel's conquest narratives is a sticky theological and historical question, and I don't want to get stuck there now. What I want to point out is that alongside the Bible's chronicles of war, there are very incongruous statements about its futility and waste. There is not a consistent glorification of battle and valor, of the sort you might find in Homer's Iliad. Far from providing extended paeans to flashing helmets and bronze shields, Israel is often chastised for placing faith in chariots, or seeking military alliances. Israel is told by its liturgical songs not to trust in mighty armies or in war horses, not to rely on military prowess to save. Chariots are usually identified as belonging to the enemy--to Pharaoh, to the Canaanites. And yet, American Christians--while thumping the Bible--uncritically valorize our war horses with an unthinking fatalism. Now, we not only trust in "chariots," but are willing to kill in order to protect their "sensitive equipment."
That's what happens in war. Glib metaphors do not work. You can talk about defending your child from the guy with the shotgun, but in war you'll end up killing to protect the shotgun. In war, you'll end up being the one who points the shotgun at the child. In war, the man pointing the shotgun at you will be the parent whose child you have killed. And in war, you might kill the villain only to end up pointing the shotgun at the one you set out to save. So do not use metaphors--do not use my emotional intuitions--to dismiss pacifism. There seems to be a presumption in our culture that the pacifists are the bleary-eyed utopians, the ones who cannot see things clearly. It seems to me that the opposite is true: it is those who believe that war is an act of self-defense who are fleeing from its realities and pulling the metaphor over their eyes.
Perhaps, however, the metaphorical defender of this war in Iraq will return to the ultimate argument, and urge me to remember September 11, 2001. But even here, the metaphor does not work as well as you want it to. The group around the Bradley vehicle is not identical with the group who massacred Americans on that day. And even if they were, by what moral calculus do deaths cancel out deaths? In the first official accounting of the amount of civilian deaths in Iraq, the Iraqi Ministry of Health has reported, based largely on hospital tallies, that "3,186 Iraqi civilians, men, women and children, [have] died as a result of either terrorist incidents or in clashes involving US-led multinational forces." And that is only since April of this year.
Can you defend those deaths with hypothetical situations? Can you draw lines of causation that make them any more justified than the deaths of men, women and children on September 11? War confounds causation, and it refutes hypotheses. War is, simply, hell.
"The war horse is a vain hope for victory, and by its great might it cannot save." Psalms 33:17 (NRSV)
War is hell. On this point, most people agree, especially after wars have begun or after they have ended. And yet wars continue. Why? Many explanations are given; alleged justifications for war (and hell) abound. But my question is not one about arcane theories of "just war." It is simpler than that: why do people who would quickly agree that war is hell if it touched them directly, nonetheless accept its continuation?
The answer is partly psychological, but it also has to do with a powerful repertoire of metaphors and assumptions that people use to romanticize war and lend it legitimacy. Most people who believe in war do not rationalize their belief with complicated theories about justice and history. Rather, they reason in favor of war by appeal to imagined commonalities between large-scale conflicts and smaller acts of violence, acts that are less likely to be questioned. For example, whenever a strict pacifist or principled advocate of nonviolence talks for long with someone who believes in war, she will inevitably be faced with some variation on a "self-defense" metaphor, which goes something like this. What if someone with a shotgun broke into your house and pointed the gun at your child? Wouldn't you be justified in acting violently--in killing to save? The emotional power of this scene is so overweening that the defender of war does not need careful logic to bring the point home. That's what war is, he can argue--killing to save.
I find the shotgun story troubling and compelling. It highlights the rhetorical power of "self-defense" as a rationale for violence. And it spotlights the visceral nature of violent acts. No amount of calm debate about the moral status of violence could possibly prepare one for the horrifying experience of looking down the barrel of a shotgun in the presence of one's child. As someone who leans very hard in the direction of pacifism, I am still suspicious of pacifists who say they know how they would act in that situation, even if they have beliefs about what they should do.
But I am also suspicious of the leap from this "self-defense" metaphor to justifications for war. The metaphor works as a defense if and only if war respects the parable's pellucid boundaries between the aggressor and the victim, the guilty and the innocent, the attacker and the defender. The implication is that we can sometimes go to war with the same confidence and clarity with which a father would defend his son from an armed intruder. In fact, however, war is hell precisely because it seldom (if ever) offers this kind of clarity. And when it does, it often clearly reverses the story's roles. The metaphorical "father" who goes to war to defend the metaphorical "child" ends up, in reality, being the one pointing a shotgun at another father's children.
This lack of clarity in war is especially clear in Iraq. Last week, American helicopters fired on a crowd that included unarmed civilians, killing or injuring dozens of civilians, many young boys. The crowd was swarming around a burning Bradley armored vehicle. A military statement identified the crowd with "anti-Iraqi" forces who might have looted the Bradley, and said that helicopters defended the "loss of sensitive equipment and weapons."
Do you see what has happened here? Who is the proverbial "parent" in this war story? Who is the proverbial, unprotected "child"? Who is the intruder holding the "shotgun"? These questions cannot be unproblematically answered. At this point, the defender of war will clutch the metaphor, try to prevent it from slipping away. "But someone in the crowd draped a terrorist flag on the Bradley ..." or "But there was some small-arms fire coming from the general vicinity ..." Go ahead, grope for the parallels. They simply are not there. Look at the picture of the wounded pre-teenagers, and tell yourself that they are terrorists. But they are not, though they may be now. Tell yourself that "small arms fire" pointed at an armored helicopter is the same as the shotgun pointed at the child. But it is not.
In fact, if you want to map the metaphor to the reality, to achieve a prefect one-to-one symmetry between what happened and the parable of the parent, it would have to be this. The "child" being protected here was a Bradley vehicle's "sensitive equipment." The "persons holding the shotgun" were children. Or were the "persons holding the shotgun" the helicopters? The metaphors crumble like houses of cards.
Yet the metaphors continue to stand up for many people. For every clear proof that war is a hellish fog, in which one's "duty" can be to kill in order to protect weapons, there is some other case that makes the metaphor seem right. This is why the American media reports every instance of a kidnapped American or European, but glosses over Iraqi civilian deaths by burying them in body counts. Images of hostages correspond more exactly to the metaphor: the hostages are innocent, the terrorists are shotgun-toting villains, and we are the preternaturally enraged parents. But of course, images like these also correspond well to the metaphor, except the children are Iraqis, and "we" are sometimes the ones with the shotguns.
It never ceases to amaze me that in the media hype over Fahrenheit 911, reviewers returned again and again to "My Pet Goat" as the most disturbing and damning scene. For me, it was infinitely more disturbing to watch American soldiers, equipped with night goggles, raid an Iraqi home and point their machine guns squarely at the head of a terrified child. Or the scene in which an Iraqi woman wails to Allah about the destruction that has befallen her because of an American air strike. But then, I know why these scenes do not garner attention: because they too deeply challenge the tidiness of our war metaphors. Or rather, they are tidy examples of the metaphor, but with Americans playing the wrong roles.
As lamentable as it is that Americans screen out these images that disturb their metaphors, it is especially disturbing to me that American Christians do this. Christians who support war sometimes do not even trouble themselves with the parable of the parent. They have recourse to an easier and even more glib justification: war happens, they say. It's in the Bible. Right there in the "Old Testament." God willed it. So be it. War happens.
Earlier this summer, I remember standing on a street here in Baltimore, passing out flyers against the war in Iraq. A very friendly woman politely explained to me that there has always been war, and always will be. She believes in the Bible, she tells me, and Israel fought wars. There's nothing we can do about war, she argues. It's been around forever.
I wish I had been more articulate in my response. I wish that I had pointed out that the same argument was offered in defense of slavery--it's in the Bible, and it's been around forever. ("Writers, by nature, tend to be people [who] ... are always thinking of the perfect riposte after the moment for saying it has passed. So they take a few years longer and put it in print," wrote Louis Menand. I haven't even waited a few years.) But I wish I had also said that the Bible's narratives and commentaries on war also fail to support the simplicity of the parable.
How to deal with ancient Israel's conquest narratives is a sticky theological and historical question, and I don't want to get stuck there now. What I want to point out is that alongside the Bible's chronicles of war, there are very incongruous statements about its futility and waste. There is not a consistent glorification of battle and valor, of the sort you might find in Homer's Iliad. Far from providing extended paeans to flashing helmets and bronze shields, Israel is often chastised for placing faith in chariots, or seeking military alliances. Israel is told by its liturgical songs not to trust in mighty armies or in war horses, not to rely on military prowess to save. Chariots are usually identified as belonging to the enemy--to Pharaoh, to the Canaanites. And yet, American Christians--while thumping the Bible--uncritically valorize our war horses with an unthinking fatalism. Now, we not only trust in "chariots," but are willing to kill in order to protect their "sensitive equipment."
That's what happens in war. Glib metaphors do not work. You can talk about defending your child from the guy with the shotgun, but in war you'll end up killing to protect the shotgun. In war, you'll end up being the one who points the shotgun at the child. In war, the man pointing the shotgun at you will be the parent whose child you have killed. And in war, you might kill the villain only to end up pointing the shotgun at the one you set out to save. So do not use metaphors--do not use my emotional intuitions--to dismiss pacifism. There seems to be a presumption in our culture that the pacifists are the bleary-eyed utopians, the ones who cannot see things clearly. It seems to me that the opposite is true: it is those who believe that war is an act of self-defense who are fleeing from its realities and pulling the metaphor over their eyes.
Perhaps, however, the metaphorical defender of this war in Iraq will return to the ultimate argument, and urge me to remember September 11, 2001. But even here, the metaphor does not work as well as you want it to. The group around the Bradley vehicle is not identical with the group who massacred Americans on that day. And even if they were, by what moral calculus do deaths cancel out deaths? In the first official accounting of the amount of civilian deaths in Iraq, the Iraqi Ministry of Health has reported, based largely on hospital tallies, that "3,186 Iraqi civilians, men, women and children, [have] died as a result of either terrorist incidents or in clashes involving US-led multinational forces." And that is only since April of this year.
Can you defend those deaths with hypothetical situations? Can you draw lines of causation that make them any more justified than the deaths of men, women and children on September 11? War confounds causation, and it refutes hypotheses. War is, simply, hell.
Friday, September 17, 2004
The way it is
From the Times this morning, "Iraq Study Finds Desire for Arms, but Not Capacity":
A new report on Iraq's illicit weapons program is expected to conclude that Saddam Hussein's government had a clear intent to produce nuclear, chemical and biological weapons if United Nations sanctions were lifted, government officials said Thursday. But, like earlier reports, it finds no evidence that Iraq had begun any large-scale program for weapons production by the time of the American invasion last year, the officials said. ...From What Is To Be Done? (1863), by Nikolai Chernyschevsky:
Mr. Bush, who warned before the war that Iraq's illicit weapons posed an urgent threat to the United States, now generally describes Iraq as having been a "gathering threat," a phrase he has used at least 11 times since Aug. 12. In a Sept. 9 campaign speech, Mr. Bush told voters in Ohio: "Remember, Saddam Hussein had the capability of making weapons; he could have passed that capability on to the enemy."
Isn't that always the way it is: if a person's inclined to look for something, he finds it wherever he looks. Even if there is no trace of it, he still finds clear evidence. Even if there's not even a shadow, still he sees not only a shadow of what he's looking for but everything he's looking for.
Teaching dreams
I've been struggling with writer's insomnia lately. I lie down in bed, and my head keeps swimming with phrases and footnotes that I need to put in my dissertation. Visions of abolitionists dance through my brain. Then, when I finally do fall asleep a few hours later, I have "teaching dreams." My wife warned me about these before the semester started.
Last night I dreamed that I had, for reasons I could not remember, assigned Clifford Geertz's The Interpretation of Cultures to my class without specifying any particular pages or essays. Disaster, of course, ensued. Plus, in the dream, my dissertation advisor happened to be observing the class. Go ahead and "thickly describe" that. (Greg, I blame you for putting Geertz back in my subconscious since I've noticed that he's on your "Recently Read" list.)
Last night I dreamed that I had, for reasons I could not remember, assigned Clifford Geertz's The Interpretation of Cultures to my class without specifying any particular pages or essays. Disaster, of course, ensued. Plus, in the dream, my dissertation advisor happened to be observing the class. Go ahead and "thickly describe" that. (Greg, I blame you for putting Geertz back in my subconscious since I've noticed that he's on your "Recently Read" list.)
Wednesday, September 15, 2004
Jazz pilgrimages
For my birthday last month, I received several new jazz CDs. Two of them are live records, one by Joe Lovano (I'm All for You) and the other by Greg Osby (Public). Both albums make me happy. They bring back memories of this past June, when I heard both Lovano and Osby live in New York.
I was in the City doing research for my dissertation at some of the numerous archives there, and subletting a place near Columbia. Doing research in a great city is one of the joys of graduate study. On the one hand, it can be tiring to spend all day cloistered in a reading room, poring over dusty stuff and trying to read terrible handwriting. (To save paper, early nineteenth-century correspondents often wrote in an extremely small hand. After filling a page of transluscent tissue paper, they would turn the paper ninety degrees and then fill the page the other direction. This is a recipe for illegibility. And this is not just a historian's complaint. Sydney Howard Gay, who for a time edited the National Anti-Slavery Standard, told one of the paper's regular columnists, "I don't wonder that you complain of [the] sad work we sometimes make of your mss. but I marvel that we do as well as we do. We can't afford to employ [the] best compositors & I am mainly my own proofreader, & you certainly write a terrible hand. I pray you avoid thin paper, even at [the] expense of more postage ..." While reading this, I found myself wishing Gay's correspondent had listened.) But for the most part, digging through the archives is one of best things about being a historian. It brings out the antiquarian in me.
The other great thing about working in archives is that they close. When you are at home, it is hard to make yourself stop working. There are no "after hours" when there are more pages of the dissertation to be written, and always more books and articles to be read. But the archives close. You can walk out the door feeling that you have put in a solid day of work. You are free to indulge yourself in leisurely activities. And for me, being in New York after the archives closed meant indulging in jazz.
There are so few places left in the world where you can hop on the subway, head in a general direction, and be sure to find some great jazz being played on any given night. Even in New York, sets are less spontaneous than they once were in jazz history. You don't simply meander into jazz clubs anymore and stay until dawn; sets are scheduled, gigs are arranged, prices are fixed. And prices are high. Most of the premier clubs now have cover charges ranging from $25 to $35, in addition to drink and food minimums. And those prices are per set. If you want to stay for a second or third set, you have to pay the minimum again in most places. But even so, there is no place to hear live jazz like New York.
So first I went to hear Joe Lovano playing at the Iridium Jazz Club. I had been to the Iridium before to hear McCoy Tyner playing with Al Foster and George Mraz. If I had been in a critical mood on either evening, I could have dwelt on the fact that the Iridium charges an arm and a leg for what feel like extremely short sets. Situated right off Times Square, with a large neon sign that screams "JAZZ," it also beckons tourists who want the New York jazz experience, who wander in without any idea about who is playing. But I was not in a critical mood. There are worse problems than accidental tourists chancing to hear McCoy Tyner play the piano. And besides, on both occasions, I knew who I was there to see.
Lovano was playing in support of I'm All For You, on which he is surrounded by a truly all-star band--the legendary Hank Jones on piano, Paul Motian on drums, and George Mraz again on bass. Jones is 86 years old and still exudes genius. As he introduces the band, Lovano seems to grope for words to describe how he feels to be on the stage with Jones in particular, and with Motian and Mraz to boot. It doesn't feel right to call them just a "rhythm section," he says. They proceed to play a truly exquisite set of ballads, capped off by a nice rendition of John Coltrane's "Countdown." Jones rattles off line after line of just-right melodies, every once in a while glancing over at his wife, seated at a table by the stage. They exchange nods or nostalgic laughs at certain voicings or chords. It is wonderful to hear and behold.
The couple sitting next to me agrees. We strike up a conversation before the music starts. He is a pianist himself, graduated from the New School and now gigging in Philadelphia while his wife attends art school. It is the second and final set of the night, but they have stayed over from the first set. They drove up from Philadelphia because he had a gig in the City. Earlier in the day, they tell me, they received a parking ticket because their car's nose stuck slightly into a bus stop outside the Metropolitan Museum of Art. The ticket cancelled out everything he had made at the gig, which was going to pay for the trip. At the first set at the Iridium, they ate dinner. Now, just to make the food minimum for the second set, they have ordered two pieces of cheescake, an order of french fries, and a soft drink. But as I watch the way he studiously observes Jones, sitting extremely still throughout the set, I can tell he believes it was worth coming all the way from Philadelphia to see. And it was.
The Philly pianist asks if I play. No, I say. I dabble at the piano, but mainly I am a listener. He is surprised and says so. Usually when he sees guys his age in jazz clubs, they are all musicians. It seems to boost his spirits to think there are still non-musician jazz fans in existence, and who are able to identify Hank Jones. There is a part of me that sometimes wishes I were a musician, that I could play jazz. But usually, I love being a listener who is only slightly educated about music. I relish the mystery, and I don't want to be disenchanted. When Joe Lovano nods at something Hank Jones "says," I don't always know what was there to nod about. But for me, that's part of the allure of the music I love. I don't want to pull back the curtain and see how the magic is made.
I guess that makes me somewhat of a true believer when it comes to jazz. If so, my next big jazz outing in New York was something of a pilgrimage. I made my way down to the Village Vanguard, the greatest jazz club still in existence in the world, where I heard Greg Osby's quartet playing in support of Public. It is an unassuming place. I'm there early and manage to take a table right next to the piano bench. The red carpet is worn, and on the walls are cheaply framed photographs of Mingus and Coltrane and Bill Evans. I think about Trane and Eric Dolphy in the space where I am sitting, or Paul Motian recording alongside Bill Evans and bassist Scott LeFaro, on a Sunday just before LeFaro's tragic death. It brings out the antiquarian in me.
Osby's quartet is excellent, although on the night I am there, the young piano sensation Megumi Yonezawa is the only other representative from the cast of Public. Osby is fast becoming the Art Blakey of this generation. He has a tremendous ability to find outstanding young jazz musicians, and an even more extraordinary willingness to bring them into his band. The drummer for the night is, I think, a student at the New School. You can tell how eager he is to be backing Osby at the Vanguard. After the first cut, I notice Osby discreetly go over and motion with his hand. Keep it down, take it easy, he says with a gesture. The drummer smiles and shines for the rest of the evening. Osby, like Lovano, moves off the stage while the other musicians solo; in the shadows I can see him nodding every once in a while, approvingly. I also approve, but for reasons that are mysterious even to me. I'd like to keep the reasons mysterious.
After the set, I linger in the space, look closely at all the photographs. The band goes backstage (which is actually not behind the stage at the Vanguard) for a few minutes but then returns to mingle. I pass Yonezawa, and I tell her the same thing that I told Paul Motian when he bounded past me on the stairs at the Iridium. "Thank you." As a grateful listener and a jazz pilgrim, that's the most coherent thing I can think to say.
I was in the City doing research for my dissertation at some of the numerous archives there, and subletting a place near Columbia. Doing research in a great city is one of the joys of graduate study. On the one hand, it can be tiring to spend all day cloistered in a reading room, poring over dusty stuff and trying to read terrible handwriting. (To save paper, early nineteenth-century correspondents often wrote in an extremely small hand. After filling a page of transluscent tissue paper, they would turn the paper ninety degrees and then fill the page the other direction. This is a recipe for illegibility. And this is not just a historian's complaint. Sydney Howard Gay, who for a time edited the National Anti-Slavery Standard, told one of the paper's regular columnists, "I don't wonder that you complain of [the] sad work we sometimes make of your mss. but I marvel that we do as well as we do. We can't afford to employ [the] best compositors & I am mainly my own proofreader, & you certainly write a terrible hand. I pray you avoid thin paper, even at [the] expense of more postage ..." While reading this, I found myself wishing Gay's correspondent had listened.) But for the most part, digging through the archives is one of best things about being a historian. It brings out the antiquarian in me.
The other great thing about working in archives is that they close. When you are at home, it is hard to make yourself stop working. There are no "after hours" when there are more pages of the dissertation to be written, and always more books and articles to be read. But the archives close. You can walk out the door feeling that you have put in a solid day of work. You are free to indulge yourself in leisurely activities. And for me, being in New York after the archives closed meant indulging in jazz.
There are so few places left in the world where you can hop on the subway, head in a general direction, and be sure to find some great jazz being played on any given night. Even in New York, sets are less spontaneous than they once were in jazz history. You don't simply meander into jazz clubs anymore and stay until dawn; sets are scheduled, gigs are arranged, prices are fixed. And prices are high. Most of the premier clubs now have cover charges ranging from $25 to $35, in addition to drink and food minimums. And those prices are per set. If you want to stay for a second or third set, you have to pay the minimum again in most places. But even so, there is no place to hear live jazz like New York.
So first I went to hear Joe Lovano playing at the Iridium Jazz Club. I had been to the Iridium before to hear McCoy Tyner playing with Al Foster and George Mraz. If I had been in a critical mood on either evening, I could have dwelt on the fact that the Iridium charges an arm and a leg for what feel like extremely short sets. Situated right off Times Square, with a large neon sign that screams "JAZZ," it also beckons tourists who want the New York jazz experience, who wander in without any idea about who is playing. But I was not in a critical mood. There are worse problems than accidental tourists chancing to hear McCoy Tyner play the piano. And besides, on both occasions, I knew who I was there to see.
Lovano was playing in support of I'm All For You, on which he is surrounded by a truly all-star band--the legendary Hank Jones on piano, Paul Motian on drums, and George Mraz again on bass. Jones is 86 years old and still exudes genius. As he introduces the band, Lovano seems to grope for words to describe how he feels to be on the stage with Jones in particular, and with Motian and Mraz to boot. It doesn't feel right to call them just a "rhythm section," he says. They proceed to play a truly exquisite set of ballads, capped off by a nice rendition of John Coltrane's "Countdown." Jones rattles off line after line of just-right melodies, every once in a while glancing over at his wife, seated at a table by the stage. They exchange nods or nostalgic laughs at certain voicings or chords. It is wonderful to hear and behold.
The couple sitting next to me agrees. We strike up a conversation before the music starts. He is a pianist himself, graduated from the New School and now gigging in Philadelphia while his wife attends art school. It is the second and final set of the night, but they have stayed over from the first set. They drove up from Philadelphia because he had a gig in the City. Earlier in the day, they tell me, they received a parking ticket because their car's nose stuck slightly into a bus stop outside the Metropolitan Museum of Art. The ticket cancelled out everything he had made at the gig, which was going to pay for the trip. At the first set at the Iridium, they ate dinner. Now, just to make the food minimum for the second set, they have ordered two pieces of cheescake, an order of french fries, and a soft drink. But as I watch the way he studiously observes Jones, sitting extremely still throughout the set, I can tell he believes it was worth coming all the way from Philadelphia to see. And it was.
The Philly pianist asks if I play. No, I say. I dabble at the piano, but mainly I am a listener. He is surprised and says so. Usually when he sees guys his age in jazz clubs, they are all musicians. It seems to boost his spirits to think there are still non-musician jazz fans in existence, and who are able to identify Hank Jones. There is a part of me that sometimes wishes I were a musician, that I could play jazz. But usually, I love being a listener who is only slightly educated about music. I relish the mystery, and I don't want to be disenchanted. When Joe Lovano nods at something Hank Jones "says," I don't always know what was there to nod about. But for me, that's part of the allure of the music I love. I don't want to pull back the curtain and see how the magic is made.
I guess that makes me somewhat of a true believer when it comes to jazz. If so, my next big jazz outing in New York was something of a pilgrimage. I made my way down to the Village Vanguard, the greatest jazz club still in existence in the world, where I heard Greg Osby's quartet playing in support of Public. It is an unassuming place. I'm there early and manage to take a table right next to the piano bench. The red carpet is worn, and on the walls are cheaply framed photographs of Mingus and Coltrane and Bill Evans. I think about Trane and Eric Dolphy in the space where I am sitting, or Paul Motian recording alongside Bill Evans and bassist Scott LeFaro, on a Sunday just before LeFaro's tragic death. It brings out the antiquarian in me.
Osby's quartet is excellent, although on the night I am there, the young piano sensation Megumi Yonezawa is the only other representative from the cast of Public. Osby is fast becoming the Art Blakey of this generation. He has a tremendous ability to find outstanding young jazz musicians, and an even more extraordinary willingness to bring them into his band. The drummer for the night is, I think, a student at the New School. You can tell how eager he is to be backing Osby at the Vanguard. After the first cut, I notice Osby discreetly go over and motion with his hand. Keep it down, take it easy, he says with a gesture. The drummer smiles and shines for the rest of the evening. Osby, like Lovano, moves off the stage while the other musicians solo; in the shadows I can see him nodding every once in a while, approvingly. I also approve, but for reasons that are mysterious even to me. I'd like to keep the reasons mysterious.
After the set, I linger in the space, look closely at all the photographs. The band goes backstage (which is actually not behind the stage at the Vanguard) for a few minutes but then returns to mingle. I pass Yonezawa, and I tell her the same thing that I told Paul Motian when he bounded past me on the stairs at the Iridium. "Thank you." As a grateful listener and a jazz pilgrim, that's the most coherent thing I can think to say.
Monday, September 13, 2004
Dissertation dialectics
"Every dialectical movement terminates with a synthesis, but not every synthesis brings the dialectical process to a stop ..." Peter Singer, in Hegel
Thesis (1): The dissertation is just a degree requirement. It is a hoop that you must jump through in order to receive those extra letters behind your name. In that sense, it is like an undergraduate senior thesis; something to check off in order to graduate from being a graduate student. Think of it as a really, really big term paper.
Antithesis (1): The dissertation is a job requirement. Upon it rests your fate as a professional academic; your career hinges on the quality of this work. It is therefore unlike anything you ever wrote as an undergraduate, or as a graduate student, for that matter. This is not something that only your professor will read, or something for which you will receive a grade in the registrar's office, or a simple requirement for graduation. This will not simply be filed away. Your future co-workers will read this, too; indeed, they will decide whether to be your co-workers on the basis of the work you produced as a student. So you are not really a student, even though you are; you are already at work, and this is the first big project you must complete. It is a hoop to jump through, but not just a hoop for a degree. This hoop is for the job and the career. Think of it more like a flaming hoop.
Synthesis (1): The dissertation is for a degree and a job. It is the beginning and the end. The end of the beginning, and the beginning of the end. It is a hoop, but one in a continuing series of hoops that stretches far beyond graduation and terminates only in tenure. Your job is not only to earn the degree in the first place, but to continue to validate that degree after the fact. So school is your job, and the job will be school. The junior student is a scholar, and the junior scholar is a student. This is what it means to be a professional academic. It is hoops all the way down.
Thesis (2): The dissertation is just a draft. When you "finish" the dissertation, it will be turned into a book. So even the completed work is a draft of another work. That means what you are writing now--the fragments of chapters, the halting lines, the provisional remarks--are only drafts of a draft of a draft that will, at the final stage, still be a rough draft for the book.
Antithesis (2): The dissertation is a book. Given the constraints of the tenure clock and the teaching load you will have upon graduation, you will not have time to write the book from scratch that you need to jump through the tenure hoops. So large parts of the dissertation will remain intact in the book, on which also depends your professional career. Do not treat lightly what you write, even now, because it is a book in the making. These are not scraps of ideas; this is scholarship.
Synthesis (2): The dissertation is both a draft and a book. And what are books, if not simply widely circulated drafts? Despite their apparent finality, scholarship is impossible without the foundational assumption that the final word is never said. All books are drafts to be criticized and possibly revised. Your dissertation is therefore a private book; your book will be merely a more widely publicized and more finely tuned draft. The book, too, is but the end of the beginning. It is drafts all the way down.
Thesis (1): The dissertation is just a degree requirement. It is a hoop that you must jump through in order to receive those extra letters behind your name. In that sense, it is like an undergraduate senior thesis; something to check off in order to graduate from being a graduate student. Think of it as a really, really big term paper.
Antithesis (1): The dissertation is a job requirement. Upon it rests your fate as a professional academic; your career hinges on the quality of this work. It is therefore unlike anything you ever wrote as an undergraduate, or as a graduate student, for that matter. This is not something that only your professor will read, or something for which you will receive a grade in the registrar's office, or a simple requirement for graduation. This will not simply be filed away. Your future co-workers will read this, too; indeed, they will decide whether to be your co-workers on the basis of the work you produced as a student. So you are not really a student, even though you are; you are already at work, and this is the first big project you must complete. It is a hoop to jump through, but not just a hoop for a degree. This hoop is for the job and the career. Think of it more like a flaming hoop.
Synthesis (1): The dissertation is for a degree and a job. It is the beginning and the end. The end of the beginning, and the beginning of the end. It is a hoop, but one in a continuing series of hoops that stretches far beyond graduation and terminates only in tenure. Your job is not only to earn the degree in the first place, but to continue to validate that degree after the fact. So school is your job, and the job will be school. The junior student is a scholar, and the junior scholar is a student. This is what it means to be a professional academic. It is hoops all the way down.
Thesis (2): The dissertation is just a draft. When you "finish" the dissertation, it will be turned into a book. So even the completed work is a draft of another work. That means what you are writing now--the fragments of chapters, the halting lines, the provisional remarks--are only drafts of a draft of a draft that will, at the final stage, still be a rough draft for the book.
Antithesis (2): The dissertation is a book. Given the constraints of the tenure clock and the teaching load you will have upon graduation, you will not have time to write the book from scratch that you need to jump through the tenure hoops. So large parts of the dissertation will remain intact in the book, on which also depends your professional career. Do not treat lightly what you write, even now, because it is a book in the making. These are not scraps of ideas; this is scholarship.
Synthesis (2): The dissertation is both a draft and a book. And what are books, if not simply widely circulated drafts? Despite their apparent finality, scholarship is impossible without the foundational assumption that the final word is never said. All books are drafts to be criticized and possibly revised. Your dissertation is therefore a private book; your book will be merely a more widely publicized and more finely tuned draft. The book, too, is but the end of the beginning. It is drafts all the way down.
Saturday, September 11, 2004
September 1, 1939
By W.H. Auden
[Three years ago, this poem by one of my favorite poets was circulated in some newspapers. It still seems appropriate to me today, perhaps even more so. It is about wresting hope from the jaws of hopelessness. From Selected Poems, pp. 86-89.]
I sit in one of the dives
On Fifty-Second Street
Uncertain and afraid
As the clever hopes expire
Of a low dishonest decade:
Waves of anger and fear
Circulate over the bright
And darkened lands of the earth,
Obsessing our private lives;
The unmentionable odour of death
Offends the September night.
Accurate scholarship can
Unearth the whole offence
From Luther until now
That has driven a culture mad,
Find what occurred at Linz,
What huge imago made
A psychopathic god:
I and the public know
What all schoolchildren learn,
Those to whom evil is done
Do evil in return.
Exiled Thucydides knew
All that a speech can say
About Democracy,
And what dictators do,
The elderly rubbish they talk
To an apathetic grave;
Analysed all in his book,
The enlightenment driven away,
The habit-forming pain,
Mismanagement and grief:
We must suffer them all again.
Into this neutral air
Where blind skyscrapers use
Their full height to proclaim
The strength of Collective Man,
Each language pours its vain
Competitive excuse:
But who can live for long
In an euphoric dream;
Out of the mirror they stare,
Imperialism's face
And the international wrong.
Faces along the bar
Cling to their average day:
The lights must never go out,
The music must always play,
All the conventions conspire
To make this fort assume
The furniture of home;
Lest we should see where we are,
Lost in a haunted wood,
Children afraid of the night
Who have never been happy or good.
The windiest militant trash
Important Persons shout
Is not so crude as our wish:
What mad Nijinsky wrote
About Diaghilev
Is true of the normal heart;
For the error bred in the bone
Of each woman and each man
Craves what it cannot have,
Not universal love
But to be loved alone.
From the conservative dark
Into the ethical life
The dense commuters come,
Repeating their morning vow,
"I will be true to the wife,
I'll concentrate more on my work,"
And helpless governors wake
To resume their compulsory game:
Who can release them now,
Who can reach the deaf,
Who can speak for the dumb?
All I have is a voice
To undo the folded lie,
The romantic lie in the brain
Of the sensual man-in-the-street
And the lie of Authority
Whose buildings grope the sky:
There is no such thing as the State
And no one exists alone;
Hunger allows no choice
To the citizen or the police;
We must love one another or die.
Defenceless under the night
Our world in stupor lies;
Yet, dotted everywhere,
Ironic points of light
Flash out wherever the Just
Exchange their messages:
May I, composed like them
Of Eros and of dust,
Beleaguered by the same
Negation and despair,
Show an affirming flame.
[Three years ago, this poem by one of my favorite poets was circulated in some newspapers. It still seems appropriate to me today, perhaps even more so. It is about wresting hope from the jaws of hopelessness. From Selected Poems, pp. 86-89.]
I sit in one of the dives
On Fifty-Second Street
Uncertain and afraid
As the clever hopes expire
Of a low dishonest decade:
Waves of anger and fear
Circulate over the bright
And darkened lands of the earth,
Obsessing our private lives;
The unmentionable odour of death
Offends the September night.
Accurate scholarship can
Unearth the whole offence
From Luther until now
That has driven a culture mad,
Find what occurred at Linz,
What huge imago made
A psychopathic god:
I and the public know
What all schoolchildren learn,
Those to whom evil is done
Do evil in return.
Exiled Thucydides knew
All that a speech can say
About Democracy,
And what dictators do,
The elderly rubbish they talk
To an apathetic grave;
Analysed all in his book,
The enlightenment driven away,
The habit-forming pain,
Mismanagement and grief:
We must suffer them all again.
Into this neutral air
Where blind skyscrapers use
Their full height to proclaim
The strength of Collective Man,
Each language pours its vain
Competitive excuse:
But who can live for long
In an euphoric dream;
Out of the mirror they stare,
Imperialism's face
And the international wrong.
Faces along the bar
Cling to their average day:
The lights must never go out,
The music must always play,
All the conventions conspire
To make this fort assume
The furniture of home;
Lest we should see where we are,
Lost in a haunted wood,
Children afraid of the night
Who have never been happy or good.
The windiest militant trash
Important Persons shout
Is not so crude as our wish:
What mad Nijinsky wrote
About Diaghilev
Is true of the normal heart;
For the error bred in the bone
Of each woman and each man
Craves what it cannot have,
Not universal love
But to be loved alone.
From the conservative dark
Into the ethical life
The dense commuters come,
Repeating their morning vow,
"I will be true to the wife,
I'll concentrate more on my work,"
And helpless governors wake
To resume their compulsory game:
Who can release them now,
Who can reach the deaf,
Who can speak for the dumb?
All I have is a voice
To undo the folded lie,
The romantic lie in the brain
Of the sensual man-in-the-street
And the lie of Authority
Whose buildings grope the sky:
There is no such thing as the State
And no one exists alone;
Hunger allows no choice
To the citizen or the police;
We must love one another or die.
Defenceless under the night
Our world in stupor lies;
Yet, dotted everywhere,
Ironic points of light
Flash out wherever the Just
Exchange their messages:
May I, composed like them
Of Eros and of dust,
Beleaguered by the same
Negation and despair,
Show an affirming flame.
Thursday, September 09, 2004
Loss of accountability
The New York Times reports that the Army has disclosed more scandalous facts about the CIA's treatment of detainees in Iraq. The CIA held dozens, perhaps up to 100, detainees in undisclosed locations without any paper trail in order to prevent the Red Cross from inspecting their treatment.
Because he is not the man he wants us to believe he is. That's why if anyone is going to be held accountable for this, it has to be President Bush. And the only ones who can hold him accountable are voters like you and me. We, the voters, are the only ones who can show the world that our country does not look lightly on flagrant attempts to hide the truth, that we do not accept our leaders circumventing respected humanitarian organizations and international laws.
This scandal does not just amount to a "loss of accountability" at a prison in Iraq. It amounts to a "loss of accountability" in our government. But we can hold President Bush accountable, and we must. We cannot be put off by promises that these things are under investigation; they have been investigated, and the investigators have spoken. Now our voice needs to be heard.
Under the Geneva Conventions, the temporary failure to disclose the identities of prisoners to the Red Cross is permitted under an exemption for military necessity. But the Army generals said they were certain that the practice used by the C.I.A. in Iraq went far beyond that.If I'm reading this right, the Army is now publicly admitting that the CIA went "far beyond" the Geneva Conventions, that it operated outside the bounds of international law. As voters, we cannot hold the CIA chief accountable for these crimes, because George Tenet resigned before the worst of this scandal broke. And President Bush has yet to hold any senior officials in his administration accountable for these errors by firing them, despite the growing evidence that the causes for these grave and illegal violations go all the way to the Pentagon. If the President is the decisive man he wants us to believe he is, if he wants us to compare him and his occupation with Harry Truman and his occupation, why does he not say that the buck stops on his desk?
The disclosure added to questions about the C.I.A.'s practices in Iraq, including why the agency took custody of certain Iraqi prisoners, what interrogation techniques it used and what became of the ghost detainees, including whether they were ever returned to military custody. To date, two cases have been made public in which prisoners in C.I.A. custody were removed from Iraq for a period of several months and held in detention centers outside the country.
... Military officials have said the C.I.A.'s practice of using Army-run prisons in Iraq to hide prisoners held for questioning violated military regulations and international law, and led to "a loss of accountability at the prison."
Because he is not the man he wants us to believe he is. That's why if anyone is going to be held accountable for this, it has to be President Bush. And the only ones who can hold him accountable are voters like you and me. We, the voters, are the only ones who can show the world that our country does not look lightly on flagrant attempts to hide the truth, that we do not accept our leaders circumventing respected humanitarian organizations and international laws.
This scandal does not just amount to a "loss of accountability" at a prison in Iraq. It amounts to a "loss of accountability" in our government. But we can hold President Bush accountable, and we must. We cannot be put off by promises that these things are under investigation; they have been investigated, and the investigators have spoken. Now our voice needs to be heard.
[Harold Brown], who served [as Secretary of Defense] under President Jimmy Carter, also pointed a finger of blame beyond Mr. Rumsfeld to the "very top" of the Bush administration for what he called "the responsibility for failing to plan for what actually happened after the overthrow of Saddam Hussein."UPDATE: Ditto to this Times editorial.
And while not calling for resignations, Mr. Brown, in his testimony before the House committee, said judgments about the administration's conduct in Iraq, on Abu Ghraib and other matters, were now up to voters to make. "When it comes to overall performance, there's another way of dealing with it, and that's called an election," Mr. Brown said.
Those wacky abolitionists
I just finished reading a roundtable discussion in the August issue of Harper's on the future of progressive politics in America. "The Democratic Party at the moment presents no message that can be heard as even a mild objection to the Republican program of privatization, extravagant military spending, tax cuts for the rich, welfare cuts for the poor," begins the lede, in the magazine's typically trenchant but somewhat hyperbolic fashion. To talk about how the progressive agenda "might return to the nation's political arena, Harper's Magazine invited five notable progressive thinkers"--Ron D. Daniels, Eric Foner, Ralph Nader, Kevin Phillips, and Frances Fox Piven--"to sit down together and consider the problem."
The forum contains some insightful quips. I think Kevin Phillips is probably right, for instance, that "Democrats have been anesthetized by campaign contributions. ... Their neediness cripples them." I also like what Frances Fox Piven says: "Our rhetorical task is far easier than that of the Republicans. ... We should talk about reclaiming democracy by reducing corporate power and reducing inequality, especially the inequalities that affect working people and poor people." You mean our rhetorical task would be easier if we actually talked more about democracy and equality? Consider this sobering fact. According to Google, John Kerry's official campaign website contains 564 hits for the word equality and 4,500 hits for the word democracy. Liberty gets 1,530. Poverty gets 1,980. How many hits does "security" get? Let's see ... 13,800. And "military"? 10,900.
There were also points made in the forum which gave me pause. I have always been a somewhat reluctant subscriber to Harper's, because it often takes a thoughtless posture towards religion. It wears its anti-clericalism on its sleeve, the same one it wipes its nose on. I appreciate Harper's because it provides carefully argued but highly opinionated pieces. But its contempt for religion is highly opinionated without being carefully argued.
Lewis Lapham, for instance, opines in the forum that what holds conservative "factions together--the Christian and neoconservative right, the racists and the homophobes--is their common tendency to believe in such a thing as absolute truth, the bright transcendent line between good and evil, right and wrong. The religious rather than the secular habit of mind." Elsewhere, he notes that "the reactionary right isn't afflicted with the disease of cognitive dissonance," apparently evidence of its non-secular "habit of mind." As though it is impossible for those of us who are religious to come down with a case of "cognitive dissonance," as if the line between good and evil is always "bright" for us! I guess it's the Brights who get to do all the doubting--except when it comes to drawing the one between progressive and reactionary, or between naturalism and supernaturalism.
The forum also has typically caricatured portraits of crazy Christians. "If moderate Republicans have one thing besides Iraq that makes them want to vote against Bush," says Kevin Phillips, "it's all these fundamentalists coming out of the woodwork from Armpit, Alabama. It's this biblical worldview in which Baghdad is the new Babylon. The average Presbyterian Republican in suburbia thinks those people are wackos." Ah, yes, now I remember the reason why I subscribe to Harper's: to see people with whom I already disagree called "wackos."
But this is a guilty pleasure. Ultimately, the "wackos" thing bothers me. When I see smart people calling other people "wackos," I'm worried. If all the smart people are busy calling people "wackos," who is going to have the respect and patience required to actually talk with those people? In the same forum, Ralph Nader complains that the progressive message is missing "emotional content, in the best sense of the phrase. One of the reasons is that liberals aren't good haters. Whereas the agents and apostles of the right, they really are haters." So when we call you "apostles" of the right "wackos," it's because we love you.
A final thing catches my eye about the forum. Several times, the contributors point to the abolitionist movement as a model for progressive movements today. As in this comment from Piven:
For example, here is William Lloyd Garrison in one of his letters:
If we are going to use the abolitionists to draw historical lessons about the present, then at least one of the conclusions we have to draw is that a "religious habit of mind" is not inconsistent with the kinds of progressive movements that "get a lot of people on the street [to] disrupt things." Garrison's religious mind was clearly very disruptive. In fact, however theologically and biblically suspect the Armpit preacher's eschatology might be (and let me make very clear that the new Babylon is not Baghdad), Garrison's social vision was clearly based on eschatology.
But his eschatology was not of the pie-in-the-sky variety. He yearned to see the kingdom of Christ coming before "this mortal shall have put on immortality," to behold it as a reality in the here and now. His belief was that since "the reconciliation of the world to God" was coming, we might as well get about the job of reconciling with each other now. His was not the small eschatology of the Armpit fundamentalist; rather, he said, This is Who We Shall Be, so This is Who We Must Be. "Why should we be fiends," he once wrote his brother, "when we may become angels?" I know, I know ... What a "wacko"!
But my point here is not to defend those wacky abolitionists as either progressive or not. My main point (which I've also made elsewhere) is that attempts to find heroes among the abolitionists usually falter on the irreducible complexity of the past. As the Harper's forum shows, when we attempt to pick out exemplary forbears, we tend to ignore the things about them that we find distasteful. We tend to avoid cognitive dissonance--even those of us who have a "secular habit of mind." And instead of really learning from or listening to their stories, we imagine (to paraphrase Peter Novick) that we can talk to the dead by prefacing their answers with our questions. Perhaps we would be better served by looking at their own questions, seeing how they answered them, and then looking again at our questions and our answers.
UPDATE: Another progressive appropriates the abolitionists: this time it's Howard Zinn. A little known fact about Zinn, whom I had the pleasure to see speaking in Boston this past spring, is that one of his earliest published academic pieces was an essay on the abolitionists, in which I believe he compared abolitionists and Freedom Riders.
UPDATE: There is a good review of Harper's and its mercurial editor at Slate. It argues, persuasively, that the magazine "has grown increasingly pompous and predictable in recent years," although not for the same reasons that I give here.
The forum contains some insightful quips. I think Kevin Phillips is probably right, for instance, that "Democrats have been anesthetized by campaign contributions. ... Their neediness cripples them." I also like what Frances Fox Piven says: "Our rhetorical task is far easier than that of the Republicans. ... We should talk about reclaiming democracy by reducing corporate power and reducing inequality, especially the inequalities that affect working people and poor people." You mean our rhetorical task would be easier if we actually talked more about democracy and equality? Consider this sobering fact. According to Google, John Kerry's official campaign website contains 564 hits for the word equality and 4,500 hits for the word democracy. Liberty gets 1,530. Poverty gets 1,980. How many hits does "security" get? Let's see ... 13,800. And "military"? 10,900.
There were also points made in the forum which gave me pause. I have always been a somewhat reluctant subscriber to Harper's, because it often takes a thoughtless posture towards religion. It wears its anti-clericalism on its sleeve, the same one it wipes its nose on. I appreciate Harper's because it provides carefully argued but highly opinionated pieces. But its contempt for religion is highly opinionated without being carefully argued.
Lewis Lapham, for instance, opines in the forum that what holds conservative "factions together--the Christian and neoconservative right, the racists and the homophobes--is their common tendency to believe in such a thing as absolute truth, the bright transcendent line between good and evil, right and wrong. The religious rather than the secular habit of mind." Elsewhere, he notes that "the reactionary right isn't afflicted with the disease of cognitive dissonance," apparently evidence of its non-secular "habit of mind." As though it is impossible for those of us who are religious to come down with a case of "cognitive dissonance," as if the line between good and evil is always "bright" for us! I guess it's the Brights who get to do all the doubting--except when it comes to drawing the one between progressive and reactionary, or between naturalism and supernaturalism.
The forum also has typically caricatured portraits of crazy Christians. "If moderate Republicans have one thing besides Iraq that makes them want to vote against Bush," says Kevin Phillips, "it's all these fundamentalists coming out of the woodwork from Armpit, Alabama. It's this biblical worldview in which Baghdad is the new Babylon. The average Presbyterian Republican in suburbia thinks those people are wackos." Ah, yes, now I remember the reason why I subscribe to Harper's: to see people with whom I already disagree called "wackos."
But this is a guilty pleasure. Ultimately, the "wackos" thing bothers me. When I see smart people calling other people "wackos," I'm worried. If all the smart people are busy calling people "wackos," who is going to have the respect and patience required to actually talk with those people? In the same forum, Ralph Nader complains that the progressive message is missing "emotional content, in the best sense of the phrase. One of the reasons is that liberals aren't good haters. Whereas the agents and apostles of the right, they really are haters." So when we call you "apostles" of the right "wackos," it's because we love you.
A final thing catches my eye about the forum. Several times, the contributors point to the abolitionist movement as a model for progressive movements today. As in this comment from Piven:
The left has a communication problem. [Cf. the calling people "wackos" thing.] The right has multiple modes of communication. They have enormous influence with the corporate media; they have their think tanks, which have evolved into propaganda machines; and they also have the social movements of the right--God, gays, and guns. All the left has are its social movements. But they have tremendous communicative power. Think of the abolitionists, the labor movement, the civil rights movement. These kinds of movements get a lot of people on the street. They disrupt things. And that attracts a lot of attention.I don't necessarily have a problem with finding role models in past social movements. As I've said before, one reason I am drawn to the abolitionists is because I think there is much in their social vision that is worth recovering. But Harper's-style progressives seem to easily forget that abolitionists were also held together by what Lapham calls the "religious habit of mind."
For example, here is William Lloyd Garrison in one of his letters:
But the mere abolition of slavery is not the reconciliation of the world to God, or of man to his brother man; though there can never be such reconciliation without it. I want to see ... Jesus, the Messiah, as the only King and Ruler on earth--the establishment of his kingdom to the subversion of all others--the prostration of all national barriers, castes, and boundaries--the mingling of the whole human race, 'like kindred drops into one'--the forgiveness of enemies, without any resort to brute force, even after the example of Christ--the overthrow of all military and naval power, by the substitution of spiritual for carnal weapons--the adoption of a common language, to the suppression of the Babel dialects which now divide and curse mankind. Such ... I hope to see ... before this mortal shall have put on immortality. It will produce a mighty sensation throughout the earth, and be more terrible to tyranny and misrule than 'an army with banners.' Seizing upon it by faith, and yearning to behold it as a reality, I am constrained to exclaim, 'How long, dear Savior, oh how long / Shall that bright hour delay? / Fly swifter round, ye wheels of time, / And bring the welcome day!'I imagine the "average Presbyterian Republican in suburbia" would think of Garrison as a "wacko," too. Of course, Garrison, despite his wacky "religious habit of mind," did not recoil from criticizing Christianity. Later in the same letter, he said, "Let the truth be told, though the whole of Christendom be thereby convicted of infidelity." (Those were the kinds of things that got him in trouble with the religious powers-that-were.) But then how should we categorize Garrison? As a disruptive progressive, a la Piven, or as a reactionary wacko, a la Lapham and Phillips?
If we are going to use the abolitionists to draw historical lessons about the present, then at least one of the conclusions we have to draw is that a "religious habit of mind" is not inconsistent with the kinds of progressive movements that "get a lot of people on the street [to] disrupt things." Garrison's religious mind was clearly very disruptive. In fact, however theologically and biblically suspect the Armpit preacher's eschatology might be (and let me make very clear that the new Babylon is not Baghdad), Garrison's social vision was clearly based on eschatology.
But his eschatology was not of the pie-in-the-sky variety. He yearned to see the kingdom of Christ coming before "this mortal shall have put on immortality," to behold it as a reality in the here and now. His belief was that since "the reconciliation of the world to God" was coming, we might as well get about the job of reconciling with each other now. His was not the small eschatology of the Armpit fundamentalist; rather, he said, This is Who We Shall Be, so This is Who We Must Be. "Why should we be fiends," he once wrote his brother, "when we may become angels?" I know, I know ... What a "wacko"!
But my point here is not to defend those wacky abolitionists as either progressive or not. My main point (which I've also made elsewhere) is that attempts to find heroes among the abolitionists usually falter on the irreducible complexity of the past. As the Harper's forum shows, when we attempt to pick out exemplary forbears, we tend to ignore the things about them that we find distasteful. We tend to avoid cognitive dissonance--even those of us who have a "secular habit of mind." And instead of really learning from or listening to their stories, we imagine (to paraphrase Peter Novick) that we can talk to the dead by prefacing their answers with our questions. Perhaps we would be better served by looking at their own questions, seeing how they answered them, and then looking again at our questions and our answers.
UPDATE: Another progressive appropriates the abolitionists: this time it's Howard Zinn. A little known fact about Zinn, whom I had the pleasure to see speaking in Boston this past spring, is that one of his earliest published academic pieces was an essay on the abolitionists, in which I believe he compared abolitionists and Freedom Riders.
UPDATE: There is a good review of Harper's and its mercurial editor at Slate. It argues, persuasively, that the magazine "has grown increasingly pompous and predictable in recent years," although not for the same reasons that I give here.
Wednesday, September 08, 2004
Miscellany
Apologies for the light blogging over the past few days. I plead two excuses: yesterday was the first day of the course I am teaching, and for the majority of the last 24 hours, Blogger has been out of commission.
If you haven't seen Ralph Luker's helpful list of blogs by history graduate students, check it out. I discovered quite a few blogs that I had not known about before. I added Positive Liberty to the list, and I probably should have also suggested Snoblog, published by a graduate student at UCLA. It's not updated very frequently, but it is usually an interesting read. On the other side of the great pond, there is also Historiological Notes.
P.S. On another miscellaneous note, I have six Gmail invitations to give out. I've been pleased with Gmail so far, although it does have some annoying quirks. There is a good run-down of the pros and cons at No Fancy Name. If you're interested in trying it out, just post a comment or "gmail.com" me at "calebmcd."
If you haven't seen Ralph Luker's helpful list of blogs by history graduate students, check it out. I discovered quite a few blogs that I had not known about before. I added Positive Liberty to the list, and I probably should have also suggested Snoblog, published by a graduate student at UCLA. It's not updated very frequently, but it is usually an interesting read. On the other side of the great pond, there is also Historiological Notes.
P.S. On another miscellaneous note, I have six Gmail invitations to give out. I've been pleased with Gmail so far, although it does have some annoying quirks. There is a good run-down of the pros and cons at No Fancy Name. If you're interested in trying it out, just post a comment or "gmail.com" me at "calebmcd."