Tuesday, August 31, 2004

 

Hot school

Here's more cannon fodder for people with this bumper sticker.

Today was the first full day of classes at the Maryland high school where my wife teaches. The school is mostly un-air-conditioned, and the temperature inside the buildings reached as high as 85 degrees according to her indoor thermometer. Three students fainted or nearly fainted, and one was taken to the hospital after a 911 call. My wife (who teaches in the only room in the school without either windows or an air-conditioning unit) was so physically exhausted by the heat that she spent her last class period trying not to throw up, and she could not drive herself home.

You can blame this on funding for schools, I guess, or you could chalk it up as one reason why Houston is worth it. It may be hot and humid in Texas, but Texans would never think of sending students and teachers into an un-air-conditioned, poorly ventilated school during the heat of summer and expect these to be optimal conditions for learning.

 

On war

As usual recently, I'm not sure whether to laugh, scream, or weep at the latest headlines. The most recent cause for my emotional confusion comes from President Bush's conflicting claims that the war on terror is not winnable in a conventional sense, but that we will win it anyway. This is an matter of emphasis or a reversal, depending on your point of view.

I'm inclined to agree with Josh Marshall that although "the president deserves every whack he gets for changing his position twice in three days on the issue he has made the centerpiece of his campaign[,] ... folks should also start using his bobbling to make the point that the issue is less whether the president thinks the 'terror war' is winnable than the fact that he doesn't even have any clear idea of how to fight it."

But I'm also inclined to argue that the issue goes even deeper than this. Bush's argument, by trying to distinguish between "conventional" wars and "other" wars, obscures what all wars have in common--raw military force, untold amounts of bloodshed, and catastrophic social and political effects. Saying this is a "different" kind of war rhetorically (if only implicitly) suggests that this war may not have to include those things. Since this war is unconventional, it can spread liberty. It can be taken to the enemy "over there" and not involve us at all "here at home." It can create democracies instead of gangrenous resentment. It can actually end all evil, instead of eroding a person's moral inhibitions against killing other human beings. This is a "different" kind of war, after all. You know, the good kind.

Other times, of course, it serves President Bush's purpose to argue that this war is unique because it is exceptionally bad. It is an "unconventional" war, with the danger of "unconventional" weapons, so you can forget about a "conventional" peace or "conventional" ways of winning. It has to be fought on different terms. The ordinary (Geneva?) conventions do not apply with the force of command. Abiding by them is beneficent and supererogatory, because this is not a conventional war.

Following these tangled webs makes it hard for me to suppress my own doubt that there is such a thing as a "conventional" war. War is by definition the suspension of conventions. Here I tend to agree with the standard Hobbesian line that the state of war is the absence of society. Good luck trying to govern war, when war is the putative chaos that exists when government breaks down. Besides, even if we could agree in the abstract that wars have conventions, who could possibly devise a generalization that would cover even most of the cases? The "rules" of war are only defined by their exceptions, and I would be willing to bet that every war that was ever fought has been deemed in some sense exceptional by those who fought it.

The convention being cited by Bush at the moment is that "in this different kind of war, we may never sit down at a peace table." But the "peace table" is a romantic and red-herring image for the termination of wars. The Civil War was over before Appomattox, and Jackson won the Battle of New Orleans after the War of 1812 already was over. In reality, sitting down at a "peace table" is not proof that there is a conventional understanding about how to end all wars. Rather, a treaty is proof that one side has sufficient power to define (for its own purposes) how this particular war should end.

But I'm hard-pressed to say which is more dismaying--what President Bush has said, or the fact that John Kerry has fired back that he certainly does know what this war is about, and he can win it. And is raising the cry of "flip-flopping" the best that we can do in response to President Bush's grab-bag of platitudes on war? Can't we do better than beating Bush with his own stick? Can't we actually interrogate our contradictory and ultimately misleading assumptions about war, instead of simply shouting back and forth about who can fight better and longer and with more courage?

This is asking too much of our national debate, you might say. Tell that to Abraham Lincoln. Since Rudy Giuliani and David Brooks have lately taken to linking Bush and other Republicans to Lincoln, maybe, just maybe, the Republicans will listen to him.

I doubt that you would find a resuscitated Lincoln today making arcane distinctions about conventional and unconventional wars--even as a wartime president, his view of combat was too full of tragic irony to make room for pedantic discriminations between what is an ordinary war and what is not. How different are President Bush's formulaic statements that war is hard and must go on, from the deeply moving and ironic realization in Lincoln's Second Inaugural Address that the "mighty scourge" of the Civil War had no end in sight.

Perhaps the Republicans would also do well to remember that Lincoln came to his "wartime" presidency with a reputation as an "anti-war" politician. His first claim to political fame on the national scene came from his steadfast opposition to the Mexican War in 1848. A speech he gave on the war seems eerily appropriate today.

Representative Lincoln vehemently criticized President James Polk (who talked a lot about how wars spread liberty) for not having a clear justification for war with Mexico, and for not having a consistent plan for its termination. He especially ridiculed Polk's argument that even after American armies had pushed Mexico out of Texas, the United States needed to conquer more of Mexico as an "indemnity" for the nation's military expenses. Lincoln pointed out that this would involve taking all of Mexico, since every inch of territorial acquisition intended to pay back America's expenses would inevitably involve more expenses, which would then have to be repaid with more territory. That's right--Polk was saying that this was not a conventional war, which meant that it would not end in a conventional way. Lincoln saw that by Polk's logic, the war could not end at all. But I'll let Lincoln speak for himself, with excerpts from his Selected Speeches and Writings:
How like the half insane mumbling of a fever-dream, is the whole war part of [the President's] late message! At one time telling us that Mexico has nothing whatever, that we can get, but territory; at another, showing us how we can support the war, by levying contributions on Mexico. At one time, urging the national honor, the security of the future, the prevention of foreign interference, and even, the good of Mexico herself, as among the objects of the war; at another, telling us, that "to reject indemnity, by refusing to accept a cession of territory, would be to abandon all our just demands, and to wage the war, bearing all it's [sic] expenses, without a purpose or definite object." So then, the national honor, security of the future, and every thing but territorial indemnity, may be considered the no-purposes, and indefinite, objects of the war!
...
As to the mode of terminating the war, and securing peace, the President is equally wandering and indefinite. First, it is to be done by a more vigorous prosecution of the war in the vital parts of the enemies [sic] country; and, after apparently, talking himself tired, on this point, the President drops down into a half despairing tone, and tells us that "with a people distracted and divided by contending factions, and a government subject to constant changes, by successive revolutions, the continued success of our arms may fail to secure a satisfactory peace." Then he suggests the propriety of wheedling the Mexican people to desert the counsels of their own leaders, and trusting in our protection, to set up a government from which we can secure a satisfactory peace; telling us, that "this may become the only mode of obtaining such a peace." But soon he falls into doubt of this too; and then drops back on to the already half abandoned ground of "more vigorous prossecution."
Hm ... "More vigorous prossecution" of the war is the only way to peace. Don't ask tough questions about when peace will have actually arrived. Sound familiar? If not, how about this ...
All this shows that the President is, in no wise, satisfied with his own positions. First he takes up one, and in attempting to argue us into it, he argues himself out of it; then seizes another, and goes through the same process; and then, confused at being able to think of nothing new, he snatches up the old one again, which he has some time before cast off. His mind, tasked beyond it's [sic] power, is running hither and thither, like some tortured creature, on a burning surface, finding no position, on which it can settle down, and be at ease. ... As I have before said, he knows not where he is. He is a bewildered, confounded, and miserably perplexed man. God grant that he may be able to show, there is not something about his conscience, more painful than all his mental perplexity!
Should I laugh or cry?

UPDATE: The attempt to claim Lincoln's mantle for Bush continued in Tuesday night's speeches, both by Laura Bush and Arnold Schwarzenegger. I also read in this morning's paper (use BugMeNot to get past the registration) that earlier in the day, Mrs. Bush and her daughters tested the teleprompter and microphone by reading the first line of the Gettysburg Address, perhaps for the benefit of reporters present?

UPDATE: Charles Sheehan-Miles, a veteran of the first Gulf War, makes some of my points in this post better than I ever could. Please read this.

Monday, August 30, 2004

 

New template

I think I like this better. But I'm still ironing out the wrinkles. Love it? Hate it? Couldn't care less? Let me know. (I'd appreciate your comments if this looks funny with your screen resolution. I'd appreciate it even more if you could tell me how to fix the problem, since I'm a CSS amateur!)

 

Journalism's great weakness

This morning I've noticed that some people are rightly upset about newspaper coverage of yesterday's massive protests in New York City. Despite the fact that the great majority of the protests apparently occurred without incident, many reports featured the handful of arrests. Our local news anchor headlined the protest story by saying that the protests were "relatively" peaceful; the qualification was dripping with disappointment, since this meant (by local TV news standards) that there was nothing left to say.

Apparently, many journalistic outlets also feel there is not much more to say about these massive protests that that nothing very illegal happened. This is not scientific proof, but head over to Google News and notice how hard it already is to click through quickly to a story on the protests. Even though they just happened yesterday, the links are already buried.

All of this made me think of the following passage from G. K. Chesterton's obscure 1909 novel, The Ball and the Cross. Since Chesterton was himself a prolific journalist, it counts as an insider's critique:
It is the one great weakness of journalism as a picture of our modern existence, that it must be a picture made up entirely of exceptions. We announce on flaring posters that a man has fallen off a scaffolding. We do not announce on flaring posters that a man has not fallen off a scaffolding. Yet this latter fact is fundamentally more exciting, as indicating that that moving tower of terror and mystery, a man, is still abroad upon the earth. That the man has not fallen off a scaffolding is really more sensational; and it is also some thousand times more common. But journalism cannot reasonably be expected thus to insist upon the permanent miracles. Busy editors cannot be expected to put on their posters, "Mr. Wilkinson Still Safe," or "Mr. Jones, of Worthing, Not Dead Yet." They cannot announce the happiness of mankind at all. They cannot describe all the forks that are not stolen, or all the marriages that are not judiciously dissolved. Hence the complete picture they give of life is of necessity fallacious; they can only represent what is unusual. However democratic they may be, they are only concerned with the minority.

Saturday, August 28, 2004

 

Manuuuuuuuuu!

Yes, the U.S. Men's Basketball Team settled for bronze at the Olympics, led by the foul-hampered Tim Duncan. The gold went to Argentina, led by the spectacular play of Manu Ginobli. As a fan of the San Antonio Spurs, I bring a somewhat unique perspective to this outcome, since both Duncan and Ginobli play for my favorite team. I could not be more disappointed for Duncan, or more pleased for Ginobli.

The fact that both Duncan and Ginobli live in San Antonio, Texas (my hometown), and play in the NBA also inevitably reminds me of some of my earlier musings about the peculiar "internationalism" of the modern Olympics. Various NBA pundits have demonstrated their razor-sharp analytical skills by diagnosing the presumed failure of the U.S. team thusly: The "international" level of competition has gotten better. There are more "international" players in the NBA now than ever before, which explains why an NBA Dream Team appears to be less dreamlike. By "international," they actually mean "non-Americans" or "foreigners."

Are these the conclusions we should draw about the basketball results? Are these the words we should use to describe them? Or, rather, is the state of "international" basketball just another sign that words like "international" are becoming more and more complicated these days? After all, Tim Duncan was not born in the United States (he grew up in the Virgin Islands), but he played for the U.S. Team. Is he an "international" player? Ginobli was born in Argentina but lives and works in the United States; indeed, this summer he signed a multi-million dollar multi-year contract with the Spurs. Is he an "international" player? Across a variety of sports, the Olympics similarly categorizes athletes by "nationality" even as that concept is losing much of its lexical precision, thanks to global markets and migration.

I'm reminded of Aihwa Ong's concept of "flexible citizenship."
"I use the term flexible citizenship to refer especially to the strategies and effects of mobile managers, technocrats, and professionals who seek to both circumvent and benefit from different nation-state regimes by selecting different sites for investments, work, and family relocation. Such repositioning in relation to global markets, however, should not lead one to assume that the nation-state is losing control of its borders. ... From the perspective of such immigrants as well-heeled Hong Kongers, however, citizenship becomes an issue of handling the diverse rules or 'governmentality' of host societies where they may be economically correct in terms of human capital, but culturally incorrect in terms of ethnicity." (From Aihwa Ong's "Flexible Citizenship among Chinese Cosmopolitans," in Cosmopolitics, p. 136)
Might we add professional athletes to Ong's list of flexible citizens--the "managers, technocrats, and professionals"? Or is this just another sad example of the fact that graduate students cannot turn off their dissertations long enough even to watch a sport they love? You decide.

Wednesday, August 25, 2004

 

Dissertation horticulture

In the past two weeks I've been wrangling with Chapter 1 of my dissertation. I have had fragments of this chapter written for over a year, but the last two weeks convinced me that this has been a hindrance rather than a help.

The first seeds of this chapter were written for a conference paper in the summer of 2003. I signed up for the conference thinking that it would provide me with an early deadline for completing part of the dissertation. Reasonable enough, you might say. But here's what happened: the conference paper took on a life of its own. It shot roots down into the small patch of ground enclosed by my panel's subject. Then, it sprouted wild branches that sprawled off in unplanned directions. Finally, these adventitious branches blossomed, producing exotic fruits to which I became attached--the kinds of little rhetorical flourishes that all writers vainly admire behind the walls of their secret gardens.

I gave the paper at the conference and put it into hibernation. Then came the season that has tried my soul. This summer, I tried to take the conference paper and transplant it into Chapter 2. The effect was somewhat like attempting to graft a bonzai tree onto a grapevine. It became a huge and unwieldy hybrid. Sometime in June, I decided that the conference paper had to be pruned away; it would have to become its own separate chapter. This decision worked out well for Chapter 2. With its roots no longer being choked by the conference paper, Chapter 2 took on a life of its own.

So now I had an aging bonzai that needed to fill the hole dug for a larger tree. I tried for a couple of months to nurture it, to encourage its growth. But it never did take. I finally realized that it was simply unnatural to force the conference paper into the dissertation narrative. The lives of the two works were different. To breed them would have required genetic modification that might, for all I know, have made my readers sick. So early this week, I decided to lay the ax to the root. My little conference paper had become a weed, and the only thing to do it was pull it out and start again.

You'll pardon my extended metaphor, I hope, because this whole experience has made me realize how biological and organic writing is. When we are taught to write, our teachers misleadingly use modular and linear metaphors for the structure of a piece: they say it has an intro, a body, a conclusion--paragraphs that can be cast, picked up and put back like so many blocks of pig iron. Then, as we grow into older writers, we hear academics speak of their writings like alchemists, blithely talking of turning a chapter into an article, a conference paper into a chapter, a dissertation into a book, as though these transformations were not dark arts.

Essays, I've found, do have bodies, but they are more like human bodies than solid masses. Paragraphs are more like organs than building blocks. To transplant them into other bodies is a delicate procedure; the new host might well reject them. And always, vestiges of the old body remain, eager at any time to be inflamed in spontaneous but urgent attacks of appendicitis.

Revision by alchemy or engineering are superstitious fantasies. Revision is more like surgery or horticulture. Sometimes, it requires realizing that even a beloved, manicured plant--a conference paper, if you will--is better off as compost for a new beginning. Other times, natural selection takes its course. The mutations are not pretty at first, but they are often for the best.

Tuesday, August 24, 2004

 

Hijacked faith?

I've been noticing a trend lately among politically progressive Christians. Increasingly, their frustration with the Religious Right is being boiled down to this sound byte: "Conservative evangelicals have hijacked our faith."

Usually, I find myself nodding in agreement with this kind of talk. But today, as I started to notice how pervasive the "hijacking" metaphor is becoming among critics of the Religious Right, I realized that something about it bothers me, too. Bear with me as I try to figure out why.

First, here are some examples. Jim Wallis, leader of Call to Renewal and editor of Sojourners magazine, had an op-ed piece in the Boston Globe last month entitled "Recovering a hijacked faith." It begins:
MANY OF US feel that our faith has been stolen, and it's time to take it back. A misrepresentation of Christianity has taken place. Many people around the world now think Christian faith stands for political commitments that are almost the opposite of its true meaning. How did the faith of Jesus come to be known as pro-rich, pro-war, and pro-American?
This month's Sojourners also contains a speech by Bill Moyers which concludes (emphasis added):
OVER THE PAST few years, as the poor got poorer, the health care crisis worsened, wealth and media became more and more concentrated, and our political system was bought out from under us, prophetic Christianity lost its voice. The Religious Right drowned everyone else out.

And they hijacked Jesus. The very Jesus who stood in Nazareth and proclaimed, "The Lord has anointed me to preach the good news to the poor." The very Jesus who told 5,000 hungry people that all of you will be fed, not just some of you. The very Jesus who challenged the religious orthodoxy of the day by feeding the hungry on the Sabbath, who offered kindness to the prostitute and hospitality to the outcast, who raised the status of women and treated even the tax collector like a child of God. The very Jesus who drove the money changers from the temple. This Jesus has been hijacked and turned into a guardian of privilege instead of a champion of the dispossessed. Hijacked, he was made over into a militarist, hedonist, and lobbyist, sent prowling the halls of Congress in Guccis, seeking tax breaks and loopholes for the powerful, costly new weapon systems that don't work, and punitive public policies.

Let's get Jesus back.
And most recently, in an interview promoting his new book, Tony Campolo said this:
My purpose in writing the book was to communicate loud and clear that I felt that evangelical Christianity had been hijacked. When did it become anti-feminist? When did evangelical Christianity become anti-gay? When did it become supportive of capital punishment? Pro-war? When did it become so negative towards other religious groups?
From some other disparate examples, see here, here, and here.

This increasingly prevalent use of the "hijacking" metaphor has culminated in a new campaign called "Take Back Our Faith," which includes a fund-raising campaign for a full-page newspaper ad and a petition. But a little looking around on the Internet led me to discover that the "hijacking" metaphor is nothing new. This Google search of the Sojourners website reveals that this year is not the first time that the "hijacking" metaphor has been applied to evangelicals. See, for instance, the magazine's March/April 1995 issue, which includes other articles by Wallis and Campolo. The idea that the Religious Right has "hijacked" Christianity is not being invented just now, but it is undergoing a revival during this election year.

In the interest of full disclosure, I should start by saying that I agree generally with what Wallis, Moyers, Campolo, and others are trying to do. In fact, I signed the "Take Back Our Faith" petition. I'm hesitant to criticize anything about it. I too am angered that the Bush-Cheney campaign appears convinced that it has the evangelical vote in hand. And I'm all for media coverage that dispels the myth that to be Christian means to be Republican.

And yet ... something about the use of the word "hijacked" bothers me. Partly, I think, I'm bothered by the connotation that the word has in this post-9/11 political climate. In 1995, to say that the Religious Right was "hijacking" religion meant one thing; in 2004, the same statement has an entirely different valence. Surely these progressive Christians know that they are framing their criticism in an intensely visceral and loaded way. To say something is being "hijacked" now is to garner almost immediate sympathy for a cause, because "hijacker" means "terrorist." Those who say the Religious Right has "hijacked" our faith may not intend to call evangelicals "terrorists," but they would be naive (or disingenuous) to claim they never anticipated their audiences making that mental equation.

One of the most craven and cynical things about the Bush administration is its manipulation of words like "terrorist." They have used this word, along with related terms like "axis of evil" and "war on terror," to carve the world up into "us" and "them." (Those who aren't with "us" are with "them." Disagree with the way we're prosecuting the "war on terror"? You must be with the "terrorists.") President Bush has manipulated Americans' fear of and revulsion at the actions of terrorists into a blank check for extreme and disastrous policies.

But my worry is this: when Christian progressives apply the word "hijack" to their political opponents, aren't they engaging in the same kind of rhetorical manipulation?

Perhaps not intentionally. But the dangers of this manipulation are potentially the same. For one thing, when you call your opponents terrorists/hijackers, any discursive space for self-criticism immediately shrinks. If "they" are the hijackers, then "we" are the good guys. This was precisely why so many progressives criticized the Bush administration's immediate invocation of a "war on terror" in the fall of 2001. As soon as that term was used, all opportunity was lost that we might have a substantive national discussion about our own foreign policy, and how it might contribute to the rise of anti-Americanism in the Middle East.

This was the point that Rowan Williams, the Archbishop of Canterbury, made in an excellent essay after September 11, collected in this excellent anthology. "So much of this seems to oblige us to think about language," Williams began. Once "evil" became the dominant adjective applied to the hijackers, and later to Iraq, certain options for American action and reaction were irrevocably lost. As Williams says:
... bombast about evil individuals doesn't help in understanding anything. Even vile and murderous actions tend to come from somewhere, and if they are extreme in character we are not wrong to look for extreme situations. It does not mean that those who do them had no choice, are not answerable, far from it. But there is sentimentality too in ascribing what we don't understand to "evil"; it lets us off the hook, it allows us to avoid the question of what, if anything, we can recognise in the destructive act of another. If we react without that self-questioning, we change nothing.
Williams warned, presciently, that we should be "very suspicious of any action that brings a sense of release, irrespective of what it achieves; very wary of doing something so that it looks as if something is getting done." That is what the language of "evil" and the wars in Afghanistan and Iraq did. They gave Americans an outlet for anger, a feeling that something was being done. But that language also closed down the possibility of self-questioning and the recognition of ourselves in the Other.

What bothers me about calling religious conservatives "hijackers" is that, in today's public sphere, this kind of talk does the same thing that Williams warned the language of "evil" would do. First, it makes Christian progressives feel good; it makes it look "as if something is getting done" to restore the authentic social vision of the gospel. It brings a "sense of release." But it also "lets us [progressives] off the hook." If those other Christians stole our religion, then we progressives are just the victims here. Thank God that we are not like these other men.

Finally, accusing evangelicals of a hijacking "allows us to avoid the question of what, if anything, we can recognise" in them. We disclaim responsibility for the church's sorry state; our responsibility only becomes the virtuous one of vanquishing the foe. And so, just as President Bush's "axis of evil" immediately polarized issues and set us on a path away from real understanding, the concept of "hijacked faith" robs us of common ground with our enemies and clothes us in a holier-than-thou hauteur. If we actually begin to use the same linguistic weapons as the Bush administration, are we really allowing our faith to be "hijacked," or are we simply handing it over?

As dangerous as I believe it is speak of our faith being "hijacked" by conservatives, it may be even more dangerous to speak of "our faith." Wallis, Moyers, and Campolo refer to their project as though it were a rescue mission, as if by "taking back our faith," we--progressive Christians--are somehow saving God. Notice how Wallis says: "some of us feel that our faith has been stolen," as though Christianity was our property, and "some of us" are miffed that it has disappeared from the lock box we had it in. And imagine the ludicrous scene that Moyers implicitly describes, of a thuggish gang of evangelicals "hijacking" Jesus, until--like a holy A-Team--we have determined to round up a posse and go "get Jesus back."

This kind of talk tastes sour in my mouth. As though any of us can claim that Christianity is "ours," that Jesus is depending on us to save him from his enemies. My criticism may seem like a semantic quibble, but it is more than that. For when we speak of Christianity as "ours" and describe it as being stolen by "them," we are falling into the same self-righteous patterns of speech that the Religious Right employs. Just as to use the word "hijacker" is to mimic the corrupted speech of Republican hawks, to use this possessive pronoun is to echo those whom we oppose.

Please do not mistake me: We do need to be concerned by the political power of pro-war and pro-wealth Christians. But we need to adopt a rhetoric that confesses to our own failures even as it criticizes. We need to give up a posture of holiness for a posture of humility, to see within ourselves the same evils we denounce in others--the same partisanship, the same self-righteousness, the same conformity to the world's way. Instead of expressing our wrath, we need to do more mourning, for only those who mourn will be comforted and changed. Instead of venting anger, we need to act. The best way to convince people that not all Christians are right-wing politicians is to act like Christians. "To-day there is rather too much than too little said about the Church," said Karl Barth, years ago. "There is something better: let us be the church!"

What we don't need are aggressive and counter-productive words like "hijacking." "Let's get Jesus back" sounds to me a little too much like "Let's roll." And the record on that phrase is not good.

UPDATE: Carlos Stouffer has some similar worries.

Sunday, August 22, 2004

 

Online cheating

I've been very pleased with the new Blogger NavBar that appears at the top of this page, in lieu of Google Ads. Before Blogger made the switch, I was becoming increasingly dismayed by the number of times that online cheating services were turning up in my Ads box. Because of the prevalence of the word "dissertation" on my blog, the Google spider calculated that visitors to this site would be interested in links to "Custom Dissertations" and the like. Thanks to the NavBar, I no longer feel like I am giving countenance to such services, even unintentionally.

Today I find that there is an article in the New York Times Sunday book review on online plagiarism: "Dear Plagiarists: You Get What You Pay For." Early on, the piece includes some astonishing statistics:
Each site appeals to a different type of student. There's the sleek and cocky Geniuspapers.com; the modest and amiable Superior-Termpapers.com; and the outsider CheatHouse.com, to name a few. While 10 percent of college students admitted to Internet plagiarism in 1999, that number rose to around 40 percent in 2003, Donald L. McCabe, the founder of the Center for Academic Integrity (C.A.I.) at Duke University, said in a telephone interview. Many students simply crib what Google dredges up free, but McCabe estimates that 2 percent of students purchase papers online. That's how many admit it, anyway.

The sheer ubiquity of the sites, and what is now almost a lifetime of habitual Internet accessibility, might explain why the majority of college students tell McCabe they don't think copying a sentence or two from the Web is a big deal. Students are fuzzy on what's cheating and what's not. ''A lot of students will tell us, 'It's out there, it's on the Internet,' '' Diane M. Waryold, the executive director of C.A.I., said in a telephone interview. ''They say, 'Isn't it for public consumption?' ''
I'm probably not the only reader who was surprised by the "40 percent" figure for 2003. But I hope I am also not alone in finding the explanations given in the second paragraph slightly unsatisfying.

The "ubiquity of the sites" and "habitual Internet accessibility" do not a cheater make. The Internet may facilitate cheating for students who already have a proclivity towards dishonesty, but I doubt that the Internet creates that proclivity. I also doubt that students who patronize a site called "Cheat House" are "fuzzy" about what they are doing, or that they cannot tell the difference between "copying a sentence or two from the Web" and ordering a custom written essay for $18 to $20 a page. I believe many students are unclear about what plagiarism is, but even the confused are surely not dazed enough to believe that it is okay to buy an assignment and submit it for a grade. Cheaters knew that was cheating before there was ever such a thing as the Internet.

It's important to distinguish clearly between those students who are genuinely uninformed about the proper conventions for using sources, and those students who are knowingly cheating. Why? Because if those two groups are conflated, as I think they sometimes are, teachers might unwisely succumb to the temptation to prohibit the Internet as a source, or to spread the sky-is-falling fear that online sources are inherently unreliable.

A natural reaction to articles like the one in the Times is to rage against the machine--to rail against Google for the deterioration of academic standards. (The article's association of verbs like "crib" and "dredges" with Google might unwittingly encourage this impulse.) But that natural reaction must be resisted. Google is here to stay, and students are going to use it for research. Any solution to the scourge of plagiarism has to start by accepting that as the new reality.

(And it's not a bad new reality. Google may make it easier for students to find information to plagiarize, but it also makes it easier for teachers to hold cheaters accountable. My wife's a high school teacher, and whenever she detects a whiff of cheating in a student's work, she can usually sniff it out by typing the first sentence into Google and--voila! True, Google can't help teachers find custom written papers, but this leaves them no less powerless than they always are to students determined to have someone else do their work for them.)

Solutions to plagiarism will also have to go beyond the paradigmatic language of crime and detection. Talking about the Internet only as a tool for cheating, or a tool for stopping it, diverts teachers from talking to their students about how to use the Internet as a legitimate tool for research. The Internet is for "public consumption," and the right strategy is not to give students the idea that it isn't. All published material (literally, by definition) is for public consumption, whether online or off. The key is to teach students how to be efficient and ethical consumers of that information. It may be obvious, but it needs to be reiterated, that being a consumer of online information is not equivalent to being a customer of Geniuspapers.com. Showing students how to be the former does not encourage them to be the latter. On the contrary, the more proactive teachers can be in educating students about online research, the less likely they will be to plagiarize.

Saturday, August 21, 2004

 

Is Houston worth it?

My wife and I have a running debate about which of our native cities is better: San Antonio or Houston. I'm in San Antonio's corner, and I usually joke to my wife that it's hardly a fair fight to pit my fair city against the sprawling tangle of interstates and smokestacks known as Houston.

This is the kind of thing that current or former Texans can get pretty worked up about. Even my wife will admit that Houstonians bristle easily in behalf of home. For example, to tell a Houstonian you are even thinking of visiting Dallas-Fort Worth is a sure way to give offense. (Consider yourself warned, and remember that concealed handguns are legal in Texas.)

Both of us were therefore interested to discover this new website: "Houston. It's Worth It." (Flash required.) The site takes a reverse-psychology approach to city boosterism by confessing to Houston's many afflictions, only then to declare with defiant braggadocio that it's worth it to live there. The site is sort of like a community blog; it allows Houstonians to complain about the city's problems but still defend its unique charms. Here are some samples:
Methinks these people protest too much, but decide for yourself. And you can read about the program in the New York Times.

My wife has just learned that I am writing a post about Houston and has made veiled threats about what will happen if I "say mean things." I should publish before I chicken out.

 

White Cedar (Arbor Vitae)

The cedar’s roots, straddling an Allegheny stone,
Know not to slurp: thus the tree’s diameter grows
An inch per thirty years.

When the cedar was a seedling, five centuries
Ago, Protestants did not exist; Virginia
Was still not on a map.

The first leaf fell in an unknown autumn during
The Ottoman Empire’s spring. Decomposed, fleeting,
It became another

Vaguely musty smell, rustling on the forest floor,
An ottoman where a passing centipede might
Have curled its hundred feet.

As the red seasons browned, how many leaves fell and
Carpeted the senseless ground? The Thirty Years’ War
Only meant an inch more.

From century one to two (inch three to inch four),
How many belly buttons closed? How many breaths
Expired whispering

Rumors of wars? Meanwhile, the tree bequeathed its leaves
To centipedes below, radii growing at
A most lethargic speed.

Belittled by the patient cedar, my hand runs
Along the trunk. It says I too will be gone in
A matter of inches.

Tuesday, August 17, 2004

 

The honeymoon continues

My wife and I are leaving town for a few days to celebrate four years of marital bliss. There won't be any new posts until at least Friday.

Sunday, August 15, 2004

 

Comments

I didn't realize until recently that Blogger's default settings only allow registered Blogger users to comment. I've changed my settings so that anyone can comment here. Sorry for the previous inconvenience!

 

The Olympic rings

In my last post, I wrote that "the Olympics place in stark relief how intertwined cosmopolitanism and patriotism can be." What better symbol of this entanglement than the official Olympic symbol?

The symbol suggests a kind of internationalism in which nations remain autonomous and distinct. There are five discernible rings, each a different color, and yet they cannot be pulled apart. In this picture, national distinction and interdependence are not incompatible. (FYI, according to this, the rings represent continents.) More pedantically, the Olympic symbol is a Kantian picture of cosmopolitanism--instead of subverting the very existence of the Westphalian state system, it suggests that states can maintain their sovereignty and nonetheless be inseparably linked.

The vision of internationalism represented by the Olympic rings can be contrasted with other cosmopolitan visions. If you want to suggest the idea of "one world," undivided by nationalistic animosities, why not simply draw one big circle, instead of five different ones? The picture's meaning would also be very different if the rings symbolized not continents or nations, but individual people; the image would then suggest "citizens of the world" joined at the hip and yet retaining personality. Or one could view the rings as spheres of responsibility and then draw increasingly wider circles around an individual, representing a person's duties to family, then to local neighborhood, then to country, and then to the world. You can convey very different worldviews by arranging the rings differently.

To understand how different, consider some quotations from antebellum reformers. First, here is Charles Sumner, a Radical Republican in Congress, in a speech before the American Peace Society in 1849:
Let me not seem too confident. I know not, that the nations will, in any brief period, like kindred drops, commingle into one; that, like the banyan-trees of the East, they will interlace and interlock, until there is no longer a single tree, but one forest ... but I am assured, that, without renouncing any essential qualities of individuality or independence, they shall yet, even in our own day, arrange themselves in harmony, as magnetized iron rings--from which Plato once borrowed an image--under the influence of the potent, unseen attraction, while preserving each its own peculiar form, all cohere in a united chain of independent circles.
Compare this with George Thompson, a radical British abolitionist, in an 1837 speech:
There is a law of mutual influence, by which the conduct of one man may affect his fellow-men, to the extremities of the world, and the end of time. Every individual may be considered as the centre of a moral circle, which is connected by links of more or less power, and includes the whole of mankind. Do any ask, 'Who is my neighbour?' I answer, every human being. 'What do I owe him?' Love. 'Thou shalt love they neighbour as thyself.'
And compare this image again with that of Elihu Burritt, an eclectic peace reformer, in 1846:
[Slavery] cannot co-exist with the centralizing idea and tendency of human brotherhood. That and all the lesser concentric circles of the new solar system of humanity, will repudiate all sympathy with slavery.
Contained within these arrangements of circles--interlocking circles, concentric circles, moral circles in which individuals stand at the center--are slightly but significantly different ideas. Another way of putting my last post is this: the Olympics are closest to Sumner's vision of internationalism, both in symbolism and in spirit.

Saturday, August 14, 2004

 

Olympics

We watched parts of the opening ceremonies of the Olympics last night. Quite a show, but it was overshadowed by a dense cloud of hot air emanating from Bob Costas: -- "Wait until you see this get-up, Katie." "Look at the pleats on those skirts; I'd hate to see the dry cleaning bill for those." "When you're a Greek god, you can pretty much do what you please." "They're giving props to each of the official languages of the games. But now they have switched back to English, so we can all breathe a sigh of relief."

And my personal anti-favorite: "Guess what country is coming up next, Katie. You bet Djibouti!" Argh. (And yet the Times asserts that Costas and Couric avoided offensive comments. I guess they aren't counting the jokes about funny names and "traditional garb" as offensive.)

The opening ceremonies are very redolent of turn-of-the-century World's Fairs like the ones held in Chicago in 1893 and in St. Louis in 1904. It may not be coincidental that the modern games were revived in the same era (1896). Since the demise of the Fairs, in fact, the opening ceremonies are in a league of their own as panoramic spectacles with an international audience of millions. For recent historians, the World's Fairs have served as illuminating windows onto contemporary views about race, empire, gender and technology. If this historiography is any indication, I'm willing to bet that future historians will have a field day (no pun intended) with the Olympics. They really are fascinating snapshots of how organizers saw the world at the time.

I was impressed by the climactic scene, when the Easter-Island-like sculpture fell apart to reveal a man trying to stay balanced on a rotating cube, a kind of inverted Atlas. I also liked the double-helix displayed in laser lights. Pretty amazing technically. Without the constant stream of commentary from Costas, it might even have been profound. I found myself wondering whether the tangled mass of wires, by which the sculptures and various dancers were suspended, were deliberately made to be as visible as they were. Either way, it made for a stunning scene.

Part of being a graduate student is seeing your thesis everywhere. Preoccupied as I am with the themes in my dissertation, I was struck by the keynote speeches, which made me think about the interesting coexistence of internationalism and nationalism at the Olympic Games. On the one hand, the Games represent a kind of peaceful internationalism, and they are ritually praised every four years as a reprieve from war. Gianna Angelopoulos-Daskalaki, the chief organizer in Athens, began her speech by addressing the "citizens of the world." She concluded by suggesting that in competitive sports, national, linguistic, and racial barriers are broken down. The IOC chairman sounded similar themes about the world's need for peace, echoing Angelopoulous-Daskalaki's claim that national divisions can be overcome in the nonviolent arena of international sports.

But are the Olympics really cosmopolitan? They could equally be read as unparalleled celebrations of national pride and achievement. The opening ceremonies themselves were panegyrics to Greece and occidental civilizations. National barriers may be broken down by sports, but as Brian and his various respondents point out over at Crooked Timber, national divisions are as impermeable as ever when it comes to broadcasting them. And over the coming weeks, the media will keep us endlessly reminded of which country is ahead in the "medal count." The Olympics place in stark relief how intertwined cosmopolitanism and patriotism can be.

Nevertheless, I have to confess that I am still moved by the possibility that sports can channel human competitiveness into peaceful venues. The Olympics occasion their fair share of nationalistic chest-beating, but who can be disappointed that this is a more peaceful kind of patriotism? Of course, patriotism that is peaceful in one guise can be murderous in another. But if national competition is not going away anytime soon (and I don't think it is, despite the prognostications of globalization theorists), should we not be glad that there are peaceful outlets for such competition?

I remember thinking the same thing when I was in Italy during the summer of 2000, when the Palio was being run in Siena. The Palio is a fierce horse race between the seventeen contrade--or city wards--of Siena. In medieval times, the contrade used to engage in regular civil wars. Now they race horses. I am not so cynical that I cannot say, without qualification, that horse races are better than wars. And so are the Olympics.

UPDATE: I learned something new: that the Olympics were also held in St. Louis in 1904 as part of the World's Fair. And Ralph Luker has this interesting post at Cliopatria about "Anthropology Days" at the Olympics. (This is the kind of thing I had in mind when I suggested "the World's Fairs have served as illuminating windows onto contemporary views about race, empire, gender and technology," and that the Olympics are similar kinds of windows onto how we view the world and the human species today.)

Friday, August 13, 2004

 

You know you're a blogger when ...

... you've only been blogging for three weeks, but you feel guilty for skipping a day without posting. And when you have a queue of "post ideas" that you can't seem to find the time to write out.

I have not done much reading or writing for the past couple of days because we've been enjoying hosting out-of-town guests. In the meantime, though, I've been thankful to those who have noticed this blog and recommended it, including Ralph Luker at Cliopatra and Paul Musgrave.

Yesterday I did start reading Charles Tilly's new book, Social Movements, 1768-2004 (see link in the sidebar). Tilly has spent a distinguished career studying "contentious politics" and the history of social movements, and this book looks to be a crowning capstone for his life's work. A couple of thoughts caught my eye in the first few chapters, and I thought I would pass them along.

One of Tilly's major questions concerns timing: when did modern "social movements," as we know them, appear? As the years in the title suggest, Tilly believes that social movements date back only to the late eighteenth century. It was then, he suggests, that political contenders (people on the margins of political institutions who wish to contest what they perceive to be unjust arrangements of power) developed a now familiar "repertoire" of collective actions. By asking when and how social movement "repertoires" of action developed, Tilly is trying to understand (for instance) when activists learned to organize public demonstrations like street marches. Sometime in the eighteenth century, he says, public protests like marches started to evolve into their present day incarnations.

This is an interesting question: how did activists come to believe that mass marches could be efficacious modes of collective action? Recently I read this story at Salon (free day pass required) about how Cheri Honkala, a well-known advocate for the homeless, has organized a march across New Jersey scheduled to culminate in New York City at the Republican Convention. The story documents the modern activist's sense that the longer and larger the march, the better. But what historical circumstances combined to make people think that walking across New Jersey en masse could effect fundamental social change? That's the kind of question Tilly is asking, and I'm interested to read more about his answers.

Tilly has also piqued my interest by arguing (rightly, I think) that the term "social movement" acquired an almost universally positive connotation in the late twentieth century. "By the turn of the twenty-first century," he writes, "people all over the world recognized the term 'social movement' as a trumpet call, as a counterweight to oppressive power, as a summons to popular action against a wide range of scourges." In America, "movements" have been valorized primarily by the successes of the Civil Rights Movement, and today all kinds of political causes don the mantle of the newest "movement."

The Salon article, for instance, briefly spotlights Honkala's son, an aspiring actor who is marching across New Jersey with his mother's group: "This, he says, is the center of his life. 'It's not like I'm taking time out when I'm in [New York] city,' he says. 'When I'm auditioning and trying to get the next role, it's all for the movement.'" Calling something "the movement" (definite article required) gives off good vibes in our political culture. Tilly says this book will explore how this came about.

Another titillating argument in the book's early pages is that the social movement as we know it may have already reached its zenith and may be heading towards its nadir.
The social movement, as an invented institution, could disappear or mutate into some quite different form of politics. Just as many forms of popular justice and rebellion that once prevailed have quite vanished, we have no guarantee that the social movement as it has prevailed for two centuries will continue forever. Since the social movement spread with the growth of centralized, relatively democratic states, for example, either governmental decentralization, extensive privatization of governmental activities, eclipse of the state by transnational powers, or widespread democratization could all put the social movement as we know it out of business. Indeed, with the set of changes that people loosely call 'globalization' occurring, citizens who count on social movements to make their voices heard must look very hard at the future. (p. 14)
I won't be sure what I think of this argument until I read more of the book. But anecdotal evidence in my own experience seems to confirm Tilly's suggestion that popular opinions about "social movements" may be shifting.

Granted, massive public demonstrations still command media attention and serve as powerful collective actions. I'm thinking, for instance, of the millions who crowded the streets of Europe in protest during the build-up to the Iraq War. But we do have to "look very hard at the future" when we consider that these massive protests did not stop the war from taking place. Consider President Bush's reaction to the protests outside the White House prior to the invasion of Iraq--that he did not read the newspapers and did not make decisions based on polls. More to the point may be his reaction to the massive street demonstrations staged in London during his November 2003 visit to the United Kingdom.
"I can understand people not liking war, if that's what they're there to protest," Bush said. "I fully understand not everybody is going to agree with the decisions I've made. I don't expect everybody to agree."

He added: "I admire a country which welcomes people to express their opinion. I'm proud of Great Britain's tradition of free speech."
The repertoire of contemporary movements may now be so familiar that its cutting edge has been dulled. Politicians can tip their hats to public protests while simultaneously thumbing their noses.

The American Left blames Bush's avowed indifference to movement actions on this adminstration, as though it is merely characteristic of this particular president. Hopefully so. But Bush's fawn-and-sneer reaction to free speech may be characteristic of a broader cultural impatience with the social movement forms that reached their apogee in the 1960s.

Today's dissenters continue to organize massive marches as the primary expression of collective contention. But I wonder whether Tilly is right that the returns of these marches are now diminishing, not only in the halls of power but also among the populace. The "target population" that activists are trying to persuade may have residual good feelings about "social movements," but perhaps they increasingly view mass actions as annoying or anarchic. (Consider mainstream media representations of anti-globalization protests. Doesn't the public mood tend to greet these events now with weariness bordering on rage?)

I'm not suggesting that mass marches are a bad thing--only that I'm interested to watch Tilly develop the argument that "social movements" may be undergoing a broad transformation in the present. The last year of his title (2004) is ominous, for it suggests a kind of impending closure to "social movements" as we now understand them. Tilly's judgment may be premature, but I think he's right that activists need to think hard about the possibility that old forms may no longer work the way they once did; real creativity is called for if social movements are to evolve into equally effective kinds of collective political action.

In the Spring of 2003 I was teaching an undergraduate class on twentieth-century American history. The week that the war in Iraq began, I gave students an entire class period to discuss their feelings on what was happening. I opened by referring to a recent demonstration that had been held downtown by anti-war groups, an event in which I had participated in a meager way. At the protest, one contingent of activists moved into the street to block traffic, inviting a cacophany of honking horns from rush-hour motorists. As I described this scene to my students (without mentioning my presence), I asked what they thought.

One particularly vocal student (whose political views I previously had no inkling about) said--and I paraphrase--"Those people should be glad that I wasn't driving downtown, because I probably wouldn't have stopped." Caught off guard by her utter seriousness, I stumblingly raised the question of whether causing minor disorder was required for social movements to be successful. I referred to Civil Rights demonstrations, which we had recently discussed in class, and pointed out that civil disobedience was a crucial part of those demonstrations' success. The student replied--and I paraphrase again--that "they have the right to say what they want, but they don't have to inconvenience the rest of us to do it."

This is admittedly unscientific evidence of Tilly's point. But I wonder ... Will the next generation of Americans, raised in a time of economic prosperity and rising technological comforts, have little patience for the "inconvenience" that has made modern social movements so effective in the past? And does this mean that radical activists need to expand and modify their repertoire of dissent? I'm not sure, but I will definitely keep turning the pages of this book.

Tuesday, August 10, 2004

 

Silence of the Past

I find it difficult to imagine the past, to give it dimension, texture, and color in my mind. This surprises me in some ways; as an aspiring historian I spend more time than most people trying to imagine the past, and surely practice helps. Yet as much as I read historical texts and historical narratives, my mental images of times gone by often play like silent movies, or appear to me like scattered collages of sepia photos, or in the best case present themselves to my mind like a disjointed dream.

I've been thinking about this over the past few days, ever since I watched Carl Theodor Dreyer's silent masterpiece, The Passion of Joan of Arc. Dreyer's movie, I've been told by my friend the film expert, is in a league of its own in film history, and it's easy to see why. It is incredible to think that this truly visionary movie was shot in 1928 (and even more incredible to learn that the negatives were lost until 1980, when they were discovered under a pile of rags in a janitor's closet at a Norwegian mental hospital). The film--which tells the story of Joan's trial by using dialogue from the actual transcript--is most famous for its use of intense close-ups on the actors' faces, and for its narrow camera angles that disconcert the viewer's sense of spacing and perspective. The movie is also without sound. The quietude of the film, combined with its disquieting facial expressions and framing, makes for an incredibly emotional and intimate viewing experience.

Who knows if this was Dreyer's intent, but the film also made sense to me as a representation of how we usually envision the past. The opening frame shows some hands opening a dusty book meant to be the trial transcript and flipping through the pages, as if to say that what follows is how a reader of the transcript might imagine the events contained within it. The viewer, as the anonymous reader, sees close-ups of the speakers' faces but does not "hear" them speak. Intertitles provide the dialogue.

This is how I often imagine history as I read it. Sitting in a silent archive, leaning over a yellowed letter, I "see" figures in my mind's eye but rarely "hear" them speak. Aside from the occasional interjection in a newspaper's transcript of a speech ("Loud cheers." "Continued applause." "Boos and hisses."), I rarely imagine sound. For that matter, it is difficult to imagine color. Like Dreyer's film, my historical imagination is usually gray and slightly out of focus. When I make the mental leap to translate a text into my imagination, what I see is something like what Dreyer "saw."

This means that an incredible amount is lost in translation--color, ambient sound, a wide-angle view of all the people in a room or on a street. Like Dreyer, I rarely imagine historical speakers in a single mental frame; if I am reading a speech, my mental image is of a speaker, until some reference to the audience ("Hurrahs.") makes me move the "camera" onto the crowd. And I find it difficult to imagine a speaker's voice. As I am reading the words of an oration, it is as if I am reading intertitles. The speaker is speechless in my mind even if he is in motion.

This does not mean that my imagination of the past is not intense; Dreyer's film shows that the silent method can be intensely vivid. Yet Joan of Arc also spotlights the "goneness" of the long ago; it represents for me the real difficulty of resurrecting the past in an act of imagination. This is especially true because so much of our access to the past is textual. Some recent historians, like Mark Smith and Richard Cullen Rath, have been experimenting with "aural history" in an attempt to remind us that the past was not a silent film--it was as "real" and full-bodied as the present. But even in trying to get us to use our historical "ears," the only tool aural historians have are texts--verbal descriptions of sounds that historical actors heard. Even attempts to expand our sensory imagination of the past fall back on the stubborn fact that the past is forever lost to our full faculties. Whereas lived experience is a "buzzing, blooming confusion," as William James put it, historical experience is necessarily attenuated and often mute. Or so it seems to me.

"Our hill has made its submission and the green
Swept on into the north: around me,
From morning to night, flowers duel incessantly
Color against color, in combats

Which they all win, and at any hour from some point else
May come another tribal outcry
Of a new generation of birds who chirp
Not for effect but because chirping

Is the thing to do. More lives than I perceive
Are aware of mine this May morning
As I sit reading a book, sharper senses
Keep watch on an inedible patch

Of unsatisfactory smell, unsafe as
So many areas are: to observation
My book is dead, and by observations they live
In space, as unaware of silence

As Provocative Aphrodite or her twin,
Virago Artemis, the Tall Sisters
Whose subjects they are. That is why, in their Dual Realm,
Banalities can be beautiful,

Why nothing is too big or too small or the wrong
Color, and the roar of an earthquake
Rearranging the whispers of streams a loud sound
Not a din: but we, at haphazard

And unseasonably, are brought face to face
By ones, Clio, with your silence. ..."

W.H. Auden, "Homage to Clio"

Monday, August 09, 2004

 

George W. Rush

Interesting tidbit from a Rolling Stone interview with Garry Trudeau, who was President Bush's classmate at Yale. (Via The Morning News.) Make of it what you will.
We both served on the Armour Council, which was the social committee for our residential college. Nobody in my freshman dorm knew what the council was. But I apparently had shown some leadership qualities in the first three or four days of school, so I was elected unanimously. George Bush was chairman. Our duties consisted of ordering beer kegs and choosing from among the most popular bands to be at our mixers. He certainly knew his stuff -- he was on top of it [laughs].

Even then he had clearly awesome social skills. Legend has it that he knew the names of all forty-five of his fellow pledges when he rushed Deke. He later became rush chairman of Deke -- I do believe he has the soul of a rush chairman. He has that ability to connect with people. Not in the empathetic way that Clinton was so good at, but in the way of making people feel comfortable.

He could also make you feel extremely uncomfortable. He was very good at all the tools for survival that people developed in prep school -- sarcasm, and the giving of nicknames. He was extremely skilled at controlling people and outcomes in that way. Little bits of perfectly placed humiliation.

Saturday, August 07, 2004

 

Hector and son

I just got around to reading last Sunday's article by Mark Edmundson on why we should read. After arguing that reading can play an important role in the socialization of readers, Edmundson's article closes by arguing that "the effects of reading major authors are almost always good ones. It is virtually impossible to be a consequential literary artist without infusing your work with sympathy. This understanding dates at least as far back as Homer, who makes it a point to depict the Trojans nearly as humanely as he does his fellow Greeks."

One might quibble (or more than quibble) with the idea that all great literature is a source of human sympathy, or that Homer can help people play well with others. (I seem to remember a lot of cracking skulls, sulking jocks, gangrenous greed, and murderous egotism in the Iliad.)

But Edmundson closes by pointing out that even the Iliad has its tender moments, citing the famous scene in which the doomed Hector says goodbye to his infant son on the walls of Troy. It's a scene that stands out in my memory from the undergrad course in which I first read Homer, and it's poignant enough to be worth an excerpt before going to bed tonight:

In the same breath, shining Hector reached down
for his son--but the boy recoiled,
cringing against his nurse's full breast,
screaming out at the sight of his own father,
terrified by the flashing bronze, the horsehair crest,
the great ridge of the helmet nodding, bristling terror--
so it struck his eyes. And his loving father laughed,
his mother laughed as well, and glorious Hector,
quickly lifting the helmet from his head,
set it down on the ground, fiery in the sunlight,
and raising his son he kissed him, tossed him in his arms,
lifting a prayer to Zeus and the other deathless gods:
"Zeus, all you immortals! Grant this boy, my son,
may be like me, first in glory among the Trojans,
strong and brave like me, and rule all Troy in power
and one day let them say, 'He is a better man than his father!' ...

From The Iliad, translated by Robert Fagles

 

Getting to the point

I've been struggling recently with some decisions about how to structure the early sections of my dissertation. The perennial problem is this: how much time should a writer spend setting the stage, providing context, and introducing themes, before actually getting to the point?

As things stand, I have a completed draft of the second chapter. It's the one I have written because it's the first major "episode" of the story I want to tell--the first chapter that takes a fairly conventional narrative form ("this happened, then this happened, this is why it happened and what it meant"). But before this Chapter 2 narrative can make sense, I've determined that I need a fairly abstract and non-linear Chapter 1 (laying out concepts, giving a synchronic survey of background ideas), as well as an "Introduction" and a brief "Prologue." That's three dissertation "units" before the main action really starts. Should I be worried about this?

I asked my advisor about that question yesterday. He was reassuring, reminding me that at this stage the most important thing is to produce pages. Then, other readers can help me decide whether the structure needs to be revamped. It could be, he said, that I'm too close to the project to judge whether there is too much "front matter" at the beginning. He's probably right. My fear, he told me, is a normal one: as a writer, you do not want to tax the reader's patience by waiting too long to get to the point. (I know from recent personal experience that a reader's patience can be thin.) But you also want to lead the reader towards the point, instead of revealing it before it will make sense.

It's hard to decide whether I want the dissertation to be a "point-early" or a "point-late" document. Those are terms used in Joseph M. Williams' helpful writing manual, Style: Toward Clarity and Grace. Williams points out that most readers are trained to look for a document's main "point" early--somewhere in the introduction. But academic readers (strange birds that they are) tend to have a higher tolerance for "point-late" writing, especially in the humanities. Part of the reason for this conventional acceptance of "point-late" writing is because "point-early" organization would seem "too crude, too flatfooted." Williams says, perhaps with his tongue partly lodged in his cheek:
In some fields outside the sciences, it is typical for a writer first to announce (some would say invent) a problem that no one suspected until the writer pointed it out. In this kind of writing, obviously enough, the writer is under no pressure to answer a question that no one except the writer has asked. But once the writer has convinced us of an unsuspected problem with, say, gender roles in the third book of Milton's Paradise Lost, she then sets to working through the problem, demonstrating how inventively she is solving it, how much more complex the problem is than we might have thought even from her early account of it. Only after we have accompanied the writer through her argument do we begin to catch sight of her main POINT.
I like reading writing like this, where the "Aha!" slowly dawns on the reader. But in "point-late" documents, it takes great skill on the part of the writer to keep the reader's attention. Otherwise, the gradual working through the problem can seem tedious at best or incoherent at worst. The "point-first" organization preached by high school English teachers everywhere is easier to do: put that "topic sentence" right up front. But it can also produce essays that sound like high school writing assignments.

It's a matter of timing. A "point-late" document, while often more interesting than a blunt "point-first" piece, is a tricky thing to produce, and it usually takes a critical reader to say whether it has been achieved. I'm hoping that having an introduction, a mini-prologue, and a first chapter will make the "Aha!" of Chapter 2 seem like an even bigger and more interesting punchline. But as in comedy, so in history: if you take too long getting to the point, the joke will almost certainly be on you.

SEE ALSO: Timothy Burke's recent post on what constitutes originality in humanistic scholarship.

 

Oubliette

Word of the Day: "oubliette, n. -- a dungeon whose only exit is a trapdoor in the ceiling." As in, "I keep thinking that my estimation of the administration’s competence and good will has reached rock bottom, when a new trapdoor opens and I fall into some yet ranker underground oubliette." (Belle Waring)

Friday, August 06, 2004

 

Burning ears

Many thanks to my friend and fellow graduate student, Jason Kuznicki, for a very generous post about my blog, as well as to GreenGourd's Garden and No Fancy Name. I'm grateful to all of them for laying out a welcome mat on behalf of the blogosphere.

Jason points out that the title of my blog is somewhat "enigmatic." Actually, the title was selected in a fit of creative laziness. When I started to set up a Blogger account, I was not prepared with a name on hand. Feeling the pressure to be creative but also feeling uninspired, I had recourse to my jazz collection, a trusted source of sage advice. Scanning the titles of my CDs and hoping for a miracle, where should my eye chance to fall but on Joe Henderson's 1966 album, Mode for Joe? Eureka!

There you have the uneventful story of how this space became "Mode for Caleb." But maybe I can add some enigma, ex post facto. A "mode" is a musical term, often associated with modal jazz. I don't pretend to be fully informed on the technical details of the genre: my enthusiasm for jazz has never fully evolved into expertise. But in contrast to standard bebop tunes, in which a band more or less repeated predetermined chord progressions while soloists improvised on those chords, "modal" tunes are organized around a certain set of diatonic scales.

Modal jazz gives preference to melodic improvisation. It frees soloists to do more than to run through chords, and it frees accompanists to improvise by interacting more flexibly with the solos. A modal tune is the very definition of ariose: melodic without being recitative. Moreover, if bebop was often about a soloist's virtuosity, modal jazz is a thoroughly collective enterprise. It provides freedom to extemporize, but always with the musical responsibility to empathize. Soloists can break away from the scale and from the accompaniment to create momentary tension in a song, but without creating as much dissonance as this free improvisation would in the structure of a traditional bop tune. Some might say that because it places an emphasis on both melody and interaction, modal music exemplifies what jazz is all about.

But I've never been the kind of listener who thinks that words can do justice to the music: if you want to hear a paradigmatic example of "modes" at work, listen to Miles Davis' Kind of Blue and discover all you need to know. The album, and the form of music, might as well serve as suggestions for the shape I hope this blog will take: improvisatory, unhurried, a solo without the solipsism.

Thursday, August 05, 2004

 

I, Corporation

On Monday, we went to see I, Robot. What follows are some rambling thoughts about it.

Most of the reviews I read about the movie complained that it does not do justice to Isaac Asimov, whose science-fiction stories about robots "suggested" the storyline, according to the credits. Asimov apparently had a somewhat triumphalist view about the ability of artificial intelligence to solve social ills. As a thoroughgoing rationalist, he believed that robots, with their superior logic, would ultimately be able to make more rational (and thus more ethical and beneficial) choices than humans. The movie, as reviewers have pointed out, nearly turns this triumphalist vision on its head. It undercuts the idea that foolproof logic is necessarily a sure guide in morals. If in Asimov's stories, the head wins out over the heart, the movie's message seems to be that the heart, not the head, is what makes us truly human.

Will Smith's robot-hating character mistrusts robots because they seem cold, soulless, slaves to logic. Smith has learned the hard way that robots cannot feel; they are simply walking calculators, always crunching numbers, always figuring percentages: all Q.E.D. and no T.L.C. The exception is a highly advanced robot named Sonny, who has been equipped not only with a positronic brain, but an artificial "heart." (The movie makes me wonder, not for the first time, about the nearly universal cultural idea that the "heart," which is just a muscle for pumping blood, is actually the seat of our emotions. Sonny's positronic brain is in his "head," but where do you think his "heart" is? You guessed it--smack dab in the middle of his chest.) In the end, Sonny saves the day because he is able to resist what seems to be irresistible logic. Contra Bonnie Tyler, Sonny has a total eclipse of the head.

Not having read Asimov, I'm in no position to judge how dramatically the movie departs from his intentions. But the complaint seems to me beside the point. So what if a movie engages with its source material in a dialectic way, even drawing different conclusions from the same basic premises? That's interesting in and of itself. But the other reason I think the pro-Asimov naysayers miss the point is because the movie is not only about technology.

In fact, I thought the movie was most interesting as an allegory about corporations. Sometimes the allegory is a little transparent, but most of the time the movie successfully uses the complex science-fiction questions that Asimov raised as a way to look at powerful corporations and our fear of them.

Others have noticed that I, Robot is one of a recent rash of anti-corporate films. In the movie, U.S. Robotics, which is about to achieve its goal of placing one of its robots in every home, has become a virtually omnipotent private entity. Throughout the movie the viewer gets sweeping shots of USR's headquarters, a huge skyscraper that hyperbolically towers over the Chicago skyline. The CEO, we find out, is the richest man in the world. (See if you can count the number of veiled Microsoft references in the film.) Throughout the movie, we get the sense that this corporation is covering something up (think Big Tobacco), that the CEO is probably crooked (think Martha Stewart and Ken Lay), and that USR has become a de facto government just by virtue of its technological innovation. At a key moment in the film, for instance, when it seems like the Marines ought to be showing up in droves, we learn that USR owns all of the Defense Department's contracts (think Halliburton).

The movie, then, is less a parable about technology and more a meditation on the interface between technology and corporate power. This makes for some rich ironies, like the fact that there are product placements scattered throughout the flim--for Converse, for FedEx, for JVC, etc. (Mel talks about this over at In Favor of Thinking. And by the way, remember ten years ago when it was actually a big deal that a character in Sleepless in Seattle was shown drinking Snapple? Remember when that was news? Now products get thirty-second close-up shots and gratuitous scenes just to fit them in. Some of the shots in this film are so lingering that you can't help but wonder whether the filmmakers are trying to comment on this.) It is also possible to view Sonny and his "Father" as whistleblowers of a sort, and to read the film as a reflection on the complex emotional dynamics of "outing" company wrongs.

The movie even subtly suggests that corporations are like robots, which means that all of the themes having to do with head versus heart, logic versus emotion, are meant to make us think about how corporations work as much as about how robots work. I was more convinced of this interpretation when it turned out that USR really was being run by a kind of highly evolved mainframe computer, VIKI, a big-brother supermachine with a truly frightening degree of "total information awareness." VIKI, even more than the individual robots in the film, is logic incarnate. But she also represents the idea that corporations develop a mind of their own. Corporations have a kind of collective artificial intelligence, a logic based on the bottom line. And once this logic kicks in, even individual members of the corporation lose their agency, their ability to override the cold decision-making process of the corporation with emotional or ethical objections. The truly horrifying thing about the movie is that the CEO, in the end, is not responsible for the corporation's crimes. Even he is eventually at the mercy of VIKI's logic.

I think the film does some interesting things with these questions--it's dealing with some of the same issues as the recent film The Corporation, reviewed here by The Chutry Experiment. It's not just about the "personhood" of robots; it's about the "personhood" of corporations and the insidious form that personality can take. So if the movie departs from Asimov's blithe hope in artificial intelligence, it does so not because it wants to monger fear about technology. Rather, the film departs from Asimov in understanding technology as inextricably tied up with the corporate logic of markets and profits. Technological innovation can no longer be isolated from the machinery of corporate capitalism; those who now talk about technology as a panacea are usually those who also see a fortune to be made.

And I guess, in reality, it's always been this way. We tend to forget that the Luddites did not destroy machines because of a hatred of technology, but because the machines represented a concentration of wealth in the hands of the few. If this film is "Luddite" or anti-Asimov, its critique of robotics is more complex than a mere valorization of emotion over reason. The big question of the movie is not whether robots are "human," but whether corporations are.

Wednesday, August 04, 2004

 

Jimmy and Blue

In an earlier post, I mentioned that I had used some jazz CDs to bribe myself into finishing a chapter draft for my dissertation. Here they are, along with my completely partial reviews:

Jimmy Smith/Cool Blues. Jimmy Smith is the undisputed king of the jazz organ. This live session, recorded at the now closed Small's Paradise in New York City, was not released at all until 1980, having been lost somewhere in the cavernous vaults at Blue Note. Fortunately, it has been rescued and resurrected as part of the RVG series. This is vintage Smith from his peak period; you can find him here still dancing around the borders between hard bop and soul jazz. To my ear, Smith is best when he swings at a rate somewhere between "cooking" and "grooving" (to use the technical terms). The first track, "Dark Eyes," is the best of the bunch, but the soloing is uniformly excellent throughout.

I was a bit disappointed with the rendition of "A Night in Tunisia," if only because it sounds so much like the Jazz Messengers version recorded two years later in 1960 (especially the way the tenor sax comes in slowly during the intro; compare Tina Brooks here with Wayne Shorter on the later date). Perhaps this similarity isn't coincidental, since Art Blakey, founder and perennial leader of the Messengers, is on the drums behind Smith. The band takes the tune at such a high tempo that Smith's organ lines get a little lost in the mix; this may be the one point in the evening when there was too much "cooking" and too little of Smith's signature "groove." But as a Blakey showcase, it's excellent, as is the 1960 version.

One of my favorite things about this CD reissue is the included announcement of the band by Babs Gonzales, who introduces the "volcanic interpretations of James Smith." Gonzales introduces the band members as being "of" their instruments: " ... Lou Donaldson of the alto sax, Tina Brooks of the tenor sax, and Mr. here [Eddie] McFadden of the guitar." That's just cool (at least if you're a jazz nerd). Makes it sound like the tenor sax is a country somewhere, where Tina Brooks hangs out with Hawk and Trane and Dexter. And when you show up to play the tenor at Small's or wherever, they say you're "of" the tenor, just like they'd say "Tina Brooks of New York City" or "Coleman Hawkins of the U-nited States." Let me reiterate that if the organ is a country somewhere, Jimmy Smith just owns the place. After you've listened to this album and seen what I mean, listen to House Party and see what I mean again.

Blue Mitchell/The Thing to Do. The Thing to Do is go out and buy it. This is one of the most recent RVG releases, and it's a good thing it's out because the (relatively) few albums Blue Mitchell recorded as a leader are hard to come by these days. "Fungii Mama" is a well-known track, but "Step Lightly," written by Joe Henderson, is the real treat. The album's also a good one to pick up if you like the acoustic Chick Corea on piano, before the 1970s happened to him. Mitchell and Junior Cook (tenor sax) worked together as Horace Silver sidemen, but you can tell the difference that Corea makes in the group's sound.

Tuesday, August 03, 2004

 

Warning: Satire

SENATOR KERRY ACCUSED OF FLIP-FLOPS

WASHINGTON, D.C.--In an off-the-cuff remark to the White House press corps today, President Bush accused Democratic presidential candidate Senator John Kerry of "wearing flip-flops."

"Look at what he's done in the Senate," Bush said. "My opponent is a flip-flopper. The American people need to know that as their second President, I will never wear flip-flops."

When asked by a reporter whether he meant to say that Senator Kerry "flip-flopped" by changing his mind on policy issues, a flustered Bush replied, "The world needs to know that I mean what I say."

Senior administration officials later declined to say whether the President had made a mistake. Instead, a press release issued by the White House declared that "President Bush strongly opposes open-toe footwear. In these dangerous times, the American people need a leader who can fill the President's shoes. Clearly, the only man who can is the President."

Another senior administration official, speaking on condition of anonymity, later said that President Bush wears loafers without socks around the office and cowboy boots while on vacation at his ranch in Crawford, Texas, but "never, ever flip-flops." When asked which kind of shoe the President prefers at home, the official replied "loafers." But he later revised his statement and changed Bush's preference to "boots." Again speaking on condition of anonymity, another source close to the President's campaign said that in a quickly assembled focus group, "loafers" had not polled well.

Responding to President Bush's criticisms, Senator Kerry said, "The President has had a problem with shoes for some time now. First it's loafers, then it's boots. Which is it going to be? There is no Booted States of America, or Flip-Flopped States of America. There is only the United States of America."

A campaign advisor for Kerry also told reporters that "Senator Kerry is a strong man who wears strong shoes."

It was too early to say how the flap over flip-flops would affect the polls, but political experts did not rule out the possibility that the incident could affect the critical "Beach Bum vote" in the November elections.

Monday, August 02, 2004

 

Bleak House and Book Guilt

I have decided not to finish Charles Dickens' Bleak House, even though I only have 200 of 700 pages left to go. I've simply lost interest, and there are too many other books knocking at my door.

It's not that the book is terrible, but it's not great either. I think the tipping point for me was when Esther received a treacly proposal from her guardian Mr. Jarndyce. I should have seen it coming, I know, but I thought Dickens might do better. My threshold for reading Victorian novels is relatively high, I suppose, yet there is only so much one can take of the cult of domesticity before you start to taste the poison in the Kool-Aid. At one point, Esther actually kisses her housekeeping keys. That was another point in favor of giving up on the book. The final blow came when I flipped back to read the "Introduction" to my edition, by Doreen Roberts, and found it more interesting than the book itself. "That's it," I told myself last night. "Move on."

This morning, though, I feel the spiky pangs of "book guilt," the iron maiden of graduate students everywhere. It is hard for me to have unfinished books on the shelf--they seem to grow an eye for every hundred pages left unturned, eyes which turn and follow me like paintings in a haunted house. I have an answer ready for my "book guilt" whenever it rears its ugly head: "Life is too short, and there are too many books, to spend another minute on a book that doesn't interest you." But my "book guilt" has an answer ready in return: "How do you know that the book is bad when you haven't finished it? Didn't Aristotle say that a life cannot be judged good or bad until its owner is on a deathbed? Don't characters deserve the same longsuffering at least until their epitaph, 'The End'?"

I'm also hounded by my feeling that books are to be a final refuge from the idea that language and art are, like so much else, primarily for consumption, that you should eat until you're full, snack when you want, graze here and there, click the link, change the channel, skip those pages, skim the headline ... close the book? Reading seems to serve a higher purpose, to play a role in my personal development. Shouldn't I grit my teeth and entertain the possibility that the difficult books are actually the ones I most need to finish? I guess that anything good for you can become a breeding ground for regret.

I think I know how to predict when to expect "book guilt"--it's when my reasons for reading the book were unclear in the first place, when my motives were a mixture of work and recreation. It's not that I mind reading books because I "have to"--I'm in the fortunate position of enjoying what I do for work (reading, writing, thinking, teaching), but work is still work even when you enjoy it. The problem is not when a book is clearly defined as a "work" book: it's when a book presents itself to me as both a "work" book and not a "work" book, or when it presses itself upon me out of a vague sense that "this is something I should read." I can see that Bleak House was one of these books: I picked it up partly for "work," since someone had recommended to me that Dickens' portrayal of Mrs. Jellyby's "Telescopic Philanthropy" might be related to my research. It was, and I'm glad I read those portions of the book. But what did this mean for the book's fate when Mrs. Jellyby promptly disappeared midway through the plot? What was left to propel me forward, other than the voices in my head that variously whispered, "This is great literature. This is historically important. This is an unfinished book." I can handle these voices so long as they are joined by other musings: "This is beautiful. This is right. This is illuminating." With Bleak House, as it turns out, there was only obligation and no sense of satisfaction.

This has happened with other books, who now glare at me from darkened corners of my shelves: Don Quixote, Les Miserables (I've tried twice but the chapter on Waterloo has always proven to be mine), The Sound and the Fury. The death knell of a book is when you notice how many pages you have left. The proof that a book is worth finishing is that you are hardly aware that the end is approaching; indeed, you are aware that the book might never end since it will warrant second readings and will impinge itself on your mind long after the final page (e.g., The Power and the Glory, Invisible Man). Bleak House simply wasn't one of those novels; maybe the next one will be.

(Incidentally, Julie at No Fancy Name says today that she's making a fresh attempt to "get through" Middlemarch and confesses, three chapters in, that she's already bored. I think I know how she feels--has "book guilt" struck again? But she also admits that she has "ulterior motives" for taking on Eliot again; stay tuned, she says, for more details.)

UPDATE: Julie's reasons for reading Middlemarch now available.

Site Meter