Sunday, October 31, 2004

 

Barbarians

In partial reply to Paul Musgrave and Jason Kuznicki, who have both recently posted C.P. Cavafy's poem "Waiting for the Barbarians" as "the poem of the presidential election season" (in Jason's words), I give you an excerpt from one of W.H. Auden's greatest longer poems, "The Age of Anxiety" (1947).

The whole thing deserves a careful read, especially if the title seems current to you. You should also read it in the original so that you know who "Rosetta," "Malin," "Quant," and "Emble" are, and so that you can see Auden's original indentations for some of the lines. A fair warning: it gets somewhat bleak, but it's the good kind of bleakness.

* * *

ROSETTA spoke first:
Numbers and nightmares have news value.

Then MALIN:
A crime has occurred, accusing all.

Then QUANT:
The world needs a wash and a week's rest.

To which EMBLE said:
Better this than barbarian misrule.
History tells more often than not
Of wickedness with will, wisdom but
An interjection without a verb,
And the godless growing like green cedars
On righteous ruins. The reticent earth,
Exposed by the spade, speaks its warning
With successive layers of sacked temples
And dead civilians. They dwelt at ease
In their sown centres, sunny their minds,
Fine their features; their flesh was carried
On beautiful bones; they bore themselves
Lightly through life; they loved their children
And entertained with all their senses
A world of detail. Wave and pebble,
Boar and butterfly, birch and carp, they
Painted as persons, portraits that seem
Neighbours with names; one knows from them what
A leaf must feel. By lakes at twilight
They sang of swans and separations,
Mild, unmilitant, as the moon rose
And reeds rustled; ritual appointed
Tastes and textures; their touch preferred the
Spectrum of scents to Spartan morals,
Art to action. But, unexpected, as
Bells babbled in a blossoming month,
Near-sighted scholars on canal paths
Defined their terms, and fans made public
The hopes of young hearts, out of the north, from
Black tundras, from basalt and lichen,
Peripheral people, rancid ones
Stocky on horses, stomachs in need of
Game and grazing, by grass corridors
Coursed down on their concatenation
Of smiling cities. Swords and arrows
Accosted their calm; their climate knew
Fire and fear; they fell, they bled, not an
Eye was left open; all disappeared:
Utter oblivion they had after that.

MALIN said:
But the new barbarian is no uncouth
Desert-dweller; he does not emerge
From fir forests; factories bred him;
Corporate companies, college towns
Mothered his mind, and many journals
Backed his beliefs. He was born here. The
Bravura of revolvers in vogue now
And the cult of death are quite at home
Inside the city.

QUANT said:
The soldiers' fear
And the shots will cease in a short while,
More ruined regions surrender to less,
Prominent persons be put to death
For mass-murder, and what moves us now,
The defence of friends against foes' hate,
Be over for ever. Then, after that,
What shall we will? Why shall we practise
Vice or virtue when victory comes?
The celebrations are suddenly hushed,
The coarse crowds uncomfortably still,
For, arm-in-arm now, behind the festooned
Conqueror's car there come his heirs, the
Public hangman, the private wastrel.

ROSETTA said:
Lies and lethargies police the world
In its periods of peace. What pain taught
Is soon forgotten; we celebrate
What ought to happen as if it were done,
Are blinded by our boasts. Then back they come,
The fears that we fear. We fall asleep
Only to meet the idiot children of
Our revels and wrongs; farouche they appear,
Reluctant look-behinds, loitering through
The mooing gate, menacing or smiling,
Nocturnal trivia, torts and dramas,
Wrecks, arrivals, rose-bushes, armies,
Leopards and laughs, alarming growths of
Moulds and monsters on memories stuffed
With dead men's doodles, dossiers written
In lost lingos, too long an account
To take out in trade, no time either,
Since we wake up. We are warm, our active
Universe is young; yet we shiver:
For athwart our thinking the threat looms,
Huge and awful as the hump of Saturn
Over modest Mimas, of more deaths
And worse wars, a winter of distaste
To last a lifetime. Our lips are dry, our
Knees numb; the enormous disappointment
With a smiling sigh softly flings her
Indolent apron over our lives
And sits down on our day. Damning us,
On our present purpose the past weighs
Heavy as the alps, for the absent are never
Mislaid or lost: as lawyers define
The grammar of our grief, their ghosts rise,
Hanged or headless, hosts who disputed
With good governors, their guilty flesh
Racked and raving but unreconciled,
The punished people to pass sentence
On the jolly and just; and joining these
Come worse warlocks, the wailing infants
Who know now they will never be born,
Refused a future. Our failings give
Their resentment seizin; our Zion is
A doomed Sodom dancing its heart out
To treacly tunes, a tired Gomorrah
Infatuated with her former self
Whose dear dreams though they dominate still
Are formal facts which refresh no more.

They fell silent and immediately became conscious again of the radio, now blandly inexorably bringing to all John Doakes and G.I. Joes tidings of great joy and saying

Definitely different. Has that democratic
Extra elegance. Easy to clean.
Will gladden grand-dad and your girl friend.
Lasts a lifetime. Leaves no odour.
American made. A modern product
Of nerve and know-how with a new thrill.
Patriotic to own. Is on its way
In a patent package. Pays to investigate.
Serves through science. Has something added
By skilled Scotchmen. Exclusively used
By upper classmen and Uncle Sam.
Tops in tests by teenagers.
Just ask for it always.


Thursday, October 28, 2004

 

Sobering numbers

... from the Program on International Policy Attitudes.
Even after the final report of Charles Duelfer to Congress saying that Iraq did not have a significant WMD program, 72% of Bush supporters continue to believe that Iraq had actual WMD (47%) or a major program for developing them (25%). Fifty-six percent assume that most experts believe Iraq had actual WMD and 57% also assume, incorrectly, that Duelfer concluded Iraq had at least a major WMD program. Kerry supporters hold opposite beliefs on all these points.

Similarly, 75% of Bush supporters continue to believe that Iraq was providing substantial support to al Qaeda, and 63% believe that clear evidence of this support has been found. Sixty percent of Bush supporters assume that this is also the conclusion of most experts, and 55% assume, incorrectly, that this was the conclusion of the 9/11 Commission. Here again, large majorities of Kerry supporters have exactly opposite perceptions.

These are some of the findings of a new study of the differing perceptions of Bush and Kerry supporters, conducted by the Program on International Policy Attitudes and Knowledge Networks, based on polls conducted in September and October. ...

Similarly, 57% of Bush supporters assume that the majority of people in the world would favor Bush's reelection; 33% assumed that views are evenly divided and only 9% assumed that Kerry would be preferred. A recent poll by GlobeScan and PIPA of 35 of the major countries around the world found that in 30, a majority or plurality favored Kerry, while in just 3 Bush was favored. On average, Kerry was preferred more than two to one.

Bush supporters also have numerous misperceptions about Bush's international policy positions. Majorities incorrectly assume that Bush supports multilateral approaches to various international issues--the Comprehensive Test Ban Treaty (69%), the treaty banning land mines (72%)--and for addressing the problem of global warming: 51% incorrectly assume he favors US participation in the Kyoto treaty. After he denounced the International Criminal Court in the debates, the perception that he favored it dropped from 66%, but still 53% continue to believe that he favors it. An overwhelming 74% incorrectly assumes that he favors including labor and environmental standards in trade agreements. In all these cases, majorities of Bush supporters favor the positions they impute to Bush. Kerry supporters are much more accurate in their perceptions of his positions on these issues.
 

Dead historians

Apparently, Francis Parkman is alive and well, but understandably cranky. AJ at No Great Matter has an exclusive interview.

 

Family values

"Christianity involves us in an inconveniently large family connextion." James Russell Lowell (1849)

Christianity is not a religion of family values. This fact, I know, remains contrary to popular opinion. Millions of American Christians today exhort one another to focus on the family, even though the earliest Christians primarily seem to have remembered Jesus telling them to love one another and to focus on God.

What the early Christians did remember Jesus saying about families would seem, at first glance, to give slight encouragement to "family values." When an anonymous character in Luke's gospel says, "I will follow you, Lord; but let me first say farewell to those at my home," Jesus literally rebukes him for, well, focusing on his family. There are other inconvenient passages. Matthew suggests that Jesus was not unequivocally in favor of stable families. (His own family had some instability, after all.) And if these are not evidence enough that Jesus might have looked askance at "family values," the clincher is the story, attested in all three synoptics, of his family reunion gone awry. (Yes, Jesus had family reunions, having been tested in every respect as we are.) Each text paints the same scene: Jesus is surrounded by a crowd of disciples. His family tries to get in to see him. Informed of this, Jesus points to the disciples around him and says, "Here are my mother and my brothers!"

And here's the trouble with that. If we view that story in the context of other stories about Jesus, then many of those disciples whom Jesus called his mother and brothers were not exactly the virgin Mary. Even the Marcus Borgs of the world -- those who believe that we can know very little about the historical figure of Jesus -- tend to agree that he associated with marginal people. You know, prostitutes and the like. People who were not exactly poster children for the two-parent household.

In short, according to his earliest acolytes, Jesus redefined "family" itself. "Family," in his vocabulary, no longer referred to biological, far less traditional, kinship ties. His mothers and fathers, brothers and sisters, were those who gathered around him. And rather than enjoining his followers to love only each other, he encouraged them to brazenly widen the family circle even more than that. They were to love not only family, but neighbors, and not only neighbors, but enemies. Someone has said that Matthew 5:43-45 is the most admired and least practiced passage in all of the gospels: "You have heard that it was said, 'You shall love your neighbor and hate your enemy.' But I say to you, Love your enemies and pray for those who persecute you, so that you may be children of your Father in heaven." See what he did there with "children" and "father"? He took "family value" words and then exploded their meaning -- your neighbors are your enemies, the exemplary Israelites are Romans and Samaritans, your abba is God. And God's family is a non-family.

Yet American Christians routinely rally under the banner of the "traditional family." It's no wonder that the unconventional fellowship practices of Jesus -- drinking with sinners, eating with tax collectors, speaking to prostitutes, and so on -- are valued less and less. That's exactly the danger of equating Christianity with "the family," or with any other false cognate like "civilization" or "nation." For each of those words substitutes a narrower definition of "we" for the community that the earliest Christians envisioned. How many Christians since, in the name of "family values," have cast stones at "home wreckers," or used the idea of family as an excuse for disowning one's own sons and daughters? That's what happens when "family" comes before "Christ" -- Christians start to look less like Christ, and more like the fallen families to which Christ's "family" represents a radical alternative.

I have been deliberately provocative in suggesting that Jesus was against "family values." Now it is time for some qualification. I admit that Jesus, according to the gospels, approved many of the things that some would classify today as "family values." But if so, then all that should matter to a Christian is showing that Jesus approved them. Why must Christians appeal to "family" for their ethics? Should Christians believe that Jesus's teachings need some additional stamp of approval? That Christian values need to be validated by "family research"? To subsume Christianity under "family values" is get the whole religion wrong.

Perhaps this seems to you like an issue of semantics. Perhaps. But you should know that I believe spirituality is often about semantics. Further, I do not find this to be a mark against the power of either. It matters whether people define themselves as "Christians" or as "pro-family." Call me a Romantic, if you will, like the German philosopher Novalis:
To signify through sounds and tones is a remarkable abstraction. With three letters I signify God, and with a few strokes a million things. How easy is the manipulation of the universe, how vivid the concentration of the spiritual world! ... A word of command moves armies, the word 'freedom' nations. [1798]
Words are powerful, and especially the word "family." It takes only a few strokes more than "God," after all, and over time it can come to take a God-like shape. I suspect that's why Jesus was adamant about not confusing his "family" with one's natural or traditional kin. Such confusion has allowed the word "family" to move nations, and even to send armies to war. Many Christians thus accept "collateral damage" abroad as the price we must pay to keep "our children" safe. We strangely rejoice that we can fight terrorists "where they live" so that we will not have to fight them "here," ignoring the fact that this just makes "their children" unsafe instead. American Christianity has almost reverted to the very thing it should consider pagan: instead of worshipping "family gods," families have become our gods. It is almost as if for hearth and home, for love of kin and country, you can set aside those pesky commands to love your enemies as yourself.

"Family values" consequently justify anti-Christian practices -- things like refusing to eat with the alienated, or resisting evil with evil. "Family" and "nation" are conflated in order to excuse militarism abroad. And when necessary, "family" and "nation" can be separated again for the sake of conservative policies at home. Why not privatize social security? As long as you can provide for your family, why value the families of others? And why worry too much about the fact that millions of "their children" are without health insurance? Putting "family values" first inevitably reduces the sphere of personal responsibility to smaller and smaller circles. By contrast, the Christian's family is supposed to be inconveniently large.

To love this larger family, does the Christian therefore have to desert smaller families? Not necessarily. If we're still referring to stories about Jesus as a touchstone for these matters, then remember that, according to John, he was able to consider the welfare of his mother, even while in the act of dying. If you start from Jesus' principle of love for neighbor and enemy alike, you have no reason not to love your natural family dearly too. In fact, it helps to start from Jesus' principle when you eventually discover -- as many people unfortunately do -- that you are sleeping in the same house with your enemy. But if you start from the principle that nuclear families are somehow sacred, if you focus on the family, then it is easy to lose sight of Jesus' principle of equally loving those who hate you. That's why the danger for Christians is not supposed to be in loving family. Rather, as Jesus put it, the danger is in loving family more than him.

Wednesday, October 27, 2004

 

Secret ballots

When in the second half of the [nineteenth] century peasants began to secure the vote -- first in France in 1848, in Germany in 1871, in Britain in 1884, in Spain in 1890, in the Habsburg Monarchy in 1907, in Italy in 1912 -- landlords in local and national politics expected support at the polls. Where necessary, it was enforced by bribery or coercion: the ballot was public in Austria and Prussia, for example, and not effectively secret in Britain or France. Peasants were often mustered by priest or bailiff to vote in a body. One German landlord distributed completed ballot papers to his peasants in sealed envelopes. A curious voter started to open his to see how he was voting, and received a smack on the head from the outraged bailiff: "It's a secret ballot, you swine!"
From Robert Tombs, "Politics," in The Nineteenth Century, p. 19.

P.S. At least the peasants had envelopes to open.

Tuesday, October 26, 2004

 

Essays, pieces, and posts

While watching the World Series tonight, I've been revisiting Perry Miller. For all of Miller's antiquated emphasis on "the uniqueness of the American experience," he can make a lot of sense, especially while watching this World Series. With all the brooding talk of "curses" and the apocalyptic surrounding the Red Sox (that blood spot on Curt Schilling's stocking looked a lot like the moon, if you ask me), not to mention the national anxiety and accompanying jeremiads swirling around the elections, the Puritans do not seem all that distant these days.

Miller is playful in the preface to his Errand in the Wilderness, in which he describes having an almost mystical "calling" to intellectual history while sitting on the banks of the Congo. There's a lot going on in that story (see Amy Kaplan for a must-read piece on Miller's epiphany), but what caught my attention tonight was Miller's mischievous jabs at the well-known publishing trick that Errand represents -- take a bunch of journal articles by a distinguished scholar, and then release them as an anthology with introductory comments by the author.
If for twenty-five years a man writes out of steady application to a single theme -- if, that is, the theme itself be sufficiently spacious -- he discovers that he has wrought out a consistency he could hardly have formulated at the beginning. Not that I have avoided publishing articles, and many sentences, which I wish I had not. I have failed myself much more often than have the scholars I emulate. A few of my more egregious lapses I have silently expunged for this edition. However, certain of my gaffes are so much a part of the record that, assuming the record be worth preserving, I let them stand, with prefatory warnings that readers may fully profit by my mistakes.

Omitting, for reasons both of space and policy, works of which I am downright ashamed, along with others that I recast into chapters for either volume of The New England Mind (The Seventeenth Century, 1939, 1953; From Colony to Province, 1952), I here put together those that seem to add up to a rank of spotlights on the massive narrative of the movement of European culture into the vacant wilderness of America.
At the conclusion of this second paragraph, an asterisk directs a reader to a footnote that wryly begins: "American scholarship is prone to idolize the footnote."* The last paragraph of the preface continues this pattern of ironic self-reflection:
There is a disposition among modern publishers, which extends even to university presses, to shy away from the word "essay." To call a collection like this a volume of essays is to curse it with the remembered pomposity of Emerson, the ponderousness of Macaulay. In the world of journalism, the approved noun is "piece." A piece is confessedly a mere exercise, not pretending to pronounce upon the universe. Yet in this usage there is a double implication: while a piece is unpretentious, it secretly prides itself on being workmanlike. And it meets a deadline. Though I have generally manufactured these studies for some sort of deadline, I still enjoy the luxury of revision. Even so, they are not transformed into essays. Wherefore, I am content to offer them, employing a few editorial comments to plead for their general coherence, as a compilation of pieces.
Miller settles for the word "piece" instead of "essay" to describe his errands from European culture into the "vacant wilderness" of America. But he breaks his own rule in the introduction for Chapter 7, because deep down he much prefers the dynamism of errands to the static solidity of "pieces." "This essay -- let me for once call a piece by that name, using it here in the original sense of an endeavor or an exertion that does not quite reach its goal -- has been unhappily construed by many readers ..."

Sometimes I share Miller's frustration that the genre of "essay" has so much disappeared from academe. Much could be gained if scholars, drawing on accumulated moments of instruction and reflection, could feel free to venture forth without the fear of loss. Let me venture, with no scientific proof, that academics rarely refer to their shorter works as "essays" any longer. While passing each other in the hallway, colleagues are more likely to refer, alas, to this or that "piece." They are even more likely to refer to an "article," which like "piece" is a reifying noun. Both names make scholarship sound like an article/piece of clothing, rather than the nervous but exhilirating process of dressing for a safari.

Like "essay" and "piece," "post" is both a noun and a verb. One of the reasons I began posting on this blog was because I wanted to "essay" -- to have a forum for publishing half-baked ideas, wild speculations, and meandering meditations, all of which are discouraged in "pieces." Yet I find that, in a very short amount of time, I have allowed my blogging habits to conform to the logic of the "piece." I have created for myself artificial deadlines, and veered close to thinking of blogging as workmanlike. You would not be aware of this, of course, because thinking that way actually makes me less likely to blog. It has created a bottleneck of "posts" that I do not write because they do not seem fully developed. I thus remind myself here that a blog "post" is a developing, a venturing forth, an errand into the wilderness.

Maybe the words "posted by" are part of the reason why I psych myself out of blogging, just as the word "piece" was problematic for Miller. I resolve, therefore, to rename the "pieces" on this blog "improvisations." Let me call "posts" by that name, even at the risk of their being misconstrued.

* Made you look, idolater!

 

Who knew?

Marsilius of Inghen (c.1330-96), Dutch philosopher and theologian. Born near Nijmegen, Marsilius studied under Buridan, taught at Paris for thirty years, then, in 1383, moved to the newly founded University of Heidelberg, where he and Albert of Saxony established nominalism in Germany. In logic, he produced an Ockhamist revision of the Tractatus of Peter of Spain, often published as Textus dialectices in early sixteenth-century Germany, and a commentary on Aristotle's Prior Analytics. He developed Buridan's theory of impetus in his own way, accepted Bradwardine's account of the proportions of velocities, and adopted Nicholas of Oresme's doctrine of intension and remission of forms, applying the new physics in his commentaries on Aristotle's physical works. In theology he followed Ockham's skeptical emphasis on faith, allowing that one might prove the existence of God along Scotistic lines, but insisting that, since natural philosophy could not accommodate the creation of the universe ex nihilo, God's omnipotence was known only through faith.
This entry, from my copy of The Cambridge Dictionary of Philosophy, is signed "J.Lo." Who knew? (Apologies to John Longeway.)

Wednesday, October 20, 2004

 

History wiki

For reasons similar to the ones Jason Kuznicki gives, things have been slow at Mode for Caleb lately. I'm hoping to rectify the situation soon, but lately I've been walking in the shadow of the valley of deadlines, fearing all kinds of evil. Blogging has thus seemed like more of a rod than a comforting staff.

Speaking of Jason's blog, though, I highly recommend Positive Liberty if you are not reading it already. Jason is a friend and fellow graduate student here at Hopkins, and although I disagree from time to time with what I read at PL, it has always been an agreeable disagreement. Jason values pluralism, and for that reason he takes differences of opinion seriously and respectfully, which is more than you can say for much of the blogosphere, or indeed for the public sphere in general.

Jason recently posted some interesting thoughts on fact-checking among those two rare birds: bloggers and historians. He points out that while famous blogs are scrutinized carefully for errors within minutes of being published, it often takes years to check the facts in a historical monograph. Sometimes historians' facts are never checked, because the evidence for them is buried in archives that only a few have the ability to visit.

Moreover, fact-checking in history is constrained by institutional and practical realities. Many original historical claims are "the product of an afternoon's exhausted work, by a scared 20-something, in a foreign city, in a language he does not understand, in an ocean of lost cultural signifiers." That work is checked by a dissertation committee that has to take the word of the "scared 20-something." And the result is that many errors--large and small--are allowed to stand, even in the work of established and brilliant scholars, for many moons.

Jason believes an ideal historiographical "wiki" might help:
Could historians learn from bloggers? You bet. It's possible to imagine history conducted along radically different lines, yet still doing the same work as today--or possibly better.

Imagine, for instance, a historians' wiki, modeled on Wikipedia, that would include within it the full texts of major monographs in history. Each passage could be noted and commented by anyone who wished--or, if you want to keep the authority of the academy sufficiently strong, only let the advanced graduate students and higher do it.

But either way, we could all be asked to vote on the veracity of different assertions, to check off whether we personally had seen the evidence on which the claims were made, and to state as specifically as possible where the errors were to be found. No longer would a historian go into the archives, get something wrong, and let it stand for twenty to thirty years. In the wiki future, historians would be rewarded--this part is crucial--on the basis of their fact-checking, not merely on how many articles they manage to turn out in a given time.

Annotation would grow on annotation; digressions and duplications would no doubt be common. But a wiki-based approach to history would break the authority of the published text in precisely the way that bloggers have done so convincingly for current events.

It would be an enormous task, of course. But I suspect it would prevent a lot of errors from cropping up in history to begin with and would mercilessly prune out the ones that are already there.
Most historians, I think, share Jason's frustration with the difficulty in pruning errors from historical scholarship. That frustration is in part a kind of sublimated self-loathing, since I'm sure most historians remember being "scared 20-somethings." That memory, whether of long ago or of yesterday, makes us all worry about being prone to error. If you read Jason's full post, you'll see that his dismay over recently finding serious errors in a senior historian's work almost made him disserticidal. "This individual is among the most respected in my field. What errors will one day be found in my own work? What errors have I already made?" (Step away from the edge, Jason ...)

The idea of a huge historians' "wiki" resurrects the old question of whether history is primarily an art or a science, a dilemma that vexed many of the nineteenth-century forerunners to professional historians. The will-to-science in history always makes small errors seem large; the scientist in every historian wants to get every last detail right, wants to make sure that his fact-finding and fact-checking tools are faultless and finely calibrated.

But the will-to-artistry in history is less affected by the occasional error, because the historian as artist realizes that if the devil is in the details, the divine muse is in the big picture. In his 1828 essay on "History," Lord Macaulay pointed out that "a history in which every particular incident may be true may on the whole be false." Even if we could check all the facts, and correct all the errors, of historical scholarship, we would not thereby have made good works of history.

If we did have a history "wiki," it might well serve the scientific impulse in historiography -- to get the facts right, to get back to the archives, to make no mistake about it. But it might do so at the cost of historical craftsmanship. Jason points out that his ideal wiki would likely have all sorts of digressions and annotations upon annotations. But as such debates and detours multiplied, they would take on an importance out of proportion to their actual worth. As discussions on "facts" proliferated, they might distort the larger story that a historian has to tell. And thus, for the sake of abolishing errors of fact, the "wiki" might lead to errors of emphasis and interpretive weight.

This is, I would argue, one of the general disadvantages of "wiki" models of information sharing. By allowing anyone to add anything, a sense of balance, of narrative, of personal artistry is inevitably lost. Small matters, simply by virtue of their inclusion in a "wiki" entry, take on significance. In the Wikipedia entry on William Lloyd Garrison, for instance, one of the first pieces of information is the pseudonym he used as a young editor. I do not know any scholar of abolitionism who would deem this significant enough to appear at the top of the article. Of course, the "wiki" allows me to "correct" this, but it's not a simple matter of fixing an error. It's a matter of how I would paint the picture, the perspectives and palettes I would use. It's a question of art, not science, and while "wikis" are conducive to the scientific checking of information, they are not as conducive to artistry.

When scholars feel the urge to zoom in on the details -- as Jason, I, and many other historians no doubt do from time to time -- it might be good to consider Macaulay's cautionary advice:
Diversity, it is said, implies error: truth is one, and admits of no
degrees. We answer, that this principle holds good only in abstract reasonings. When we talk of the truth of imitation in the fine arts, we mean an imperfect and a graduated truth. No picture is exactly like the original; nor is a picture good in proportion as it is like the original. When Sir Thomas Lawrence paints a handsome peeress, he does not contemplate her through a powerful microscope, and transfer to the canvas the pores of the skin, the blood-vessels of the eye, and all the other beauties which Gulliver discovered in the Brobdignaggian maids of honour. If he were to do this, the effect would not merely be unpleasant, but, unless the scale of the picture were proportionably enlarged, would be absolutely false. And, after all, a microscope of greater power than that which he had employed would convict him of innumerable omissions. The same may be said of history. Perfectly and absolutely true it cannot be: for, to be perfectly and absolutely true, it ought to record all the slightest particulars of the slightest transactions -- all the things done and all the words uttered during the time of which it treats. The omission of any circumstance, however insignificant, would be a defect. If history were written thus, the Bodleian library would not contain the occurrences of a week.
Part of me likes Jason's idea of a great big historians' "wiki," because it would usefully criticize the authority of published books in the way that blogging is starting to keep the mainstream media honest. But authority and honesty, while certainly important, are not the only things we look for in good history books. We also turn to them for context, for narrative, for symmetry, for color. Those are the things a "wiki" might militate against.

Indeed, if historians can learn from bloggers about checking facts, bloggers might also take some lessons from historians and mainstream journalists about crafting "fiction," by which I do not just mean "made-up stories." In that case, the blogosphere might be treated to more portraits of handsome peeresses, and fewer close-ups of pimples and pores.

Tuesday, October 19, 2004

 

Autumn leaves

I. "As is the generation of leaves, so is that of humanity. The wind scatters the leaves on the ground, but the live timber burgeons with leaves again in the season of spring returning. So one generation of men will grow while another dies." Homer, The Iliad.

II. Pay attention.

III. "Autumn Leaves" immortalized.

Friday, October 15, 2004

 

Out of town

I am currently in San Antonio, my hometown, with irregular access to the Internet. Move along, folks, there's nothing to see here. At least until Monday.

Wednesday, October 13, 2004

 

Dissertation glaucoma

Human beings are stuck, for better or for worse, with binocular vision. Our field of view encompasses only the 180 degrees in front of us. Plus, our peripheral vision is relatively weak: out of the corner of your eye, you can barely see colors or sharply distinguish shapes. This situation could be worse. We could have, like many mammals, monocular vision, with one eye on either side of our heads. We could have, like the Cyclops of legend, only one eye squarely set between our temples, which would provide our greedy brains with half as much visual data. But our field of view could also be better. We could have, like many mothers, eyes in the back of our heads. Even better, we could have, like many birds, 360 degrees of vision. Then there would be no such thing as peripheral vision. Where is the periphery on a sphere?

I was thinking about these things not because I am pondering a career in optometry, but because I am pursuing a career as an historian. And writing a dissertation in history makes me acutely aware of how limited human vision is.

From the moment that I began writing my dissertation, as the central themes of the work began to take shape, I also began to notice a large but blurry mass of indistinct ideas, just visible out of the corner of my eye. As I began with my archival research, and familiarized myself with the secondary literature, this blurry mass grew larger still. Now, huge bodies of literature stretch out indefinitely on the peripheries of my field of view. I know they are there, just as I know there are books on the coffee table to the left of where I am sitting right now, even though I cannot read the titles on the spines or identify their colors. Important questions -- about complex social categories like race, class, and gender -- remain in my peripheral vision. They resolve into focus only when I turn to pay them close attention. I know that all of these questions, all of these literatures, are worthy of my full attention. But the Latin roots of the word "attention" tell a tale: the word stems from attendere, literally "to stretch." And human vision can stretch only so far to the right or to the left, without inducing strain.

Speaking of a stretch, you might think the metaphor I am drawing is one. (This is the third entry in what it is becoming a series of posts that discuss dissertation writing with extended metaphors, some more extended than others.) But consider this: think of how often we speak about writing and thinking by using visual language? Writers promise, "I will show," or they assert, "We can see," or they remind, "As we have seen." Opinions are "points of view"; ideas "appear" differently from different "perspectives." (Out of the corner of my eye, I've been watching the presidential debate on PBS. I just heard Mark Shields and David Brooks talking about how "visionary" the candidates were. Or were not.) Even the way we "see" thought is shaped by the way we see. See what I mean? (In a mailing I recently received from the University of California Press, this book caught my eye, but I have not read it.)

All thinking, like all seeing, becomes blurry at the peripheries. Writing history requires the historian to focus the reader's eye. The past is an almost unfathomably brilliant kaleidoscope of shiny things. To take it all in would require the kind of rapid eye movement available only in the world of dreams. So I remind myself of this when that cloud of blurry ideas starts to bother me. How will I move that historiography into the center of my frame? When will that huge question about class identities come into sharper focus? It is when those questions seem urgent that I remind myself (yes, writers and historians are inward-looking enough to need reminding) that my lines of sight are limited. You cannot see it all at once.

When eye doctors evaluate vision, they do not raise their eyebrows if patients see what is in front of them better than they see peripherally. They do not ask you to read the eye chart on the wall to your left, without turning your head. If your peripheral vision is weaker than your frontal vision, that is not abnormal. Your vision is not defective. At least, a standard degree of peripheral weakness is normal. But there is such a thing as glaucoma. Some blurriness in your peripheral vision is normal; peripheral blindness is not. And in the worst cases of glaucoma, blindspots that appear first in the corners of your eyes converge gradually on the center, until total blindness results. A good ophthalmologist has to be able to tell the difference between the normal limitations of the human eye and the abnormality of diseases like glaucoma.

So too does the historian have to distinguish between normal and abnormal peripheral vision. If the blurry mass of ideas in the corner of my mind's eye becomes too big, and gradually shrinks my field of view, then I have a problem. The key is to remain aware of what and how much is there, to be able to focus on those things when they impinge onto your central frame, to make that what is out of sight is never out of mind. This is a long (but hopefully not "obscure," another visual thinking word) way of showing you that having a long list of things you still have to get to in your dissertation does not mean you are blind. But you do have to be on guard for signs of glaucoma. From time to time, you'll have to turn your head from side to side. The wrong way to deal with the problem of peripheral vision is to put on blinders. The right way is to accept your natural limitations, but patiently work around them. It might help, too, to remember that in history, as in art, "perfect" vision might not always be to your advantage, since perfect human vision, after all, is in some ways less than perfect.

Tuesday, October 12, 2004

 

Sage advice

Blogging has been light over the last few days, and probably will be for a few days more. The reasons: a lengthy to-do list, plus an upcoming out-of-town trip this weekend. I am flattered and honored that people seem to keep stopping by this blog. It pains me to think of your pointing your browser here, only to find the same pontifications you've already read (or already decided not to read), still sitting at the top of the page.

May I take this opportunity to recommend Sage? On the advice of Jason, who warned me that blogging would become addictive, I've been using this Firefox extension to read RSS/Atom feeds from many of the blogs I frequent. It helpfully tells me when blogs have been updated, so I do not feel the let-down of heading over to regular reads like Hoarded Ordinaries, or The Parish, or Sharp Sand, or Paul Musgrave, and finding nothing new. Of course, Sage will only work for you if you already have Firefox. But if you do not already have Firefox, why not?

Saturday, October 09, 2004

 

Confessions of a coffee drinker

In the spirit of The Weblog's Friday Afternoon Confessional, I have a confession to make. (I started this post on Friday afternoon, and the confessional mood has carried over to today.) I am a regular Starbucks patron. And I have been for some time.

I remember when the first Starbucks I ever knew opened, next door to a new Barnes and Noble (also the first in my experience) at the intersection of Interstate 10 and De Zavala Road in San Antonio. Since I lived just down the street, and since books, music and coffee are a large part of my joie de vivre, I became a frequent customer. In high school, my biggest complaint about Starbucks was the feeling of indignation I got when the employees starting locking the door and mopping up before the posted closing time, making it impossible for me to get a late-night mocha fix before studying Calculus into the wee hours. I don't remember for sure, but it's quite possible that I even wrote a letter to corporate HQ. It might have been filled with vehement outrage about my right to buy expensive coffee right up to the 59th second of the 59th minute of the ten o'clock hour. But again, I don't remember for sure. In fact, maybe it was a friend of mine.

College, of course, has a way of complicating and redirecting high-school indignation. I acquired, for one thing, a healthy skepticism about corporate transparency. Perhaps that skepticism is one reason why I feel the need to label this post as a confession. I learned in college that Starbucks has become a favorite flogging horse for the radical Left, which has demonized the company as a purveyor of cultural and economic imperialism.

Many contemporary activists make the ubiquitous Green Siren sound like the Whore of Babylon--a strumpet whose songs woo impressionable coffee-drinkers towards the Scylla of suburbia or the Charybdis of consumerism. The activists, like Odysseus's crew, can protect themselves from the Siren only by chaining themselves to things, or perhaps by plugging up their ears. If you believe anti-globalization activists are right about Starbucks' siren song, you probably find it appropriate that the Battle of Seattle, an anti-WTO demonstration that turned violent and destructive in 1999, was fought in the streets of the company's hometown.

Postmodern critics often refer to the way globalization rips cultural signifiers away from what they signify. The global marketplace becomes a bazaar of free-floating symbols--like the Coca-Cola trademark, for instance, or the Nike "swoosh"--which conceal the material conditions under which they were created. Ironically, however, anti-globalization activists often use free-floating signifiers to their advantage. Rather than mounting a careful critique of Starbucks, they also tend to detach the Green Siren from its contexts, the better to throw darts in her general direction. Any one of these corporate symbols can be used as short-hand for unfair trade and bad business ethics, without needing to point carefully to specific evidence of guilt. The Siren becomes the token for a multitude of sins, much like the White Whale that Starbucks' namesake helped pursue. To rally his crew, after all, Captain Ahab had to turn his quarry into a formless mass, a creature whose villainy grew larger with every rehearsal of its sins. In truth as in fiction, the objects of monomania are often oversimplified by those who hurl the harpoons.

What are the sins for which Starbucks and customers like myself need to confess? The answers are various, and they are hard to specify. Often, criticisms of Starbucks are primarily anecdotal. A frequent charge is that the company has driven independent, locally owned coffehouses out of business, thus covering the urban landscape with Starbucks franchises. Counter-charges are also frequently anecdotal, but at least as compelling: after all, the market share for specialty coffee has grown since Starbucks came on the scene, arguably allowing more independent houses to thrive. The fact that there are so many Starbucks stores is not in itself evidence that more independent coffeehouses are failing. I did some quick Googling on the subject and was unable to turn up knockdown arguments for either side, though I'd be interested if anyone knows of some statistics that can be massaged in one direction or another.

As for the aesthetic blight of cookie-cutter coffeehouses, I'm inclined to say that there are worse ways to dot the nation's landscape. Moreover, for many communities, Starbucks does play the role that local coffeehouses otherwise would. When the De Zavala Starbucks opened, it was a new thing under the Central Texas sun. I'm aware that this defense is also anecdotal and sentimental, but Starbucks represents for me memories of getting coffee with my dad on Saturday mornings when I drove home from college. Might we have forged those memories at a local coffeehouse? Yes, if we had had one in the first place. Perhaps for a city like Seattle or San Francisco, with fully developed coffehouse scenes, Starbucks has had the net effect of closing down coffee options. But in cities like San Antonio or Phoenix, with hot summers and short winters, it's hard to see how coffeehouse options would have been possible before the Starbucks phenomenon.

Nonetheless, in spite of my warm feelings for the place, I'm self-aware enough to see how my attachment to Starbucks is evidence of just how effective their branding strategies have become. McDonalds wants me to believe that they provide happy community spaces, too. Coca-Cola would like to teach the world to sing in perfect harmony. I'm aware that these claims are deliberate efforts to persuade me to buy a product. They often mask the deleterious effects of our "fast food nation" economy. But for whatever reason, I feel assured that Starbucks makes a more concerted effort to be socially responsible than many corporations in its weight-class.

This is what one progressive magazine I read a few months ago calls the Starbucks Paradox. Despite Starbucks' role as the whipping boy of anti-WTO activists, "the employees and habitués of Starbucks [seem] far more diverse by race and class than the American anti-globalization movement." And "moreover, progressives have tended to romanticize small businesses; yet many sweatshops in this country have been small, family-owned enterprises, and that didn’t benefit those who worked there. As a rule, racial minorities have fared better in larger institutions." It's hard to view the labor practices of the company as primarily regressive. Yes, they engage in the same kinds of union-busting intimidation that other corporations do, but they also start from a higher moral ground by offering unusually generous benefit packages, even for part-time workers. As the "Paradox" article argues, this does not mean Starbucks should be given carte blanche. Rather, it demonstrates that the company has signed its name to laudable principles to which they can be held.

Starbucks now issues an annual corporate social responsibility report, and if this does not make them unique, I find their report more convincing than others. While skimming through the McDonalds reports and the Starbucks reports, for instance, I had the feeling that the difference between them is much like that between the Bush and Kerry campaigns. While McDonalds has a long and disastrous record of irresponsibility on which to run, they talk mainly about what the company will be doing in the future, the initiatives it is starting to undertake with the help of its expert advisors. They "continually seek to learn," their initiatives are "ongoing," they have a "vision"--trust us, our people are looking into it. To my way of thinking, which is admittedly prejudiced, the Starbucks reports seem to offer more concrete evidence of real change.

At the same time, when I read "corporate social responsibility" reports, I cannot silence either my skepticism or my optimism. My skepticism tells me that even these reports are marketing ploys. But my optimism tells me that this is an encouraging sign that consumers do have the power to influence corporate behavior. Is the fact that corporations now feel a greater need to demonstrate their responsibility worthy of unmitigated cynicism? I know in these reports we see corporations as through a glass, darkly, but this opacity does not definitively tell in favor of either my inner skeptic or my inner optimist.

How can one possibly disaggregate all of the local causes and consequences that explain modern transnational corporations? Grant me all of my skepticism about McDonalds' good will, and I still must deal with facts like these: Every time I speak on the phone with my 27-year-old cousin, who has mental disabilities, she is buoyant and glowing about her job at McDonalds, which has an exceptional record of hiring handicapped employees. I don't need a corporate report to tell me about the effect that her employment has had on her family--it is not exaggerating to say that it reaffirms my cousin's sense of dignity and helps her family to encourage her aspirations for independence. And when my other cousin--in the same family--had open-heart surgery at the age of five in Seattle, my aunt and uncle would not have been able to afford lodging in the city without the Ronald McDonald House there. In the complicated world in which we live, whales are never simply black or white.

* * * * *

The impetus for this post was the recent announcement that Starbucks is raising its prices for coffee, which has been one of many subjects of conversation at this highly caffeinated blog. I noticed the price hike a couple of weeks ago while I was in Philadelphia, when I paid $1.50 for a tall cup of drip coffee instead of the usual $1.40. Then, this week, when I paid $1.70 for a grande coffee instead of $1.60, I mentioned to the barista that the prices had gone up. "Yep" was his simple reply. Most of the drinks are going up by an average of 11 cents.

The issue of coffee prices is my greatest source of liberal guilt as a Starbucks patron. I'm speaking, of course, of the fact that many local coffee producers in the world are still not being paid a fair price for their crops. But this is a complex problem, too. Just as it is hard to pin Starbucks down with the charge of closing local coffeehouses, it's hard to say definitively that Starbucks itself has had a negative global impact on coffee prices for growers. The demand for quality specialty coffees, which has increased in Starbucks wake, will in the long-run help raise coffee prices from previously record-low levels. And while Starbucks still does not certify most of its coffee brands as fair-trade, their interest in quality standards serves as an incentive towards longer, pre-negotiated contracts with local farmers, and encourages them to invest in more sustainable farming practices.

Does this mean that when I buy a cup of Starbucks coffee, I can rest assured that I'm not ripping off a farmer in Kenya? Absolutely not. But here is a more difficult question: would I rest easier about that farmer's prices in a pre-Starbucks world? Absolutely not. The fact that Starbucks raises its consumer prices can again be interpreted either optimistically or cynically. Cynically: They are trying to increase market share and profit margins. Optimistically: They could be trying to pass the cost of paying fairer prices on to us, the consumers, instead of simply short-shrifting the coffee grower. Though I am no economist or mathematician (there's a reason I needed a mocha fix to study for Calculus) it seems to me that when prices for coffee rise--unlike prices for oil, perhaps the only world commodity of greater importance than coffee--the local grower is more likely to benefit than not. The simple fact remains, however, that full transparency is hard to achieve here. I can strive to be an informed consumer. I can purchase fair-trade certified coffee as often as I can. But I am still enmeshed in global processes that attenuate my agency and shorten my sight. I see through a mug, darkly.

In flagellating myself for drinking coffee, I am railing against my complicity in the gigantic systems that have created so much wealth disparity in our world. But I am aware, in a way that the average Battler of Seattle might not have been, that my self-flagellation does not by itself change that disparity. I am further aware that the actions I can take to be a more responsible consumer, while to some degree effective, do not solve the problem of my complicity in inscrutable and far-reaching patterns of change. So if you've continued reading this post in the expectation that I was coming to The Answer to the dilemmas I've raised, I am sorry to disappoint you. I warned you that I was in a confessional mood, and the discipline of confession is often directly opposed to our human impulse to explain and understand. The Answer, if there is one, is this: There is no such thing as an Immaculate Consumer, or an Immaculate Corporation. The world is one in which good and bad mix together as inseparably as sugar in coffee.

When I think about the complicated issues of fair trade and consumer or corporate responsibility, the historian in me often thinks back to my particular historical subjects, the abolitionists. The crusade against Atlantic slavery was in many ways similar to the anti-globalization cause. Abolitionists in Britain knew that the sugar trade was stained by the blood of Caribbean slaves, a stain they tried to publicize and document with scads of information. Their information gathering efforts were often blocked by the well-situated and powerful sugar interests in London, who embarked on disinformation campaigns. They willfully dissembled about the horrors of panopticons and plantations, all the while issuing social responsibility reports of their own, which portrayed themselves as enlightened and paternalistic capitalists.

Faced with the knowledge that sugar was a hybrid of "sweetness and power," many abolitionists believed that boycotting the product was morally obligatory. The historian David Brion Davis reports in his magnum opus that one prominent Quaker abolitionist, William Allen, "resolved to abstain from sugar until its West Indian cultivators had been emancipated--a vow he kept for forty-three years." There is an appealing simplicity to Allen's "quest for purity," the same appealing simplicity in calls to only buy organic shade-grown coffee, or beans that have been certified as "fair trade."

I have a great respect for the kind of conviction represented by Allen's vow, especially when I reflect on how difficult it would be for me even to give up something as luxurious and non-essential as specialty coffee. But the moral absoluteness of the vow also oversimplifies; it holds out the idea that one can extricate oneself completely from oppressive power relationships, that one can make a consumer choice with full knowledge about the origins of a product. If anything, obtaining that knowledge has become even more difficult now. I have no idea whether the components that make up the computer I'm using were at some point supplied by seedy warlords using child labor. I have no way of knowing whether the wood in this desk was harvested in an environmentally responsible way. I can agitate for greater responsibility, or be agitated myself, but that agitation does not secure for me a conscience unburdened by nagging doubts.

In 1843, these issues came up at the second World's Anti-Slavery Convention, when a group of Anglo-American abolitionists met to discuss strategies for ending slavery in the American south. Some delegates viewed the problem in a way similar to William Allen's. They believed that England and Europe should create protectionist obstacles for the importation of slave-grown cotton. Some thought England could encourage the growth of cotton in its imperial holdings in India as a competitor for slave-grown American crops. But free-traders at the meeting barely concealed their disdain for these measures. One of the most famous free-trade activists of the time, Richard Cobden, pointed out how difficult it would be to avoid all contact with slavery's evils:
There is not one of our friends here from America, who did not come in a ship laden with slave-grown cotton or tobacco; and when you send the tidings of this meeting to all parts of the world, the very paper will be the produce of slave labour, for the greater part consists of cotton. Will you tell me that by isolating yourselves, and preventing inter-communication with your species, and shutting out men from the social communion because they have slaves, that that is the way to reform mankind? It was not the way in which the great Propounder of our religion went to teach mankind. He mixed with the bad and the good. Do you mix with the bad as well as the good; and your example will be more infectious than that of the bad. That is the way to reclaim the world.
The first part of Cobden's speech still rings true today. Who is to say that "the very paper" on which anti-WTO activists print flyers does not consist of some evil, somewhere along the chain of its production? On the other hand, in defense of the Seattle Battlers, I don't share Cobden's sanguine hope that free trade by itself will make good business practices "more infectious" than the bad. The world cannot so easily be reclaimed. Cobden is right, though, that the anti-free-traders were naive. To produce free-grown cotton in India was to subserve imperial and oppressive labor practices there. Cobden was right about buying slave-grown cotton: you're damned if you do, and damned if you don't. But surely he too is wrong that to buy the "bad" is a surer road to salvation.

To borrow from something "the great Propounder" said, I guess we have to do the best we can with handling our "unrighteous Mammon." (See the aforementioned sermon by Tony Price.) With our limited knowledge, and given the limited transparency of global corporations, we have to try to do the right thing. But I have to confess (literally) that making the right consumer decisions is an incredibly complex question, and it cannot be taken lightly. The world does mix the good and the bad, as Cobden implied, and simply being in the world means being mixed up with both. At the end of this post, I'm disappointed to find myself calling for little more than an ironic and critical distance from one's own choices. I'm concerned it might come across as a call for inaction. But cultivating critical distance, I guess, is what confession is primarily about, and it is at least a prerequisite for prudential action in the world.

"And thus, through the serene tranquillities of the tropical sea, among waves whose hand-clappings were suspended by exceeding rapture, Moby Dick moved on, still withholding from sight the full terrors of his submerged trunk, entirely hiding the wrenched hideousness of his jaw. But soon the fore part of him slowly rose from the water; for an instant his whole marbleized body formed a high arch, like Virginia's Natural Bridge, and warningly waving his bannered flukes in the air, the grand god revealed himself, sounded, and went out of sight. Hoveringly halting, and dipping on the wing, the white sea-fowls longingly lingered over the agitated pool that he left."

Wednesday, October 06, 2004

 

Dose of Kant

Since in their endeavors men proceed neither merely instinctually, like animals, nor yet according to a fixed plan, like rational citizens of the world, it appears that no systematic [planmassig] history of man is possible (as perhaps it might be with bees or beavers). One cannot resist a certain [feeling of] indignation when one sees men's actions placed on the great stage of the world and finds that, despite some individuals' seeming wisdom, in the large everything is finally woven together from folly and childish vanity and often even childish malice and destructiveness. In the end, one does not know what concept one should have of a species so taken with its own superiority.
From Immanuel Kant's, "Idea for a Universal History with a Cosmopolitan Intent" (1784), in Perpetual Peace and Other Essays

 

Global tests

"It is my express wish that in awarding the prizes no consideration be given to the nationality of the candidates, but that the most worthy shall receive the prize, whether he be Scandinavian or not." -- Alfred Nobel (1895)

Try to imagine one of the recent Nobel prize winners in chemistry--two Israelis and one American--ever arguing thusly: "As a scientist, I never gave another country a veto over my findings." It is hard to imagine, no? That's because science advances only on the premise that conclusions have to pass a "global test." In fact, if the reasons for a scientific theory only pass the muster of particular countries, that's a good indication that it's bad science.

Perhaps I am naive for wishing that both of our presidential candidates were more like scientists in this regard. Theories about global security are surely even more important than theories about the subatomic structure of the universe, even though in our nuclear age these subjects are increasingly related. When our national leaders make decisions about whether to deal in death and destruction, surely they should abide by the same standards of reasoning that constrain chemists when they talk about how certain proteins receive the "kiss of death." Reasons for attacking terrorist cells, just like reasons for assertions about biological cells, should abide by the global tests of logic, truth-telling, and empirical proof. That's what John Kerry meant when he said on Thursday night that the case for any pre-emptive war has to pass a global test. He means it has to be backed by "legitimate reasons," reasons that seem legitimate to any fair-minded person, "whether he be Scandinavian or not," as Alfred Nobel put it.

But the Bush campaign's spin has encouraged Kerry to back away from the strength of his claim. Last night, when asked about the "global test," John Edwards basically gave the Bush line on Kerry's behalf--that he will never allow other countries to veto our national security. That response only validates Bush's spurious opposition between our national security and a "global test." In fact, these things are not opposed at all. Passing a "global test" is as essential to our security as it is to the verification of scientific theories.

Here is what Kerry should say, but what he won't say because it is deemed too politically risky. When we wage preemptive wars, we are making a decision that not only affects our national security, but the security of countries besides our own. So if we can't give legitimate reasons to the world for destabilizing its security, then our wars deserve to be vetoed, in the same the way that the global scientific community ought to veto a theory that is clearly wrong.

Perhaps some see that position as a betrayal of our national interests. But there is a hypocrisy in this position. President Bush's vision for national security has no problem "vetoing" the national interests of other countries. When he insists that no country should ever be able to veto our national security, he means that only we should have such veto power over the views of our allies. And in order to make political gains, Kerry is beginning to sound like he's saying the same thing.

My plea to Kerry is to stick with his original statements about the "global test." Senator Kerry, don't be boxed into President Bush's categories. Reassure us that you understand how our security is inextricably bound up with the security of other nations. Persuade people who don't see it already that we live in an interdependent world. Convey your belief that President Bush's go-it-alone rhetoric is antiquated and therefore dangerous. Give us a President who says: "No country--including our own--should be able to veto the considered judgment of the world."

Tuesday, October 05, 2004

 

Book review

My book review, which I mentioned earlier, came out today on H-SHEAR, a discussion list for historians of the early republic. I reviewed Stanley Harrold's new book, The Rise of Aggressive Abolitionism: Addresses to the Slaves. One of the great things about the H-NET lists is that they allow reviewers to write extended pieces, instead of short synopses, and they also allow authors to respond to reviews. You can read my review here and Professor Harrold's reply here.

 

West on Kerry

I support the neo-liberal Kerry. I wish he would find his voice. He probably needs to listen to a little John Coltrane, Sarah Vaughn or Nina Simone, and actually dig deep in his soul to find out who he is, you know what I mean?
From an interview with Cornel West. (Hat tip: Jesus Politics.)

Sunday, October 03, 2004

 

Book marks

between pages 232 and 233
in this Moviegoer, by Walker Percy,
there is a piece of paper, folded three times,
right before Binx, in stage whisper, speaks,

"There is only one thing I can do: listen to people, see how they stick themselves into the world, hand them along a ways in their dark journey and be handed along, and for good and selfish reasons. It only remains to decide whether this vocation is best pursued in a service station or--"

stuck there by some Other Reader,
perhaps the one who wrote the name
"Browning" on page 91, or "both luna"
on 193, circling "You're like me,"

it might have been the one who left
a smudge on 218, who wrote on page
22, at an angle of forty-five degrees
the word "menage"--nothing else,

who underlined (page 31) "Uncle Jules
is the only man I know whose victory
in the world is total and unqualified"
and "the City of Man is so pleasant."

"Not in a thousand years could I explain it to Uncle Jules, but it is no small thing for me to make a trip, travel hundreds of miles across the country by night to a strange place and come out where there is a different smell in the air and people have a different way of sticking themselves into the world."

the book is heavy by the end
with all the weight of long ago,
burdened by the "Date Due" labels
with an "X" on all but one,

the ponderous sticker that declares
"3 1151 02157 7782," the edges
seared, stamped, possessed, by
"Milton S. Eisenhower Library."

when the cover was replaced
with a gray and bumpy binding,
this paper, left by Someone else,
was glued by chance into the spine.

this paper, signed by a machine,
"Charles Simpson"--"CS:bbn"--
with letter-head in seriffed caps,
foresees the end of the beginning:

"By now you ought to have received your first issue of Architectural Digest. We hope you've had a chance to browse through it, to let your eye and imagination wander unhurriedly through the dazzling environments people have created for themselves. But as you enjoy the wide variety of fascinating features and articles on interiors, gardening, art, antiques, and celebrity lifestyles in upcoming issues, please don't overlook the matter of your subscription payment."

between 98 and 99, I leave my mark:
a third of a postcard, Sienese, depicting
a sibyl, with sandaled feet, and a dog who
gazes off the mark, and on the back,

"Printed In Italy," "Riproduzione Vi ...",
my scissored card might as well be
stitched there by specificity,
handed along, stuck in the world,
a clue in an Other Reader's search.

Friday, October 01, 2004

 

Globalization versus "globalization"

"Some claim that the world is gradually becoming united, that it will grow into a brotherly community as distances shrink and ideas are transmitted through the air. Alas, you must not believe that men can be united in this way." -- Fyodor Dostoevsky (1880)

[This post is a long-winded continuation of this one. For a complete list of my posts on transnational history, see here.]

Many manifestoes for transnational history begin by stating that our world is uniquely "global." Here's how the story goes: The world's landscape is now criss-crossed by information flows, digital technology, multinational corporations, international NGOs, cheap transportation, "flexible citizenship," [fill in the blank here with your favorite sign that it's a small world after all].

Factors like these, according to globalization gurus, show that the sovereignty of the nation-state is eroding, that a new "Internetional" is forming over and against the international system of Westphalian states. And, so the argument goes, now that we can see how fragile nations are, and how interdependent the world is, we should apply that understanding to our studies of the past.

This seems to be the implication of a recent anthology on "transnational history," edited by Thomas Bender. It is titled Rethinking American History in a Global Age, and the "in a Global Age" is crucial to the general thrust of the book. In the introduction, Bender writes that nation-centered narratives of American history no longer satisfy because the world is globalizing:
We are intensely aware today of the extraterritorial aspects of contemporary national life. The inherited framing of American national history does not seem to fit or connect us to these transnational and global developments. Inevitably, contemporary historiography is being inflected by a new awareness of subnational, transnational, and global political, economic, social, and cultural processes. These circumstances invite, even demand, a reconsideration of the American past from a perspective less tightly bound to perceptions of the nation as the container of American history. One can no longer believe in the nation as hermetically sealed, territorially self-contained, or internally undifferentiated. (3)
I have said before that I am a self-described transnational historian (and describing yourself as a transnational historian is the first thing that transnational historians do). So I'm preaching to myself here. But I'm telling myself to be wary of claims that globalization "demands" a transnational reconsideration of history.

My complaint is not necessarily that this demand is "presentist." I'm not lamenting that our beliefs about the world inform the historical questions that we ask. That's unavoidable, in my view. During the 1960s, historians of abolitionism were influenced by the Civil Rights movement in choosing questions to ask about abolitionists. And I can see how, writing the history of abolitionism in the twenty-first century, I've been influenced by contemporary debates on globalization, patriotism and cosmopolitanism in my choice of what to study.

But my complaint is that transnational historians often take the globalization of the present for granted, instead of interrogating the concept, instead of exposing it to skepticism. There is a difference between being aware of how the present always intrudes into the stories we tell about the past, and uncritically accepting "now" as the end-point towards which all of "then" has been moving.

Consider the interesting contribution to the Bender volume by Akira Iriye. It's useful to look closely at something Iriye says in his essay, "Internationalizing International History":
Historians need neither embrace the concept of globalization uncritically nor suppose that it is the only framework in which to understand international history. But they are uniquely equipped by training to historicize such a concept, if for no other reason that that globalization is a historical phenomenon. David Held and Anthony McGrew, two leading students of globalization, write that this term "refers to ... entrenched and enduring patterns of world-wide interconnectedness ... [and] suggests a growing magnitude or intensity of global flows such that states and societies become increasingly enmeshed in worldwide systems and networks of interaction." Words like "become" and "increasingly" are part of the historical vocabulary, and so historians are in a good position to make a contribution to the literature. (53)
This paragraph starts out exactly right. Historians should not uncritically "embrace the concept of globalization." They need to "historicize such a concept." But there are two ways to do that. One is to tell the history of Globalization with a capital "G," by showing how the world changed over time to become more global. But the other option is to tell the history of "globalization" (notice the scare quotes). A history of "globalization," unlike the history of Globalization, would try to understand why and when people think of the world as more global, and would bracket the question of whether the world actually is globalizing.

The last part of Iriye's paragraph seems to favor the history of Globalization. Why are historians "in a good position to make a contribution to the literature" on Globalization? Because we can ask, in Gershwinesque fashion, how long has this been going on? But this is not the only service that historians can provide. In fact, if we really want to "historicize" "globalization," we won't just go about testing how or why It has occurred. We'll view the phenomenon itself as a cultural construction--as a concept--and not just an objective historical fact. To put this another way, "increasingly" and "become" are part of the historian's vocabulary, but so are "global," "states," "systems," "interaction," and "interconnectedness." All of those words need to be taken apart, not just the ones that deal with change over time.

The distinction I'm drawing here--between Globalization and "globalization"--might seem like splitting hairs. I reserve the right to do that on this blog, but I don't think I'm doing it now. For one of history's tasks is to put quotes around words whose meanings we think we understand. When we speak of Globalization, we might think we have a clear idea of what constitutes this phenomenon. We might mean that in today's world, so much money and information flows across borders in the blink of an eye, that the world is more connected than ever before. But in fact, that statement is loaded with unspoken cultural assumptions--first, that the people involved in those information flows represent, in some sense, the world, and second, that those flows do more to connect than to divide. There are other lacunae in this logic as well. When we say that technologies like the Internet are shrinking the world, for instance, we pass lightly over the fact that only a minority of the world's population enjoys regular Internet access. Historians can help shine a light on some of these logical leaps by demonstrating that generations long before ours thought they were living "in a global age," even before air travel, Internet outsourcing, and the United Nations.

On the first day of the class I'm teaching this semester, I put this quote on the board: "No nation can now shut itself up from the surrounding world. ... Space is comparatively annihilated." I asked students whether they agreed; they did, and for many legitimate reasons--the world economy, the growth of international law, the nature of worldwide environmental problems, digital communications etc. But then, in the grand trickster tradition, I revealed that this was not a quotation from someone like Anthony Giddens, David Held or Anthony McGrew. It was said by Frederick Douglass in 1852. He was referring to transatlantic steamships, which shortened the trip from New York to Liverpool from four weeks to two: "Oceans no longer divide," Douglass said, "but link nations together. From Boston to London is now a holiday excursion. Space is comparatively annihilated."

When confronted by the fact that earlier generations thought of their world as "global," we can do one of two things. We can chuckle at the quaintness of thinking that steamships annihilate space. Or, we can learn to laugh at ourselves, too. We can look with greater skepticism at the idea that the Internet annihilates space, or by itself links nations together. Another way of putting our historiographical choices is this: We could suggest that steamships were the beginning of Globalization, or we could say that they were the beginning of "globalization." I'm not opposed absolutely to the first disjunct in each of these disjunctions, but I think the second disjunct has been neglected in manifestoes for transnational history.

The search for origins in history often means going as far back in time as your expertise runs, and then surmising that everything before that time was completely different. My advisor is rightly annoyed whenever he reads in my drafts some variation of, "And nothing was ever the same again ..." Perhaps we can find the origins of Globalization at some point in the past, but whenever we locate that point, we are likely to find that people before then thought their age was global too.

Many globalization theorists, including Giddens, talk a lot about how the electric telegraph, which spanned the Atlantic in the 1860s, was the beginning of modern global connectivity. But before the telegraph, people thought steamships annihilated space. And before steamships, people thought clippers annihilated space. And before clippers, people thought ships with sails made the world smaller, and before that ships with galley oars, and before that ... By emphasizing how contingent "global" world views are, historians fully historicize the past and help us put the present in better perspective.

To conclude, there are two possible contributions transnational historians might make to debates over Globalization. Neither has to be the One True Way to write transnational history, but they should be distinguished, and they ought to correct each other. On the one hand, we can write the history of Globalization, but that concedes in the first place that It has happened. In addition to investigating how the world is "increasingly" "becoming" global, we should also write transnational histories that put quotes around "global."

As I see it, the latter is one of the major tasks of transnational historians. We should not simply go and fetch evidence that nations are on their way out, and have been for some time. Our contribution is not just to say that Globalization theorists are right, and we'll tell you where it all began. Our contribution is also to historicize the concept of "globalization" itself, and show how Globalization theorists have come about.

In sum, if part of our task as transnational historians is to show that "nations" are constructed, contingent, and imagined communities, the other half of our task is to show that the "globe" is constructed, contingent, and imagined, too.

Site Meter