Thursday, May 26, 2011

On Ms. Judith McHale, Written Two Year Ago

Can America change hearts and minds? Obama may be popular abroad, but it won't be so easy for his new public diplomacy secretary to improve America's image - John Brown, The Guardian (April 22, 2009)

Few US government activities have been more maligned in recent years than public diplomacy, defined by the US state department as "engaging, informing and influencing key international audiences". Dozens of reports, from all sides of the political fence, have argued that the US had failed to make its case overseas.

Months after the president's inauguration, the Obama administration has finally selected an under-secretary of state for public diplomacy and public affairs: Judith McHale, a media and communications executive – close to the Clintons for years – who is little known to the general public. In the 2008 campaign cycle, the Washington Post reported, she contributed $109,600 to Democratic politicians and campaign committees.

McHale, the daughter of a US foreign service officer and raised in England and South Africa, faces many challenges in her new job (if she is approved by Congress). First among these is the negative effect of the "Hughes legacy". Karen Hughes, a media-savvy George Bush confidante who ran the state department's public diplomacy from 2005 until her resignation in late 2007, was criticised from all sides of the political spectrum for her ignorance of foreign affairs and maladroitness in dealing with the Muslim world.

Despite her "new initiatives", Hughes came to epitomise the previous administration's failure to improve America's overseas image. With a background not dissimilar to Hughes's – including having political connections rather than diplomatic expertise – McHale will have to convince sceptics the world over that she is not a Democratic clone of Hurricane Karen.

Second, McHale – again, like Hughes before her selection – has no previous experience working within the state department bureaucracy. And yet she'll be dealing with an organisation, by some considered dysfunctional, that has its own, often arcane, way of doing things.

With the consolidation of the agency that handled public diplomacy during the cold war – the United States Information Agency – into the state department in 1999, the role of public diplomacy practitioners at Foggy Bottom has been problematic. It's no secret that PD officers are often considered by their co-workers in other career paths to be second-class citizens who don't really count.

McHale will have to demonstrate to her state department colleagues – and to the White House as well – that public diplomacy is an integral part of the foreign policy process and smart power, not just PR or using internet social networks ("public diplomacy 2.0").

McHale's third challenge will be the defence department, which during the Bush administration supported some widely criticised "public diplomacy" initiatives, including having one of its contractors, the Lincoln Group, covertly pay off Iraqi newspapers to print articles composed by the US military but published as straight news items.

To be sure, Pentagon officials recently announced that the position of deputy assistant secretary of defence for support to public diplomacy had been eliminated, in an effort, according to the New York Times, "by the Obama administration to distance itself from past practices that some military officers called propaganda".

But McHale may face an uphill battle in making it crystal clear that she – and not the "strategic communications" and psyops chiefs at DoD – is the public diplomacy boss. Many military officers, following the lead of secretary of defence Robert Gates, do welcome more aggressive civilian "soft-power" programmes, at least in theory. McHale, however, should be ready for bureaucratic turf wars with those in uniform who feel they, and not somebody at state, should be in charge of the battle for hearts and minds.

McHale faces a final challenge: making the state department work harmoniously and productively with the growing number non-governmental organisations involved in public diplomacy.

This expanding engagement of the private sector in PD resulted, in part, from the frustration of citizens concerned with foreign affairs with how the Bush administration was handling its relations with the outside world. In truly American style, US NGOs decided to take matters into their own hands. This certainly was the case of the prominent organisation Business for Diplomatic Action, which concluded that the US government's message was no longer credible overseas. Competition between the government and private sector is, of course, not necessarily antagonistic, but it needs much care and attention.

As she prepares for her new job, McHale can take some comfort in the current popularity of Obama overseas. But honeymoons don't last forever, and anti-Americanism will not disappear overnight (if ever). So, if she is in fact confirmed, which appears likely, the new under-secretary of state for public diplomacy and public affairs may eventually regret having left Discovery Communications – unless she can pull off some minor miracles.

Tuesday, May 24, 2011

Aimless, Misled and in Debt

Aimless, Misled and in Debt Updated May 25, 2011, 01:39 AM
New York Times
Richard Arum is professor of sociology and education at New York University and co-author of "Academically Adrift: Limited Learning on College Campuses."

Current research tracking the fate of newly minted college graduates has now repeatedly demonstrated that large numbers are experiencing very difficult adult transitions. School-to-work transitions in the U.S. have always been comparatively difficult because the institutional ties between firms and schools are relatively weak; and much of a student’s educational preparation does not focus on occupationally specific skills.

Colleges typically have abandoned responsibility for developing the attitudes and abilities necessary for adult success.

There are many advantages to this loose coupling between education and work, but the consequences for student labor market transitions have long been clear [...] for many college graduates. Young adult years are typically filled with many false starts, job shifts and extended periods of under-employment. Given current economic conditions, it is not surprising that many graduates today are experiencing pronounced and acute difficulties trying to make these transitions.

But this in no way implies that nothing is new about the phenomenon nor that colleges are not implicated in these outcomes.

First, the amount of indebtedness many graduates have is pronounced and unprecedented.

Second, young adults today as a group are highly motivated, but often directionless. The sociologists Barbara Schneider and David Stevenson have astutely described recent cohorts as “drifting dreamers” with “high ambitions, but no clear life plan for reaching them.” Indeed, more than a third of college graduates in our study reported that they aspired to own their own businesses, even though there was little evidence that entrepreneurial skills were being developed.

Many colleges and universities are implicated in the difficulties that graduates are facing, since not only did they fail to ensure that college students experienced rigorous academic coursework associated with the development of higher order cognitive skills, but, more troubling, they typically have abandoned responsibility for shaping and developing the attitudes and dispositions necessary for adult success.

In my research with Josipa Roksa, more than a third of students reported studying alone less than five hours per week, but these students were able to graduate on-time and with high grades [JB - my highlight]. If students have learned in higher education that success was possible with such little effort, colleges have done them a great disservice. Career, family and community achievements are not so easily attained.

Evidence that a deeper problem is at play here – one that cannot simply be dismissed with reference to “rites of passage” or current economic cycles – is suggested by examining college graduates’ non-economic behavior in areas where one would expect to observe individuals assuming adult responsibility.

Josipa Roksa and I have found that 30 percent of individuals in our study reported a year after graduating from college that they read the newspaper (in print or online) either monthly or never [JB-my highlight]. Regardless of whether young adults can any longer find employment listings from perusing such outlets, as a democratic society we will likely not be able to confront and overcome our country's difficult challenges with an educated class that fails to even bother regularly consulting a newspaper.

It might not be reasonable to expect recent college graduates to be gainfully employed, but we should be able to expect them to read about the world around them and to think critically and in complex ways about the political rhetoric they hear, the information they encounter, and the economic, political and social problems that we collectively face. If graduates are not doing that, colleges should share some of the blame.

Sunday, May 22, 2011

Apostrophe Catastrophe, or the Consolations of the Internet

Having just finished correcting (or, to be politically correct, I should say "reviewing") undergraduate written examinations, I was again struck -- as I have been for years -- by the inability of students, including the best among them, to use apostrophes "correctly." Among numerous examples: Nazi's, when referring to this word in the plural; parents, when referring to what is loved by parents (including children); and, a recurrent mistake, improperly using "it's" (contraction for "it is") instead of "its" (possessive pronoun) (and vice versa).

The one, admittedly minor, Apostrophe Catastrophe (AC) "tipping point" that tempted me to throw all of the final exams out of the window (if not to jump out of the window myself): September 11'th. How, I asked myself, could he, a student who was quite diligent, have put an apostrophe between "11" and "th"? Just kidding about the defenestration, but patience for the young does have its limits.

In a society that prides itself on its accuracy, I have long tried to understand this failure on the part of the US younger generation to use the apostrophe "correctly." I've come up with the following reasons, of course more hunches than the result of serious research:

Perhaps it is a deep-rooted, subliminal suspicion, in an American culture still formed to a considerable extent by Anglo-Saxon linguistic traditions, toward French grammatical tendencies: "The apostrophe," Wikipedia tells us in a footnoted assertion, "was introduced into English in the sixteenth century in imitation of French practice ... [,] used in place of a vowel letter to indicate elision (as in l'heure in place of la heure)."

"By the 18th century, apostrophe + s was regularly used for all possessive singular forms, even when the letter e was not omitted (as in the gate's height)," Wikipedia goes on to say.

And I recall, from my pleasurable reading some forty years ago of original manuscripts penned by Thomas Jefferson stored at the National Archives (I was then an editor on a documentary volume on early Russian-American relations, which included writings by Jefferson), that he would use "it's" and "its" interchangeably (I hope Jefferson scholars will correct me if my memory has failed me). So, Jefferson may have been a francophile,


but his lack of concern regarding the use of the apostrophe, as a Founding Father, suggests that there is a long tradition in the United States regarding the tyrannical imposition of a "regulated" apostrophe. Let the apostrophe blow freely in the wind (need I cite the song?), rather than being officially placed "where it belongs" on a page. Is that not in the "American" spirit?

It's quite clear to me, after years of "teaching" college students, that "proper punctuation" (not to speak of grammar) is not systematically, if at all, taught in many grade schools/high schools in the United States. I suspect, perhaps unfairly, that all too many English composition teachers, no matter how dedicated they are, themselves can't really explain why "its" is not "it's" to their fifth-grade students  (or have simply given up why, God bless their souls). The emphasis in our schools seems to be "creativity of expression," rather than accuracy in grammar, with the result that when an 18-year old goes to college, he/she has never been taught how to use the apostrophe correctly.

Is that the end of the world? No, except when a "teacher" has to read final exams for people who supposedly aspire to a "college" degree. And, in our competitive society, a misplaced apostrophe may not be helpful on a résumé -- assuming those reading it care about such a matter. Indeed, most don't. So, ultimately, "What me worry?" But beware, dear young people, of anal Human Resouces specialists who'll justify your not making the grade for a job because you can't spell as a college graduate is expected to.

But not to get too solemn: as anyone who reads American publications can attest, the "correct" use of the apostrophe in the United States remains problematical. Why, for example, do so many media, when referring to a decade, put an apostrophe between the last number and an "s"? (e.g., 1990's). Does not sanity suggest that it should be 1990s?

And then, of course, we have the Internet and the new social media, where the apostrophe has been lost in a sea of what some consider illiteracy rather than a new form of communication. When you text-message or even send an email (some people still do send an email; and some still even write letters), why worry about apostrophe? You should want to send your message, no matter how clear or incomprehensible, subito! (instant gratification takes too long!). It's the message itself, not its content, that counts. Narrative, not accuracy ...

Not so speak of spell-check. Students have become dependent on (should I say addicted  to?) this program (I was amazed to see the difference between two forms of the written work of students -- between an in-classroom exam where they were to write their answers by hand, sans computer, and the final, which was a take-home to be sent by email to their instructor. There were far fewer spelling mistakes on the final, although one student -- and spell-check -- evidently could not tell the difference between "boarders" and "borders.") Maybe a computer genius can find a way to avoid the AC syndrome through the Perfect Spell-Check Program. If one in fact does exist, please let me know. Long live technology, if it can abolish illiteracy!

Indeed, in the process of writing this apostrophe-caused outburst (aggravated by seeing yet another "its" rather than "it's" on a exam -- I simply couldn't take it anymore), I found interesting sites/articles in cyberspace dealing with the apostrophe "issue" (an interesting linguistic development, by the way: today we don't have problems -- which may suggest something is actually going wrong -- but we simply have "issues" -- which suggests that no one is actually doing anything wrong). See, for example: Apostrophe Catastrophes; Apostrophe catastrophe; Apostrophe catastrophe! The rogue apostrophe is spreading like measles. It's time to fight back...Avoid an apostrophe catastrophe!

A final, perhaps apocalyptic comment (why didn't the world end as it should have the other day?). In our age of when we are, supposedly, all so "happily communicating" (and here I feel fully confident that I am using aposrophes correctly) and "interacting" online, there has been a near-total breakdown of supposedly clear (I won't say "traditional") means of verbal expression, especially among the young. Indeed, what is perhaps humankind's greatest gift -- language -- is being degraded by technologies supposedly improving communications. No wonder Plato was against writing (and here I am writing about this). More to the point -- how many "great speeches" do you still here today? Compare blackberry user Obama to Bible-reader Martin Luther King.

Is that degradation of the spoken language the end of the world, which, I repeat, was to have happened the other day? No. But are our so-called communicators of the future -- the twitterers and Facebookers -- really communicating in any humanly meaningful way? Twitter me on this question, in 140 characters or less (and of course using no apostrophes) ...

Image from
The Very Violent Road to America June 9, 2011
J.H. Elliott, New York Review of Books

Before the Revolution: America’s Ancient Pasts
by Daniel K. Richter
Belknap Press/Harvard University Press, 502 pp., $35.00

Over the last fifty years the writing of North American colonial history has undergone a great transformation. During the nineteenth century and a substantial part of the twentieth there was not much doubt about its scope or its purpose. Essentially the colonial period was seen as a prelude—a prelude to the achievement of independence by the thirteen mainland colonies from British imperial domination, and to the creation of the God-blessed nation that was to become a model and an inspiration to the peoples of the world. The challenge facing historians of this period was to trace the origins and early manifestations of those elements—political and religious liberty, individual self-fulfillment, innovation and enterprise—that grounded the new nation on a set of fundamental principles, and to explore the processes that would enable the United States to win its rendezvous with destiny.

The resulting story, as told to generations of Americans, was relatively simple and straightforward. Its origins were located in England, the England of Magna Carta, the Protestant Reformation, and the seventeenth-century struggle to save liberty from the grasp of arbitrary power. It was thus an essentially English story, which was then carried across the Atlantic by English emigrants, and was in due course replayed on the soil of America, and primarily of New England. Naturally it acquired new elements along the way. In particular, Frederick Jackson Turner added a fresh dimension to the origins of American individualism with his arguments for the impact of the frontier experience on American society.

The story, however, continued to be shaped by three defining elements. It was Anglocentric, in the sense that it placed the weight of its emphasis on the contribution of British settlers, with some assistance from continental Europeans, primarily those of Teutonic origin, who were granted a kind of honorary Anglo status. It was teleological, in the sense that everything in the story built up to a logical conclusion in the winning of independence. And it was exceptionalist, in the sense that it was a story like no other about a nation that itself was like no other. As William Findley wrote, even before the eighteenth century was over, Americans had “formed a character peculiar to themselves, and in some respects distinct from that of other nations.”1

Over the past few decades all three pillars supporting the structure of colonial history have come to look increasingly insecure, partly as a consequence of changes in the discipline of history, but also because of the enormous political, social, and cultural changes that have transformed the world itself. As far as teleology is concerned, the Whig approach to history, with its retrospective selection of those features of the past that are held to explain a distinctive, and equally selective, interpretation of the present, has fallen out of favor. While it lingers on, more frequently in covert than in open form, it has given ground before a contingent view of events that has no use for teleology. Against a determinist reading of the past, whether Whig or Marxist, historians are now more likely to see it as embracing a range of possibilities, and have become more aware of the need to keep their eyes open for the paths not taken.

American exceptionalism, too, has come to look out of joint with the times. To some extent this is a reflection of the ever-widening scope of historical inquiry. The move by many scholars into social, cultural, and gender history during the past few decades has encouraged historians to look at North American colonial history in the wider setting of the history of the Western world as a whole during the early modern period. Witchcraft, after all, was not a phenomenon confined solely to Salem. Comparative history, too, has helped to identify similarities as well as differences, for example in the colonizing process. On examination, the early settlers of Jamestown do not look so very different in their aspirations and methods from the Spanish conquistadores hunting after gold and Indian laborers in Mexico and Peru.

But perhaps most important of all, the world has changed, and, with it, the United States’ sense of itself. National self-confidence, which once took for granted a manifest destiny deriving from a set of exceptional national qualities and characteristics, has taken some hard knocks since the 1960s. If the destiny is less manifest and some of the characteristics are less positive than they once appeared, then perhaps, after all, the United States does not have all the answers.

What, then, of the third supporting column of the traditional structure of American colonial history, its Anglocentric pillar? This, surely, is the one that has crumbled most dramatically. The greatest discovery made by the United States in the twentieth century was the discovery of its own diversity. If E pluribus unum remains, in its widest sense, an abiding aspiration, the country has been brought face to face with the fact that it contains within its borders a multiplicity of ethnicities and ethnic inheritances that the Founding Fathers could never have envisaged. In seeking to secure their own place in the sun, the different ethnic groups of which today’s United States is composed—groups all too easily categorized into Native American, African-American, Hispanic-American—have also sought to claim their share of the past. Historians have responded by attempting to incorporate their stories into the traditional grand narrative and, in doing so, have broken it wide open.

The dramatic development of the history of slavery, in particular, has made a mockery of any narrative confined to a discussion of the achievements of a white settler population. With some 1.5 million Africans—over three times the number of free emigrants—transported to the British American colonies in the eighteenth century, the African contribution to the construction of the British colonial world, and subsequently of the United States, has rightly assumed its proper place in the story. Similarly, the central place of slave labor in the development of the British Caribbean islands, which in the seventeenth century received some 91 percent of the slaves transported to British America, has helped to bring home the deficiences of a mainland-based narrative.

The plantation societies of the West Indies were integral to the shaping and expansion of Britain’s American empire, as contemporaries were well aware. So, also, was that great historian of British colonial America Charles McLean Andrews.2 But Andrews’s approach to imperial history went out of favor in the postwar period. It would be some time before the work of a new generation of historians firmly restored the West Indies to the agenda of North American colonial history. An empire divided in the late eighteenth century, when the mainland colonies and the Caribbean islands went their separate ways, is no justification for a historiography divided.3

If the new colonial history has been extended to embrace the Caribbean, it has also been extended to embrace the North American West. There has, of course, been a long tradition of borderlands history, as the names of Hubert Howe Bancroft and Herbert Eugene Bolton remind us. Bolton, whose The Spanish Borderlands: A Chronicle of Old Florida and the Southwest was published in 1921,4 worked hard to persuade colonial historians of the importance of incorporating the Spanish borderlands into their view of the past, and of recognizing the extent of the Hispanic contribution to the shaping of the future United States. His views, and those of his followers, however, were Turnerian in their approach to the frontier, and, like Turner, they saw it as a dividing line between civilization and savagery.5

The civil rights movement and the impressive achievements of ethnohistorians in uncovering the past of “the people without history” have combined to discredit this approach.6 Since Francis Jennings launched his assault on historians of “frontier semantics and mythology” in 1975, there has been an enormous increase in the number of studies devoted to Indian societies and to the interaction of Europeans and Native Americans in the colonial period.7 Many of these studies have attempted to see the process of European intrusion and settlement through Indian eyes, and an outstanding contribution along these lines was made by Daniel K. Richter, the author of the book under review, whose Facing East from Indian Country is subtitled “A Native History of Early America.”8 Thanks to the work of Richter and others like him who have set out to break with the traditional Eurocentric narrative, “the people without history” have been given back their voice.

The widened reach of colonial history over the past few decades, however, and the complexity introduced by an awareness of the need to listen to many different voices have led to the realization that the old framework is no longer fit for purpose. In an age of globalization, parochialism is at a discount. One way in which historians of colonial North America have responded to the challenge is to extend their range of vision by looking out from the colonial seaboard, both eastward and westward. To the east, they have discovered, or rediscovered, the Atlantic. To the west they have lifted their eyes to see a continent.

In recent years Atlantic history has become one of the most dynamic of historical subdisciplines. Developed and encouraged in particular by Philip Curtin and Jack P. Greene at Johns Hopkins and by Bernard Bailyn at Harvard, it has sought to show the interconnectedness of the societies—British and European, African and American colonial—that border the Atlantic. In tracing and analyzing the movement of people, commodities, ideas, and cultural practices across and around the Atlantic, which these historians tend to treat largely as a British Atlantic, it has helped to counter any tendency to view the American colonies in isolation. The history of the slave trade and slavery, and of migratory movements and the peopling of America, have been particular beneficiaries of the new Atlantic perspective, but it has also done much to underline the limitations of old-style imperial and nationalist history.9

For all its contribution to the widening of horizons, the practice of Atlantic history has thrown up a number of problems, and one of the most intractable of these has been that of deciding how far it extends geographically. The Indian peoples of the eastern seaboard were early, and tragically, caught up in the turbulence of an Atlantic world in expansion, but how far westward does Atlantic history go? There was clearly a ripple effect as the peoples of the interior were exposed one after another to the presence of settlers moving inland, and of European traders bearing coveted goods. Richter, in his Facing East from Indian Country, takes his western boundary as the Mississippi River, and looks eastward from St. Louis. The land he surveys was indeed Indian country in the eighteenth century, although shrinking Indian country; but westward there stretched a vast continent that was also Indian country, and that hardly seems a serious candidate for inclusion in Atlantic history.

As historians and ethnohistorians have turned their attention to these peoples of the interior, like the Osages, the Comanches, and the Pueblo Indians—peoples who themselves were undergoing great transformations while living out their own histories—it is not surprising that Atlantic history has been finding a counterpoint in the developing field of continental history. This has not yet been subject to the kind of historiographical surveys that Atlantic history has generated, but it has produced some impressive works, like Pekka Hämäläinen’s Comanche Empire, that are revolutionizing our knowledge and understanding of Native American history.10 Such books reinforce the current trend toward seeing Indians as actors rather than victims. They also show how apparently remote events occurring deep in the American interior impinged on and shaped the history of the developing colonial societies. As Paul W. Mapp has shown in an important new book, The Elusive West and the Contest for Empire, 1713–1763, colonial history by the eighteenth century requires a perspective that spans the continent, from the Atlantic to the Pacific oceans.11

These many changes in our perception and understanding of the colonial past have confronted historians who aspire to write accessible surveys with a number of dilemmas. How do they find space for so much that is new without jettisoning too much of the old that is both valuable and important? How wide should be their geographical range, and whose pasts, among the many possible pasts, are they relating? One of the surveys that has most successfully faced up to these challenges is Alan Taylor’s American Colonies, first published in 2001.12 It has now been joined by Daniel Richter’s Before the Revolution.

Alan Taylor’s solution to these various problems was to move forward through time by dividing his book into three chronological sections, entitled “Encounters,” “Colonies,” and “Empires,” and then dividing them into regional subsections, such as “The Atlantic, 1700–80” and “The Great Plains, 1680–1800.” This made his survey widely inclusive, although at some cost to inner coherence. Daniel Richter, who is director of the McNeil Center for Early American Studies at the University of Pennsylvania, adopts a different strategy. He, too, follows a chronological approach, but one that presents the history of North America as consisting of a succession of layers, superimposed one on another. Thus we start with “Progenitors,” covering in two chapters medieval North America and medieval Europe, and then move on to “Conquistadores,” both Roman Catholic and Protestant, and then to “Traders,” “Planters,” “Imperialists,” and finally to what he calls “Atlanteans,” by which he means the peoples of Britain’s Atlantic Empire or those within its orbit. He ends his book, as he begins it, on the eve of the Revolution, with the figure of Thomas Paine.

Paine is important to him because, as Richter points out, even as Paine talked about beginning the world over again, he remained well aware of the presence, and the weight, of the past. It is this presence of the past at each successive stage of the North American story that Richter seeks to demonstrate in his layered history. Inevitably there is something rather artificial about this layering device, as if conquistadores, traders, and planters can be separated into neat substrata, but it has the advantage of presenting the continuities in North American history, as against its ruptures. There is no better antidote to the tendency to see the Revolution as beginning the world again than to take the North American story back to the Indian settlements at Chaco Canyon (in present-day New Mexico) and Cahokia (in present-day Illinois), and show how their inhabitants, and their descendants, contributed to the shaping of the world that the Founding Fathers inherited and wanted to remake.

As is to be expected, Richter is particularly strong whenever he turns to Native Americans and their interaction with those of European descent. He has much less to say about Africans, perhaps feeling that in recent years they have had a substantial share of the limelight. Much of his book, however, is devoted to the telling, or the retelling, of the story of European, and primarily English settlement—of how the English grabbed the land, and ultimately contended successfully for the control of North America. To many readers, therefore, his story will have a familiar ring, although he tells it well, and with many touches of freshness.

He is good, for instance, at drawing on contemporary sources, like the House of Lords Journal for 1710. He uses this to show how contemporaries viewed the Glorious Revolution of 1688, before going on to suggest—not, I think, entirely convincingly—that imperialism looked much the same after as before it. He is also alive to the importance of environmental history, and at various points dwells on the impact on peoples on both sides of the Atlantic of the Little Ice Age that began in the 1300s and reached its peak in the seventeenth century.

Yet as they read his recapitulations of British history and colonial politics, or his excellent brief discussion of such bloody events as Bacon’s rebellion in Virginia in 1676, some of Richter’s readers will wonder how far, if at all, his new layered history differs from its predecessors. The history of English colonization consumes a considerable amount of space. So, too, does the development of British imperial policy and the domestic events that lay behind it. In many respects this is to be welcomed, and reflects another trend of the times—a rethinking of imperial history, with a particular emphasis on the struggle of the European powers for control of the continent.13 Francis Parkman, indeed, should have been living at this hour. Richter’s book does, however, raise the question of just whose past a survey of North American colonial history should embrace, and of the relative amount of space to be accorded to each of the many peoples who figure in its story.

Richter certainly gives Native Americans and the English their due. He also has a welcome comparative chapter, entitled “Dutch, French, Spanish, and English Counterpoints,” designed to illustrate what he calls “the oddity of the English model.” That the English settlements in North America differed in important ways from those of their European rivals is undoubted, although one of the great historical questions remains the extent to which English attitudes toward land, patriarchy, and religion, on all of which Richter places much emphasis, account for these national differences, as against the American environment into which respective European peoples moved, and the character of the Native American peoples with whom they came into contact. This is not, however, a question with which the author can be expected to grapple in a survey as compact and wide-ranging as this.

Yet if Richter often seems to be treading well-trodden ground, even though successfully including many features of the new Atlantic and continental history on the way, his story has an ending that is rather less familiar. There is no triumphant build-up here to a successful Revolution. The final chapter, ominously called “Gloomy and Dark Days,” covers the Seven Years’ War and its aftermath. The words chosen for the chapter title are those of Teedyuscung, a Delaware chief, both warrior and peacemaker, who was murdered in his sleep. Those words convey “the violence that ripped North America’s Atlantean peoples apart,” in Richter’s reading of the period, far better than that “antiseptic term” the Seven Years’ War. In his summation, “Native American traditions of property, land, trade, and power smashed against those of Europeans, in turn setting land-grabbing creole planters against imperial offi- cials. The final victim was the fragile unity of the Atlantean world.”

Ultimately, his history is a history of violence, of violence perpetrated by Europeans against Native Americans, by Native Americans against Europeans, and by both peoples against their own kith and kin. It is a dark and brutal story, although one in which the Native Americans are shown as for long holding their own, manipulating Europeans as trading partners and playing off one set of Europeans against another until the overwhelming British victory of 1763 no longer made this possible. There is precious little uplift here, and little sense of the more constructive characteristics of the brave new world that was rising amid the wreckage of the old. But, in patiently uncovering the layers beneath the rubble, Richter forcefully brings home to us that the American past belongs to many peoples, and that none should be forgotten.

1.1
Cited by Michael Kammen, "The Problem of American Exceptionalism: a Reconsideration," American Quarterly , Vol. 45, No. 1 (March 1993), p. 7, from William Findley, History of the Insurrection, in the Four Western Counties of Pennsylvania (Philadelphia, 1796), p. vi. ↩

2.2
Charles McLean Andrews, The Colonial Period of American History , 4 vols. (Yale University Press, 1934–1938). ↩

3.3
A point well made by Andrew J. O'Shaughnessy in the preface to his An Empire Divided: The American Revolution and the British Caribbean (University of Pennsylvania Press, 2000). ↩

4.4
Yale University Press. ↩

5.5
See David J. Weber, "A New Borderlands Historiography: Constructing and Negotiating the Boundaries of Identity," in Alta California: Peoples in Motion, Identities in Formation, 1769–1850 , edited by Steven W. Hackel (University of California Press, 2010), pp. 215–234, at p. 216. ↩

6.6
Eric R. Wolf, Europe and the People Without History (University of California Press, 1982). ↩

7.7
Francis Jennings, The Invasion of America: Indians, Colonialism, and the Cant of Conquest (University of North Carolina Press, 1975), p. 13. ↩

8.8
Harvard University Press, 2001. ↩

9.9
Atlantic history has already produced an extensive literature. For useful surveys of the field, see The British Atlantic World, 1500–1800 , edited by David Armitage and Michael J. Braddick (Palgrave Macmillan, 2002; second edition, 2009); Bernard Bailyn, Atlantic History: Concept and Contours (Harvard University Press, 2005); Atlantic History: A Critical Appraisal , edited by Jack P. Greene and Philip D. Morgan (Oxford University Press, 2009); and The Oxford Handbook of the Atlantic World: 1450–1850 , edited by Nicholas Canny and Philip Morgan (Oxford University Press, 2011). ↩

10.10
Yale University Press, 2008. ↩

11.11
University of North Carolina Press, 2011. See my review in The London Review of Books (forthcoming). ↩

12.12
Viking. ↩

13.13
As reflected, for instance, in the proliferating literature on the Seven Years' War and its impact since the publication of Fred Anderson's Crucible of War: The Seven Years' War and the Fate of Empire in British North America, 1754–1766 (Knopf, 2000). ↩

If the dog ate your homework, read this

latimes.com
Op-Ed
If the dog ate your homework, read this
A community college English teacher tries to impart a life lesson to students who can, but don't, try hard enough.

By Jaime O'Neill

May 22, 2011

Dear Students,

I taught my first freshman composition class more than 40 years ago. Your class is my last.

We began the semester with 36 students. I predicted on the first day that I would probably wind up giving grades to half that many. Had I been more strict about dropping people whose attendance was erratic and whose assignments weren't coming in, I would have been right. But I let lots of students slide. I didn't drop people who weren't showing up, nor did I drop the people who weren't doing the work. That was no favor because now I'm forced to give grades that will narrow future options for people who might have gone further, had they only tried. If you were one of the students who missed more than five or six classes, or who failed to turn in most of the assignments, you need to ask yourself if you're making good use of your time. There are always excuses for not showing up, or not turning work in. I've heard them all. But lives built on excuses generally don't turn out well.

There were a handful of people in this class who made it here every day, always with the assigned writing completed. If I were an employer, these are the people I would want as employees.

But I have never liked to think of myself as working to provide a screening process for your future bosses. I like to think I'm working for you, and helping build your futures as more fully realized human beings. In that light, some of you have failed this semester. You've failed yourselves. As a result, some of you learned very little and showed no discernible improvement in your writing, wasting your time and mine.

I never find it pleasant or productive to guilt-trip students. But if just one of you reads these words and decides to take your education a bit more seriously, it was worth writing them.

Few people care whether you succeed or fail. You are not showing up to class for your teachers or even your parents. You're not doing these assignments for anyone but yourselves. If you cut classes because your teachers bore you, then you should be dropping those classes, not piddling away your GPA.

I went to a community college too. I screwed up in high school, graduating in the bottom third of my class. But I married and became a father not long thereafter. Those responsibilities made me quite serious about the second chance offered by the community college system. It's difficult to maintain a slacker attitude when you're up nightly with 2 o'clock feedings of an infant daughter whose vulnerability and dependence on you are impossible to overlook. Had I not shown up regularly and done the work conscientiously, I would have blown that second chance. I would have had a much different life, a much poorer one, not only materially but intellectually and even spiritually. And my children would have had poorer lives too, because what I learned in college was shared with them in ways too numerous to count. I've never regretted the portion of my youth that I devoted to study.

And I've never regretted spending so much of my adult life teaching in community colleges. I'm glad I was able to help some of my students get their own second chances. Most of the people who attend community colleges have very little handed to them. We are not favored by wealth or connections. Unlike the Donald Trumps of the world, those born to the mansion, the way is not made easy for us. So it is something of a crime against our very selves when we squander the second chance when it is offered.

Some of you did just that this semester, throwing away time and opportunity. If next semester provides another opportunity, I hope you will seize it. Life has a way of getting serious with us well before some of us decide to get serious with it. By that time, it may be too late to build the life you might have wanted.

And if you don't know just what it is you do want, drop out of school until you figure it out. If you misuse your time here, you will erode the chance you have for a more hopeful future. In the papers you wrote, I occasionally pointed out cliches in your prose. In this note to you, however, I have turned myself into a living cliche, an old teacher scolding the young for lack of seriousness. But ignore the hectoring of an old man who has traveled the road that lies ahead of you and you could become your own living cliche — the loser who squandered opportunity. My hope is that you do not.

Jaime O'Neill is a writer in Northern California.

Copyright © 2011, Los Angeles Times

Friday, May 20, 2011

Social Media: Preparedness 101: Zombie Apocalypse

http://emergency.cdc.gov/socialmedia/zombies_blog.asp

Monday, May 16, 2011

Osama Bin Laden, All-American Porno-Loving Male?




Given the prevalence of pornography in American society -- and I will not pass moral judgement on this phenomenon -- I wonder if, by "leaking," in part doubtless for public diplomacy reasons, that the mastermind behind 9/11 viewed porno tapes in the comfort of his suburban Pak McMansion, the US government is, perhaps inadvertently, depicting this odious character as an "all American" porno-addicted male?

Dare I say, in addition, that OBL's fascination with violence is also not without a certain similarity to a traditional US trend that to many defines the American "homeland"; indeed 9/11 was, so tragically, a Hollywood disaster movie brought from the reel to the real.

Also, and I hate to present such a nightmarish vision, but I can think of no one who would have "enjoyed" more sending Predators to their intended targets more than OBL (as we know, he loved the the power of new technology like the internet, thanks -- I would say -- to his religious fundamentalism), wishing perhaps he could do so with the cold, inhuman efficiency of the staff of a secret US military base near Las Vegas, the US world entertainment capital.

Oh, and let's not forget a National Inquirer issue I saw in my local supermarket the other day, waiting at the check-out counter to pay for my discount chicken legs, with an article saying that Osama used heroin. How much more all-American can you get than by using drugs? We have been happily drugged since the very first days of the Republic; George Washington (our very own first president, with bad teeth, as I suspect Osama had, although his "sheik" Saudi friends  may have taken care of his mollars), as is well known, was "was the largest distiller of his time, producing almost 11,000 gallons of rye whiskey in 1799."

Also, OBL had, according to NI, a "nagging wife." Now, let's be honest, we men between us girls: Isn't a "nagging wife" part of the American macho folklore, full of self-importance about one out of many organs of the human body, which helps American men justify their so-called sexual frustrations. "Male concerns" about their "nagging wife" (i.e., a lady who finds her husband not such a macho man anymore, if ever at all) has led to an a very profitable Viagra industry to recreate that "special moment." Rumors ("news") about OLB using Viagra abound.

While we are on the subject of matrimony, let's turn to OLB and with his many wives, a fact repeated ad nauseam by the US news media. But lo and behold! OBL and his multi-partners bring to mind that uniquely American sectarian phenomenon (as often observed by foreign commentators of religious life in the U.S.), Mormonism (see Thomas Albert Howard, God and the Atlantic: America, Europe, and the Religious Divide [2011]), a phenomenon now accentuated by a leading Republican Mormon running for the presidency, who, granted, "condemns polygamy and its prior practice by his Mormon church [although] the Republican presidential candidate's great-grandfather had five wives and at least one of his great-great grandfathers had 12."

Maybe Romney is a closet Obama Muslim?  Just kidding. Let FOX News deal with that one in its fair unbiased way.

And how about Arnold? I'll let you judge about that one.

As an aside, one reads in the Wall Street Journal that the British do not approve of the colonials "taking out" the Sheik. But here are the bons mots (if such they can be called, with some justification) that a UK citizen of the British Conservative persuasion sent by email to an Anglophile correspondent of mine, which he kindly shared:

Ever had a whisky bin laden? Two shots drowned in water.

Belongs on the Jay Leno show. No wonder the special relationship continues to endure.
Image from

Saturday, May 14, 2011

Ras’s Web Gems: Cowboys in Kabul

http://www.317am.net/2011/05/ras%e2%80%99s-web-gems-cowboys-in-kabul.html#more-8086

via GC on facebook

Thursday, May 12, 2011

What would the Greek philosophers make of P.J. Crowley?


Truth to Tell: What would the Greek philosophers make of P.J. Crowley? - Jonathan Lear, New Republic

On March 10, State Department spokesman P.J. Crowley committed the sin of speaking frankly. During a talk at MIT, he was asked by a researcher to explain the treatment of Bradley Manning. Though he did not think Manning’s treatment amounted to torture, as the questioner had alleged, and though he thought the commander at Quantico was acting within his legal authority, Crowley nevertheless said that the conditions of Manning’s detention were “ridiculous, counterproductive, and stupid.” Three days later, Crowley was out of a job.

As a philosopher, I found his remarks fascinating. The ancient Greeks had a term for Crowley’s actions. They called it parrhesia, the ability to speak one’s mind even when doing so involves social risk. Crowley’s remarks were striking because parrhesia is rarely practiced in American politics, and almost never practiced, at least on the record, by government spokesmen. I wanted to know more about what Crowley had done, so I arranged to meet him for lunch in Washington.

Going into our conversation, I realized there was a possibility that Crowley had simply spoken impulsively. But as our lunch progressed, it became clear to me that Crowley understood what he was doing. When I am interviewing someone, I do not just listen to what they say (and, in any case, I did not expect to extract a backstory from a seasoned spokesman). Instead, I listen for the moment when I can hear their life’s energy enter their words. One of the words that I heard Crowley say with this kind of energy was “credibility.” The Vietnam war was a formative experience for Crowley, as it was for so many other members of his generation. At Holy Cross, he was ROTC corps commander, and he entered the Air Force at a time, as he put it, “of great tumult.” “Our country was being torn apart, but that was rooted in a loss of credibility and, in the case of Watergate, a loss of nobility.” Of Vietnam he said, “The media didn’t lose Vietnam. What lost Vietnam was the loss of credibility because of a gap between what we were saying and what we were doing, and what people saw. ... Having come into the government at the tail end of the Vietnam war, I thought if ever I was in the position I was in, I would try to keep the gap as narrow as possible between what we would say and what we were doing.”

Crowley did not talk about parrhesia, and I did not want to play the role of philosophy professor and raise it with him. But he was clearly saying that he had spoken frankly on purpose. He emphasized that if the MIT researcher’s question had been different—something more anodyne, like, “What about Bradley Manning?”—he would have given a different answer. “But,” he explained, “the question posed by an American citizen was, ‘Why are we torturing Bradley Manning?’ It was a question that, in my judgment, I needed to answer. If I ducked the question, I would have left at least him—if not a larger group—disillusioned. Going back to this relationship between the American people and its government: I thought it merited an honest answer.”

With his talk of credibility and the relationship between people and government, Crowley was not just saying that he had meant to speak frankly. He was making a point about the relationship of parrhesia to government—about the importance of frank speech in politics. One of the clichés we have inherited from the Vietnam era is the concept of speaking truth to power. The paradigm is protest. But Crowley, as I understand him, was working from a different paradigm: speaking truth as power. For diplomatic speech to be successful, he was saying, it must be persuasive; and it cannot be persuasive if it isn’t frank. Frankness requires a willingness to answer difficult questions on the spot, sometimes manifesting the obvious truth that bureaucracies do not really speak with a monolithic voice. Sometimes, as in the case of Crowley’s comments, it is simply a matter of saying what everyone already knows.

We have entered an era in which persuasive political speech is going to have to be frank speech [JB highlight]. In the age of blogs, Facebook, Twitter, and 24-hour news coverage, we all know too much for things to be otherwise. My point is not that new media technologies are inevitably taking us in the direction of truth. At the moment, 25 percent of the American people think Barack Obama was not born in the United States, while another 18 percent say they don’t know, and the Internet has played a crucial role in sustaining this nonsense. Because of its openness and pervasiveness, the Web will continue to be a source of gossip, misinformation, and prejudice. But it is also true that, for all their drawbacks, these technologies have the capacity to cast a glaring light on discrepancies between what our diplomats say and what our country is seen to be doing. And when the discrepancies grow too large, diplomats’ words are emptied of meaning. Parrhesia in this new century is going to be a diplomatic requirement.

This is the context in which to understand Crowley’s remark. It is not simply that he was speaking frankly. It is that he was heard as speaking frankly; and that at least opened up the possibility for him to speak persuasively on other subjects. Parrhesia created a space of trust.

President Obama does not seem to realize this; he seems to be sticking to a worn-out paradigm about the nature of political speech. Asked about Manning’s treatment, he said this: “With respect to Private Manning, I have actually asked the Pentagon whether or not the procedures that have been taken in terms of confinement are appropriate and are meeting our basic standards. They assure me that they are. I can’t go into details about some of their concerns, but some of this has to do with Private Manning’s safety as well.”

This is an attempt at judicious speech that fails because the evasion is simply too obvious. Obama does not say that he has looked into the charges and found them baseless—only that he asked the Pentagon and they gave him assurances. In the moment, Obama’s sights seem to be set on maintaining protocol, protecting the Pentagon from embarrassment, and projecting the image of a president who stands with the military. This is not the remark of someone whose sights are set on our core value of presumed innocence; it is not the remark of someone concerned that an individual is being mistreated on his watch—and it is obvious that this is so. In this era of instant scrutiny from all angles, the avoidance of parrhesia comes across not as judicious, but as evasive and untrusting.

Imagine what a remarkable moment it would have been if Obama had asked Crowley to stay in his job. Suppose, in response to a question about why he was not fired, Obama had said, “We are a country of competing views, with honest and honorable disagreements. Our government has room to express those differences openly. Mr. Crowley may have spoken more bluntly than is to my taste, but it is certainly true that in pre-trial confinement a person charged with a crime must be treated as innocent until proven guilty.” Here, Obama would have been invoking our highest ideals while also acknowledging that parrhesia is tolerated, even encouraged, in the United States government. It is that kind of moment that would make us the envy of the world.

Jonathan Lear is a professor in the Committee on Social Thought and the Department of Philosophy at the University of Chicago. This article originally ran in the May 26, 2011, issue of the magazine.

Monday, May 9, 2011

Losing Superman: If the Man of Steel renounces his U.S. citizenship, he'll gain new arch-enemies.


Losing Superman: If the Man of Steel renounces his U.S. citizenship, he'll gain new arch-enemies - Ariel Dorfman, latimes.com

May 6, 2011

Can it be a mere coincidence that the world heard that Superman would renounce his U.S. citizenship just days before Al Qaeda's sinister and lugubrious leader was killed in his Pakistani compound? Or are the two events secretly related?

The news of U.S. commandos killing Osama bin Laden came just five days after word arrived that the Man of Steel, in Action Comics #900, was flying to the United Nations to declare his independence from America.

Such a drastic decision came in response to the U.S. national security advisor reproaching him for going to Tehran to show solidarity with Iran's Green Revolution and its protest against President Mahmoud Ahmadinejad and his cronies. Although the superhero had done nothing more than silently, and non-violently, support the rebels, the Iranian government took Superman's presence as an act of war instigated by the United States, the Great Satan.

In spite of having utter distaste for the autocratic mullahs, I'll admit that their identification of Superman and America is not illogical. I doubt that they spend much time reading foreign comics, but even they must know that this superhero stands for "truth, justice and the American way."

Superman decided that in an increasingly global world, it was counterproductive for him to be branded as an instrument of U.S. policy. He came from another planet, after all, which gave him a "larger picture."

It is difficult to exaggerate the indignation that this risky act of renunciation of citizenship caused among the U.S. public, which saw it as a slap in the face. I have read bloggers (I'm not making this up!) who propose deporting Superman to the planet Krypton, from whence he came (echoes of "America — love it or leave it"), as if he were an illegal alien. Well, he is indeed an illegal alien. (Did he go through customs to enter Kansas? Did he fill out papers for citizenship?)

Petitions have started to circulate threatening a boycott of Time Warner (the parent company of DC Comics) if there is no retraction of such a dire resolution — and indeed there are reports that DC Comics may be rethinking the story line. And a number of conservative pundits claim that this insult attested to America's decline. The icon of Americanism, the ideal self-made man, the son of faraway strangers with unpronounceable names, who assimilates and blends in (goodbye Kal-El, hello Clark Kent), the most representative figure of U.S. goodness and might was turning his back on the land of the free and the home of the brave.

President Obama may not follow the adventures of Superman assiduously, but someone in his entourage must have alerted him to the significance of such an iconic figure dissing the United States and going cosmopolitan.

What would happen, for instance, if the Man of Steel, champion of the dispossessed, were to decide that it was his task to close Guantanamo or to use his X-ray vision to expose secret documents that not even WikiLeaks' Julian Assange has been able to uncover? What if the erstwhile American demigod were to offer his services to China? (Though, thinking it over, he would probably never do anything of the sort, given his enthusiasm for truth and justice).

At any rate, Obama's counselors must have explained that Superman's desertion should be treated as an immense cultural and ideological crisis that could conceivably cost the president his reelection, given that Republicans were presumably spreading rumors about how Obama "lost" Superman (like Vietnam or Cuba were "lost").

Obama's response was a sheer act of political genius. By killing Osama bin Laden, he proved that the United States did not need a muscular man who can fly and destroy walls with the flick of a wrist. For that, the U.S. has helicopters and Navy SEALs and intelligence on the ground and in the air and weapons made of — yes — steel. A dashing way of restoring national confidence when it was wilting.

Of course, before Obama could order that clandestine operation in Pakistan, he had to take care of a matter that had been haunting him for the last few years. How could he reveal that Bin Laden had been slain in the name of the United States if an incredibly large percentage of Americans believed that this president is not, in fact, American at all? How to create a contrast with that renegade Superman, if Obama himself is accused of having been born abroad, in Kenya — which, as every American knows, is much farther away from Kansas than Krypton, even if all three places share the Kafkaesque letter K?

And that's why Obama finally produced his long-form birth certificate, just two days before he ordered Bin Laden hunted down, as a way of silencing the "birthers" who deny him legitimacy, who see him as "other" and alien and far more extraterrestrial than Kal-El. Naturally, there appears to be a portion of the citizenry that still doesn't believe he was born in Hawaii. But the vast majority of Americans now do.

So what comes next?

Now is when a really heroic task can be accomplished. President George W. Bush originally invaded Afghanistan because the Taliban refused to give up Bin Laden. So has the moment not come to withdraw all American forces from that country?

I am sure that Superman, along with the United Nations, would be delighted to proffer help in bringing the troops home. It would be wonderful to read of these exploits in the next issues dedicated to the Man of Steel, a story of how Obama and Superman — both with remote origins in Kansas, both despised as the "other" and alien — collaborated to create at least one small oasis of peace in a world that, alas, seems to be lacking truth and justice.

That would be a real homage to the many victims of the murderous Bin Laden.

Ariel Dorfman, a Chilean American writer, is a professor at Duke University. He is the author of "Death and the Maiden" and the forthcoming memoir "Feeding on Dreams: Confessions of an Unrepentant Exile." http://www.adorfman.duke.edu.

Copyright © 2011, Los Angeles Times

Saturday, May 7, 2011

Big Brother's Face on Facebook


Bahrain urges loyalty pledge on Facebook, Twitter: Government begins social media propaganda, thinq.co.uk

07 May, 2011

Government officials in Bahrain are asking citizens to post pledges of loyalty on Facebook and Twitter in the latest of a series of propaganda moves.

Demonstrations by Shiites in the country, who seek political reform, were crushed by the ruling Sunni monarchy, which is now looking to social networking as a way to affirm its dominance over the country.

Other countries in the region banned social networks or closed down Internet service providers altogether to crack down on uprisings, but Bahrain's approach is designed to put a positive PR spin on things and highlight to the world that it has a large following.

The country launched an online campaign called “We Are All Hamad,” asking people to post pictures of Hamad bin Isa Al Khalifa, the country's king, on their Facebook and Twitter pages, along with other websites. The country's state-run news agency claimed that 10,000 people have already done this in the first week alone [my highlight and italics: I wonder what will happen to you if you don't].

Bahrain's government has also been making use of YouTube, where it posted videos about a number of people who killed two policemen last month. The government labelled the four people as “traitors” and “beasts”, issuing death sentences for them, while completely glossing over the fact that at least 30 citizens have died in violent clashes with government forces in the uprising.

Tensions between Bahrain and Iran are a major reason for the move. Shiite-led Iran has been criticising the country for some time now and Bahrain has answered by telling investors to cut their Iranian ties and “buy Bahrain”. Image from

European misgivings about American religious dynamics

"European misgivings about American religious dynamics reflect both a traditionalist-right and a secularist-left genealogy and corresponding historical perspectives. While these perspectives are at odds with one another on many levels, their respective narrations of the relationship between modernity and religion, willy-nilly, cast the religio-political character of the United States as ill-conceived, producing an erroneously religious society (Right) or an overly religious one (Left). These narratives owe their formation to the far ends of a distinctly European political spectrum that first took clear shape in the decades following the French Revolution. The traditionalist critique of America derives from a nostalgia for established churches and a host of attendant pre-democratic, culturally organicist sentiments; the United States, in this view, represents a disquieting departure from a more salutary ecclesiastical establishmentariasnim, and a robust exercise of religious freedom has been met warily as indulgence in subjectivism and individualism. On the other hand, European skepticism of American religious life evinces a progressive-secularist mien, insofar as the French tradition of laïcité and various comparable ideologies and intellectual currents informing leftist and socialist thought and, later, secularization theory are invoked, or unconsciously assumed, as the benchmarks for appropriate historical development. In both cases, Right and Left, critics of American religiosity have often regarded the United States simply as the absence of certain specifically European conditions that are regarded as normative, or at least highly desirable. Both views, one might further argue, bear witness to an irrepressible nineteenth-century European mission civilisatrice, with the emerging United States serving Europe at once as poor learner, oafish foil, and didactic counterexample."

--Thomas Albert Howard, God and the Atlantic: America, Europe, and the Religious Divide (Oxford University Press, 2011), p. 23

Friday, May 6, 2011

Use of 'Geronimo' code for Osama bin Laden irks tribes


In connection with the below articles, readers might be interested in my 2006 piece for TomDispatch,  "Our Indian Wars Are Not Over Yet": Ten Ways to Interpret the War on Terror as a Frontier Conflict."


Use of 'Geronimo' code for Osama bin Laden irks tribes - Jim Meyers, Tulsa World

WASHINGTON - American Indians testifying at a U.S. Senate hearing took turns expressing shock Thursday at their own military's apparent decision to link the name of Geronimo, viewed not only as an iconic figure in their history but a role model for their youth, to the operation that took out Osama bin Laden.

Tex Hall, chairman of the Mandan, Hidatsa and Arikara Nation and the Great Plains Tribal Chairman's Association, said the incident must be rectified.

Suzan Shown Harjo, president of the Morning Star Institute, said Geronimo, who suffered many indignities during his lifetime, has been slurred once again by being compared to a terrorist.

According to the witnesses, the U.S. military's latest insult to Geronimo is but the latest American Indians have suffered through the years.

Initially it was that history that drew the attention of the Senate Indian Affairs Committee, which had scheduled the hearing before the weekend's deadly raid that took the life of bin Laden and others and that included the contentious reference to Geronimo when the news was shared with U.S. officials.

Mascots for sports teams, common caricatures and negative images of American Indians put forward in movies originally were listed as the hearing's topics.

Geronimo and the link between his name and the bin Laden operation, however, dominated the first phase of the hearing.

Sen. Tom Udall, D-N.M., who chaired the hearing, said his office has tried to get the Defense Department to clarify exactly how Geronimo's name came to be associated with the raid in Pakistan.

"Their protocol prohibits the release of information regarding operation names," Udall said, adding as a result the details of how Geronimo's name was used remain unclear.

"I find the association of Geronimo with bin Laden to be highly inappropriate and culturally insensitive. It highlights a serious issue and the very issue we have come here to discuss today."

To Harjo, who also is a member of the Cheyenne & Arapaho Tribes of Oklahoma, the Pentagon's explanation does not seem to matter.

"How awful in either case," she said, adding Geronimo was being treated as an enemy of the United States.

For the witnesses, history has just been repeated.

"Our names are not our own. They are stolen," Harjo said.

She said that is part of an effort to not only control American Indians but also ban their religious practices and force their traditions underground.

Many of those traditions, she said, never really re-emerged.

Chaske Spencer, an actor and producer who appeared in the "Twilight" movies, spoke of his own childhood struggles to understand the images he saw of his own people. Spencer said that impact can last a lifetime.

Stephanie Fryberg, an enrolled member of the Tulalip Tribes in Washington state and an associate professor of Social and Cultural Psychology at the University of Arizona, testified on behalf of the American Psychological Association.

Fryberg said research shows that American Indian mascots have a variety of negative psychological consequences for American Indians such as decreased self-esteem.

Despite such testimony, it appeared unclear exactly what the committee or Congress can do.

Udall said following the hearing that he still hoped the military will respond to his request for a clarification on how Geronimo's name was used.

"We've had a couple of different stories out there," he said."I think officially they could weigh in and clarify that."

Image from article, with caption: The famed Indian warrior Geronimo, a Chiricahua Apache, poses with a rifle. The leader of the Fort Sill Apache Tribe is looking for a formal apology from President Barack Obama for the government's use of the code name "Geronimo" for Osama bin Laden. National Archives/ Associated Press File

II.

The truth about Geronimo . . . and Osama bin Laden - Benjamin Runkle, Washington Post
“Geronimo!” That was the call that went over the command net on May 1, indicating that Navy SEALs had found their man. And that code name for Osama bin Laden has angered some Native Americans, who have demanded a formal apology from the Obama administration.

Their complaints are understandable, but misguided. The code name doesn’t denigrate the Apache war captain, a hero to some students of Native American history, through comparison to the Saudi terrorist leader. The similarities are not in the men themselves but in the military campaigns that targeted them.

In May 1885, Geronimo led the breakout of 120 Chiricahua Apache from the San Carlos Reservation in what is now Arizona, creating mass hysteria in the American Southwest. The Chiricahua had legitimate grievances: Civilian “Indian agents” were corrupt and consistently cheated the Apache on their rations, while the land the tribe had been given was almost worthless for farming but still encroached upon by miners.

The Apaches were such fierce adversaries that even as hardened a soldier as William Tecumseh Sherman, in an 1870 letter, recommended abandoning the Arizona territory altogether. As Geronimo biographer Angie Debo notes, the fugitives, after a previous breakout, “killed everyone they encountered.”

So, taking advantage of a reciprocal pursuit treaty, Gen. George Crook, commander of the Department of Arizona, ordered his troops into Mexico to capture or kill Geronimo. Eventually, 5,000troops — one-quarter of the entire U.S. Army — were deployed on the border and into Mexico in pursuit of Geronimo.

The 16-month campaign was the first of nearly a dozen strategic manhunts in U.S. military history in which forces were deployed abroad with the objective of killing or capturing one individual. Among those targeted were Pancho Villa, Che Guevara, Manuel Noriega and Saddam Hussein.

The original Geronimo campaign and the hunt for bin Laden share plenty of similarities. On May 3, 1886, more than a century before a $25 million reward was offered for information on bin Laden’s whereabouts, and almost 125 years to the day before the al-Qaeda leader’s death, the U.S. House of Representatives introduced a joint resolution “Authorizing the President to offer a reward of twenty-five thousand dollars for the killing or capture of Geronimo.”

In both operations, the United States deployed its most advanced technology. Whereas a vast array of satellite and airborne sensors was utilized in the search for bin Laden, Gen. Nelson Miles directed his commanders to erect heliograph stations on prominent mountain peaks, using sunlight and mirrors to transmit news of the hostiles. Neither system helped anyone actually catch sight of the man who was sought.

Small raiding forces it was proved more decisive than large troop formations in both cases. In 1886, Lt. Charles Gatewood was able to approach the 40 Apache warriors still at large with a party of just five — himself, two Apache scouts, an interpreter and a mule-packer. He convinced Geronimo and the renegades to surrender on Sept. 4, with a deftness that would have been impossible with 5,000 soldiers. Similarly, the United States could never have deployed the thousands of troops necessary to block all escape routes out of Tora Bora — the deployment of 3,000 troops three months later to Afghanistan’s ShahikotValley in Operation Anaconda failed to prevent the escape of the targeted individuals from similar terrain — but a lightning strike by a few dozen commandos was successful.

Both campaigns also demonstrated the importance of human intelligence to manhunting. Gatewood was alerted to Geronimo’s location near Fronteras, Mexico, by a group of Mexican farmers tired of the threat of Apache raids, but he also needed the assistance of Apache scouts familiar with the terrain and with Geronimo’s warriors to close in on his quarry. So, too, according to administration officials, did the success in finding bin Laden depend upon the interrogation of his former confederates in al-Qaeda and upon the efforts of local agents in Pakistan to track the courier who led U.S. intelligence officers to the Abbottabad compound.

The parallels between Geronimo and bin Laden may extend to the strategic effect of the campaigns that targeted them. Geronimo’s surrender to U.S. forces at Skeleton Canyon was important and symbolic — but the effective end of Apache resistance to the settlement of the Southwest came through Gen. Miles’s cruel policy of exile, under which even those Chiricahua who resisted Geronimo and stayed on the reservation were sent to Florida, where many died because of the change in climate. Today, most terrorism analysts do not believe that bin Laden’s killing will end the struggle against al-Qaeda, noting that in the decade since he has gone to ground, affiliated groups such as al-Qaeda in the Arabian Peninsula have become an increasing threat to America’s safety.

Almost from the moment of his surrender, Geronimo began the transformation from monster to legend, participating in “Wild West” shows, marching at Theodore Roosevelt’s 1905 inauguration, and selling souvenir bows and arrows and autographed pictures of himself wherever he traveled. Bin Laden’s legacy, on the other hand, will always be defined by the 3,000 innocent lives lost on Sept. 11, 2001, and the thousands of innocent Muslims killed by extremists inspired by the ideology of “bin Ladenism.”

No linkage of their names, whether intentional or incidental, will change how they are remembered.

outlook@washpost.com

Benjamin Runkle, a veteran of Operation Iraqi Freedom and a former Defense department and National Security Council official, is author of the forthcoming “Wanted Dead or Alive: Manhunts from Geronimo to bid Laden.”

Sunday, May 1, 2011

Why is royal propaganda/public diplomacy so successful?


Billions of people the world over watched on the tube the union of a balding pushing-thirty anonymous-looking, rather tall blond young man in uniform who's never had a real job with an uptight, somewhat icy-looking, merchant Brit overly weight-watching upper-bourgeoisie gal (also of a certain age), a kind of updated Margaret Thatcher (who is really a Brit male's vision of a real man; please, British friends, allow me, as a colonial, some kidding).

Well, at least they -- die couple (my reference is to the British monarchy's Germanic roots, not to Lady Di)  -- both have good dentist(s). Most Americans have "an issue" with the British mouth, marked by its horrible teeth: Why can't they, the Brits, smile (show their oral stuff) properly, ask we the-ever-smiling in the USA?

(An Englishman surmises that the reason Americans smile so much is because they expect the stranger they see before them is carrying a gun. Or so says a wise American who has lived in the sea-walled island).

A "socialized" British medical system doesn't take proper care of their citizens' molars?  Obamacare: take note. We don't want anyone in the US paying for our molars through our dollars.

But I digress.

Why are these two uber-bland, and I really do mean totally uninteresting, intellectually and even emotionally uninspiring not-so-young people, straight out of a US commercial for toothpaste, who could really be found on any American mall (or campus, what's the difference?), be able to grab the attention of our globe when he's in a Michael-Jackson/Gaddafi uniform and she's in a wedding gown no card-carrying women-libber could afford (or, more to the point, even say she would want)?

Surely it's not solely because they have good dentists. No, think propaganda.

The British Empire is over. Like all arguably benevolent empires before it -- let's go back to Rome -- the British Empire ended when it overextended, i.e., tried to dominate the world without functioning at home.

When an empire is over, in my view, it becomes focused on what Joseph Nye, the admirable former Clinton administration Defense Department official who wrote a sexy novel, calls "soft power," a term which reminds me of the male organ after orgasm.

As I constantly ask myself, without a definitive answer, following events such as the royal wedding (and before) about what soft power actually is, my thought at this moment, with the USG essentially bankrupt while fighting three senseless wars, is that soft power is what fatigued empires do (mostly unintentionally), as they decline, to try to influence the world in their favor, without having the interest, energy, money for it, with "public diplomacy", their Viagra as their cheap pill to fix what they perceive as their global "communications" problems (instead of the failure of their own domestic/foreign policies).

Well, OK, Louis XIV did not take penile dysfunction "medicine," perhaps because French "culture" is more "serious"  than Lady Gaga, but he was Mr. SP par excellence.

Let me wander a bit more if you will. Maybe, regarding the made-for-Tee-Vee royal wedding, the proper, "diplomatic" word for this event is marketing. We have two bland (brand?) Brits, with good teeth, and they are "royal" (or should I say, the lady becoming royal, perfect for a television reality-show).

Looking at them, so the PR guys/gals think, will lead the world into buying "British" products, including, allow me to suggest, Cadbury chocolate, not exactly good for your teeth. Of course, by "British products," we are often referring to international conglomerates selling their wares through a "British brand." Who owns Jaguar? The Indian company Tara Motors Ltd.

Give the handlers of the British wedding (how much did it cost at a time when most of us in the rest of the non-royal world are struggling to get a real job?) credit for their prop/PR savvy.

My main point: Savvy royal handlers doubtless realize that, while revolutionaries love propaganda as an agent for change in their favor, propaganda -- making people do what you want them to do without killing them (or, in past ages, eating them), since ultimately it cannot be to your advantage if you have no interlocutor -- is often most effective as a confirmation of tradition, or, in this particular somewhat vulgar royal wedding case (cake?), the packaging (at great cost to the British people themselves) of a largely invented past to influence people in Africa, Asia, Australia, New Zealand, and North America (and the U.K. itself), who were once under the Empire's arguably benign control, to buy Cadbury chocolate, now owned by Kraft Foods.