Thursday, January 18, 2018

Writers gonna write


TOM RACHMAN, The Times Literary Supplement
[Original article is marked by italics]











Image from article

Books on proper English do strangely well – perhaps because those who
buy books are those who mind about apostrophes. Such volumes are
supposed to improve your prose. But the true appeal seems to lie in a
tasty kind of misanthropy: the reader gets to froth over dimwits who say
things like, “She literally disappeared off the face of the earth”. But the
language scold always loses in the end. Those kids who, thirty years ago,
irked adults by saying “like” all the time are today saying “like” at board
meetings, on national broadcasts, and to their own teenagers.

Whether ours is a time of butchered English or of flourishing invention
isn’t obvious. Does online writing strip English of pomposity and
outmoded rules? When Emmy J. Favilla turned up for her job interview at
BuzzFeed five years ago, the media company already enjoyed notoriety as
the leading trawler of click-bait, filling its webpages with enticing posts
such as “Cat Enjoys Being Vacuumed”. The site had millions of views a
day, and not a single copy editor. At first, BuzzFeed had been a mere side
project for Jonah Peretti, one of the creators of the Huf ington Post. His
passion was to understand how ideas and information spread online, and
this enterprise was his lab to track viral content. But he was onto
something bigger than that. Financing poured in, humans were hired to
oversee the algorithms, and he finally left the Huf ington Post in 2011 to
dedicate himself to the listicle capital of the world.

Peretti had discerned that social media were becoming a dominant
reading source. He plotted the BuzzFeed expansion accordingly. “A big
part of that is scoops and exclusives and original content,” he told the
New York Times in 2012, “and it’s also about cute kittens in an
entertaining cultural context.” Peretti employed a much-respected
newshound, Ben Smith of Politico, as BuzzFeed Editor-in-Chief, and
more appointments followed, including that of its first copy editor,
Favilla. Almost at once, she had decisions to make, beginning with how
you abbreviate “macaroni and cheese”. Her ruling: “mac ’n’ cheese”,
which she deemed cuter than “mac & cheese”. With that, she was off. In a
mere two months, Favilla had drafted an entire style guide, thousands of
words on preferred spellings and the like. When BuzzFeed posted a
version online in 2014, old-school media sources treated the occasion as a
milestone: the internet was growing up. Or, at least, someone could
finally pronounce on whether you should say “de-friend” or “unfriend”
(it’s the latter). All the attention surprised Favilla, and prompted her
manifesto, A World Without “Whom”: The essential guide to language in
the BuzzFeed age.

As if to immunize herself against criticism, she begins by announcing her
paucity of qualifications; she is neither a lexicographer nor an expert in
linguistics. Previously, she worked at Teen Vogue. “I am constantly
looking up words for fear of using them incorrectly and everyone in my
office and my life discovering that I am a fraud”, she says. But despite the
tone of chirpy self-satire, what follows is a small revolution. “Today
everyone is a writer – a bad, unedited, unapologetic writer”, she says.
“There’s no hiding our collective incompetence anymore.” Unlike the
language scolds of yore, Favilla embraces the new ways, punctuating her
writing with emoji, inserting screen-grabs of instant messages, using
texting shortcuts such as “amirite?” Hers is a rule book with fewer rules
than orders to ignore them. Humans are gushing out words at such a pace,
they can’t be expected to bother with grammar, she says. More important
is to be entertaining, on trend, popular (neatly matching the corporate
goals of BuzzFeed). “It’s often more personal and more plain-languagey,
and so it resonates immediately and more widely.”

Many of her judgements will chill traditionalists. She delights in the use
of “literally” to mean its opposite. As her book title declares, she’d
abolish “whom”, given how few people use it correctly. Other matters that
have long rattled copy-editors don’t concern her: variations in spelling,
comma precision, full stops in acronyms. Often, when pondering a style
ruling, she offers no firm guidance, as if mistrusting authority to such a
degree that she can’t grant it even to herself, the author of “the essential
guide to language”.

“Use your judgment, friends”, she says. And: “Don’t sweat it too much”.
And: “In the end, who cares?” Her chapter “Getting Things as Right as
You Can: The stuff that kinda-sorta matters” features an instant-message
exchange in which she corrects a BuzzFeed colleague on a style point.
When questioned, Favilla lifts the rule, then admits to being drunk – “so
whatever”. In this merry free-for-all, her scorn is reserved for those who
scorn. A person who resists current usage is “stodgy and miserable and
irrelevant”, prefers “a stagnant, miserable world”, and will be “sitting
motionless in a puddle of his own tears”. She claims to want only to
describe language, not prescribe its correct use. But her preference is
clear, to raze what she deems pedantic and elevate the verbal etiquette of
millennials. At times, she sounds like an activist: “We’ve come a long
way, but we’ve still got some work left to do”.

Immersion in memes makes Favilla a handy guide for the perplexed – by
which I mean people old enough to remember the twentieth century.
However, she seems unsure where to pitch her book, leery of appearing
uncool to her peers but needing to address those miserable geriatrics who
somehow missed out on “cash me ousside, howbow dah”. (This line was
spoken by a thirteen-year-old girl on the daytime television talk show Dr.
Phil as a threat to a derisive studio audience: “Catch me outside – how
about that?” A clip of her saying those words became a viral hit, watched
more than 100 million times on YouTube. The girl, Danielle Bregoli, is
now a celebrity.) Another meme Favilla explains for the web-blind is
“Doge-speak”; a photo of a dog is superimposed onto other illustrations,
then overlaid with phrases in broken English, as if to reflect the inane
thoughts of the animal. In one, the dog appears at the Last Supper,
thinking, “Such delicious” and “wow”.

BuzzFeed and its rivals dine on this sort of material, which is intended to
be silly, often ironically. Fixing grammar in slapstick would be absurd, so
Favilla’s practical rule for editing is this: “I ask myself, How would I
write this in an email to a friend, or in a Facebook status?” What Favilla
circles around is a striking proposal: eliminate formal English. If
professional writing should read like an online message, and messaging is
akin to conversation, there’s only one register. “Repeat after me”, she
commands. “If we speak that way, it’s okay to write that way.”

But how to write sarcasm? Social-messaging fever has exposed a
weakness in English. We can write facts and transcribe speech. But to
convey the subtle emotions of talking – this is vexingly hard. The
avalanche of online exchanges amounts to a crowd laboratory, where
hundreds of millions seek to get their meaning across to prospective
dates, business contacts, gaming rivals. A few of the tweaks – “idk” (I
don’t know) or “fwiw” (for what it’s worth) – are shortcuts for weary
digits. But other abbreviations – “jk” (just kidding) or “lmfao” (laughing
my fucking ass off) – emote where the writing hasn’t. Punctuation too is
transformed, no longer there just to organize sentences. A full stop is
redundant in texting, so the concluding full stop online becomes a drillhole
of hostility. Alternatively, full stops mimic emphatic oration: “You.
Are. Insane”.

New punctuation has arisen, such as a tilde on either side of a phrase to
saturate the bracketed remark in irony. Also, messages regularly include
GIFs to elaborate on the writing, so that someone might message,
“omg!!!” and insert a looped video of a cute child slapping his cheeks in
amazement. If you can’t find a fitting GIF, you may insert an asterisk on
either side of a phrase to evoke a video clip, as Favilla does in the book,
approving of one dictionary’s ruling with the line “*twirls out of the room
and into a party full of double-duty words wearing a skirt made of
shredded dictionary pages*”. But her favourite new tool is the emoji,
which she calls “the most evolved form of punctuation we have at our
disposal”. “I mean, what a time to be alive, seriously”, she writes.

Favilla turns grave only when considering offensive speech. “Language
has the impressive ability to craft social construct, and if the result is
negative, then we learn and we listen and we phrase things better the next
time.” She opposes the term “sex change operation” in favour of “gender
affirmation surgery”. She mentions her copydesk’s attempt to find “a nongendered
term for pads, tampons, and menstrual cups”. She endorses the
gender-neutral pronouns “ze”, “zir”, “hir”, “xe”, “xem” and “xyrs”. She
also frets – sincerely, I think – about not stereotyping dogs, calling this a
“crucial matter to address”.

Yes, these are the kind of worries that delight right-wing firebrands. In her
defence, what can seem like trendy quibbling may be a drive for accuracy.
And to stand at the vanguard of language change always earns you
contempt. It is when revolt sounds like adolescent rebellion that Favilla
harms her case; she tells the reader, “Don’t be a hater just because you’re
old and uncool”.

Whereas Favilla is a digital native resolutely of the twenty-first century,
eighty-nine-year-old Harold Evans is an inky newspaperman of the
twentieth, growling in the on-deadline voice of a chap accustomed to
being in charge. Even the title of his book on writing, Do I Make Myself
Clear?, has an irascible twang.

Favilla seeks to deflect criticism with self-deprecation, but Evans has no
such compunction. He is a man with accomplishments to be proud of, and
he is proud of them. “A fair question – I am glad you asked – is what do I
bring to the picnic? The short answer is that I have spent my life editing
thousands of writers, from the urgent files of reporters on the front lines
to the complex thought processes of Henry Kissinger in his memoirs and
history of China.” Evans was reporting at sixteen. He wrote a journalism
manual forty years ago. He remembers typewriters and hot lead, back
when there was “no meandering in cyberspace”. When Favilla was in
diapers, Evans was Editor of The Times. “It was the pinnacle of the
profession”, he says modestly.

Evans too has a case to make about language in the Digital Age, one far
less jubilant. Ghastly writing is the reason people assert more today and
reason less, he contends, citing muddled wording among the causes of
terrorism, the financial crisis and the struggles of Obamacare. Climatechange
deniers, he points out, conceal lobbying organizations with
misleading titles. Meanwhile, the social media beloved by Favilla have
helped to propagate fake news.

Regrettably, Evans fails to delve much deeper into the problem than by
sniffing out excessive word counts and clich̩s Рhardly a satisfying
explanation for what ails the common tongue. He doesn’t help himself
with cursory attempts to sound tech-savvy. Overwhelmingly, his sources
are the legacy media: the New York Times, the Wall Street Journal, the
Washington Post, the Associated Press, ABC News, CNN. Sometimes he
comes off as a snorty oldster of the kind Favilla derides, as when he
digresses about his gratitude “when finally I track down someone who
sorts out the problems with Apple’s obsession with passwords”.

Evans may err in his prescription, but is correct to diagnose trouble.
Public opinion is frighteningly confused today, with many citizens
opposing what they support. They’re for health care, but against the
policy providing it. Bewilderment also warps discussion of gun control
and Brexit and global warming, leaving those without scruples to spin,
while earnest news sources mount their factual cases – and are snubbed.
Manipulative language has been around as long as public debate. But
today’s lies linger because the internet has scuttled credibility, placing
heaps of alluring junk beside small piles of dry honesty.

BuzzFeed, to its credit, has invested in serious journalism. The company’s
growth during Favilla’s time has been staggering. She recalls about 150
people on the payroll when she started in 2012. Today, it boasts 1,500
employees in eighteen offices from New York and Sydney to London and
Mumbai, with an entertainment studio in Los Angeles. (BuzzFeed
recently announced a round of layoffs, after reportedly missing its
revenue target for the year. Still, the company brought in more than $250
million in 2017, according to the Wall Street Journal.) Around the time of
her arrival, BuzzFeed generated 100 million views a month; three years
later, it was 5 billion. Amid this boom, the Editor-in-Chief, Ben Smith,
expanded news coverage, built a team of dogged reporters, conducted an
exclusive interview with President Obama in 2015. He gained attention –
and disdain in some quarters – for posting a dossier of salacious,
unverified claims about President Trump. His staff were also producing
undeniable scoops, such as a piece revealing that United States funding to
Afghanistan had been diverted to schools “that have never seen a single
student”.

In the meantime, one of BuzzFeed’s most successful ventures was the
live-streaming of two employees placing rubberbands around a
watermelon until it exploded. At one point, 800,000 people were
watching that live on Facebook. Comments from the public included,
“what am i doing with my life”.

I can’t refute Favilla’s fundamental claim. Her populist, who-cares
approach probably will prevail, and only a fool would play the scold.
After all, outrage is just the feeling of losing control, right? Yet there is
something dispiriting about A World Without ‘Whom’. Her credo of
triumphal sloppiness retains so little of what is inspiring about writing:
precision, brilliance of intent. She cites George Orwell’s unmatched essay
on writing, “Politics and the English Language”, yet applies little from it.
Instead, she chokes out sentences such as: “So, sure, the ration of
prescriptivism-leaning to descriptivism-leaning molecules just hanging
out in your DNA, waiting to pounce like a lion stalking its prey on the
writer who asks if languagey is a ‘real’ word, fluctuates across the human
population as much as our propensity for doing household chores does”.

Goofball giggles are a treat, but not the summit of language – and
language remains our best vessel for complex ideas. Failure to master our
tongues, to allow others to direct them hither and thither – that is what
Orwell was warning against.

Consider the catchphrases that ooze with cynicism and passivity – “It is
what it is” or “Whatever” or “You do you”. Or “Haters gonna hate”,
which means nothing much except that, if someone opposes your view,
you should write them off. Don’t argue, simply shut them out, hold to
your comfy in-group, close your eyes, plug your ears, and hum loudly till
they’ve left. But first, take a peep at your news feed. There’s another
meme waiting. And it. Is. Hilarious.

How Sex Trumped Race - Note for a discussion, "E Pluribus Unum? What Keeps the United States United."


Ross Douthat JAN. 17, 2018, New York Times
(Original article contains links.)

















Image from article, with caption: A demonstrator at the Women's March in 
Washington last year

Suppose that you were asked to assess the state of American society under Donald
Trump, the essence of our problems and divisions [JB emphasis], without any access to the
president’s own words or the media coverage thereof. Suppose, instead, that you had
to cobble together your assessment based only on the way the electorate and the
culture has responded to his ascent and presidency — by looking at the changes
wrought in our partisan landscape, the new sociological and political fissures that
have opened, and the protests and mass movements, social trends and cultural
expressions have defined his strange first year in office.

I suspect this exercise might lead to the conclusion that both race and class, the
two tangled areas that so many commentators — myself included — have written
about endlessly for the last two years, are less important to our moment than the
scale of the media attention paid to them suggests, and that divisions and anxieties
around sex and gender are where the essential cultural action of the Trump era really
lies.

This possibility seems like a deliberate provocation in a week when Trump’s
outburst about countries that resemble outhouses has made the president’s racism a
headline topic for the umpteenth time. And I’m not denying the reality of that
racism, which has been apparent since Trump embraced birtherism and which
plainly informs his views on immigration more than the commitment to merit-based
migration policy that his minders and managers have been trying to advance.

But the same week that reminded us of Trump’s bigotry also brought a striking
analysis of how Americans are reacting to his presidency, generated by a large and
deep survey conducted by Survey Monkey and written up for The Atlantic by Ron
Brownstein. To some extent the survey shows what smaller polls have also shown:
Trump’s coalition depends on working-class whites, evangelicals and older white
men; he’s opposed by minorities and women and the young; and he has lost ground
just about everywhere since his election last November.

But the way he’s lost ground is interesting. The press coverage often makes it
seem as if Trump is using racial provocations to hold his white blue-collar base while
assuming the minorities will never vote for him, and indeed I suspect that — to the
extent his provocations have any cunning behind them — he may think about them
in those terms. Yet racial polarization in the electorate hasn’t actually increased over
the last year: Relative to where he stood last November, Trump has lost white
support, including working-class-white support, while either holding his own or
actually gaining ground with blacks and Hispanics.

His lost support has been heavily concentrated among the female of the species:
“From February through December,” writes Brownstein, “Trump’s approval rating
fell more with middle-aged blue-collar white women than any other group.”
Meanwhile among minorities he’s made gains or held his own by appealing primarily
to men, while remaining extraordinarily unpopular with black and Hispanic women.
“In every age group,” Brownstein notes, “and at every level of education, about twice
as many African-American men as women gave Trump positive marks,” and “among
Hispanic men older than 50, Trump’s approval — strikingly — exceeded 40 percent.”

Relative to where American politics stood before his rise, Trump’s campaign
polarized America more by class and gender than it did by race. And then, by
jettisoning much of the populist economic agenda he campaigned on, Trump’s actual
presidency has made class less important and gender more essential to
understanding how Americans divide.

This doesn’t mean that race isn’t enduringly important to these divisions; the
fact that a minority of minority men seem more blasé about his bigotry than you
might expect does not mean that Trump is actually building a pan-racial coalition.
But if you’re looking at what Trump has directly changed, who seems distinctively
offended and energized by his provocations, white-brown-black differences aren’t
where the action is; instead, it’s with the large female backlash that may be poised to
swamp the male backlash that helped make him president.

The last year has offered ample confirmation of this point. The heart of the anti-Trump
resistance movement is middle-aged white women, while the Black Lives
Matter movement has receded despite Trump’s own attempts to elevate it via his
campaign against Colin Kaepernick. In terms of the numbers involved, the white
nationalist-antifa collisions have been sideshows compared with the Women’s March
and its various imitators. And in the culture, the clearest Trump-driven convulsion
has been the #MeToo movement, which intersects with race and class but is
fundamentally about the relations between the sexes.

There are various conclusions one could draw from this reality. Someone
focused on building anti-Trump solidarity might argue that one defining effect of
Trump’s rise has been to make more white women feel the sense of marginalization
and disempowerment that minorities already feel — which is why the female
reaction is so much more notable than the reaction among groups more inured to
prejudice.

Alternatively, someone focused on the primacy of race might complain that
white women have effectively hijacked anti-Trumpism, using their positions of
influence in the media and elsewhere to turn what should be a “Get Out!” moment
into a “Handmaid’s Tale” moment, depriving Black Lives Matter, immigrants’ rights
groups and other like-minded movements of media oxygen in order to focus on their
own more intimate sufferings at the hands of Trump-like elite men.

My own suggestion would be that the surprising gender-over-race dynamic
might also reflect some underappreciated social shifts that could modestly
depolarize racial issues even as the war over the sexes gets a little worse.
In particular, on two of the issues that drove racial polarization in the late
Obama years — the justice system’s seeming racial bias, which spurred so much
minority activism, and elite support for ever-increasing immigration, which spurred
populist backlash on the right — the underlying numbers have actually been moving
in the direction desired by both sides’ activists.

Mass incarceration isn’t just in retreat (with prison populations falling 13
percent from their 2007-08 peak), it’s retreated in a very race-specific way:
Imprisonment rates for black men plunged by 24 percent in 2000s even as the white
imprisonment rate slightly rose. Meanwhile, the immigration rate, legal and illegal,
has also fallen quite dramatically since 2005. Neither issue is about to disappear, but
it’s still notable that trends feeding black disillusionment and white-identity politics
were improving in the years leading up to Trump … even as trends related to sex,
 marriage and family continued to show growing social divide between the 
sexes, with fewer marriages, fewer children and  less sex all around. If “less sex”
 just means “less for Harvey Weinstein,” of course, that’s good news for
 everyone else. But what’s being exposed in the Trump era is more than just
a few pigs and their crimes. Something is badly out of joint with male-female 
relations, our ability to woo and be wooed, our capacity to successfully
and happily pair off.

It may be too much to hope that recent racial polarization has been driven by
trends that are destined to improve. (We don’t know, for instance, what’s happening
with the crime rate after the late-Obama-era spike.) But at the very least our race
problems might not, the presidency’s bigotry notwithstanding, be necessarily getting
worse. Even Trump’s recent “what, me, racist?” tweet noting an all-time-low in the
black unemployment rate was not wrong: These are the best economic times for
African-Americans in a decade.

But there is strong evidence that our problems with sex and gender and male-female
relations are worsening — which is why it’s understandable that they’re at the
heart of how the country has reacted to the Trump presidency, and fitting that this
year of public protests and intimate revelations have thrown them into sharp relief.

Monday, January 15, 2018

In Trump’s Remarks, Black Churches See a Nation Backsliding: Note for a discussion, "E Pluribus Unum? What Keeps the United States United."


By SABRINA TAVERNISE, New York Times

image (not from article) from
Excerpt:
On the day before Martin Luther King’s Birthday, African-American churchgoers gathered as they always do, to pray, give thanks and reflect on the state of race in America. But after a disheartening week and an even more disheartening year, black Americans interviewed on Sunday said they were struggling to comprehend what was happening in a country that so recently had an African-American president. ...
“Donald Trump is America’s id,” said the Rev. William H. Lamar IV, pastor of the 180-year-old Metropolitan African Methodist Episcopal Church, which is five blocks from the White House. “He is as American as baseball and apple pie.” ...
Pastor Lamar agreed that there was a problem with the American story. “The narrative that held America together has been fractured,” he said. “The ground is shifting underneath us. You have to tell a truthful story about how America got to where it is. The factories are not gone because of immigration.” ...

Saturday, January 13, 2018

The presidential verbal tic "like"


image from

The President of the United States has made "like" linguistically respectable (sorry, I meant "likeable").

"Like" as a verbal tic, used especially by young people.

Until now.

Until now: thanks to President trump "like" as a verbal tic is now part of the USA gov "official" lingo.

An updated government statement?: "Like, of the People, by the People, and for the People."

Well, ok, I like Ike ...
image from

Friday, January 12, 2018

Why am interested in Russia? - a personal note ...


Why am I interested in Russia? Because of its multi-ethnic, semi -"European" space (far too vast for "Europe") that it occupies. See also.

image from

So:  Why am I interested in Russia? Because Russia reminds me, in some ways, of my very own beloved but recently misnamed [?] homeland :)

Monday, January 8, 2018

Why Would the President of the United States, Like, Tweet This Way?


Ben Zimmer, theatlantic.com; see also (1) (2)

image from



Donald Trump triggered yet another round of furious Twitterology this weekend when, in the midst of a tweetstorm defending himself against Michael Wolff’s blockbuster book, Fire and FuryTrump declared that “throughout my life, my two greatest assets have been mental stability and being, like, really smart.”

That, plus the follow-up that he is in fact a “very stable genius,” sent the Twitterverse into a tizzy. And just like the December tweet from the @realDonaldTrump account stating that Michael Flynn “pled guilty” to lying to the FBI, many observers picked up on the use of a single word. Last time, that word was pled (which I wrote about here and here), but this time it was another four-letter item: like, set off with commas.

The use of pled in last month’s tweet unleashed speculation about its true authorship—Trump’s lawyer John Dowd ended up taking the blame for a statement that seemed to add an incriminating bit of evidence to an obstruction-of-justice case. This time around, no one seems to deny that the sentiment is Trumpian. Even the machines agree: the site Did Trump Tweet It? puts the machine-learning probability that Trump wrote the tweet at over 99 percent. But those fussy commas around the like suggested to many that Trump must have dictated the tweet—perhaps to his communications director Hope Hicks, who is painted in Wolff’s book as “the ultimate facilitator of unmediated behavior,” on Twitter and elsewhere.

Unfortunately, the forensic linguists whom I spoke to in December haven’t yet worked out a model to help determine when Trump might be dictating his tweets and when they come from his very own thumbs. But his use of the word like is still worthy of some armchair analysis.

The commas around like are an anomaly in the president’s tweeting history: Of the tens of thousands of tweets in the Trump Twitter archive, this is the first time like has appeared this way, except for in two manual retweets. One was from a follower in 2013 who, upon hearing that Trump was speaking at the CPAC conference, exclaimed, “This conference just became, like, a hundred times more awesome!” The other, from 2014, was from Peter King of Sports Illustrated, who tweeted, “Why do baseball players slide headfirst? Are they just, like, not smart?” (Trump thought that was a “great point.”)

But while tweeting-Trump may never have used like in this way before yesterday, speaking-Trump uses it all the time—at least when he’s touting his own intelligence. Back on July 11, 2015, less than a month after Trump declared that he was running for president, he told a rally in Phoenix, Arizona, “I’m, like, a really smart person.” (That inspired MSNBC’s Rachel Maddow to run a segment with “Like, a Really Smart Person!” displayed on the screen behind her.) In December 2015, he told CNN’s Don Lemon, “But I’m, like, a really smart person. You know, I went to Ivy League schools.”

He continued the “like, smart” theme all through 2016. “I’m conservative, folks, but I’m also, like, smart,” he said at an Arizona rally in March excoriating Jeb Bush. “I’m, like, a really smart person, like a lot of you people,” he said in Connecticut in April, before explaining that “it’s very easy to be presidential” if he wants to be. And in December, responding to a question from Chris Wallace on Fox News Sunday about why as president-elect Trump was eschewing the daily intelligence briefing, he said, “you know, I’m, like, a smart person.” That led Seth Meyers to crack on his late-night show that “‘I’m, like, a smart person’ is a sentence that disproves itself. It’s like getting a back tattoo that says ‘I make good decisions.’” Despite the ribbing, Trump doubled down when he spoke at the CIA headquarters on his first official day in office, saying, “Trust me, I’m, like, a smart person.”




So this is clearly a thing for Trump, even if it’s only now carrying over to his Twitter feed. Why, we may ask, does this self-declared “very stable genius” keep doing it?

First, let’s consider the role of like in “I’m, like, really smart.” The word like serves a number of functions in casual American speech, all of which tend to be stigmatized—just as Seth Meyers implied—as not exactly sounding “smart.” Alexandra D’Arcy, a sociolinguist at the University of Victoria, takes issue with many of the common characterizations of like that have been attached to it ever since it became associated with beatnik types (think Bob Denver as Maynard G. Krebs on the old Dobie Gillis show), and then with hippies, stoners, and surfers. Later, it was taken as typical of young women, particularly fitting the “Valley Girl” stereotype. (Indeed, many commenters on Trump’s tweet said his “like” made him sound like a Valley Girl.)

In a new book, D’Arcy lays out four different flavors of modern “like.”  There’s the quotative use, as in, “I was like, ‘No way!’” There’s the approximative use, as in, “It’s like thirty degrees below out there.” There’s the discourse marker, used to connect clauses as a kind of discursive glue, as in, “… like, you know what I’m saying?” And finally there’s the discourse particle, which can get stuck into the middle of a clause. That last one is what Trump is doing when he inserts like before really smart or a really smart person

As D’Arcy explains, when like is used as a discourse particle, it can serve a range of communicative purposes, even if it can’t be assigned a concrete definition. It can draw focus to a topic of discourse, indicating to those listening that they should pay attention to what comes next. It can also be used as a kind of hedge—or as the linguist Lawrence Schourup puts it, “like” can express “a possible unspecified minor nonequivalence of what is said and what is meant.” (Interestingly, despite the “Valley Girl” stereotype, D’Arcy finds that men actually use like as a discourse particle slightly more often than women.)
So why does Trump continue to use this hedging device, in his running self-evaluation of his own “smartness”? Of the various commenters expounding on Trump’s tweet yesterday, I think Katie Rosman of The New York Times did the best job of explaining the like. The word “hedges the claim before it, adds an element of ‘I know it’s immodesty but I’m not going to be afraid to say it,” Rosman tweeted, adding that Trump “has never shown such concern of seeming immodest.” I’m reminded of how Trump can preface a boastful statement by saying, “I will tell you this in a nonbraggadocious way…” as he did when promoting the GOP tax plan last November.

Moreover,  the like allows Trump to have his cake and eat it too: He can brag about his “very good brain” and his Ivy League education without coming off as intellectual, exemplifying what Jonathan Chait has called his “oddly snobbish anti-intellectualism.” Because the discourse particle doesn’t “sound smart,” he evidently thinks it disarms the ludicrous self-puffery of someone incessantly crowing about his own intelligence. He may not have a Valley Girl stereotype in mind—given Trump’s age, Maynard G. Krebs would be a more likely model—but he seems to treat like as something humorously not smart that makes his proclamations of smartness somehow more palatable. At least that’s how it must sound in the mind of a “very stable genius.”

Sunday, January 7, 2018

How Trump Is Making Us Rethink American Exceptionalism


Joshua Zeitz, Politico [Original article contains additional links.]

image from article

This past year has shown that the U.S. is far from immune to the forces shaping the rest of the world.


This article is the second in a series on how President Donald Trump changed history—reviving historical debates that have simmered on low heat for years, and altering how historians think about them. See the first in the series here.

Americans have always thought their country was exceptional. They thought it even as early as 1630, when John Winthrop delivered a now-famous sermon in which he called the Puritan community a “city on a hill”—long before there even was an American country.

In more recent years, the idea of American exceptionalism has become tainted by politics—a rhetorical cudgel that politicians, particularly conservatives, wield to bludgeon their opponents. During President Barack Obama’s tenure, Republican leaders expressed concern that, in Newt Gingrich’s words, there was “a determined group of radicals in the United States who outright oppose American Exceptionalism.” Mitt Romney claimed that Obama didn’t “have the same feeling about American exceptionalism that we do.” Former New York Mayor Rudolph Giuliani went a step further. “I do not believe that the president loves America,” he declared. Unlike his predecessors, Obama didn’t seem to appreciate “what an exceptional country we are.” Obama ultimately felt compelled to correct the record. On July 4, 2012, he paid tribute to a group of newly naturalized citizens, celebrating their diversity and service to country as “one of the reasons that America is exceptional.”

It’s unusual that the Republican Party’s most recent standard-bearer, President Donald Trump, has disavowed the very idea of “American exceptionalism.” “I don’t think it’s a very nice term,” he said. “I think you’re insulting the world.” But that doesn’t mean that Trump has chucked this dearly held principle. When most conservative politicians invoke the term “exceptionalism” they use it as shorthand for raw national chauvinism—the assertion that the United States is not just different, but better. Trump has replaced it, at least temporarily, with an angrier tag line that conveys the same sense of national power and entitlement—America First, itself a term ripped from history and freighted with dark meaning. When America is first, it owes little to everyone else. It’s a more Trumpian way of saying what other politicians often mean.

When they use the term “exceptional” to connote pure superiority, though, politicians generally betray a facile grasp of history. In its original formation, American exceptionalism was a much more complicated theory. It conveyed the idea that the United States was immune from social, political and economic forces that governed other countries—specifically, that it was invulnerable to communism and fascism, and to violent political convulsions of the sort that jolted Europe throughout the long 19th and 20th centuries. It also implied that Americans bore a providential obligation to be exemplars of virtue in a sinful world.

Exceptionalism was for many decades a hotly contested topic among historians and social scientists. Could arbitrary borders really render an entire country exempt from broader social, economic and political forces, particularly in an age when these borders became more porous to the movement of capital and labor? Or did patterns of political development in fact create unique forms of national “character”?

In more recent years, the debate cooled. While some political scientists continued to explore potential variants of American exceptionalism, most historians concluded that the idea was meaningless and the very conversation itself stale.

Then came Trump.

His election and the conditions that accompanied it—a growing rejection of science and evidentiary fact, extreme political tribalism JB emphasis], the rise of conservative nationalist movements around the world, a popular reaction to immigration and free trade—may offer final and conclusive proof that there is nothing at all exceptional about the United States. We are fully susceptible to the same forces, good and ill, that drive politics around the globe.

But before we sound a death knell for the idea, it would help to remember what it actually means.

***

Ironically, the phrase that so many conservatives traditionally claimed as their own first entered the popular lexicon in 1929, when Josef Stalin censored what he called the “heresy of American exceptionalism.” At issue was the insistence of American communists, under the leadership of Jay Lovestone, that their country’s economy was developing on a timeline divergent from that of Europe, thus necessitating an intermediary period before outright revolution.

This seemingly benign idea set American communists on a collision course with Moscow. Marxist orthodoxy held that the laws of political economy were universal and immutable. History—in this case, the various stages of capitalist development—operated the same way everywhere. No one country was immune to its universal principles. The Soviet leadership purged Lovestone and his supporters and replaced them with more conforming enthusiasts.

If the phrase was new, the underlying sentiment—that America was somehow different, or special—was not. Since the earliest days of colonial settlement, the Puritan settler Winthrop conceived of America as a “city on a hill,” a distinct place with a heaven-sent obligation to build a new and pure world. In the aftermath of the War for Independence, many citizens agreed with William Findley, a farmer from western Pennsylvania who would later serve in Congress, that Americans had “formed a character peculiar to themselves, and in some respects distinct from that of other nations.” During his travels across the young country in 1831 and 1832, the French statesman Alexis de Tocqueville concluded that “the position of America is therefore quite exceptional,” for indeed, many of the people whom he consulted believed, in the words of one of his Boston informants, that “there are no precedents for our history.”

There were two intertwining strains of this popular but still nameless idea. The first was rooted in Puritan theology and held that America was divinely selected for greatness and mission. The second presumed that the United States was unique in the character of its people, economy and politics.

With the advent of modern universities in the late 19th century, scholars attempted to explain this historical uniqueness. Historian Frederick Jackson Turner offered the most lasting theory in 1893, with his essay, “The Significance of the Frontier in American History.” By Turner’s reckoning, the American frontier had been a great leveler where pioneers shed all semblance of status and heritage and worked together to tame a fierce wilderness. It was a cauldron of democracy—the “gate of escape from the bondage of the past.”

Turner’s thesis created more despair than pride. While the country was in many ways brimming with confidence—“the “greatest destiny the world ever knew was ours,” said future Secretary of State John Hay in a typical display of national chauvinism—alongside this spirit of optimism ran a sense of parallel dread. In 1890, the U.S. census declared the western frontier officially “closed.” If the frontier had been the greatest source of America’s unique political economy, now, with no more territory to conquer and "civilize," the country’s raw ambitions and talent would surely dissipate.

The essay struck a resonant chord, for already many people saw signs of weakness in the culture. As more men worked as clerks and professionals, they let atrophy the muscle, brawn and sheer courage that it took to break the land. Even manual laborers were now more likely to be employees of other men, rather than self-sufficient yeoman farmers or shop owners, whom earlier generations of Americans regarded as the foundation of the republic. Worse still, millions of new immigrants from Southern and Eastern Europe threatened to dilute the heritage that many old-stock Americans viewed as central to the nation’s past success. The United States was losing its edge, or so people worried. It would not be the last time that concerns about “exceptionalism” ran parallel to doubts about the country’s future prospects.

The frontier thesis was just one of many attempts to identify and explain points of distinction. Many historians and political scientists sought to answer a question that Werner Sombart, the German sociologist, posed in 1906: “Why is there no Socialism in America?” It seemed to take root everywhere else, but not in the United States. Was it America’s ethnic and linguistic diversity? The inability to forge a working-class identity that transcended race? A surfeit of open land to absorb the working class? These possibilities animated social scientists then, and to a large degree, even now.

Though exceptionalism claimed deep roots in American tradition, the concept truly came into its own in the 1950s, as a generation of historians, writing in the wake of World War II, sought to make sense of why their country, alone, had escaped the violent disruptions that beset Europe over the previous 150 years: revolution, regicide, class uprising, total war and genocide. None of it had happened in the United States, these historians noted. They conveniently glossed over the violently repressive regimes of chattel slavery, Redemption, war on Indian nations, and Jim Crow, which, of course, most historians writing in these years blithely did.

Some, like David Potter, believed that material abundance freed America from the economics of scarcity that drove the trajectory of modern political development in Europe. It created an environment in which America’s “distinctive national character” could take root and grow. Others, like Louis Hartz, proposed that the absence of a feudal heritage set Americans on a uniquely tranquil path. Drawing on Tocqueville—who observed more than a century earlier that the “great advantage of Americans is that they have arrived at a state of democracy without having to endure a democratic revolution; and that they are born equal, instead of becoming so”—Hartz posited a longstanding “liberal tradition” common to the vast majority of Americans. His theory fit neatly inside the mid-century “consensus” school of history, which held that despite tactical differences over economic and political questions, most political actors in American history—Federalists and Republican-Democrats, Whigs and Democrats, Democrats and Republicans—shared a broad common belief system that rejected doctrinal extremes of the far right and far left.

The common thread binding these diverse interpretations was a belief that the rest of the world (and certainly Europe)—but not the United States—operated according to a singular collection of economic and historical rules.

The idea of American exceptionalism fell into intellectual disrepute by the early 1970s. After a decade of political turbulence over civil rights and Vietnam—police violence against peaceful black protesters, urban riots, political assassinations, pitched anti-war protests, acts of political terrorism by left-wing extremists—it seemed hardly certain that the country was any less fractured and unstable than its European cousins. Exceptionalism no longer seemed like a sure thing.

It wasn’t just external events that inspired a reconsideration. As successive generations of scholars grew interested in such phenomena as transnational immigration and borderlands conflict—not just between Mexico and the U.S., but between nation-states far and wide—the clearer it has become that American history is inextricable from global events. The field of comparative history also inspired researchers to recognize that America may be different from other countries, but it is not governed by a unique set of rules regarding state-building, economics, religion, ideology and even socialism. Nor are all other countries driven by the exact same global forces. All countries experience some degree of differentiated political and economic development, and, as Princeton University historian Daniel Rodgers observed, to ask, “Is America different?” is no more instructive a question than “‘Is Argentina different?’ Or Afghanistan.”

In short, many historians today argue that it was always folly to assume that invisible forces drove all of global history, but greater folly, still, to assert that those forces were somehow inoperable within America’s ever-changing borders and among its perpetually evolving citizenry.

***

If you still need convincing, look no further than recent events. It’s become axiomatic that Trump’s victory was of a piece with populist insurgencies as far and wide as France and the Netherlands, Britain and Greece. Driven by backlash against open borders, free trade and common markets, they share an angry ethno-nationalism, a rejection of elite institutions and actors, and a wispy nostalgia for an imaginary past. These movements reject sexual and cultural pluralism and, in some cases, claim a common patron in Moscow.

Nor is it clear that America’s democratic norms and institutions are fundamentally any more rock solid than those of, say, Poland, Hungary or Venezuela. As bad as? No. But any more unassailable? To observe Republican North Carolina legislators' attempt to abrogate the results of a gubernatorial election—to say nothing of GOP congressional leaders Paul Ryan and Mitch McConnell, who have all but abandoned the legislative committee process in their feverish effort to cram the courts with conservative judges and pass deeply unpopular class legislation by cover of night—one can’t help but worry that the United States has slipped into what sociologist Larry Diamond calls a “democratic recession.” It’s a global, not a uniquely American, phenomenon.

If we’ve learned anything in recent months, it’s that the United States isn’t immune to the forces of history. It is, on the contrary, swept up in them. That fact set seems the very repudiation of American exceptionalism, if we’re to use the term correctly.

Or is it?

In a slim but prescient volume published over 10 years ago, historian Eric Rauchway examined an earlier era in which “globalization”—the worldwide movement of capital, labor, information and ideas—generated political and cultural populist backlash in the United States. From the mid-19th century through World War I, America absorbed tens of millions of immigrants and trillions of dollars in foreign investment capital (in current-day money), and launched a massive colonization program on par with those of European nations like Russia, England and France, though, in the case of the United States, colonization took the form of western expansion on American soil. We were, in effect, deeply caught up in global currents.

By his own admission, Rauchway neither “cheers nor jeers” for the concept of American exceptionalism (“nor am I even especially interested in it,” he continued). Instead, he wanted to explain “discernible degrees of difference in the impact of world systems and the extent to which they appear to have mattered in American national development.” In this endeavor, he did detect much about the United States that was different.

Whereas other countries raised large armies and state bureaucracies to subdue and govern colonial land and people, the United States didn’t have to, so didn’t. These other world powers drained their treasuries to fund increasingly influential state institutions that worked to draw indigenous populations into a permanent but subordinate relationship; the United States, by contrast, maintained a small central government and standing army. This difference carried political consequences.

Whereas other central states funded the lion’s share of infrastructure and capitalist development, America enjoyed such abundant access to foreign capital that its railroads, telegraph systems, extractive industries and agricultural industry grew up around private investment. All of these points of differentiation contributed to a very different pattern of political development.

Working men and women who faced labor competition from new immigrants migrated en masse to destinations west—before 1900, usually to new farms; after, to cities where they found factory or service employment. Roughly 20 percent of native-born Americans were picking up stakes and moving each year, according to the census, and not always willingly. In an age of growing wealth and economic inequality, many such native-born Americans grew to resent immigrants, whom they blamed for their condition. But they also nursed intense hatred toward banks, railroads, grain operators, mine owners and financial elites, who (they believed) kept them in a state of economic privation and dependency.

The result was a particular brand of American populism (which I’ll explore in greater depth in the third article in this series). It was often viciously nativist. It was anti-statist; unlike socialists in Europe, political radicals in the United States tended to embrace punitive regulatory policies that would rein in large corporations, rather than large-scale social welfare policies that extended government health care and pensions to working people.

If all of this sounds eerily familiar, it should. Substitute Mexican immigrants for Europeans, Facebook and Google for Standard Oil and Union Pacific, JPMorgan Chase & Co. for … well, J.P. Morgan & Co. and Chase Manhattan Corp., and the story is similar.

As was the case one hundred years ago, America today is very much part—if not at the center—of history. It isn’t immune to the same currents of immigration, free trade, population aging and technological change that are upending political and economic systems around the globe. To believe that we’re “exceptional,” in the way that historians understand the term, is to reject reality for national theology.

But if we’re not exceptional, we might still be different, and the key to understanding how we make it out on the other side of any political storm is probing the strengths and vulnerabilities that flow from this difference. Tocqueville’s Boston informer posited that “there are no precedents for our history.” He was wrong. The trick is knowing which ones to look for.