Archive for July, 2011

No, not this. Rather, this. I’m entirely used to folks falling prey to the homophone problem, confusing loose with lose, and a host of other abuses of the English language. Errors are too ubiquitous to get too twisted in knots. But considering how this blog is all about finger wagging, mostly complaining about things that are wrong with the culture but on rare occasion praising something worthwhile, I feel compelled to observe that misused reflexive structures in English are among the worst lightweight offenses (if such an oxymoron can be said to exist) against my particular sensibilities, all the more so when made by people who ought to know better. It grates up there with saying or writing something as moronic as “it don’t ….”

The simple rule with reflexive pronouns (which end with the suffix -self) is that the subject must be the pronoun’s referent. You can hurt yourself (you and yourself match) but I cannot admire yourself (I is the subject and doesn’t match the object yourself). See how simple that is? In a more idiomatic use, one might reply to the question “How are you doing?” with “Very well, and yourself?” This is merely shorthand for “Very well, thank you, and how are you doing yourself?” The reflexive pronoun is arguably superfluous in this example, but it’s idiom, so don’t puzzle too long over it. Throwing the question back at the questioner is a little like another irritating and unnecessary reiteration often heard: “to return something back” (the word back being redundant).

Now, I’m not really a major grammar Nazi. I make mistakes, too, and am generally happy to forgive and forget after itching inwardly a little. However, a new blogger (Lou Tafisk — his nom de plume — of Necrotic Hijinks, now added to my blogroll) appeared a couple of months ago who, like me, is all about the finger wagging, or in his words, putting “everyone’s stupidity on display.” His targets tend to be college students as a class. By all accounts, he quit his teaching job rather than continue to handhold the cretins. He includes within the scope of his vitriol other bloggers, who happily offer themselves up for demolition. I’ve yet to see him really demolish anyone’s blog; he really just uses them as launchpads to crack wise. I threw my own hat in the ring.

Just to give dear ole Lou something to ponder, since I doubt he will take the time to familiarize himself with over six years of blogging on my part if he chooses this blog for public humiliation (doubtful, since I’m not a humorist), I thought I’d return the favor preemptively. I don’t crack wise nearly so well as he, but I’m good at finger wagging. Just look at this:

I couldn’t care about the coarse sexual repartee, which others have been quick to join, but what gives with a college professor, albeit an out-of-work one, and self-proclaimed holder of a Ph.D. using the reflexive pronoun yourself when you is not the subject of the sentence but the object? The blurb above the big Become a Victim button uses yourself correctly, though the subject is only implied with the imperative (one of the three moods in English, though the Wikipedia article lists another five from other languages just for completeness and perhaps unnecessary obfuscation), so I know he can use them properly. Yeah, sure, it’s just a brief reply lost the comments, but do we really want to entrust the minds of students, or for that matter the comic tearing down of others’ blogs, to someone who distinguishes himself with crimes against grammar?

Scratch That

Posted: July 25, 2011 in History, Idealism

“So, Timmy, tell me what you want to be when you grow up.”

“Gee whiz, Mr. Principal, maybe I’ll be a race car driver, or a fireman, or even … [pie-eyed inhalation] … an astronaut!”

Well, scratch that last one from the list. The last space shuttle has been retired. Does that mean manned space flight has also ended? I doubt it. The current crew of the International Space Station can’t just be stranded in orbit. But considering that it’s now mostly a Russian effort using rocket ships rather than space shuttles, change astronaut to cosmonaut and maybe kids still have license to dream.

The end of the shuttle program closes a troublesome though extended chapter in U.S. space flight. A little wistful emotion is hard to avoid, but even harder is an honest assessment of the failure of the program. Such an assessment has appeared in Discover, written by Amos Zeeberg:

The most important thing to realize about the space shuttle program is that it is objectively a failure. The shuttle was billed as a reusable craft that could frequently, safely, and cheaply bring people and payloads to low Earth orbit. NASA originally said the shuttles could handle 65 launches per year; the most launches it actually did in a year was nine; over the life of the program, it averaged five per year. NASA predicted each shuttle launch would cost $50 million; they actually averaged $450 million. NASA administrators said the risk of catastrophic failure was around one in 100,000; NASA engineers put the number closer to one in a hundred; a more recent report from NASA said the risk on early flights was one in nine. The failure rate was two out of 135 in the tests that matter most.

there may well be no way NASA could have known that the shuttle would flop back in the ’70s when it was being planned and built, or possibly even while it was flying in the early ’80s, before its bubble of innocence was pricked by disaster. But it would soon become clear to anyone that the shuttle program was deeply troubled — at least, to anyone who bothered to look.

The article wags its finger not only at the public for its complicity and/or shared responsibility in this escapade (as with all others — we never learn) but also at the bureaucracies that couldn’t admit their own failures, denying and covering them up for decades while they were plainly obvious “to anyone who bothered to look,” as the article says. Yet the author recognizes that the space program represents something more than just hurling men and technology into orbit, and the title of the article, “How to Avoid Repeating the Debacle That Was the Space Shuttle,” suggests hope for continued American presence in space under a renewed program. Perhaps if the author had fully adopted his own recommendations — to abandon fantasy and judge projects according to reality — he would have grounded his hopes right here, on the ground, rather than still being starstruck.

The Watergate break-in may have inadvertently opened the floodgates. Every subsequent scandal got the suffix -gate to designate something tawdry, illegal, and controversial. It’s an example of the endless flexibility of English usage and the American fascination with clever coin, goofy respelling, and double entendres. The newest phenomenon of this sort is to call something “[blank] porn” to denote something explicit and base, or appealing to prurient interests. There were scandals before Watergate, and there were shameful, voyeuristic desires and morbid curiosities before porn. But now we have handy suffixes to collect them all within categories.

Torture porn has entered the lexicon not just in reference to Abu Ghraib atrocities but in its more popular use in connection with splatter films and gorno (gore + porno) such as the Final Destination, Scream, and Saw franchises, which specialize in depictions of victims dispatched in some particularly gruesome or grisly manner. I may have seen the term first used in reference to The Passion of the Christ, which is only one installment of Mel Gibson’s fetish for torture porn (which includes Braveheart and Apocalypto among other lesser examples). Graphic novels (and their movie counterparts) such as Sin City and The 300 probably fall within the category as well.

Drone porn (or more generally, war porn) isn’t a ritualized, stylized, fictional version of the snuff film like torture porn. It’s the real deal: video shot from predator drones and other war machines showing people being blown to bits. The loss of detail (blood, guts, body parts) is made up for by knowledge that it’s for real, even if all one can see in the aftermath of a big explosion is a cloud of dust, sand, and/or smoke. Which is more deplorable, the appetite for viewing such things or the leaking of such video (by the Dept. of Defense?), is open to debate. Winning that argument is silly, in fact, as both supply and demand are pretty awful.

Food porn is glamor photography applied to foods, typically accomplished with saturated colors, lighting effects, application of shellacs, and use of hidden forms to hold shapes. This style of photography is found in advertisements, recipe books, and on menu boards but rarely corresponds to the look of food as it is actually prepared and/or served. In some instances, food porn is overtly suggestive of female anatomy or recommends food as a superior substitute for actual sex.

Antique porn is actual pornography and erotica, though vintage in origin. Considering how readily available porn with high production value is over the Internet, I have to wonder why some crave the vintage stuff. No doubt someone has also used the term in connection with antiquing.

Skill porn is a less typical use of the porn suffix referring to one fictional character after another so impossibly good at what they do that it just numbs you. Think of it as superhero creep, where skill levels (fighting, sharpshooting, strategizing, dodging hails of bullets) are so absurdly high they don’t matter anymore. That’s the problem with omnipotence and omniscience in narrative. They don’t really sustain interest because their possessors are granted unrealistic and meaningless power and license. There’s no real conflict.

Other uses are too many to name or remember, but as they proliferate, they share one principal characteristic: they embody the race to the bottom. Although race to the bottom is an economics term, its applicability to culture is obvious and thus well within the province of this blog. As access to instances of [blank] porn become ever more available through the Internet and the democratization of production (e.g., everybody making and distributing their own porn videos, such as bride porn, now almost a rite of passage for new brides), it becomes increasingly difficult not to wallow in low standards on display everywhere.

Update: I knew I was forgetting an important one. So let me add to the list doomer porn. This is an appropriate addition because this blog is at times what many would regard a doomer blog. Not always, mind you, as I can’t bear to look at that unavoidable future all the time, but often enough. Doomer porn is for those who revel in dancing on the not-yet-dead ruins of industrial civilization, chomping at the bit to enter the next phase, which is prophesied to entail a die-off of perhaps 80–90% of the human population (animal and plant populations have already suffered horribly), a war of all against all for the remaining scraps, and a good chance at preemptive nuclear Armageddon before either of the other predictions comes to pass.

I’ve struggled for a little while to know quite what to do with this quote from The Master and His Emissary, which I’m currently reading (albeit very slowly) and blogging as I go:

Language enables the left hemisphere to represent the world ‘off-line’, a conceptual version, distinct from the world of experience, and shielded from the immediate environment, with its insistent impressions, feelings, and demands, abstracted from the body, no longer dealing with what is concrete, specific, individual, unrepeatable, and constantly changing, but with a disembodied representation of the world, abstracted, central, not particularised in time and place, generally applicable, clear and fixed. Isolating things artificially from their context brings the advantage of enabling us to focus intently on a particular aspect of reality and how it can be modelled, so that it can be grasped and controlled.

After this, McGilchrist launches into a wider discussion of metaphor and symbol, which gets a little heady for the uninitiated. I say uninitiated because the insight that language is a human technology, like writing, number systems, clock time, and others, that enables us to construe reality according to certain inherently limiting principles is not altogether obvious or intuitive to most of us precisely because we are inside the bubble, working and thinking from within those limitations. For instance, the inner voice everyone hears in the mind’s ear is language based, and to think in other terms — without words — is closer in experience to feeling than thinking.

I sensed something profound in McGilchrist’s analysis, but I didn’t know how to structure my thinking. Then I saw in the comments at a recent post at kulturCritic this remarkable observation, which Sandy Krolick appears to have simply tossed off:

… the transformation of language from an oral to written traditions wrought incalculable damage not only to the fullness of words, but to the fullness of experience as well. Univocality replaced polysemy. And the power of the spoken word was emptied out in the interests of clarity, disambiguation and legalistic adjudication. Scientific control of nature and people took precedence over everything else. And life became similarly emptied as a result. Specialization in how we interacted with one another was a further qualification on this specialization in language — at the semantic, syntactic and logistic levels of communication. This was the ground work for the curriculum of the West.

Krolick’s comment meshes extremely well with McGilchrist and reminded me of a blog post I wrote more than two years ago citing Neil Postman’s discussion of “The Judgment of Thamus,” as well as another more recent blog post about language in decline. It’s all probably too much to read and absorb, so I’ll summarize.

Krolick and McGilchrist are both arguing that words get in the way of a more immediate connection with (not “to”) the world by creating screens and abstractions that allow us to understand, manipulate, and control things and ideas. Language and writing thus represent a fundamental departure from a much older, primal identification with reality. In contrast, I’ve been in the rather unfortunate position of defending language and deploring the decreasing facility with which most people use words in both speech and writing. These perspectives share a fundamental concern: the loss of meaning. But the lost meanings are quite different from each other. Preverbal cognition is experienced in the body but is notoriously difficult to access, as in the controversy over infant amnesia, precisely because it isn’t fixed in memory through language. Meaning is felt through empathetic identification, but it’s a constantly moving target. Verbal cognition is experienced in the head and essentially amounts to a powerful virtual reality that blocks or at least dominates other cognitive states. Meaning is imposed and rationalized but ultimately fails to be very convincing because it is largely fictive.

This presents a puzzle: what type of beings should we really be? Slobbering, grunting brutes who share the world (modestly) with other animals or sophisticated, thinking men who possess power to create wonders and even more immense power to destroy? History has gotten us to this second state, but it’s clear that many of us are deeply dissatisfied with our labors and wish for something more immediate and primitive. The trend toward ever greater erudition and understanding has probably only just reversed, but it’s apparent to the cognoscenti that, for the masses, being a know-nothing is preferable to being a know-something. Otherwise, we wouldn’t have to nearly so many silly fops in entertainment and government to keep us enthralled with their vapid stupidity. Whether this is an expression of the cultural mind destroying itself is a good question.

Several things stand out in this article in the NY Times, among them the tiresome attempt to tease behavioral characteristics out of hunter-gatherer cultures and apply them or at least draw parallels to modern First Worlders. I see this all the time in online discussions of evolutionarily advantaged cultural practices, a questionable concept itself worth distinguishing from immediate survivability. Most of the time, the author appears to be some yahoo with a high school understanding of biology and no real understanding of anthropology trying to combine the two. Or is the writing merely dumbed down for the audience? (My own expertise does not extend to these fields, so maybe I’m in a poor position to judge, but I’m extremely skeptical nonetheless of unsupported claims, hypothetical explanations, and cherry-picked details torn from context.)

Natalie Angier (the author linked to above) has numerous articles in the NY Times science section, including these recent titles:

  • Opossums: A Fast Life and Success That Starts in the Pouch
  • Much More to Jellyfish Than Plasma and Poison
  • Serotonin, Our Utility Hormone, Still Surprises
  • Humans and Animals: An Ancient and Complex Bond
  • New Caledonian Crows Owe Their Toolmaking Skills to a Nourishing Nest
  • Searching for the Source of Our Fountains of Courage
  • Musk Oxen (Ovibos moschatus) Tell a Tale of Survivors

I suspect a substantial discontinuity exists between behavioral data aggregated over diverse populations and long stretches of time and the responses we exhibit to the short-term, rapidly shifting incentives of modern life (not those dealing with actual survival pressure on the savanna but more typically those about maximizing profit). But Angier doesn’t let that stop her from a whiplash-inducing shift from snapshots of Ache hunter-gatherers in eastern Paraguay, !Kung bushmen of the Kalahari in Africa, and Hadza foragers of northern Tanzania to “top American executives,” who represent the peaks of hierarchies that are far less pronounced in, shall we say, less developed cultures.

Subsistence cultures often adopt behaviors such as insulting the meat of a successful hunter to discourage pride and to encourage sharing. (The urban poor have similar coping mechanisms.) But this is not the case in cultures of abundance, where power hierarchies emerge and those who manage to scramble to the top quickly pull up the ropes and ladders behind them. Obviously, incentives differ between haves and have-nots. History has shown the former are fairly uniformly prone to corruption and cravenness. The latter would probably go there, too, but for their limited resources.

Angier traces a line, perhaps unintentionally, from childish egalitarianism and zealous devotion to fairness (read: the limbic or reptile brain) to adult acceptance of the necessity of unfairness. Here’s the really nasty part:

When given a mild anti-anxiety drug that suppressed the amygdala response, subjects still said they viewed an 80-20 split as unjust, but their willingness to reject it outright dropped in half.

So what portion of the public is being prescribed antidepressants? The most recent numbers I could find were reported in USA Today in 2009:

About 10% of Americans — or 27 million people — were taking antidepressants in 2005, the last year for which data were available at the time the study was written. That’s about twice the number in 1996, according to the study of nearly 50,000 children and adults in today’s Archives of General Psychiatry. Yet the majority weren’t being treated for depression. Half of those taking antidepressants used them for back pain, nerve pain, fatigue, sleep difficulties or other problems, the study says.

How easy is it to connect the dots? If the rate of usage doubled from 1996 to 2005, what must it be now, six years later? If I were a fascist or other controlling type and sought ways of getting the public to acquiesce to authority, accept injustice unblinkingly, and forgo civil rights, what could possibly be better than medicating the public into submission? Some egalitarian instinct we have.

I saw the documentary What’s the Matter with Kansas? recently and felt an odd mixture of empathy and disdain toward the people profiled in the film. It’s hard not to feel some empathy for folks who are witnessing their way of life slipping away. But that’s true of all of us now, as the Age of Oil winds down, so there is nothing especially notable about Kansans, who are presented in caricature as though everyone in the state is a farmer. The troubles of Kansas farmers are not really of their own making, but their questionable understanding of modern dilemmas and subsequent voting for policies and politicians who protect one set of interests while undermining another have led to worsened outcomes and films that ask WTF? Feeling disdain for them is a little like blaming the victim or even Schadenfreude, which are unwholesome sentiments to harbor.

The film focuses on two intertwined institutions: the family farm and the nuclear family. Both had their heyday during living memory and became idealized notions of how living arrangements might best be constituted. Yet they were in actuality confined to a relatively narrow band of mid-20th-century history. (Farther back in history, Kansas had a reputation as a hotspot of radicalism. Go figure.) Kansans apparently believe they are voting for those institutions in all their faded glory, but demographic and technological change has already nearly wiped out the economic conditions that made them possible.

Some of us who learned civics in high school, back when it was still fashionable to teach and learn such things, came to believe that a healthy socioeconomic system includes informed voters making rational decisions and voting accordingly. It was considered a sign of civic virtue to vote for public interests, especially if that meant voting against personal interests, because society would degrade if everyone voted selfishly. The conflict, as one can observe in Kansas, is that people vote for conservative social issues while unwittingly ceding their economic interests to agribusiness. Revised economics dictate that small family farms either merge and grow into corporate behemoths or eventually be put out of business. The cause and effect is not so simple as the voting record, but folks are nonetheless given the mistaken notion that, for example, voting in defense of marriage helps to stem the decay of that once-hallowed American institution and the nuclear family that flows from it.

It would probably have been better if voters had acted instead to protect their economic interests rather than vote their emotions and hates, as many do now. Shifting from principled community interests to purely private economic interests sounds like market fundamentalism, an economic policy shared by many conservatives, but in rich irony, voting against the latter and in support of the former has left the field open for corporations and the monied elite to consolidate their economic positions without effective opposition or resistance.