Archive for May, 2014

I am, as usual, late getting to the latest controversy in academe, which has been argued to death before I even became aware of it. Inside Higher Ed appears to have gotten there first, followed by editorials at The New York Times, The Los Angeles Times, and The Washington Post. At issue are trigger warnings, a neologism for what might otherwise be called parental advisories (thinking in loco parentis here), to be placed in syllabi and on classroom materials, at first fiction reading but potentially history lessons (and frankly, just about anything else), that might trigger a panic attack or some other dolorous response from a student with a phobia or memory of a traumatic experience. The opinion articles linked above (Inside Higher Ed is more purely reporting) are all in agreement that triggers warnings are a bad idea.

Although articles in news organs are more nearly broadcasting and thus lack discussion (unless one ventures into the swamp of the comments section, which I rarely do), I indulged in a long discussion of the subject with fellow alumni of one of the institutions named in the reports. As with other issues, it developed so many facets that a snapshot determination became impossible if one attempted to accommodate or address all perspectives. Therein lies the problem: accommodation. Left-leaning liberals are especially prone to hypersensitivity to identity politics, which gained prominence in the late 1970s or early 80s. I quickly run afoul of anyone who takes such a perspective because I am notoriously white, male, well-educated, and middle class, so I must constantly “check my privilege.” When someone like me refuses others accommodation, it looks to others like raising the ladder behind me after I’ve safely ascended. I can appreciate, I think, how frustrating it must be to have one’s earnestness thwarted, but yet, I admit I just don’t get it. At the risk of offending (trigger warning here), let me blunder ahead anyway.

The world (or as I’m beginning to call it more simply, reality) is a messy place, and each of us inevitably carries some phobia, trauma, or history that is unsavory. From one celebrated perspective, what doesn’t kill us makes us stronger; from another, we are trained to request accommodation. Accommodation used to be primarily for physical disabilities; now it’s for feelings, which some argue are just as debilitating. This is the province of every self-selected minority and special interest group, which has spawned predictable backlashes among various majority groups (e.g., the men’s movement, resurgent white supremacy). Naturally, any lobby, whether part of a minority or majority, will seek to promote its agenda, but I regard the brew-ha-ha over trigger warnings as an example of growing incidence of what’s been called the Strawberry Generation. It’s remarkable that students now regard themselves as dainty flowers in need of special protection lest they be trampled by, well, reality. So trigger warnings are being requested by students, not on their behalves. With so many examples throughout even recent history of flagrant social injustice and oppression, it’s clear that everyone wants to proclaim their special combination of disadvantages and receive accommodation, all the better if multiplied by inclusion in several protected classes. It’s a claim of victimhood before the fact or perhaps permanent victimhood if one is a survivor of some nastiness. (Disclaimer: real oppression and victimhood do exist, which I don’t intend to minimize, but they’re not because of reading fiction or learning history, scandalous as they may be).

In addition, what exactly is accomplished by virtue of warnings that one is about to encounter — what should it be called — messy material? Does one steel oneself against impact and thus limit its educational value, or does one expect to be excused from facing reality and receive an alternative assignment minus the offending material? Both are the antithesis of higher education. Arguments in the abstract are easy to ignore, so here are two specific examples: substitution or elimination of the words nigger and injun in modernized editions of Mark Twain’s Adventures of Huckleberry Finn and biology textbooks that give consideration to (literally) unscientific accounts of creation and evolution. If one’s racial or religious background gives rise to excess discomfort over the use of one or another egregious trigger word (nigger in particular now having been reclaimed and repurposed for all sorts of uses but with caveats) borne out of ripe historical context or what science (as opposed to religion) teaches, well, that’s the world (reality) we live in. Sanitizing education to avoid discomfort (or worse) does no one any favors. Discomfort and earnest questioning are inevitable if one is to learn anything worthwhile in the course of getting an education.

In his defense of the canon of English literature published in Harper’s (March 2014), Arthur Krystal wrote that traditionalists argue “its contents were an expression of the human condition: the joy of love, the pain of duty, the horror of war, and the recognition of the self and soul.” I would add to these the exhilaration of understanding, the transcendence of beauty, the bitterness of injustice, and the foreknowledge of death. Ranking or ordering elements of the human condition is a fool’s errand, but I contend that none possesses the power to transfix and motivate as much as knowing that one day, each of us must die.

Thus, we develop narratives of a supposed afterlife, in effect to achieve immortality. The most typical are religious dogma regarding eternity spent in heaven, hell, purgatory, or limbo. Another way to cheat death, or more simply, to be remembered, is passing one’s genes to another generation through procreation and achieving a small measure of proxy immortality. Other examples include acquiring fame and wealth to make a mark on history, such as having one’s name on buildings (like the $100 million presidential library and museum being discussed for siting in Chicago’s Hyde Park neighborhood honoring Barack Obama), or having one name inscribed in one of many sports record books, or being preserved on celluloid (which is now increasingly archaic, since everything is going digital). For creative arts, the earliest works of literature to have achieved immortality, meaning that they are still widely known, read, and performed today, are the plays of William Shakespeare. For musicians, it would probably be J.S. Bach. I discount the works of the ancient Greeks or those of the Middle Ages throughout the rest of Europe because, despite passing familiarity with their names, their legacies lie buried deep below the surface and are penetrated only by scholars.

And therein lies the rub: for posterity to supply meaning to an earthly afterlife by proxy, culture must retain historical continuity or at least some living memory. Yet wide swathes of history have been rendered both mute and moot, as Shelley makes clear in his sonnet Ozymandias, with its memorable interdiction, “Look upon my works, ye mighty, and despair!” Who among us can claim to know much if anything about ancient Egyptian or Chinese dynasties, or indeed any of the other major civilizations now collapsed? Our own civilization, grown like a behemoth to the size of the globe, now faces its own collapse for a host of reasons. Even worse, civilizational collapse, ecological collapse, and depopulation present the very real possibility of near-term human extinction (NTE). All the assiduous work to assure one’s place in history won’t amount to much if history leaves us behind.


Peter Van Buren has a new book out and is flogging it at TomDispatch. He’s a good enough writer, so I have no objection to the promotional aspect of disseminating his own work. But as I read his article describing an America gone to seed, I realized that for all his writerly skill, he misses the point. As a former State Dept. administrator (charged with assisting Iraqi reconstruction) turned whistle-blower, Van Buren is clearly outside the mainstream media and somewhat outside mainstream opinion, yet he appears to be well within the dominant paradigm. His new spin on regime change takes as implicit all the teachings of economics and politics as systems ideally suited to engineering an equitable social contract where everyone benefits. But as cycles of history have shown, those systems are even more prone to manipulation by a power elite who care little about people they pretend to serve. Whether that carelessness is learned or ingrained in the kleptocracy plutocracy is open to debate.

Van Buren’s article offers a few interesting tidbits, including a couple neologisms (I’m always on the lookout for new coin):

dirt shadow = the faint but legible image left behind an uninstalled sign on the exterior of a closed storefront or building

street gravy = the dirt and grunge that collects over time on a homeless person

Neither is too picaresque. The second is obviously a (sad because it’s too hip) euphemism, since gravy suggests richness whereas the actuality is downright unpleasant. As Van Buren surveys, similar unpleasantness is currently experienced all across America in towns and niche economies that have imploded. Interestingly, his counterexample is a U.S. Marine Corps base, Camp Lejeune located in North Carolina, that functions as a gated community with the added irony that it is supported by public funds. Van Buren also notes that, according to the Congressional Budget Office, an average active-duty service member receives a benefits and pay compensation package estimated to be worth $99,000, some 60 percent of it in noncash compensation.

If there is a cause why our regime is in disarray, however, Van Buren busies himself with standard economic and political (one might even say military-industrial) explanations, demonstrating an inability to frame the decline of empire as the beginning of an epochal shift away from plentiful energy resources, famously termed The Long Emergency by James Howard Kunstler. (We ought to resurrect that phrase.) Other frames of reference are certainly not without their impacts, but the inability to connect all the dots to see the underlying cause is commonplace in the mainstream.

In contrast, consider this passage from Harvesting the Biosphere by Vaclav Smil:

There are two simple explanations why food production in traditional agricultural societies — despite its relatively high need for claiming new arable land — had a limited impact on natural ecosystems: very low population growth rates and very slow improvements in prevailing diets. Population growth rates averaged no more than 0.05% during the antiquity and they reached maxima of just 0.07% in medieval Eurasia — resulting in very slow expansion of premodern societies: it took Europe nearly 1,500 years to double the population it had when Rome became an empire, and Asian doubling was only a bit faster, from the time of China’s Han dynasty to the late Ming period. [pp. 118–119]

Smil goes on to provide exhaustive detail, much of it measurement (with acknowledged ranges of error), showing how modern mechanisms and energy exploitation have enabled rapid population growth. Although population has (apparently) not yet peaked, we are already sliding back down the energy gradient we climbed over the past 250 years and will soon enough face widespread food shortages (among other things) as productivity plummets due to diminishing energy inputs and accumulated environmental destruction (including climate change). Economics and politics do not possess solutions to that prospect. That’s the real dirt in the historical narrative, which remains largely uncovered (unreported) by paradigmatic thinking.