Before continuing with my series on “Pre-Extinction Follies,” I want to divert to an idea I’ve struggled with for some time, namely, that by virtue of socialization and education (and especially higher education), we train our minds to think according to a variety of different filters. Which filter is most powerful and for what objectives is a question that leads to many potential answers, such as, just for example, (1) the scientific worldview and its follow-on power to manipulate (and pollute) the land, sky, and oceans of the planet, (2) the spiritual worldview and its power to transfix the human psyche, (3) the artistic worldview and its power to resonate with emotion and intuition, or (4) the sportsman’s worldview and its power to construe the world in terms of pointless endless cycles of competitions, games, and championships. As I observed here, there is considerable overlap that makes distinguishing between competing worldviews somewhat questionable, but considering how depth and nuance is driven out of most points of view, keywords, soundbites, and dogma function just fine to separate and define people according to classes, races, demographic groups, etc.

The idea of twisted minds, never far from my thinking, came to the fore again recently because of two experiences: reading (at long last) Joe Bageant’s Rainbow Pie and getting HBO, which granted access to comedy shows (Real Time with Bill Maher and Last Week Tonight with John Oliver) that rework political and cultural news to make it palatable to and digestible by the masses. Having been a viewer of The Daily Show for some time and long before that a variety of Bill Maher’s exploits, I appreciate the acumen it takes to transmit (some of) the news comically. That particular filter is precisely why I go there. Along the way, I get exposure to lots of ideas I normally avoid (yes, I practice a form of information aversion at the same time I’m an information sponge, though not a political junkie or news hound), but I don’t kid myself as hosts of those shows sometimes chide their own audiences that I’m getting all of the news there.

Still, I can’t help but feel frustration at the way various media folks twist the news. In unscripted interviews and panel discussions in particular, ask a question of an economist and an economics answer results. The same is true, respectively, of news anchors, magazine and blog writers, and celebrities (mostly actors). They may have excellent command of the issues, but their minds reshape issues according to their training and/or vocation, which makes me want to hurl things at the screen because opinions and policy are frequently so constrained and twisted they become idiotic. An economist who promotes growth is a good example (more of what’s destroying us, please!). The worst, though, are politicians. Career politicians (is there any other kind?) are conditioned to distort issues beyond recognition and to deal with people (and their issues) as demographics to be shuffled in the abstract around the imaginary surface of some playing field. Dedication to service of the commonweal is long gone, replaced by theater, spin doctoring, and perpetual campaigning.

In contrast, someone comes along infrequently who has the wit and god’s eye view necessary to contextualize and synthesize modern information glut effectively and then tell the truth, the latter of which carries a very high value for me. That would be Joe Bageant, whose writing and perspective are fundamentally alien to me yet communicate with power and clarity, cutting through all the manufactured bullshit of trained and twisted minds. Writing about literacy, Bageant has this to say about the redneck folks (the white underclass) he knew and was part of growing up:

  1. They do not have the necessary basic skills, and don’t give a rat’s ass about getting them;
  2. Reading is not arresting enough to compete with the electronic stimulation in which their society is immersed;
  3. They cannot envisage any possible advantage in reading, because the advantages stem from extended personal involvement, which they have never experienced, are conditioned away from, and is understandably beyond their comprehension; and
  4. Their peers do not read as a serious matter, thereby socially reinforcing their early conclusion that it’s obviously not worth the time and effort ….

Elsewhere, Bageant writes about the unacknowledged lessons of class warfare that his brethren knew as a matter of intuition from living through it rather than through abstract instructions of some sociology text or teacher. We all possess that intuition about a wide array of issues, but we suppress most of it as a result of educational conditioning and conformity (the rightthink or political correctness for which we congratulate ourselves on issues such as sexism, racism, and LGBT rights). So we prefer the happy lies and fables of politicians, entertainers, and educators to the awful truth of what’s really happening all around us, plain to see. It’s a systemic fraud in which we all participate.

What strikes me, too, is that education (or literacy) does not function as a panacea for the masses. Over-educated Icelanders made that clear roughly a decade ago. Bageant decries the ignorance (“ignernce”) of his social stratum and their continuous knuckling under to their supposed betters, yet he admits they flee into the middle and upper classes when opportunity arises, social mobility usually resulting from educational accomplishment. The unspoken conclusion, however, is that the educated elite conspire (albeit sometimes unwittingly) to perpetuate and intensify class warfare and to preserve their enhanced position on the scale. And they do so with the armature of education.

This is the second of four parts discussing approaches to the prospect of NTE, specifically “Consume, Screw, Kill” by Daniel Smith in Harper’s Magazine (behind a paywall), which is a review of The Sixth Extinction by Elizabeth Kolbert. Part one is found here.

I don’t plan to read The Sixth Extinction any more than Guy McPherson’s book Going Dark, primarily because I need no further convincing from either author how badly things are going in the world (not in the economic or geopolitical sense, though those aspects are also in the protracted process of cratering). Also, I have other titles with stronger claims on my (remaining) time. But Smith’s review of Kolbert’s book and this review of McPherson’s came across my path, both of which I read with some interest. Professional critics often sound false to me, their assessments frequently being bogged down with competing motivations that interfere with honesty and objectivity, including careerism and what’s allowed to be said in mainstream publications.

Smith recounts what’s in Kolbert’s book with considerable detail and worthwhile context. This part is valuable. But the logical emotional response is stuffed down like choked-back vomitus. (Is there such a thing as a logical emotion, or is that an oxymoron?) Instead, Smith’s principal critique is to waive away Kolbert’s indictment of humanity, who have wrecked things not just for ourselves but for the rest of creation. Smith notes that over evolutionary and geological time, the Earth has always recovered from extinction events:

There are moments reading The Sixth Extinction when one is positively cheered by the geologic perspective on display. A giant rock smashed into Earth, baking it to a crisp — and still the planet recovered. More than recovered, it thrived! So profligate and inventive is the process of evolution, and so resourceful and fecund are the planet’s life-forms, that even now we can’t say how many species live here … None of this is to say that the collapse of biodiversity is not a tragedy. It is simply to say that tragedy is a human problem, inevitably defined in human terms.

Maybe this sixth time is little different, other than being anthropogenic in origin, but considering that ionizing radiation from unattended nuclear power plants is certain to irradiate the entire planet once we’re gone, I have a hard time chalking this one up as merely another in a series of catastrophic geological events. True, by now we have no control over foreseeable events, if ever we did, but we have culpability this time. The telescope / microscope argument carries no weight with me, either. To zoom in or zoom out far enough to alter one’s perspective so radically there is no longer any reason to fret is just a cheap rhetorical trick. Extinction won’t be a comic episode of “Honey, I Shrunk the Kids” or B-movie fare such as “Attack of the 40-ft. Woman” (both cautionary if fanciful tales of science run amok). Instead, it will be “Oops, we killed almost everything, including ourselves” — a nearly lifeless Earth. The word oops hardly expresses the response most have once they have processed the awful news. Smith goes on:

It is also to say that it is difficult to know just how to respond to the sixth mass extinction — or to The Sixth Extinction.

Really? The response any decent person ought to have is any one of, say, abject horror, galling self-recrimination, existential despair, desperate conservation, or resigned leave-taking, not some vague wondering. Individuals typically wrestle with the eventuality of their own demise at some point; that’s part of the human condition. The prospect of no future for one’s progeny, or anything else for that matter, well, that’s a novelty. Compartmentalization in the face of such a threat is the mark of an academic or journalistic approach, such as this further remark:

Kolbert nevertheless maintains a conspicuous ethical distance from her subject, no doubt both because she is one of the 7 billion defendants in the dock (and one who, I don’t think it’s churlish to note, has been racking up a disproportionate number of airline miles) and because of the likelihood that the sixth extinction is a foregone conclusion.

No, churlish is exactly what it is. How exactly does Smith think scientific reports, news stories, and books come into being except to go out and gather information, synthesize it, and draw from it conclusions? The risible charge that airline miles makes Kolbert or any of us a worse contributor than the next is idiotic in the extreme. No one has ideological purity or behavioral innocence. We all have a participatory hand, but we only happen to be here at the end stage. Indeed, the fix we’re in has been centuries if not millennia in the making. Along the way, the drift of culture and history has shaped our options broadly and narrowly, but only a very few have ever been in a position to influence outcomes with any significance, and they appear to have been routinely corrupted by that uniqueness.

Smith’s approach to pre-extinction contrasts with others I will blog about in parts 3 and 4 of this series.

I’ve been working my way (as always, slowly) through three different approaches and a fair number of hyperlinks found therein to the prospect of Near-Term Extinction (NTE — modifying that as Near-Term Human Extinction, or NTHE, is an unnecessary and self-absorbed embellishment). The three are these:

  1. Consume, Screw, Kill” by Daniel Smith in Harper’s Magazine (behind a paywall), which is a review of Elizabeth Kolbert’s book The Sixth Extinction,
  2. The Last of Everything” by Daniel Drumright, which is a blog essay at Nature Bats Last, and
  3. Digital Humanities in the Anthropocene” by Bethany Nowviskie, which is a transcript of a talk given at the Digital Humanities 2014 conference in Lausanne, Switzerland.

Each of these is rather long and involved. Drumright has been in the vanguard for decades, but the others may be relative latecomers (hard to know whether this is accurate) to the complex of ideas I’ll call “pre-extinction follies.” That complex is basically a response to the recognition that we humans are very likely not long for this world due to a variety of factors well beyond our control but delayed in their effect. That delay provides opportunity for quite a bit of introspection while the lights are still burning and store shelves are still stocked. From an only slightly longer perspective, such responses are arguably the province of what some call the chattering classes: those many pundits and commentators with time, education, and media resources available to ponder issues that lie largely beyond the ken of the masses. That would include me, obviously. Read the rest of this entry »

The Chicago Reader has a feature article on something I have blogged about repeatedly, namely, infiltration of abandoned structures to take photographs and video(s) in the interest of documenting modern ruins and establishing an aesthetic I called “post-industrial chic.” The Reader article provides new nomenclature for this behavior and sensibility: urban exploration, or urbex for short. The article cites Detroit, Chicago, and Gary (IN) as urbex hubs, but my previous surfing around the Internet revealed plenty of other sites, including those on other continents, though perhaps none so concentrated as the American rust belt. The idea is proliferating, perhaps even faster than abandonment of structures built to house our more enterprising endeavors, with Facebook pages, Meet.Up groups, and an already defunct zine/blog/book complex called Infiltration, which is/was devoted to penetrating places where one is not supposed to be. It would be suitably ironic if Infiltration had itself been abandoned, but instead, its founder and chief instigator passed away.

It’s impossible to know what may be going on inside of the minds of those who are, by turns, documentarians, aesthetes and artists, thrill-seekers, and voyeurs. Have they pieced together the puzzle yet, using their travels to observe that so many of these crumbling structures represent the ephemeral and illusory might of our economic and technical achievements, often and unexpectedly from the Depression Era with its art deco ornamentation? Is there really beauty to be found in squalor?

Answers to those questions are not altogether apparent from urbex sources. Whereas artistic statements are de rigeur in galleries and artist’s websites, urbex purveyors tend to be uncharacteristically silent about their drive to document. There are frequent paeans to the faded, former glory of the abandoned sites, but what resonates is the suggestion of human activity and optimism no longer enjoyed but held over in the broken fibers of the structures rather than a recognition that, by not even being worth the bother of tearing down, these ruins are close reminders of our own uselessness in old age, impermanence, and mortality.

To those more doom-aware, if I can be so presumptuous, another deeper significance flows from late-modern ruins: our self-defeat. The Pyrrhic victory of human success (in demographic terms) over the rest of creation has lasted long enough to spans entire lifetimes, which has been enjoyed innocently by those born at the propitious historical moment (if, indeed, they managed to survive various 20th-century genocides and wars). But for those of us born only a little later, we are already witness to the few decayed bits (thus far) of the far more expansive human-built world we will leave behind.

This fate was explored by the History Channel film Life After People, which omits the obvious reasons for our disappearance but simply leaps ahead in time to contemplate how the natural world reacts to our absence. The film, as it turns out, became the pilot for a series that appears to have run for two seasons, largely on its own recycled bits. Invented imagery of this eventuality is echoed in all manner of cinematic demolition derbies, with New York City and the White House among the most iconic locations to undergo ritual destruction for our, um, what? Enjoyment?

Something caught my eye this week as I was surfing around, this time from a mostly abandoned classical music criticism blog I used to read (with some frustration). I reproduce in full a post called “Top Ten Music School Rankings” because it’s content-lite (perhaps not original to the blog):

10. The school where you did your undergrad.
9. The school where you got your Master’s, and to which you are indebted for the gigs it helped you get to pay off the student loans for the school where you did your undergrad.
8. The place where you wrote your DMA dissertation on your teacher’s teacher’s teacher’s pedagogical methods (or lack thereof).
7. Juellerd. Julleard? Julliard. Jewelyard? Whatever.
6. Harvard.
5. The place you wanted to go for undergrad, but you fracked one single note in one single excerpt and then you panicked and broke down and called the trumpet professor “Dad” and then Dave got in even though he couldn’t play Petrushka in time and he’s always been kind of a dick about it and now he’s subbing like every weekend in the fucking BSO.
4. Royal Something of Great British Academy I think? I hear they never let Americans in. Or maybe that’s the other one?
3. The school that everybody knows isn’t as good as the school where you did your undergrad, but is “up-and coming.” Featuring a lauded entrepreneurship initiative that trains barista skills at one of the three coffee shops housed in its new state-of-the-art building, named for an alumnus of the university’s business school currently facing indictment for fraud.
2. University of Phoenix.
1. The school that has paid to have this list promoted on Facebook.

It’s funny (I guess) in ways that register mostly on music school grads, whose experiences and concerns over musical minutiae diverge from the mass of college graduates who majored in business, English, or any number of professional studies (premed, prelaw, journalism) that lead more consistently to employment in those professions. (Music school, BTW, is an unacknowledged type of trade school.) But the jokes are also somewhat ghoulish in ways that are becoming increasingly familiar to everyone seeking employment after completion of the formal phase of education. Mentions of Univ. of Phoenix and Facebook ought to be struck from this particular list except that they’re too emblematic of the systemic fraud that now passes for higher education. So it was curious to read, after the hooting and self-recognition in the comments section, a veritable cry in the wilderness:

I graduated from Oberlin, Michigan and Wisconsin and am currently a custodian in an apartment complex. I even won the concerto competition at 2 of the 3 schools and am in debt up to my eyeballs. I wish music schools would emphasize alternatives in the field of music, offer apprenticeships and internships and even require students to double major or double on a secondary “gig” instrument, so they could do well in the field.

Despite robust demand for education in performance fields (e.g., music, dance, acting) and other fine arts, there have never been waiting jobs anywhere close to the number of (presumably skilled) graduates churned out by schools. Obviously, one can invert the supply-demand nomenclature to oversupply of skilled performance labor vs. minimal market demand for those skills. Offering such degrees by every second- and third-tier school is undoubtedly a money-making enterprise but is nonetheless tantamount to intellectual dishonesty of a type distinct from what I blogged about here. Faculty and administrators are certainly hip to the fact that they’re often selling a bill of goods. After all, they’re paid for that expertise. This is why some parents (and some professors, too) do everything in their power to discourage students from pursuing performance studies, but to little avail as enrollments and selectivity continue to rise even if skill levels and accomplishment don’t.

As the “debt up to my eyeballs” comment above exemplifies, the cost of higher education has mounted far faster than inflation, and crushing student debt (unlikely to ever be repaid) now accompanies attendance at most top-tier schools except perhaps to trust-fund students. And even those top-tier schools find it difficult to deliver graduates into waiting jobs. It’s not that no one gets employed, mind you; it’s just that majoring in performance studies of one sort or another is akin to (essentially) majoring in football or basketball with dreams of joining the NFL or NBA after school. The numbers don’t bode well for graduates without extraordinary talent and/or connections, and unlike sports franchises, the arts don’t operate as pure meritocracies. Scoring ability if far more measurable than artistic expression, making it worthwhile to tolerate the misbehavior of thugs and cretins with speed, power, and attitude. I’m still waiting for the meme to establish itself that perhaps the considerable risk of tens of thousands of dollar in debt to attend music school is not worth the reward. This would clearly be a case of “do as I say, not as I do,” as careful readers of this blog must surmise by now that I, too, went to music school, though some while back before tuition hikes put it out of reach for me.

I don’t normally concern myself overly much with B movies. I may watch one while munching my popcorn, but they hardly warrant consideration beyond the time lost spent plopped in front of the screen. My first thought about World War Z is that there hasn’t been another case of a special effect in search of a story since, well, any of the films from the Transformers franchise (new one due out in a couple weeks). WWZ is a zombie film — the kind with fast zombies (running, jumping, and busting their heads through glass instead of just lumbering around) who transform from the living into the undead in under 20 seconds. None of this works without the genre being well established for viewers. Yet World War Z doesn’t hew to the implicit understanding that it should essentially be a snuff film, concocting all manner of never-before-seen gore from dispatching them-no-longer-us. Instead, its main visual play is distant CGI crowd scenes (from helicopters — how exciting!) of self-building Jenga piles of zombies.

Two intertwined stories run behind the ostensible zombie dreck: (1) an investigation into the origin of the viral outbreak that made the zombies, leading to a pseudo-resolution (not quite a happy ending) Hollywood writers apparently find obligatory, and (2) reuniting the investigator with his family, who has been separated because he’s the kind of reluctant hero with such special, unique skills that he’s extorted into service by his former employer. Why an A-list actor such as Brad Pitt agrees to associate himself with such moronic fare is beyond me. The character could have been played by any number of action stars aging past their ass-kicking usefulness as we watch: Bruce Willis, John Travolta, Nicolas Cage, Pierce Brosnan, Mel Gibson, Liam Neeson, Wesley Snipes, Keanu Reeves (who can at least project problem-solving acumen), and Sylvester Stallone, just to name a few. This list could actually go on quite a bit further.

This is the kind of film for which the term suspension of disbelief was coined. The implausibly fortunate survival of the hero through a variety of threats is assured, tying the story together from front to back, which is a cliché that drains dramatic tension out of the story despite everyone around him perishing. I was curious to read P.Z. Myers’ rant discussing the awful science of World War Z, which also observes plot holes and strategic WTFs. The bad science doesn’t stick in my craw quite like it does for Myers, but then, my science background is pretty modest. Like so many fight scenes in action movies where the hero is never really injured, I just sorta go with it.

What really interests me about WWZ, however, is that it presents yet another scenario (rather uninspired, actually) of what might happen when society breaks apart. Since the film features a fast crash where everything goes utterly haywire within hours — yet the electrical grid stays up — the first harrowing scene is the family fleeing, first in a car and then a commandeered mobile home, before seeking temporary refuge in a tenement. The main character states early on that people on the move survive and people who hunker down are lost. That may be true in a theater of war, but I can’t judge whether it’s also true with a virulent contagion scenario. In any case, the investigator alternates between movement and refuge as his situation changes.

Because the zombie horde is a functionally external threat, survivors (however temporary) automatically unite and cooperate. This behavior is borne out in various real-world responses to fast-developing events. However, slow-mo threats without the convenient external enemy, such as we’re now experiencing in the real world with protracted industrial collapse, provides a different picture: dog eating dog and fighting to survive another day. Such alternatives cause many who foresee extraordinary difficulties in the decades ahead to wish for events to break over civilization like a tsunami, taking many all at once and uniting those unlucky enough to survive. But until that happens, we’re faced with slow death by a thousand cuts.

Fools Rush In

Posted: July 1, 2014 in Culture, Economics, Education, Literacy
Tags: ,

Several highly publicized inventories of OECD Skills Outlook 2013 hit the media last fall and then promptly fell off the radar. They stayed on my radar, waiting for the propitious time to sort my thinking and develop a blog post. (I’m always late to the party.) The full report is 466 pp., including blank pages, extensive front- and back-matter, and a writing style that positively discourages reading except to pluck quotes or statistics, as I do below. Such reports (e.g., the Intergovernmental Panel on Climate Change (IPCC) also released in Fall 2013, which I also blogged about here) take considerable effort to compile, but they always leave me wondering whether any of them are actionable or worth going to such lengths to assess, compile, and report. Even the executive summaries expend more effort saying what the reports are rather than offering a cogent conclusion and/or recommendation. This style may well be a requirement of advanced bureaucracy.

Skills assessed by the OECD Skills Outlook are described here:

The Survey of Adult Skills, a product of the OECD Programme for the International Assessment of Adult Competencies (PIAAC), was designed to provide insights into the availability of some of these key skills in society and how they are used at work and at home. The first survey of its kind, it directly measures proficiency in several information-processing skills — namely literacy, numeracy and problem solving in technology-rich environments.

Read the rest of this entry »

Any given species has its unique behaviors and preferred habitat, inevitably overlapping with others that are predator or prey. The human species has spread geographically to make nearly the entire world its habitat and every species its prey (sometimes unintentionally). But it’s a Pyrrhic success, because for the ecosystem to work as our habitat as well as theirs, diversity and abundance is needed. As our numbers have expanded to over 7 billion, nonhuman populations have often declined precipitously (when we don’t farm them for food). When we humans are not otherwise busy hunting, harvesting, and exterminating, we harass them and claim their habitats as uniquely our own. Our unwillingness to share space and/or tolerate their presence except on our own terms is audacious, to say the least.

Read the rest of this entry »

I am, as usual, late getting to the latest controversy in academe, which has been argued to death before I even became aware of it. Inside Higher Ed appears to have gotten there first, followed by editorials at The New York Times, The Los Angeles Times, and The Washington Post. At issue are trigger warnings, a neologism for what might otherwise be called parental advisories (thinking in loco parentis here), to be placed in syllabi and on classroom materials, at first fiction reading but potentially history lessons (and frankly, just about anything else), that might trigger a panic attack or some other dolorous response from a student with a phobia or memory of a traumatic experience. The opinion articles linked above (Inside Higher Ed is more purely reporting) are all in agreement that triggers warnings are a bad idea.

Although articles in news organs are more nearly broadcasting and thus lack discussion (unless one ventures into the swamp of the comments section, which I rarely do), I indulged in a long discussion of the subject with fellow alumni of one of the institutions named in the reports. As with other issues, it developed so many facets that a snapshot determination became impossible if one attempted to accommodate or address all perspectives. Therein lies the problem: accommodation. Left-leaning liberals are especially prone to hypersensitivity to identity politics, which gained prominence in the late 1970s or early 80s. I quickly run afoul of anyone who takes such a perspective because I am notoriously white, male, well-educated, and middle class, so I must constantly “check my privilege.” When someone like me refuses others accommodation, it looks to others like raising the ladder behind me after I’ve safely ascended. I can appreciate, I think, how frustrating it must be to have one’s earnestness thwarted, but yet, I admit I just don’t get it. At the risk of offending (trigger warning here), let me blunder ahead anyway.

The world (or as I’m beginning to call it more simply, reality) is a messy place, and each of us inevitably carries some phobia, trauma, or history that is unsavory. From one celebrated perspective, what doesn’t kill us makes us stronger; from another, we are trained to request accommodation. Accommodation used to be primarily for physical disabilities; now it’s for feelings, which some argue are just as debilitating. This is the province of every self-selected minority and special interest group, which has spawned predictable backlashes among various majority groups (e.g., the men’s movement, resurgent white supremacy). Naturally, any lobby, whether part of a minority or majority, will seek to promote its agenda, but I regard the brew-ha-ha over trigger warnings as an example of growing incidence of what’s been called the Strawberry Generation. It’s remarkable that students now regard themselves as dainty flowers in need of special protection lest they be trampled by, well, reality. So trigger warnings are being requested by students, not on their behalves. With so many examples throughout even recent history of flagrant social injustice and oppression, it’s clear that everyone wants to proclaim their special combination of disadvantages and receive accommodation, all the better if multiplied by inclusion in several protected classes. It’s a claim of victimhood before the fact or perhaps permanent victimhood if one is a survivor of some nastiness. (Disclaimer: real oppression and victimhood do exist, which I don’t intend to minimize, but they’re not because of reading fiction or learning history, scandalous as they may be).

In addition, what exactly is accomplished by virtue of warnings that one is about to encounter — what should it be called — messy material? Does one steel oneself against impact and thus limit its educational value, or does one expect to be excused from facing reality and receive an alternative assignment minus the offending material? Both are the antithesis of higher education. Arguments in the abstract are easy to ignore, so here are two specific examples: substitution or elimination of the words nigger and injun in modernized editions of Mark Twain’s Adventures of Huckleberry Finn and biology textbooks that give consideration to (literally) unscientific accounts of creation and evolution. If one’s racial or religious background gives rise to excess discomfort over the use of one or another egregious trigger word (nigger in particular now having been reclaimed and repurposed for all sorts of uses but with caveats) borne out of ripe historical context or what science (as opposed to religion) teaches, well, that’s the world (reality) we live in. Sanitizing education to avoid discomfort (or worse) does no one any favors. Discomfort and earnest questioning are inevitable if one is to learn anything worthwhile in the course of getting an education.

In his defense of the canon of English literature published in Harper’s (March 2014), Arthur Krystal wrote that traditionalists argue “its contents were an expression of the human condition: the joy of love, the pain of duty, the horror of war, and the recognition of the self and soul.” I would add to these the exhilaration of understanding, the transcendence of beauty, the bitterness of injustice, and the foreknowledge of death. Ranking or ordering elements of the human condition is a fool’s errand, but I contend that none possesses the power to transfix and motivate as much as knowing that one day, each of us must die.

Thus, we develop narratives of a supposed afterlife, in effect to achieve immortality. The most typical are religious dogma regarding eternity spent in heaven, hell, purgatory, or limbo. Another way to cheat death, or more simply, to be remembered, is passing one’s genes to another generation through procreation and achieving a small measure of proxy immortality. Other examples include acquiring fame and wealth to make a mark on history, such as having one’s name on buildings (like the $100 million presidential library and museum being discussed for siting in Chicago’s Hyde Park neighborhood honoring Barack Obama), or having one name inscribed in one of many sports record books, or being preserved on celluloid (which is now increasingly archaic, since everything is going digital). For creative arts, the earliest works of literature to have achieved immortality, meaning that they are still widely known, read, and performed today, are the plays of William Shakespeare. For musicians, it would probably be J.S. Bach. I discount the works of the ancient Greeks or those of the Middle Ages throughout the rest of Europe because, despite passing familiarity with their names, their legacies lie buried deep below the surface and are penetrated only by scholars.

And therein lies the rub: for posterity to supply meaning to an earthly afterlife by proxy, culture must retain historical continuity or at least some living memory. Yet wide swathes of history have been rendered both mute and moot, as Shelley makes clear in his sonnet Ozymandias, with its memorable interdiction, “Look upon my works, ye mighty, and despair!” Who among us can claim to know much if anything about ancient Egyptian or Chinese dynasties, or indeed any of the other major civilizations now collapsed? Our own civilization, grown like a behemoth to the size of the globe, now faces its own collapse for a host of reasons. Even worse, civilizational collapse, ecological collapse, and depopulation present the very real possibility of near-term human extinction (NTE). All the assiduous work to assure one’s place in history won’t amount to much if history leaves us behind.

Read the rest of this entry »