Posts Tagged ‘Education’

A long while back, I blogged about things I just don’t get, including on that list the awful specter of identity politics. As I was finishing my undergraduate education some decades ago, the favored term was “political correctness.” That impulse now looks positively tame in comparison to what occurs regularly in the public sphere. It’s no longer merely about adopting what consensus would have one believe is a correct political outlook. Now it’s a broad referendum centered on the issue of identity, construed though the lens of ethnicity, sexual orientation, gender identification, lifestyle, religion, nationality, political orientation, etc.

One frequent charge levied against offenders is cultural appropriation, which is the adoption of an attribute or attributes of a culture by someone belonging to a different culture. Here, the term “culture” is a stand-in for any feature of one’s identity. Thus, wearing a Halloween costume from another culture, say, a bandido, is not merely in poor taste but is understood to be offensive if one is not authentically Mexican. Those who are infected with the meme are often called social justice warriors (SJW), and policing (of others, natch) is especially vehement on campus. For example, I’ve read of menu items at the school cafeteria being criticized for not being authentic enough. Really? The won ton soup offends Chinese students?

In an opinion-editorial in the NY Times entitled “Will the Left Survive the Millennials?” Lionel Shriver described being sanctioned for suggesting that fiction writers not be too concerned about creating characters from backgrounds different from one’s own. He contextualizes the motivation of SJWs this way: (more…)

I pause periodically to contemplate deep time, ancient history, and other subjects that lie beyond most human conceptual abilities. Sure, we sorta get the idea of a very long ago past out there in the recesses or on the margins, just like we get the idea of U.S. sovereign debt now approaching $20 trillion. Problem is, numbers lose coherence when they mount up too high. Scales differ widely with respect to time and currency. Thus, we can still think reasonably about human history back to roughly 6,000 years ago, but 20,000 years ago or more draws a blank. We can also think about how $1 million might have utility, but $1 billion and $1 trillion are phantoms that appear only on ledgers and contracts and in the news (typically mergers and acquisitions). If deep time or deep debt feel like they don’t exist except as conceptual categories, try wrapping your head around the deep state , which in the U.S. is understood to be a surprisingly large rogue’s gallery of plutocrats, kleptocrats, and oligarchs drawn from the military-industrial-corporate complex, the intelligence community, and Wall Street. It exists but does so far enough outside the frame of reference most of us share that it effectively functions in the shadow of daylight where it can’t be seen for all the glare. Players are plain enough to the eye as they board their private jets to attend annual meetings of the World Economic Forum in Davos-Klosters, Switzerland, or two years ago the Jackson Hole [Economic] Summit in Jackson Hole, WY, in connection with the American Principles Project, whatever that is. They also enjoy plausible deniability precisely because most of us don’t really believe self-appointed masters of the universe can or should exist.

Another example of a really bad trip down the rabbit hole, what I might call deep cynicism (and a place I rarely allow myself to go), appeared earlier this month at Gin and Tacos (on my blogroll):

The way they [conservatives] see it, half the kids coming out of public schools today are basically illiterate. To them, this is fine. We have enough competition for the kinds of jobs a college degree is supposed to qualify one for as it is. Our options are to pump a ton of money into public schools and maybe see some incremental improvement in outcomes, or we can just create a system that selects out the half-decent students for a real education and future and then warehouse the rest until they’re no longer minors and they’re ready for the prison-poverty-violence cycle [add military] to Hoover them up. Vouchers and Charter Schools are not, to the conservative mind, a better way to educate kids well. They are a cheaper way to educate them poorly. What matters is that it costs less to people like six-figure income earners and home owners. Those people can afford to send their kids to a decent school anyway. Public education, to their way of thinking, used to be about educating people just enough that they could provide blue collar or service industry labor. Now that we have too much of that, a public high school is just a waiting room for prison. So why throw money into it? They don’t think education “works” anyway; people are born Good or Bad, Talented or Useless. So it only makes sense to find the cheapest possible way to process the students who were written off before they reached middle school. If charter schools manage to save 1% of them, great. If not, well, then they’re no worse than public schools. And they’re cheaper! Did I mention that they’re cheaper?

There’s more. I provided only the main paragraph. I wish I could reveal that the author is being arch or ironic, but there is no evidence of that. I also wish I could refute him, but there is similarly no useful evidence for that. Rather, the explanation he provides is a reality check that fits the experience of wide swaths of the American public, namely, that “public high school is just a waiting room for prison” (soon and again, debtor’s prison) and that it’s designed to be just that because it’s cheaper than actually educating people. Those truly interesting in being educated will take care of it themselves. Plus, there’s additional money to be made operating prisons.

Deep cynicism is a sort of radical awareness that stares balefully at the truth and refuses to blink or pretend. A psychologist might call it the reality principle; a scientist might aver that it relies unflinchingly on objective evidence; a philosopher might call it strict epistemology. To get through life, however, most of us deny abundant evidence presented to us daily in favor of dreams and fantasies that assemble into the dominant paradigm. That paradigm includes the notions that evil doesn’t really exist, that we’re basically good people who care about each other, and that our opportunities and fates are not, on the whole, established long before we begin the journey.

I attended a fundraiser a short while back. It’s familiar territory for me, filled with gifts culled from local businesses and corporations to be resold at auction, portable kitchens and bars to feed and libate guests to break down their inhibitions to giving, and lots of high heels and party dresses (with ample cleavage). Men rarely strut and parade the way the women do; tuxedos are the rare except. Secondary and tertiary activities are typical, often a DJ or live band that plays so loudly sensible people would flee the room rather than slowly go deaf. But monstrous volume in the era of amplified everything has dulled that reflex to nothingness. Or people are by now already deaf from habitual exposure to arena-rock volume filtered down to small venues. Folks simply, stupidly tough it out, ending the night with their ears ringing and their voices hoarse from screaming over the noise just to be heard.

Beneficiaries of fundraisers usually fall into two categories that are poorly served by American institutions: those seeking quality educations (including public schools that ought to be better funded through taxation) and folks suffering from catastrophic illness or disease that is ideally meant to be covered by health insurance but in practice is not. Auctioneers do pretty well enticing people to part with their money. It’s a true skill. But then, who goes to a fundraiser determined to hold tightly to their hard-earned cash? (Don’t answer that question.) Silent auctions may accompany the live auction, but the group vibe definitely contributes to some competition to outbid the next person (a wallet- or dick-measuring exercise?). Auction items are mostly luxury items, things normal Americans wouldn’t consider buying except when associated with charitable giving. Getting something for one’s charity (bought under or over its presumed market value) also shifts some portion of the philanthropic burden to those entities donating gifts.

All this is preliminary the most appallingly tone-deaf item offered for auction: a 4-person safari to a game preserve in South Africa to hunt and kill a wildebeest. When the auctioneer described the item, everyone in my vicinity looked at each other as if to say “what the fuck?” Certainly, humans have a long history of hunting game purely for sport (which is to say, not for food), and from the perspective of a South African safari outfitter, wild animals are a natural resource to be exploited the same way that, for instance, mining and agriculture is conducted throughout the world, but the last few years have seen a notable change of heart about killing animals, especially so-called romance animals (mostly large mammals, including whales, less so large fish), without need or admirable purpose. The outcry over an American dentist killing Cecil the Lion was an expression of that sentiment. So, too, was the killing of a gorilla at the Cincinnati Zoo after a child fell into the enclosure. (Personally, considering how few of them exist, I would privilege the life of the gorilla over the child, but that’s a mine field.) Pictures of Donald Trump’s sons standing over their trophy kills have also elicited significant disapproval. We are now acutely aware that wild animals are not an inexhaustible resource (and never were — consider the passenger pigeon).

I judged that bidding on the safari was no more or less robust than other auction items, but I mentioned aloud that if I were to bid on it, I would probably go on the safari but would also insist on merely paintballing the poor wildebeest, a relatively harmless proxy for killing it needlessly. Admittedly, the wildebeest would experience the same existential terror as if it were being hunted to death, but at least it would live. Or it would live until the next safari came round. Hunting and killing a wildebeest or other large game has never been on my bucket list, and its appearance at auction would not suddenly inspire me to add it to the list. That is the province of of a class of fools rich and insulated enough to still regard the world as their playground, with no thought of responsibility, stewardship, or consequences.

Continuing from my previous post, Brian Phillips has an article, writing for MTV News, entitled “Shirtless Trump Saves Drowning Kitten: Facebook’s fake-news problem and the rise of the postmodern right.” (Funny title, that.) I navigated to the article via Alan Jacob’s post at Text Patterns (on my blogroll). Let me consider each in turn.

After chuckling that Phillips is directing his analysis to the wrong audience, an admittedly elitist response on my part, I must further admit that the article is awfully well-written and nails the blithe attitude accompanying epistemological destruction carried out, perhaps unwittingly but too well-established now to ignore, by developers of social media as distinguished from traditional news media. Which would be considered more mainstream today is up for debate. Maybe Phillips has the right audience after all. He certainly gets the importance of controlling the narrative:

Confusion is an authoritarian tool; life under a strongman means not simply being lied to but being beset by contradiction and uncertainty until the line between truth and falsehood blurs and a kind of exhaustion settles over questions of fact. Politically speaking, precision is freedom. It’s telling, in that regard, that Trump supporters, the voters most furiously suspicious of journalism, also proved to be the most receptive audience for fictions that looked journalism-like. Authoritarianism doesn’t really want to convince its supporters that their fantasies are true, because truth claims are subject to verification, and thus to the possible discrediting of authority. Authoritarianism wants to convince its supporters that nothing is true, that the whole machinery of truth is an intolerable imposition on their psyches, and thus that they might as well give free rein to their fantasies.

But Phillips is too clever by half, burying the issue in scholarly style that speaks successfully only to a narrow class of academics and intellectuals, much like the language and memes employed by the alt-right are said to be dog whistles perceptible only to rabid, mouth-breathing bigots. Both charges are probably unfair reductions, though with kernels of truth. Here’s some of Phillips overripe language:

Often the battleground for this idea [virtue and respect] was the integrity of language itself. The conservative idea, at that time [20 years ago], was that liberalism had gone insane for political correctness and continental theory, and that the way to resist the encroachment of Derrida was through fortifying summaries of Emerson … What had really happened was that the left had become sensitized to the ways in which conventional moral language tended to shore up existing privilege and power, and had embarked on a critique of this tendency that the right interpreted, with some justification, as an attack on the very concept of meaning.

More plainly, Phillips’ suggestion is that the radical right learned the lessons of Postmodernism (PoMo) even better than did the avant-garde left, the latter having outwitted themselves by giving the right subtle tools used later to outmaneuver everyone. Like other mildly irritating analyses I have read, it’s a statement of inversion: an idea bringing into existence its antithesis that unironically proves and undermines the original, though with a dose of Schadenfreude. This was (partially) the subject of a 4-part blog I wrote called “Dissolving Reality” back in Aug. and Sept. 2015. (Maybe half a dozen read the series; almost no one commented.)

So what does Alan Jacobs add to the discussion? He exhibits his own scholarly flourishes. Indeed, I admire the writing but find myself distracted by the writerly nature, which ejects readers from the flow of ideas to contemplate the writing itself. For instance, this:

It turns out that the children of the ruling classes learned their lessons well, so when they inherited positions in their fathers’ law firms they had some extra, and very useful, weapons in their rhetorical armory.

In precisely the same way, when, somewhat later, academic leftists preached that race and gender were the determinative categories of social analysis, members of the future alt-right were slouching in the back rows of their classrooms, baseball caps pulled down over their eyes, making no external motions but in their dark little hearts twitching with fervent agreement.

Terrific capture of the classroom culture in which teachers are steeped. Drawing identity politics more manifestly into the mix is a fairly obvious extrapolation over Phillips and may reflect the results of the presidential election, where pundits, wheeling around to reinterpret results that should not have so surprised them, now suggest Republican victories are a repudiation of leftist moral instruction. The depth of Phillips’ and Jacobs’ remarks is not so typical of most pundits, however, and their follow-up analysis at some point becomes just more PoMo flagellation. Here, Jacobs is even more clearly having some fun:

No longer did we have to fear being brought before the bar of Rational Evidence, that hanging judge of the Enlightenment who had sent so many believers to the gallows! You have your constructs and we have our constructs, and who’s to say which are better, right? O brave new world that hath such a sociology of knowledge in it!

This goes back to the heart of the issue, our epistemological crisis, but I dispute that race and gender are the determinative categories of social analysis, no matter how fashionable they may be in the academy. A simpler and more obvious big picture controls: it’s about life and death. My previous post was about geopolitics, where death is rained down upon foreign peoples and justifying rhetoric is spread domestically. Motivations may be complex and varied, but the destruction of people and truth affects everyone, albeit unevenly, without regard to race, gender, religion, nationality, etc. All are caught in the dragnet.

Moreover, with the advent of Western civilization, intellectuals have always been sensitive to the sociology of knowledge. It’s a foundation of philosophy. That it’s grown sclerotic long precedes PoMo theory. In fact, gradual breaking apart and dismantling of meaning is visible across all expressive genres, not just literature. In painting, it was Impressionism, Cubism, Dada and Surrealism, and Abstract Expressionism. In architecture, it was Art Deco, the International Style, Modernism, Brutalism, and Deconstructivism. In music, it was the Post-Romantic, the Second Viennese School, Modernism, Serialism, and Minimalism. In scientific paradigms, it was electromagnetism, relativity, quantum mechanics, the Nuclear Era, and semiconductors. The most essential characteristics in each case are increasingly dogmatic abstraction and drilling down to minutia that betray meaningful essences. Factoring in economic and political perversions, we arrive at our current epistemological phase where truth and consequences matter little (though death and destruction still do) so long as deceits, projections, and distractions hold minds in thrall. In effect, gravity is turned off and historical narratives levitate until reality finally, inevitably comes crashing down in a monstrous Jenga pile, as it does periodically.

In the meantime, I suppose Phillips and Jacobs can issue more gaseous noise into the fog bank the information environment has become. They can’t get much traction (nor can I) considering how most of the affluent West thinks at the level of a TV sitcom. In addition, steps being considered to rein in the worst excesses of fake news would have corporations and traditional news media appointed as watchers and censors. Beyond any free speech objections, which are significant, expecting culprits to police themselves only awards them greater power to dominate, much like bailouts rewarded the banks. More fog, more lies, more levitation.

I’m not paying close attention to the RNC in Cleveland. Actually, I’m ignoring it completely, still hoping that it doesn’t erupt in violence before the closing curtain. Yet I can’t help but hear some relevant news, and I have read a few commentaries. Ultimately, the RNC sounds like a sad, sad nonevent put on by amateurs, with many party members avoiding coming anywhere near. What’s actually transpiring is worthy of out-loud laughter at the embarrassment and indignities suffered by participants. The particular gaffe that caught my attention is cribbing from Michelle Obama in the speech delivered on Monday by Melania Trump. The speech writer, Meredith McIver, has accepted blame for it and characterized it as an innocent mistake.

Maybe someone else has already said or written this, but I suspect innocent plagiarism is probably true precisely because that’s the standard in quite a lot of academe these days. Students get away with it all the time, just not on a national stage. Reworking another’s ideas is far easier than coming up with one’s own original ideas, and Melania Trump has no reputation (near as I can tell) as an original thinker. The article linked to above indicates she admires Michelle Obama, so the plagiarism is from a twisted perspective an encomium.

The failure of Trump campaign officials to review the speech (or if they did review it, then do so effectively) is another LOL gaffe. It doesn’t rise to the level of the McCain campaign’s failure to vet Sarah Palin properly and won’t have any enduring effects, but it does reflect upon the Trump campaign’s ineptitude. My secret fear is that ineptitude is precisely why a lot of folks intend to vote for Trump: so that he can accelerate America’s self-destruction. It’s a negative view, and somewhat devil-may-care, to say “let’s make it a quick crash and get things over with already.” Or maybe it’s darkly funny only until suffering ramps up.

Something I wrote ten years ago at Creative Destruction. Probably worth an update.

Creative Destruction

We’ve all see the reports. U.S. high schoolers rank at or near the bottom in math and science. Admittedly, that link is to a story eight years old, but I doubt rankings have changed significantly. A new study and report are due out next year. See this link.

What interests me is that we live in an era of unprecedented technological advancement. While the U.S. may still be in the vanguard, I wonder how long that can last when the source of inspiration and creativity — human knowledge and understanding — is dying at the roots in American schools. It’s a sad joke, really, that follow-the-directions instructions for setting the clock on a VCR (remember those?) proved so formidable for most end users that a time-setting function is built into more recent recording systems such as TIVO. Technical workarounds may actually enable ever-increasing levels of disability working with our…

View original post 834 more words

According to Jean-Paul Sartre, the act of negation (a/k/a nihilation) is a necessary step in distinguishing foreground objects from background. A plethora of definitions and formal logic ensure that his philosophical formulations are of only academic interest to us nowadays, since philosophy in general has dropped out of currency in the public sphere and below awareness or concern even among most educated adults. With that in mind, I thought perhaps I should reinforce the idea of negation in my own modest (albeit insignificant) way. Negation, resistance, and dissent have familial relations, but they are undoubtedly distinct in some ways, too. However, I have no interest in offering formal treatments of terminology and so will gloss over the point and decline to offer definitions. Lump ’em all in together, I say. However, I will make a distinction between passive and active negation, which is the point of this blog post.

Although the information environment and the corporations that provide electronic access through hardware and connectivity would have us all jacked into the so-called Information Superhighway unceasingly, and many people do just that with enormous relish, I am of a different mind. I recognize that electronic media are especially potent in commanding attention and providing distraction. Stowed away or smuggled in with most messaging is a great deal of perception and opinion shaping that is worse than just unsavory, it’s damaging. So I go beyond passively not wanting handheld (thus nonstop) access to actively wanting not to be connected. Whereas others share excitement about the latest smartphone or tablet and the speed, cost, and capacity of the service provider for the data line on their devices, I don’t demur but insist instead “keep that nonsense away from me.” I must negate those prerogatives, refuse their claims on my attention, and be undisturbed in my private thoughts while away from the computer, the Internet, and the constant flow of information aimed indiscriminately at me and everyone.

Of course, I win no converts with such refusals. When I was shopping for a new phone recently, the default assumption by the sales clerk was that I wanted bells and whistles. She simply could not conceive of my desire to have a phone that is merely a phone, and the store didn’t sell one anyway. Even worse, since all phones are now by default smart phones, I had a data block put on my account to avoid inadvertently connecting to anything that would require a data line. That just blew her mind, like I was forgoing oxygen. But I’m quite clear that any vulnerability to information either tempting me or forced on me is worth avoiding and that time away from the computer and its analogues is absolutely necessary.

Personal anecdote: I was shopping at an appliance retailer (went to look at refrigerators) recently that had an embedded Apple store. At the counter with three models of the iPhone 6, the latest designations, were three kids roughly 8-11 in age (I estimate). They were unattended by parents, who must have believed that if the kids were not causing problems, they were a-okay. The kids themselves were absolutely transfixed — enthralled, really — by the screens, standing silent and motionless (very unlike most kids) with either a fierce concentration or utterly empty heads as they examined the gadgets. They were so zoomed in they took no notice at all of passersby. Parents of earlier generations used the TV as a pacifier or baby sitter the same way, but these kids seemed even more hollow than typical, dull-eyed TV viewers. Only a few days later, at a concert I attended, I watched a child who apparently could not pry his eyes away from the tablet he was carrying — even as he struggled to climb the stairs to his seat. The blue glare of the screen was all over his face.

Both scenes were unspeakably sad, though I might be hard-pressed to convince anyone of that assessment had I intervened. These scenes play out again and again, demonstrating that the youngest among us are the most vulnerable and least able to judge when to turn away, to disconnect. Adults fare no better. Schools and device makers alike have succeeded in selling electronics as “educational devices,” but the reality is that instead of exploring the world around us, people get sucked into a virtual world and the glossy fictions displayed on screens. They ultimately become slaves to their own devices. I mourn for their corrupted mindscapes, distorted and ruined by parents and teachers who ought to be wiser but who themselves have been coopted and hollowed out by mass media.

The phrase enlightened self-interest has been been used to describe and justify supposed positive results arising over time from individuals acting competitively, as opposed to cooperatively, using the best information and strategies available. One of the most enduring examples is the prisoner’s dilemma. Several others have dominated news cycles lately.

Something for Nothing

At the Univ. of Maryland, a psychology professor has been offering extra credit on exams of either 2 or 6 points if no more that 10 percent of students elect to receive the higher amount. If more than 10% go for the higher amount, no one gets anything. The final test question, which fails as a marker of student learning or achievement and doesn’t really function so well as a psychology or object lesson, either, went viral when a student tweeted out the question, perplexed by the prof’s apparent cruelty. Journalists then polled average people and found divergence (duh!) between those who believe the obvious choice is 6 pts (or reluctantly, none) and those who righteously characterize 2 pts as “the right thing to do.” It’s unclear what conclusion to draw, but the prof reports that since 2008, only one class got any extra credit by not exceeding the 10% cutoff.

Roping One’s Eyeballs

This overlong opinion article found in the Religion and Ethics section of the Australian Broadcasting Commission (ABC) website argues that advertizing is a modern-day illustration of the tragedy of the commons:

Expensively trained human attention is the fuel of twenty-first century capitalism. We are allowing a single industry to slash and burn vast amounts of this productive resource in search of a quick buck.

I practice my own highly restrictive media ecology, defending against the fire hose of information and marketing aimed at me (and everyone else) constantly, machine-gun style. So in a sense, I treat my own limited time and attention as a resource not to be squandered on nonsense, but when the issue is scaled up to the level of society, the metaphor is inapt and breaks down. I assert that attention as an exploitable resource functions very differently when considering an individual vs. the masses, which have unique behavioral properties. Still, it’s an interesting idea to consider.

No One’s Bank Run

My last last example is entirely predictable bank runs in Greece that were forestalled when banks closed for three weeks and placed withdrawal limits (euphemism: capital controls) on what cash deposits are actually held in the vaults. Greek banks have appealed to depositors to trust them — that their deposits are guaranteed and there will be no “haircut” such as occurred in Cyprus — but appeals were met with laughter and derision. Intolerance of further risk is an entirely prudent response, and a complete and rapid flight of capital would no doubt have ensued if it weren’t disallowed.

What these three examples have in common is simple: it matters little what any individual may do, but it matters considerably what everyone does. Institutions and systems typically have enough resilience to weather a few outliers who exceed boundaries (opting for 6 pts, pushing media campaigns to the idiotic point of saturation, or withdrawing all of one’s money from a faltering bank), but when everyone acts according to enlightened self-interest, well, it’s obvious that something’s gotta give. In the examples above, no one gets extra points, no one pays heed to much of anything anymore (or perhaps more accurately, attention is debased and diluted to the point of worthlessness), and banks fail. In the professor’s example, the threshold for negative results is only 10%. Different operating environments probably vary, but the modesty of that number is instructive.

More than a few writers have interpreted the tragedy of the commons on a global level. As a power law, it probably functions better at a feudal level, where resources are predominantly local and society is centered around villages rather than megalopolises and/or nation-states. However, it’s plain to observe, if one pays any attention (good luck with that in our new age of distraction, where everyone is trained to hear only what our own culture instructs, ignoring what nature tells us), that interlocking biological systems worldwide are straining and failing under the impacts of anthropomorphic climate change. Heating at the poles and deep in the oceans are where the worst effects are currently being felt, but climate chaos is certainly not limited to out-of-sight, out-of-mind locations. What’s happening in the natural world, however, is absolutely and scarily for real, unlike bogus stress tests banks undergo to buttress consumer sentiment (euphemism: keep calm and carry on). Our failure to listen to the right messages and heed warnings properly will be our downfall, but we will have lots of company because everyone is doing it.

This is the fourth of four parts discussing approaches to the prospect of NTE, specifically, “Digital Humanities in the Anthropocene” (DH in the A) by Bethany Nowviskie, which is a transcript of a talk given at the Digital Humanities 2014 conference in Lausanne, Switzerland. Part one is found here; part two is here; part three is here.

Of the three articles reviewed in this blog series, DH in the A is the most confounding. It offers what I thought might be the best approach to the prospect of NTE, which is to confront it openly and hash out some sort of meaningful action to take in the time remaining us — but from a humanities perspective. However, as Nowviskie’s comments indicate, she is refraining from endorsing most of what she wrote about in favor of the measured, meaningless mumbles of empty academic speech. But before I get to that, let’s have a look at (some of) what Nowviskie covers in her lengthy article. The profusion of people cited and links littered throughout the transcript is pretty impressive, though I daresay few would bother to explore them in much detail. She begins by laying bare the stark reality of mass extinction:

To make plain the premise on which this talk rests: I take as given the scientific evidence that human beings have irrevocably altered conditions for life on our planet. I acknowledge, too, that our past actions have a forward motion: that we owe what ecologists like David Tilman call an “extinction debt” — and that this debt will be paid. As the frequency of disappearance of species leaps from its background rate by a hundred to a thousand times the average, I accept — despite certain unpredictabilities but with no uncertain horror — that we stand on the cusp of a global mass extinction of plants and animals, on the land and in our seas. We are here to live for a moment as best we can, to do our work, and to help our fellow-travelers muddle through their own short spans of time — but we are also possessed of a knowledge that is sobering and rare. We, and the several generations that follow us, will bear knowing witness to the 6th great extinction of life on Earth. This is an ending of things, a barring of doors, not seen since the colossal dying that closed the Mesozoic Era, 66 million years ago. [link in original; emphases mine]

(more…)

This is the third of four parts discussing approaches to the prospect of NTE, specifically, “The Last of Everything” by Daniel Drumright, a blog essay for denizens of Nature Bats Last, which has recently narrowed its focus to discussion of NTE. Part one is found here; part two is here.

Despite having had the longest period of engagement (of the three authors reviewed in this blog series, I’m guessing) with industrial, economic, and ecological collapse that will precede population collapse and most likely extinction of the preponderance of life on Earth, Drumright still writes with the raw emotion of someone who has just become aware that we are all now staring inescapable death in the face  — the death of our species. Indeed, the number of ways Drumright rephrases the damning dawning realization still waiting to break across the popular mind or public sphere is exhausting. Here is one from the outset of the essay:

This is a commiserative thought experiment written ONLY for those whose lived experiences have afforded them the intellectual/emotional freedom to fully explore the dismal implications that virtually no one will survive near term global starvation.

Again and again, Drumright comes back to the stark reality of NTE using a variety of evocative phrases, none of which fully encapsulates the immensity of NTE because, frankly, humans have great difficulty trying to grok things so far outside standard frames of reference. He rightly points out, too, that no humans up to this point in history have had to contend with such awful conclusions, foreseeable and delayed or postponed in their effect but nonetheless unavoidable. Accordingly, his essay is written with none of the sober detachment and objectivity of the professional critic or academic. Instead, he writes as though he were a parent destroyed by the death of a child. That comparison fails, of course, because despite our familial responsibility for the fate of the living planet, we are all participants in this prospective death, referred to elsewhere on this blog as a megadeath pulse.

Whether because of this emotional overlay or mere sloppiness, Drumright’s essay suffers from unnecessary restatement and rambling, poor grammatical and paragraph structure, and excessive length. (To decipher the tortured language, I edited the copy I pasted into MS Word for safekeeping.) But maybe it’s just as well that the writing is choked and spluttering in places; its tone is what I actually expect even from those who have dealt with the NTE meme complex for some time. We’re just not culturally equipped to absorb the death of our species with much composure.

One of Drumright’s principal ideas is acceptance, as distinguished from (I guess) denial, resistance, desperation, or hope (now frequently renamed as the placebo drug hopium). Only with acceptance does the time remaining begin to offer forms of sanctuary for the soul. But this can only be an individual response and refuge. Conjecture by those with a lengthy period of collapse-awareness more typically envisages scenarios of anarchic chaos and destruction before it’s all over. Rather like the last tortured gasps of empire, an expectation of out-of-control crowds (mobs, really) and wanton, end-of-days benders (including some nasty jaunts toward revenge and victimization) seems a greater likelihood than calm, wizened acceptance. Indeed, few achieve the peace of mind needed to go to one’s death with grace and resignation.

Drumright provides numerous insights hard won in his time fighting for environmental justice. Only on “this side of acceptance,” however, has he broken through the idols and illusions of political activity and recognized that the trajectory we’re on is not guided or controllable (if it ever was, in fact). Similarly, various impediments stand in the way of acceptance and continue to thwart the next, not-quite-final step:

Let’s start talking about how we’re all going to die, not vaguely, halfheartedly or sarcastically, but specifically so that we can actually begin to get beyond that specter, and start being creative in figuring out how we’re going to live through [the ongoing process of] extinction until that fateful day comes for each of us. Because if we’re talking about acceptance, it’s probably time we get around to actually talking about what IT is we’ve come to accept, beyond endlessly lamenting the loss of all the rest of life, and incessantly debating our legacy of agency which has nevertheless led us to where we are today irrespective of our personal opinions.

Drumright goes on to discuss (at length) as a case in point the suicide this past spring of whistle-blower and truth-teller Michael Ruppert. Somewhat surprisingly, Drumright is critical of Ruppert for having squandered an opportunity to lead the way with a meaningful or beautiful death, albeit by suicide. Part of Drumright’s indignation stems from Ruppert’s position in the vanguard among the collapse-aware (I’ve adopted the term doomer but for reasons yet unclear cannot stomach collapsitarian). Ruppert’s approach was never, best as I can assess, spiritual or therapeutic. Rather, he was always trying to break through the wall of denial, obfuscation, and ignorance that characterizes most people, some who know better and most who don’t. To deny him (after the fact) his own personal response, shocking though ultimately harmless to those of us who puzzled over his choice of exit strategy, seems to me a niggardly response.

From a wider perspective, Drumright appears to make a mistake of metonymy, lamenting that in his final act Ruppert failed to teach by example how we all might take our leave in the company of love and dignity. The discontinuity between individual and society, however, will not be bridged even by hundreds of prime examples. Irrational fear as we recognize death stalks each of us simply cannot be undone or overcome at the level of society. Still, for those who can take instruction and choose consciously how to end things, Drumright offers thoughtful alternatives. Absent from those alternatives are the usual frothy, lofty incantations that soothe the faithful, which are not actually humane responses, considering how they rely on other idols and illusions. He does, however, offer a glimpse of beauty by reminding us that the bonds of community, whether shared digitally or in person, are to be embraced and cherished in our remaining time.