Posts Tagged ‘Futurism’

I’ve often thought that my father was born at just the right time in the United States: too young to remember much of World War II, too old to be drafted into either the Korean War or the Vietnam War, yet well positioned to enjoy the fruits of the postwar boom and the 1960s counterculture. He retired early with a pension from the same company for which he had worked nearly the entirety of his adult life. Thus, he enjoyed the so-called Happy Days of the 1950s (the Eisenhower era) and all of the boom years, including the Baby Boom, the demographer’s term for my cohort (I came at the tail end). Good for him, I suppose. I admit some envy at his good fortune as most of the doors open to him were closed by the time I reached adulthood. It was the older (by 10–15 years) siblings of Boomers who lodged themselves in positions of power and influence. Accordingly, I’ve always felt somewhat like the snotty little brother clamoring for attention but who was overshadowed by the older brother always in the way. Luckily, my late teens and early twenties also fell between wars, so I never served — not that I ever supported the American Empire’s foreign escapades, then or now.

Since things have turned decidedly for the worse and industrial civilization can’t simply keep creaking along but will fail and collapse soon enough, my perspective has changed. Despite some life options having been withdrawn and my never having risen to world-beater status (not that that was ever my ambition, I recognize that, similar to my father, I was born at the right time to enjoy relative peace and tranquility of the second half of the otherwise bogus “American Century.” My good fortune allowed me to lead a quiet, respectable life, and reach a reasonable age (not yet retired) at which I now take stock. Mistakes were made, of course; that’s how we learn. But I’ve avoided the usual character deformations that spell disaster for lots of folks. (Never mind that some of those deformations are held up as admirable; the people who suffer them are in truth cretins of the first order, names withheld).

Those born at the wrong time? Any of those drafted into war (conquest, revolutionary, civil, regional, or worldwide), and certainly anyone in the last twenty years or so. Millennials appeared at the twilight of empire, many of whom are now mature enough to witness its fading glory but generally unable to participate in its bounties meaningfully. They are aware of their own disenfranchisement the same way oppressed groups (religious, ethnic, gender, working class, etc.) have always known they’re getting the shaft. Thus, the window of time one might claim optimal to have been born extends from around 1935 to around 1995, and my father and I both slot in. Beyond that fortuitous window, well, them’s the shakes.

/rant on

Since deleting from my blogroll all doom links and turning my attention elsewhere, the lurking dread of looming collapse (all sorts) has been at low ebb at The Spiral Staircase. Despite many indicators of imminent collapse likewise purged from front-page and top-of-the-broadcast news, evidence continues to mount while citizens contend with other issues, some political and geopolitical, others day-to-day tribulations stemming from politics, economics, and the ongoing pandemic. For instance, I only just recently learned that the Intergovernmental Panel on Climate Change (IPCC — oh yeah … them) issued AR6 last month, the sixth periodic Assessment Report (maybe instead call it the State of the Union Address Planet Report). It’s long, dense reading (the full report is nearly 4,000 pp., whereas the summary for policymakers is a mere 42 pp.) and subject to nearly continuous revision and error correction. The conclusion? Climate change is widespread, rapid, and intensifying. And although it’s true that mundane daily activities occupy center stage in the lives of average folks, there is simply no bigger story or concern for government leaders (I choke on that term) and journalists (that one, too) than climate change because it represents (oh, I dunno …) the collapse of industrial civilization and the early phase of mass extinction. Thus, all politics, geopolitics, economic warfare, class struggle, Wokeism, authoritarian seizure of power, and propaganda filling the minds of people at all levels as well as the institutions they serve amount to a serious misallocation of attention and effort. I will admit, though, that it’s both exhausting and by now futile to worry too much about collapse. Maybe that’s why the climate emergency (the new, improved term) is relegated to background noise easily tuned out.

It’s not just background noise, though, unlike the foreknowledge that death awaits decades from now if one is fortunate to persist into one’s 70s or beyond. No, it’s here now, outside (literally and figuratively), knocking on the door. Turn off your screens and pay attention! (Ironically, everyone now gets the lion’s share of information from screens, not print. So sue me.) Why am I returning to this yet again? Maybe I’ve been reviewing too many dystopian films and novels. Better answer is that those charged with managing and administering states and nations are failing so miserably. It’s not even clear that they’re trying, so pardon me, but I’m rather incensed. It’s not that there aren’t plenty of knowledgeable experts compiling data, writing scientific reports, publishing books, and offering not solutions exactly but at least better ways to manage our affairs. Among those experts, the inability to reverse the climate emergency is well enough understood though not widely acknowledged. (See Macro-Futilism on my blogroll for at least one truth teller who absolutely gets it.) Instead, some lame version of the same dire warning issues again and again: if action isn’t taken now (NOW, dammit!), it will be too late and all will be lost. The collective response is not, however, to pull back, rein in, or even prepare for something less awful than the worst imaginable hard landing where absolutely no one survives despite the existence of boltholes and underground bunkers. Instead, it’s a nearly gleeful acceleration toward doom, like a gambler happily forking over his last twenty at the blackjack table before losing and chucking himself off the top of the casino parking structure. Finally free (if fleetingly)!

Will festering public frustration over deteriorating social conditions tip over into outright revolt, revolution, civil war, and/or regime change? Doesn’t have to be just one. Why is the U.S. still developing and stockpiling armaments, maintaining hundreds of U.S. military bases abroad, and fighting costly, pointless wars of empire (defeat in withdrawal from Afghanistan notwithstanding)? Will destruction of purchasing power of the U.S. dollar continue to manifest as inflation of food and energy costs? Is the U.S. Dept. of Agriculture actually doing anything to secure food systems, or does it merely prepare reports like the AR6 that no one reads or acts upon? Will fragile supply lines be allowed to fail entirely, sparking desperation and unrest in the streets far worse than summer 2020? Famine is how some believe collapse will trigger a megadeath pulse, but I wouldn’t count out chaotic violence among the citizenry, probably exacerbated and escalated as regimes attempt (unsuccessfully) to restore social order. Are any meaningful steps being taken to stop sucking from the fossil fuel teat and return to small-scale agrarian social organization, establishing degrowth and effectively returning to the land (repatriation is my preferred term) instead of going under it? Greenwashing doesn’t count. This headline (“We Live In A World Without Consequences Where Everyone Is Corrupt“) demonstrates pretty well that garbage economics are what pass for governance, primarily preoccupied with maintaining the capitalist leviathan that has captured everything (capture ought to be the trending word of the 2021 but sadly isn’t). Under such constraint, aged institutions are flatly unable to accomplish or even address their missions anymore. And this headline (“Polls Show That The American People Are Extremely Angry – And They Are About To Get Even Angrier“) promises that things are about to get much, much worse (omitted the obvious-but-erroneous follow-on “before they get better”) — for the obvious reason that more and more people are at the ends of their ropes while the privileged few attend the Met Gala, virtue signal with their butts, and behave as though society isn’t in fact cracking up. Soon enough, we’ll get to truth-test Heinlein’s misunderstood aphorism “… an armed society is a polite society.”

Those who prophesy dates or deadlines for collapse have often been slightly embarrassed (but relieved) that collapse didn’t arrive on schedule. Against all odds, human history keeps trudging further into borrowed time, kicking cans down roads, blowing bubbles, spinning false narratives, insisting that all this is fine, and otherwise living in make-believe land. Civilization has not quite yet reached the end of all things, but developments over the last couple months feel ever more keenly like the equivalent of Frodo and Sam sitting atop Mount Doom, just outside the Cracks of Doom (a/k/a Sammath Naur), except that humanity is not on a noble, sacrificial mission to unmake the One Ring, whatever that might represent outside of fiction (for Tolkien, probably industrial machines capable of planetary destruction, either slowly and steadily or all at once; for 21st-century armchair social critics like me, capitalism). All former certainties, guarantees, sureties, continuities, and warranties are slipping away despite the current administration’s assurances that the status quo will be maintained. Or maybe it’s merely the transition of summer into fall, presaging the annual dormancy of winter looking suspiciously this year like the great dying. Whatever. From this moment on and in a fit of exuberant pique, I’m willing to call the contest: humanity is now decidedly on the down slope. The true end of history approaches, as no one will be left to tell the tale. When, precisely, the runaway train finally careens over the cliff remains unknown though entirely foreseeable. The concentration of goofy references, clichés, and catchphrases above — usually the mark of sophomoric writing — inspires in me to indulge (further) in gallows humor. Consider these metaphors (some mixed) suggesting that time is running out:

  • the show’s not over til it’s over, but the credits are rolling
  • the chickens are coming home to roost
  • the canary in the coal mine is gasping its last breath
  • the fat lady is singing her swan song
  • borrowed time is nearly up
  • time to join the great majority (I see dead people …)
  • the West fades into the west
  • kiss your babies goodnight and kiss your ass goodbye

/rant off

Watched Soylent Green (1973) a few days ago for the first time since boyhood. The movie is based on a book by Richard Fleischer (which I haven’t read) and oddly enough has not yet been remade. How to categorize the film within familiar genres is tricky. Science fiction? Disaster? Dystopia? Police procedural? It checks all those boxes. Chief messages, considering its early 70s origin, are pollution and overpopulation, though global warming is also mentioned less pressingly. The opening montage looks surprisingly like what Godfrey Reggio did much better with Koyaanisqatsi (1982).

Soylent Green is set in 2022 — only a few months away now but a relatively remote future in 1973 — and the Earth is badly overpopulated, environmentally degraded, overheated, and struggling to support teeming billions mostly jammed into cities. Details are sketchy, and only old people can remember a time when the biosphere remained intact; whatever disaster had occurred was already long ago. Science fiction and futuristic films are often judged improperly by how correct prophecies turn out in reality, as though enjoyment were based on fidelity to reality. Soylent Green fares well in that respect despite its clunky, dated, 70s production design. Vehicles, computer screens, phones, wardrobe, and décor are all, shall we say, quaintly vintage. But consider this: had collapse occurred in the 70s, who’s to say that cellphones, flat screens, and the Internet would ever have been developed? Maybe the U.S. (and the world) would have been stalled in the 70s much the way Cuba is stuck in the 50s (when the monumentally dumb, ongoing U.S. embargo commenced).

The film’s star is Charlton Heston, who had established himself as a handsomely bankable lead in science fiction, disaster, and dystopian films (e.g., The Omega Man and The Planet of the Apes series). Though serviceable, his portrayal is remarkably plain, revealing Heston as a poor man’s Sean Connery or John Wayne (both far more charismatic contemporaries of Heston’s even in lousy films). In Soylent Green, Heston plays Detective Robert Thorn, though he’s mostly called “Thorn” onscreen. Other characters below the age of 65 or so also go by only one name. They all grew up after real foodstuffs (the titular Soylent Green being a synthetic wafer reputedly made out of plankton — the most palatable of three colors) and creature comforts became exceedingly scarce and expensive. Oldsters are given the respect of first and last names. Thorn investigates the assassination of a high-ranking industrialist to its well-known conspiratorial conclusion (hardly a spoiler anymore) in that iconic line at the very end of the film: “Soylent Green is people!” Seems industrialists, to keep people fed, are making food of human corpses. That eventual revelation drives the investigation and the film forward, a device far tamer than today’s amped up action thrillers where, for instance, a mere snap of the fingers can magically wipe out or restore half of the universe. Once the truth is proclaimed by Thorn (after first being teased whispered into a couple ears), the movie ends rather abruptly. That’s also what makes it a police procedural set in a disastrous, dystopic, science-fiction future stuck distinctively in the past: once the crime/riddle is solved, the story and film are over with no dénouement whatsoever.

Some of the details of the film, entirely pedestrian to modern audiences, are modestly enjoyable throwbacks. For instance, today’s penchant for memes and slang renaming of commonplace things is employed in Soylent Green. The catchphrase “Tuesday is Soylent Green Day” appears but is not overdriven. A jar of strawberries costs “150D,” which I first thought might be future currency in the form of debits or demerits but is probably just short for dollars. Front end loaders used for crowd control are called “scoops.” High-end apartment building rentals come furnished with live-in girls (prostitutes or gold-diggers, really) known as Furniture Girls. OTOH, decidedly 70s-era trash trucks (design hasn’t really changed much) are not emblazoned with the corporate name or logo of the Soylent Corporation (why not?). Similarly, (1) dressing the proles in dull, gray work clothes and brimless caps, (2) having them sleep on stairways or church refuges piled on top of each other so that characters have to step gingerly through them, (3) being so crammed together in protest when the Tuesday ration of Soylent Green runs short that they can’t avoid the scoops, (4) dripped blood clearly made of thick, oversaturated paint (at least on the DVD), and (5) a sepia haze covering daytime outdoor scenes are fairly lazy nods to world building on a low budget. None of this is particularly high-concept filmmaking, though the restraint is appreciated. The sole meme (entirely unprepared) that should have been better deployed is “going home,” a euphemism for reporting voluntarily to a processing plant (into Soylent Green, of course) at the end of one’s suffering life. Those who volunteer are shown 30 minutes of scenes, projected on a 360-degree theater that envelops the viewer, depicting the beauty and grandeur of nature before it had disappeared. This final grace offered to people (rather needlessly) serves the environmental message of the film well and could have been “driven home” a bit harder.

Like other aspects of the film’s back story, how agriculture systems collapsed is largely omitted. Perhaps such details (conjecture) are in the book. The film suggests persistent heat (no seasons), and accordingly, character are made to look like they never stop sweating. Scientific projections of how global warming will manifest do in fact point to hothouse Earth, though seasons will still occur in temperate latitudes. Because such changes normally occur in geological time, it’s an exceedingly slow process compared to human history and activity. Expert inquiry into the subject prophesied long ago that human activity would trigger and accelerate the transition. How long it will take is still unknown, but industrial civilization is definitely on that trajectory and human have done little since the 70s to curb self-destructive appetites or behaviors — except of course talk, which in the end is just more hot air. Moreover, dystopian science fiction has shifted over the decades away from self-recrimination to a long, seemingly endless stream of superheros fighting crime (and sometimes aliens). Considering film is entertainment meant to be enjoyed, the self-serious messages embedded in so many 70s-era disaster films warning us of human hubris are out of fashion. Instead, superpowers and supersuits rule cinema, transforming formerly uplifting science-fiction properties such as Star Trek into hypermilitaristic stories of interstellar social collapse. Soylent Green is a grim reminder that we once knew better, even in our entertainments.

“Language is dynamic” is a phrase invoked in praise or derision of shifts in usage. Corollaries include “the only constant is change” and “time’s arrow points in only one direction” — both signalling that stasis is an invalid and ultimately futile conservative value. The flip side might well be the myth of progress, understood in reference not to technological advancement but human nature’s failure to rise above its base (animal) origins. This failure is especially grotesque considering that humans currently albeit temporarily live in an age of material abundance that would provide amply for everyone if that largesse were justly and equitably produced and distributed. However, resources (including labor) are being systematically exploited, diverted, and hoarded by a small, unethical elite (what some call “alpha chimps”) who often use state power to subjugate vulnerable populations to funnel further tribute to the already obscenely wealthy top of the socioeconomic hierarchy. But that’s a different diatribe.

Although I’m sensitive the dynamism of language — especially terms for broad ideas in need of short, snappy neologisms — I’m resistant to adopting most new coin. For instance, multiple colors of pill (red, blue, white, and black to my knowledge) refer to certain narrative complexes that people, in effect, swallow. Similarly, the “blue church” is used to refer to legacy media struggling desperately (and failing) to retain its last shreds of legitimacy and authority. (Dignity is long gone.) Does language really need these terms or are hipsters just being clever? That question probably lacks a definitive answer.

My real interest with this blog post, however, is how the modern digital mediascape has given rise to a curious phenomenon associated with cancel culture: deletion of tweets and social media posts to scrub one’s past of impropriety as though the tweet or post never happened. (I’ve never deleted a post nor have any plans to.) Silicon Valley hegemons can’t resist getting their piece of the action, too, by applying deeply flawed algorithms to everyone’s content to demonetize, restrict, and/or remove (i.e., censor) offensive opinion that runs counter to (shifting) consensus narratives decided upon in their sole discretion as water carriers for officialdom. Algorithmic dragnets are not effective kludges precisely because thoughts are not synonymous with their online expression; one merely points to the other. Used to be said that the Internet is forever, so wait before posting or tweeting a reasonable duration so that irresponsible behavior (opinion and trolling, mostly) can be tempered. Who knows who possesses technical expertise and access to tweet and video archives other than, say, the Wayback Machine? When a public figure says or does something dumb, a search-and-destroy mission is often launched to resurrect offending and damning past utterance. Of course, scrub-a-dub erasure or deletion is merely another attempt to manage narrative and isn’t a plea for forgiveness, which doesn’t exist in the public sphere anyway except for rehabilitated monsters such as past U.S. presidents a/k/a war criminals. And the Internet isn’t in fact forever; ask an archivist.

Shifting language, shifting records, shifting sentiment, shifting intellectual history are all aspects of culture that develop naturally and inevitably over time. We no longer believe, for instance, in the four elements or geocentrism (a/k/a the Aristotelian-Ptolemaic system; never mind the intransigent Flat Earthers who need not be silenced). Darker aspects of these shifts, however, include the remarkable Orwellian insight that “Who controls the past controls the future: who controls the present controls the past” from the 1949 novel Nineteen Eighty-Four. Here’s the passage for context:

Who controls the past, controls the future: who controls the present, controls the past … The mutability of the past is the central tenet of Ingsoc. Past events, it is argued, have no objective existence, but survive only in written records and in human memories. The past is whatever the records and the memories agree upon. And since the Party is in full control of all records, and in equally full control of the minds of its members, it follows that the past is whatever the Party chooses to make it.

In 2021, the awful lesson is taken to heart by multiple parties (not the Party in the novel but wannabes) who have latched maniacally onto Orwellian mechanisms of thought control specifically through the manipulation of records, history, and language. But as mentioned above, policing mere expression is not the same as policing thought itself, at least among those who retain critical thinking skills and independence of mind. I abstain judgment how effective attempted brainwashing is with the masses but will at least mention that Yeonmi Park, who escaped from North Korea in 2007 before settling in the U.S. in 2014, describes the chilling totalitarian thought control exercised by the North Korean government — the stuff of nightmare dystopianism. The template is by now well established and despots everywhere are only too happy to implement it repeatedly, following an evil trajectory that should be resisted at every turn while still possible.

This article at Scientific American argues in support of a fundamental change to its style sheet. A style sheet, for the uninitiated, is a guide to how a publication presents its output, including formatting, commonly used spellings, and preferred grammar. For instance, should ordinals (i.e., 1st, 2nd, 3rd, etc.) be raised? Or should web be capitalized when referring to the World Wide Web? The change Scientific American just adopted is dropping the term climate change in favor of climate emergency. Well, good for Scientific American, I guess. My lack of enthusiasm or urgency — the very urgency signaled by the revised term now that the emergency is upon us (um, has been for decades already if one thinks in terms of geological or evolutionary time rather than mere decades of human history) — stems not from the truthfulness or effectiveness of the arguments but by my assessment that the problem is flatly unsolvable at this late date and that, as a global civilization, we’re doing almost nothing to combat it anyway. That’s been the case since the basic problem swung into public view in the 1970s, and it’s been the case since James Howard Kunstler published The Long Emergency in 2006.

Climate emergency deniers have claimed that recent volcanic eruptions in the Caribbean, Iceland, and Hawaii have erased or nullified all the efforts by humans to stem the release of greenhouse gases from industrial activity. According to this link, that’s comparing apples and oranges: peak volcanic activity vs. a sliver of human activity. Since 1750 (a conventional start date of the Industrial Revolution), it’s human activity driving the climate emergency, not volcanic activity. Moreover, industrial activity shows no signs of abating, at least until is all creaks to a halt when the infernal machine will no longer crank. The blocked Suez Canal and deep freeze in Texas both remind how fragile industrial infrastructure is; just wait for a Carrington Event to fry everything at once. This link explains human carbon emissions (also mentions volcanoes), which continues to increase in volume every single year. (This past year might (might!) be an anomaly due to the pandemic, but things are already ramping back up.) And besides, humans can’t control volcanoes (did someone suggest dropping nukes in them to “seal them up”?) We can’t even control ourselves.

Some while back, I purged from my blogroll all the doom links and determined that industrial civilization is in its death throes, so why bother blogging about it anymore? Similarly, the last time I cited the Doomsday Clock in January 2020, it was (metaphorically) 100 seconds to midnight. The Clock today still sits at that harrowing eve of destruction, and I didn’t link to the January 2021 statement, which includes discussions of the novel coronavirus, nuclear threats, and climate change (the older term), summarizing them together as a wake-up call. Really? So now it’s time to get serious? Nope, don’t think so. The proper time is long past due, the catastrophic future is already locked in, and we’ve been steadfastly pretending there is nothing to see (so that there will eventually be nothing to do — a self-fulfilling prophecy). It’s merely unknown when members of the human species begin dropping like flies.

/rant on

Remember all those folks in the weeks and days preceding election day on November 4, 2020, who were buying guns, ammo, and other provisions in preparation for civil breakdown? (No one known personally, of course, and gawd no not actually any of us, either; just them other others who don’t read blogs or anything else.) Well, maybe they were correct adopting the precautionary principal (notably absent from a host of other perils besetting us). But as of this writing, nothing remotely resembling widespread disruption — feared by some, hotly anticipated by others — has developed. But wait! There’s still time. Considering Americans were set up by both political parties to distrust the outcome of the presidential race no matter which candidate claimed to have prevailed, we now face weeks or months of legal challenges and impatient formation of agitators (again, both sides) demanding their candidate be declared the winner (now, dammit!) by the courts instead of either official ballot-counters or the liberal-biased MSM. To say our institutions have failed us, and further, that political operatives all the way up to the sitting president have been openly fomenting violence in the streets, is a statement of the obvious.

Among my concerns more pressing than who gets to sit in the big chair, however, is the whipsawing stock market. Although no longer an accurate proxy of overall economic health or asset valuation, the stock market’s thoroughly irrational daily reaction to every rumor of, say, a vaccine for the raging coronavirus, or resumption of full economic activity and profitability despite widespread joblessness, renewed lockdowns, and a massive wave of homelessness in the offing due to bankruptcies, evictions, and foreclosures, none of this bodes well for the short-term future and maintenance of, oh, I dunno, supply lines to grocery stores. Indeed, I suspect we are rapidly approaching our very own Minsky Moment, which Wikipedia describes as “a sudden, major collapse of asset values which marks the end of the growth phase of a cycle in credit markets or business activity” [underlying links omitted]. This is another prospective event (overdue, actually) for which the set-up has been long prepared. Conspiratorial types call it “the great reset” — something quite different from a debt jubilee.

For lazy thinkers, rhyming comparisons with the past frequently resort to calling someone a Nazi (or the new Hitler) or reminding everyone of U.S. chattel slavery. At the risk of being accused of similar stupidity, I suggest that we’re not on the eve of a 1929-style market crash and ensuing second great depression (though those could well happen, too, bread lines having already formed in 2020) but are instead poised at the precipice of hyperinflation and intense humiliation akin to the Weimar Republic in 1933 or so. American humiliation will result from recognition that the U.S. is now a failed state and doesn’t even pretend anymore to look after its citizens or the commonweal. Look no further than the two preposterous presidential candidates, neither of whom made any campaign promises to improve the lives of average Americans. Rather, the state has been captured by kleptocrats. Accordingly, no more American exceptionalism and no more lying to ourselves how we’re the model for the rest of the world to admire and emulate.

Like Germany in the 1930s, the U.S. has also suffered military defeats and stagnation (perhaps by design) and currently demonstrates a marked inability to manage itself economically, politically, or culturally. Indeed, the American people may well be ungovernable at this point, nourished on a thin gruel of rugged individualism that forestalls our coming together to address adversity effectively. The possibility of another faux-populist savior arising out of necessity only to lead us over the edge (see the Great Man Theory of history) seems eerily likely, though the specific form that descent into madness would take is unclear. Recent history already indicates a deeply divided American citizenry having lost its collective mind but not yet having gone fully apeshit, flinging feces and destroying what remains of economically ravaged communities for the sheer sport of it. (I’ve never understood vandalism.) That’s what everyone was preparing for with emergency guns, ammo, and provisions. How narrowly we escaped catastrophe (or merely delayed it) should be clear in the fullness of time.

/rant off

I’ve been holding in mind for five months now the article at this link (an informal interview with neuroscientist and psychologist Oliver J. Robinson), waiting for conditions when I could return to forms of media consumption I prefer, namely, reading books, magazines, and long-form journalism. When I try to read something substantive these days, I find myself going over the same paragraph repeatedly, waiting in vain for it to register. Regrettably, the calm, composure, and concentration needed for deep reading has been effectively blocked since March 2020 as we wait (also in vain) for the pandemic to burn itself out. (I could argue that the soul-destroying prospect of industrial collapse and near-term human extinction is having the same effect for much longer.) So my attention and media habits have been resignedly diverted to crap news gathering, mostly via video, and cheap entertainments, mostly streaming TV (like everyone else, though others may complain less). The lack of nourishment is noticeable. Considering we’re only weeks away from the U.S. presidential election, stress levels are ratcheting up further, and civil authorities prepare for “election riots” (is that new term?), which I can only assume means piling violence upon violence under the pretense of keeping-the-peace or law-and-order or some other word string rendered meaningless now that the police are widely acknowledged to be a significant contributors to the very problems they are meant to address. These unresolved issues (pandemic, police violence, civil unrest) give rise to pathological anxiety, which explains (according to Robinson, disclaimers notwithstanding) why it’s so hard to read.

To say we live in unprecedented times is both obvious and banal. Unique stresses of modernity have led multiple times to widespread madness and conflict, as well as attempts to recapture things lost in previous shifts from other styles of social organization. Let me not mince words regarding what’s now happening: we’re in an era of repudiation of the Enlightenment, or a renewed Counter-Enlightenment. I’ve stated this before, and I’m not the only one making this diagnosis (just learned it’s a rather old idea — I’m always late to the party). For instance, Martin Jay’s essay “Dialectic of Counter-Enlightenment” appears to have been floating around in various forms since 2011. Correlation of this renewal of Counter-Enlightenment fervor with literacy seems clear. Despite basic literacy as a skill being widely improved worldwide over the past two centuries, especially in the developing world, deep literacy is eroding:

Beyond self-inflicted attention deficits, people who cannot deep read – or who do not use and hence lose the deep-reading skills they learned – typically suffer from an attenuated capability to comprehend and use abstract reasoning. In other words, if you can’t, or don’t, slow down sufficiently to focus quality attention – what Wolf calls “cognitive patience” – on a complex problem, you cannot effectively think about it.

Considering deep literacy is absolutely critical to clear thinking (or critical thought, if you prefer, not to be confused with the The Frankfurt School’s critical theory discussed in Jay’s essay), its erosion threatens fundamental institutions (e.g., liberal democracy and the scientific method) that constitute the West’s primary cultural inheritance from the Enlightenment. The reach of destruction wrought by reversing course via the Counter-Enlightenment cannot be overstated. Yet many among us, completely unable to construct coherent ideas, are rallying behind abandonment of Enlightenment traditions. They’re ideologues who actively want to return to the Dark Ages (while keeping modern tech, natch). As with many aspects of unavoidable cultural, social, environmental, and civilizational collapse, I have difficulty knowing quite what to hope for. So I won’t condemn retrograde thinking wholly. In fact, I feel empathy toward calls to return to simpler times, such as with German Romanticism or American Transcendentalism, both examples of cultural and aesthetic movements leading away from the Enlightenment.

Long before these ideas coalesced for me, I had noted (see here, here, and here) how literacy is under siege and a transition back toward a predominantly oral culture is underway. The Counter-Enlightenment is either a cause or an effect, I can’t assess which. At the risk of being a Cassandra, let me suggest that, if these times aren’t completely different from dark episodes of the past, we are now crossing the threshold of a new period of immense difficulty that makes pathological anxiety blocking the ability to read and think a minor concern. Indeed, that has been my basic assessment since crafting the About Brutus blurb way back in 2006. Indicators keep piling up. So far, I have a half dozen points of entry to process and digest by other cultural commentators exploring this theme, though they typically don’t adopt wide enough historical or cultural perspectives. Like the last time I failed to synthesize my ideas into a multipart blog series, I don’t have a snazzy title, and this time, I don’t even have planned installment titles. But I will do my best to roll out in greater detail over several blog posts some of the ways the Counter-Enlightenment is manifesting anew.

Supporting the Vietnam war was dumb. Supporting the Iraq invasion after being lied
to about Vietnam was an order of magnitude dumber. Supporting any US war agendas
after being lied to about Iraq is an order of magnitude even dumber than that.
—Caitlin Johnstone

Upon rereading, and with the advantage of modest hindsight, I think I got it exactly correct in this 5-year-old blog post. Even the two brief comments are correct. More specifically, the United States is understood to be the sole remaining military superpower following the collapse of the Soviet Union in 1991. Never mind that numerous countries count themselves members of the nuclear club (cue Groucho Marx joke) and thus possess sufficient power to destroy the world. Never mind that the U.S. failed to win the Korean War or the Vietnam War (the two major U.S. military involvements post-WWII), or in fact any of numerous 21st-century wars (undeclared, de facto, continuing). Never mind that the U.S. has been successful at multiple smaller regime-change actions, often on the back of a civil war instigated by the U.S. and purposefully designed to install a puppet leader. And never mind that the capitalist competition for control of economic resources and capture of perpetual growth is being won handily by China. Nope, the U.S. is no longer the only superpower but is instead busy transitioning from superpower (military and economic) to failed state. Or in the language of that old blog post, the U.S. is now a geopolitical Strong/Stupid hybrid but is actively deploying stupidity in a feverish play to be merely Stupid. The weirdest aspect, perhaps, is that it’s being done right in front of god and everybody, yet few bother to take notice.

It’s no stretch to assert that in the U.S. in particular (but also true of nearly every regime across the world), we’re piling stupidity upon stupidity. If I were inclined to go full conspiracy like some QAnon fool, I’d have to say that the power elite have adopted a deep, 3D-chess strategy that means one of two possible things using the Rock-Paper-Scissors power dynamic algorithm (which, unlike tic-tac-toe, produces a winner) modified and inverted to Strong-Stupid-Smart: it’s either (1) very Smart of them to appear so Stupid, granting victory (against all appearances) over Strong (but only Strong in a three-legged contest), or (2) they reject the algorithm entirely in the misguided belief that nuthin’ beats stoopid. That second option would indeed be entirely consistent with Stupid.

Take for instance three looming issues: the pandemic (and its follow-on effects), the U.S. presidential election (ugh, sorry, it’s unavoidable), and climate change. They loom threateningly despite being well underway already. But with each, we’ve acted and behaved very stupidly, stunningly so I would argue, boxing ourselves in and doing worse damage over time than if we had taken proper steps early on. But as suggested in a previous blog post, the truth is that decision-makers haven’t really even tried to address these issues with the purpose of solving, resolving, winning, remedying, or ameliorating entirely predictable outcomes. Rather, issues are being either swept under the rug (ignored with the futile hope that they will go away or resolve themselves on their own) or displaced in time for someone else to handle. This second option occurs quite a lot, which is also known as kicking the can down the road or stealing from the future (as with sovereign debt). What happens when there’s no more future (for humans and their institutions, anyway) because it’s been squandered in the present? You already know the answer(s) to that question.

Fantasies and delusions rush into the space
that reason has vacated in fear of its life.

—James Howard Kunstler

Since I first warned that this blog post was forthcoming, conditions of modern American life we might have hoped would be resolved by now remain intransigently with us. Most are scrambling to adjust to the new normal: no work (for tens of millions), no concerts, no sports (except for events staged for the camera to be broadcast later), little or no new cinema (but plenty of streaming TV), no school or church (except for abysmal substitutes via computer), no competent leadership, and no end in sight. The real economy swirls about the drain despite the fake economy (read: the stock market a/k/a the Richistan economy) having first shed value faster than ever before in history then staged a precipitous taxpayer-funded, debt-fueled recovery only to position itself for imminent resumption of its false-started implosion. The pandemic ebbed elsewhere then saw its own resumption, but not in the U.S., which scarcely ebbed at all and now leads the world in clownish mismanagement of the crisis. Throughout it all, we extend and pretend that the misguided modern age isn’t actually coming to a dismal close, based as it is on a consumption-and-growth paradigm that anyone even modestly numerically literate can recognize is, um, (euphemism alert) unsustainable.

Before full-on collapse (already rising over the horizon like those fires sweeping across the American West) hits, however, we’ve got unfinished business: getting our heads (and society) right regarding which of several competing ideologies can or should establish itself as the righteous path forward. That might sound like the proverbial arranging of deck chairs on the RMS Titanic, but in an uncharacteristically charitable moment, let me suggest that righting things before we’re done might be an earnest obligation even if we can’t admit openly just how close looms the end of (human) history. According to market fundamentalists, corporatists, and oligarchs, Socialism and Marxism, or more generally collectivism, must finally have a stake driven through its undead heart. According to radical progressives, Black Lives Matter, and Antifa, fascism and racism, or more generally intolerance, deserve to be finally stamped out, completing the long arc of history stalled after the Civil Rights Era. And according to barely-even-a-majority-anymore whites (or at least the conservative subset), benefits and advantages accrued over generations, or more generally privilege, must be leveraged, solidified, and maintained lest the status quo be irretrievably lost. Other factions no doubt exist. Thus, we are witnessing a battle royale among narratives and ideologies, none of which IMO crystallize the moment adequately.

Of those cited above, the first and third are easy to dismiss as moribund and self-serving. Only the second demonstrates any concern for the wellbeing of others. However, and despite its putative birthplace in the academy, it has twisted itself into pretzel logic and become every bit as intolerant as the scourges it rails against. Since I need a moniker for this loose, uncoordinated network of movements, I’ll refer to them as the Woke Left, which signifies waking up (i.e., being woke) to injustice and inequity. Sustained analysis of the Woke Left is available from James Lindsay through a variety of articles and interviews (do a search). Lindsay demonstrates handily how the Woke Left’s principle claims, often expressed through its specialized rhetoric called Critical Theory, is actually an inversion of everything it pretends to be. This body of thought has legitimate historical and academic lineage, so it’s arguable that only its most current incarnation in the Woke Left deserves scorn.

Two recently published books exemplify the rhetoric of the Woke Left: White Fragility (2018) by Robin DiAngelo and How to Be an Antiracist (2019) by Ibram Kendi. Although I’ve read neither book, I’m aware of numerous scathing reviews that point out fundamental problems with the books and their authors’ arguments. Foremost among them is what’s sometimes called a Kafka trap, a Catch-22 because all avenues of argument lead inescapably toward guilt, typically some form of original sin. Convinced they are on the righteous right side of history, Woke Left protesters and agitators have been harassing and physically threatening strangers to demand support for the cause, i.e., compliance. What cause is a good question, considering a coherent program has yet to be articulated. Forcing others to choose either side of a false binary — with us or against us — is madness, but that’s the cultural moment at which we’ve arrived. Everyone must align their ideology with some irrational narrative while being put at risk of cancellation and/or destruction no matter what alignment is ventured.

If things go south badly on the heels of contested election results this fall as many expect — the pump already primed for such conflict — and a second civil war ensues, I rather expect the Woke Left to be the first to fail and the other two, each representing the status quo (though different kinds), to be in an extended battle for control of whatever remains of the union. I can’t align with any of them, since by my lights they’re all different kinds of crazy. Sorta makes ya wonder, taking history as an indicator, if a fourth or fifth faction won’t appear before it’s a wrap. I don’t hold out any hope for any faction steering us competently through this crisis.

I had at least two further ideas for this third part of a series, but frankly, given the precipitous turn of events over the past month or so, nothing feels appropriate to write about just yet other than the global pandemic that has staggered society, reeling from being forced apart from each other and the way of life to which we are adapted being suddenly ripped out from beneath us. As the voiceover at the beginning of one of the Lord of the Rings movies intones rather soberly, “The world … has changed ….” That was my assessment here, though I was really thinking of the post-truth public sphere.

Many are already admitting that we will never be able to go back to what once was, that what broke will stay forever broken. And while the eventual response may be interpreted in sweet-lemon style as a reform opportunity or beckon call to greatness, I daresay a far more likely result is that mass death, sickness, and ruin will create a critical mass of desperate people not so willing to stay hunkered down waiting for the extended crisis to pass. Indeed, the bunker mentality already imprinted on our minds as we cringe before the next in a series of injurious blows can’t be expected to endure. Folks will take to the streets with all their stockpiled guns and ammo, seeking something, anything to do, rather than dying quietly, meekly, alone, at home. The metaphor of being pummeled into submission or to death is probably incorrect. Right now, we’re still only partway up one of those parabolic curves that ultimately points skyward. Alternatively, it’s a crescendo of pain that overwhelms until nothing functions anymore.

If surviving historians are able to piece together the story some time hence, one possibility will be to observe that the abundance we sorta enjoyed during two centuries of cheap energy did not develop into anything resembling an enlightened style of social organization that could be sustained or indeed even prepare us adequately for inevitable black swan events. Such discontinuities are entirely predictable by virtue of their inevitability, though precise timing is a fool’s errand. Natural disasters are the obvious example, and despite organizations and agencies scattered throughout all levels of government, we’re found flat-footed nearly every time disaster strikes. This global pandemic is no different, nor is the collapse of industrial civilization or runaway climate change. The current crisis is the first major kick in the teeth that may well cascade domino-style into full-on collapse.

As the crisis deepens, top leaders are often found to be worthless. Where is Pence, appointed more than a month ago to coordinate a coronavirus task force? It’s quite unlike a major political figure to do his or her work quietly and competently without media in tow. Even incompetence gets coverage, but Pence is nowhere to be seen. Must be self-quarantining. Some leaders are even worse than worthless; they actively add to the misery. Mainstream media may also have finally gotten hip to the idea that hanging on every insipid word uttered by that gaping chasm of stupidity that is our president is no longer a ratings bonanza to be tolerated in exchange for fruitless fact-checking missions. I fantasize about press events where correspondents heckle and laugh the fatuous gasbag (or his apologists) off the podium. Regrettably, there seems to be no bottom to the humiliation he can withstand so long as attention stays riveted on him. Perhaps the better response to his noisome nonsense would be stone silence — crickets.