Posts Tagged ‘Futurism’

What if everyone showed up to an event with an expectation of using all their tech and gadgets to facilitate the group objective only to discover that nothing worked? You go to a fireworks display but the fireworks won’t ignite. You go to a concert but the instruments and voices make no sound. You go to a sporting event but none of the athletes can move. Does everyone just go home? Come back another day to try again? Or better yet, you climb into your car to go somewhere but it won’t start. Does everyone just stay home and the event never happens?

Those questions relate to a new “soft” weapons system called Scorpius (no link). The device or devices are said to disrupt and disable enemy tech by issuing a narrowly focused electromagnetic beam. (Gawd, just call it a raygun or phaser. No embarrassment over on-the-nose naming of other things.) Does the beam fry the electronics of its target, like a targeted Carrington event, or just simply scramble the signals, making the tech inoperable? Can tech be hardened against attack, such as being encased in a Faraday cage? Wouldn’t such isolation itself make tech nonfunctional, since electronic communications between locations is the essence of modern devices, especially for targeting and telemetry? These are a few more idle questions (unanswered, since announcements of new weaponry I consulted read like advertising copy) about this latest advance (euphemism alert) in the arms race. Calling a device that can knock a plane (um, warplane) out of the sky (crashing somewhere, obviously) “soft protection” because the mechanism is a beam rather than a missile rather obfuscates the point. Sure, ground-based technologies might be potentially disabled without damage, but would that require continuous beam-based defense?

I recall an old Star Trek episode, the one with the Gorn, where omnipotent aliens disabled all weapons systems of two spaceships postured for battle by superheating controls, making them too hot to handle. Guess no one thought of oven mitts or pencils to push the “Fire!” buttons. (Audiences were meant to think, considering Star Trek was a thinking person’s entertainment, but not too much.) Instead of mass carnage, the two captains were transported to the surface of a nearby planet to battle by proxy (human vs. reptile). In quintessential Star Trek fashion — imagining a hopeful future despite militaristic trappings — the human captain won not only the physical battle but the moral battle (with self) by refusing to dispatch the reptile captain after he/it was disabled. The episode posed interesting questions so long as no one searched in the weeds for plausibility.

We’re confronted now, and again, with many of these same questions, some technical, some strategic, but more importantly, others moral and ethical. Thousands of years of (human) history have already demonstrated the folly of war (but don’t forget profitability). It’s a perennial problem, and from my vantage point, combatants on all sides are no closer to Trekkie moral victory now than in the 1960s. For instance, the U.S. and its allies are responsible for millions of deaths in Iraq, Afghanistan, Syria, and elsewhere just in the last two decades. Go back further in time and imperial designs look more and more like sustained extermination campaigns. But hey, we came to play, and any strategic advantage must be developed and exploited, moral quandaries notwithstanding.

It’s worth pointing out that in the Gorn episode, the captains were deprived of their weapons and resorted to brute force before the human improvised a projectile weapon out of materials handily strewn about, suggesting perhaps that intelligence is the most deadly weapon. Turns out to have been just another arms race.

A quick search revealed that over 15 years of blog posts, the word macrohistory has been used only once. On reflection, macrohistory is something in which I’ve been involved for some time — mostly as a dilettante. Several book reviews and three book-blogging series (one complete, two either on hiatus or fully abandoned) concern macrohistory, and my own several multi-part blogs connect disparate dots over broader topics (if not quite history in the narrow sense). My ambition, as with macrohistory, is to tease out better (if only slightly) understandings of ourselves (since humans and human culture are obviously the most captivating thing evar). Scientists direct similar fascination to the inner workings of nonhuman systems — or at least larger systems in which humans are embedded. Thus, macrohistory can be distinguished from natural history by their objects of study. Relatedly, World-Systems Theory associated with Immanuel Wallerstein and The Fourth Turning (1997 book by William Strauss and Neil Howe) take similarly broad perspectives and attempt to identify historical dynamics and patterns not readily apparent. Other examples undoubtedly exist.

This is all preliminary to discussing a rather depressing article from the December 2020 issue of Harper’s Magazine: Rana Dasgupta’s disquieting (ahem) essay “The Silenced Majority” (probably behind a paywall). The subtitle poses the question, “Can America still afford democracy?” This innocuous line begs the question whether the U.S. (America and the United States of America [and its initialisms U.S. and U.S.A.] being sloppily equivalent almost everywhere, whereas useful distinctions describe the United Kingdom, Great Britain, and England) actually has or practices democracy anymore, to which many would answer flatly “nope.” The essay is an impressive exercise, short of book length, in macrohistory, though it’s limited to Western cultures, which is often the case with history told from inside the bubble. Indeed, if (as the aphorism goes) history is written/told primarily by the victors, one might expect to hear only of an ongoing series of victories and triumphs with all the setbacks, losses, and discontinuities excised like some censored curated Twitter or Facebook Meta discussion. One might also wonder how that same history reads when told from the perspective of non-Western countries, especially those in transitional regions such as Poland, Ukraine, Turkey, and Iran or those with histories long predating the rise of the West roughly 500 years ago, i.e., China, Japan, Egypt, and the lost cultures of Central America. Resentments of the Islamic world, having been eclipsed by the West, are a case in point. My grasp of world history is insufficient to entertain those perspectives. I note, however, that with globalism, the histories of all regions of the world are now intimately interconnected even while perspectives differ.

Dasgupta describes foundational Enlightenment innovations that animate Western thinking, even though the ideas are often poorly contextualized or understood. To wit:

In the seventeenth century, England was an emerging superpower. Supremacy would come from its invention of a world principle of property. This principle was developed following contact with the Americas, where it became possible to conjure vast new English properties “out of nothing”—in a way that was impracticable, for instance, in the militarized, mercantile societies of India. Such properties were created by a legal definition of ownership designed so that it could be applied only to the invaders. “As much land as a man tills, plants, improves, cultivates, and can use the product of,” John Locke wrote in 1689, “so much is his property.” When combined with other new legal categories such as “the savage” and “the state of nature,” this principle of property engendered societies such as Carolina, where Locke’s patron, the first earl of Shaftesbury, was a lord proprietor.

Obvious, isn’t it, that by imposing the notion of private property on indigenous inhabitants of North America, colonialists established ownership rights over territories where none had previously existed? Many consider that straightforward theft (again, begging the question) or at least fencing the commons. (Attempts to do the same in the open oceans and in space [orbit] will pick up as technology allows, I surmise.) In addition, extension of property ownership to human trafficking, i.e., slavery and its analogues still practiced today, has an exceptionally long history and was imported to the Americas, though the indigenous population proved to be poor candidates for subjugation. Accordingly, others were brought to North America in slave trade that extended across four centuries.

Dasgupta goes on:

From their pitiless opposition to the will of the people, we might imagine that British elites were dogmatic and reactionary. (Period dramas depicting stuck-up aristocrats scandalized by eccentricity and innovation flatter this version of history.) The truth is that they were open-minded radicals. They had no sentimentality about the political order, cutting the head off one king and sending another into exile. They could invent financial and legal structures (such as the Bank of England, founded in 1694) capable of releasing unprecedented market energies. Even their decision to exploit American land with African labor demonstrated their world-bending pursuit of wealth. Their mines and plantations would eventually supply the capital for the first industrial revolution. They loved fashion and technology, they believed in rationality, progress, and transparency. They were the “founding fathers” of our modern world.

And yet they presided over a political system as brutal as it was exclusive. Why? The answer is simple. They could not afford democracy, but also, crucially, they did not need it. [emphasis in original]

So much for the awe and sacred respect in which Enlightenment philosophers and the Founders are held — or used to be. Statues of these dudes (always dudes, natch) are being pulled down all the time. Moreover, association of liberal democracy with the 17th century is a fundamental mistake, though neoliberalism (another poorly defined and understood term) aims to shift backwards to a former or hybrid state of human affairs some are beginning to call digital feudalism.

The article goes on to discuss the balancing act and deals struck over the course of centuries to maintain economic and political control by the ownership class. It wasn’t until the 1930s and the postwar economic boom in the U.S. that democracy as commonly understood took root significantly. The labor movement in particular was instrumental in forcing FDR’s New Deal social programs, even though populism and socialism as political movements had been successfully beaten back. Interestingly, the hallowed American nuclear family (limited in its scope racially), an ahistorical formation that enjoyed a roughly 30-year heyday from 1945 to 1975, coincides with the rise of the American middle class and now-aged democratic institutions. They’re all connected with widely distributed wealth and prosperity. But after the oil crisis and stagflation of the middle 1970s, gains enjoyed by the middle class have steadily eroded and/or been actively beaten back (again!) so that dominant themes today are austerity imposed on the masses and inequality coughing up hundy-billionaires with increasing frequency. Estimates are that 30-40% of the American citizenry lives in poverty, bumping up against failed state territory. Inequality has returned to Gilded Age levels if not exceeded them. Dasgupta fails to cite perhaps the major underlying cause of this shift away from affordable democracy, back toward the brutal, world principal of property: falling EROI. Cheap foreign labor, productivity gains, and creation of a giant debtor society have simply not offset the disappearance of cheap energy.

Dasgupta’s further discussion of an emerging two-tier economy along with the Silicon Valley technocracy follows, but I’ll stop short here and encourage readers instead to investigate and think for themselves. Lots of guides and analyses help to illuminate the macrohistory, though I find the conclusions awful in their import. Dasgupta drives home the prognosis:

The neoliberal revolution aimed to restore the supremacy of capital after its twentieth-century subjugation by nation-states, and it has succeeded to an astonishing degree. As states compete and collude with gargantuan new private powers, a new political world arises. The principle of labor, which dominated the twentieth century—producing the industrious, democratic society we have come to regard, erroneously, as the norm—is once again being supplanted by a principle of property, the implications and consequences of which we know only too well from our history books.

I’ve often thought that my father was born at just the right time in the United States: too young to remember much of World War II, too old to be drafted into either the Korean War or the Vietnam War, yet well positioned to enjoy the fruits of the postwar boom and the 1960s counterculture. He retired early with a pension from the same company for which he had worked nearly the entirety of his adult life. Thus, he enjoyed the so-called Happy Days of the 1950s (the Eisenhower era) and all of the boom years, including the Baby Boom, the demographer’s term for my cohort (I came at the tail end). Good for him, I suppose. I admit some envy at his good fortune as most of the doors open to him were closed by the time I reached adulthood. It was the older (by 10–15 years) siblings of Boomers who lodged themselves in positions of power and influence. Accordingly, I’ve always felt somewhat like the snotty little brother clamoring for attention but who was overshadowed by the older brother always in the way. Luckily, my late teens and early twenties also fell between wars, so I never served — not that I ever supported the American Empire’s foreign escapades, then or now.

Since things have turned decidedly for the worse and industrial civilization can’t simply keep creaking along but will fail and collapse soon enough, my perspective has changed. Despite some life options having been withdrawn and my never having risen to world-beater status (not that that was ever my ambition, I recognize that, similar to my father, I was born at the right time to enjoy relative peace and tranquility of the second half of the otherwise bogus “American Century.” My good fortune allowed me to lead a quiet, respectable life, and reach a reasonable age (not yet retired) at which I now take stock. Mistakes were made, of course; that’s how we learn. But I’ve avoided the usual character deformations that spell disaster for lots of folks. (Never mind that some of those deformations are held up as admirable; the people who suffer them are in truth cretins of the first order, names withheld).

Those born at the wrong time? Any of those drafted into war (conquest, revolutionary, civil, regional, or worldwide), and certainly anyone in the last twenty years or so. Millennials appeared at the twilight of empire, many of whom are now mature enough to witness its fading glory but generally unable to participate in its bounties meaningfully. They are aware of their own disenfranchisement the same way oppressed groups (religious, ethnic, gender, working class, etc.) have always known they’re getting the shaft. Thus, the window of time one might claim optimal to have been born extends from around 1935 to around 1995, and my father and I both slot in. Beyond that fortuitous window, well, them’s the shakes.

/rant on

Since deleting from my blogroll all doom links and turning my attention elsewhere, the lurking dread of looming collapse (all sorts) has been at low ebb at The Spiral Staircase. Despite many indicators of imminent collapse likewise purged from front-page and top-of-the-broadcast news, evidence continues to mount while citizens contend with other issues, some political and geopolitical, others day-to-day tribulations stemming from politics, economics, and the ongoing pandemic. For instance, I only just recently learned that the Intergovernmental Panel on Climate Change (IPCC — oh yeah … them) issued AR6 last month, the sixth periodic Assessment Report (maybe instead call it the State of the Union Address Planet Report). It’s long, dense reading (the full report is nearly 4,000 pp., whereas the summary for policymakers is a mere 42 pp.) and subject to nearly continuous revision and error correction. The conclusion? Climate change is widespread, rapid, and intensifying. And although it’s true that mundane daily activities occupy center stage in the lives of average folks, there is simply no bigger story or concern for government leaders (I choke on that term) and journalists (that one, too) than climate change because it represents (oh, I dunno …) the collapse of industrial civilization and the early phase of mass extinction. Thus, all politics, geopolitics, economic warfare, class struggle, Wokeism, authoritarian seizure of power, and propaganda filling the minds of people at all levels as well as the institutions they serve amount to a serious misallocation of attention and effort. I will admit, though, that it’s both exhausting and by now futile to worry too much about collapse. Maybe that’s why the climate emergency (the new, improved term) is relegated to background noise easily tuned out.

It’s not just background noise, though, unlike the foreknowledge that death awaits decades from now if one is fortunate to persist into one’s 70s or beyond. No, it’s here now, outside (literally and figuratively), knocking on the door. Turn off your screens and pay attention! (Ironically, everyone now gets the lion’s share of information from screens, not print. So sue me.) Why am I returning to this yet again? Maybe I’ve been reviewing too many dystopian films and novels. Better answer is that those charged with managing and administering states and nations are failing so miserably. It’s not even clear that they’re trying, so pardon me, but I’m rather incensed. It’s not that there aren’t plenty of knowledgeable experts compiling data, writing scientific reports, publishing books, and offering not solutions exactly but at least better ways to manage our affairs. Among those experts, the inability to reverse the climate emergency is well enough understood though not widely acknowledged. (See Macro-Futilism on my blogroll for at least one truth teller who absolutely gets it.) Instead, some lame version of the same dire warning issues again and again: if action isn’t taken now (NOW, dammit!), it will be too late and all will be lost. The collective response is not, however, to pull back, rein in, or even prepare for something less awful than the worst imaginable hard landing where absolutely no one survives despite the existence of boltholes and underground bunkers. Instead, it’s a nearly gleeful acceleration toward doom, like a gambler happily forking over his last twenty at the blackjack table before losing and chucking himself off the top of the casino parking structure. Finally free (if fleetingly)!

Will festering public frustration over deteriorating social conditions tip over into outright revolt, revolution, civil war, and/or regime change? Doesn’t have to be just one. Why is the U.S. still developing and stockpiling armaments, maintaining hundreds of U.S. military bases abroad, and fighting costly, pointless wars of empire (defeat in withdrawal from Afghanistan notwithstanding)? Will destruction of purchasing power of the U.S. dollar continue to manifest as inflation of food and energy costs? Is the U.S. Dept. of Agriculture actually doing anything to secure food systems, or does it merely prepare reports like the AR6 that no one reads or acts upon? Will fragile supply lines be allowed to fail entirely, sparking desperation and unrest in the streets far worse than summer 2020? Famine is how some believe collapse will trigger a megadeath pulse, but I wouldn’t count out chaotic violence among the citizenry, probably exacerbated and escalated as regimes attempt (unsuccessfully) to restore social order. Are any meaningful steps being taken to stop sucking from the fossil fuel teat and return to small-scale agrarian social organization, establishing degrowth and effectively returning to the land (repatriation is my preferred term) instead of going under it? Greenwashing doesn’t count. This headline (“We Live In A World Without Consequences Where Everyone Is Corrupt“) demonstrates pretty well that garbage economics are what pass for governance, primarily preoccupied with maintaining the capitalist leviathan that has captured everything (capture ought to be the trending word of the 2021 but sadly isn’t). Under such constraint, aged institutions are flatly unable to accomplish or even address their missions anymore. And this headline (“Polls Show That The American People Are Extremely Angry – And They Are About To Get Even Angrier“) promises that things are about to get much, much worse (omitted the obvious-but-erroneous follow-on “before they get better”) — for the obvious reason that more and more people are at the ends of their ropes while the privileged few attend the Met Gala, virtue signal with their butts, and behave as though society isn’t in fact cracking up. Soon enough, we’ll get to truth-test Heinlein’s misunderstood aphorism “… an armed society is a polite society.”

Those who prophesy dates or deadlines for collapse have often been slightly embarrassed (but relieved) that collapse didn’t arrive on schedule. Against all odds, human history keeps trudging further into borrowed time, kicking cans down roads, blowing bubbles, spinning false narratives, insisting that all this is fine, and otherwise living in make-believe land. Civilization has not quite yet reached the end of all things, but developments over the last couple months feel ever more keenly like the equivalent of Frodo and Sam sitting atop Mount Doom, just outside the Cracks of Doom (a/k/a Sammath Naur), except that humanity is not on a noble, sacrificial mission to unmake the One Ring, whatever that might represent outside of fiction (for Tolkien, probably industrial machines capable of planetary destruction, either slowly and steadily or all at once; for 21st-century armchair social critics like me, capitalism). All former certainties, guarantees, sureties, continuities, and warranties are slipping away despite the current administration’s assurances that the status quo will be maintained. Or maybe it’s merely the transition of summer into fall, presaging the annual dormancy of winter looking suspiciously this year like the great dying. Whatever. From this moment on and in a fit of exuberant pique, I’m willing to call the contest: humanity is now decidedly on the down slope. The true end of history approaches, as no one will be left to tell the tale. When, precisely, the runaway train finally careens over the cliff remains unknown though entirely foreseeable. The concentration of goofy references, clichés, and catchphrases above — usually the mark of sophomoric writing — inspires in me to indulge (further) in gallows humor. Consider these metaphors (some mixed) suggesting that time is running out:

  • the show’s not over til it’s over, but the credits are rolling
  • the chickens are coming home to roost
  • the canary in the coal mine is gasping its last breath
  • the fat lady is singing her swan song
  • borrowed time is nearly up
  • time to join the great majority (I see dead people …)
  • the West fades into the west
  • kiss your babies goodnight and kiss your ass goodbye

/rant off

Watched Soylent Green (1973) a few days ago for the first time since boyhood. The movie is based on a book by Richard Fleischer (which I haven’t read) and oddly enough has not yet been remade. How to categorize the film within familiar genres is tricky. Science fiction? Disaster? Dystopia? Police procedural? It checks all those boxes. Chief messages, considering its early 70s origin, are pollution and overpopulation, though global warming is also mentioned less pressingly. The opening montage looks surprisingly like what Godfrey Reggio did much better with Koyaanisqatsi (1982).

Soylent Green is set in 2022 — only a few months away now but a relatively remote future in 1973 — and the Earth is badly overpopulated, environmentally degraded, overheated, and struggling to support teeming billions mostly jammed into cities. Details are sketchy, and only old people can remember a time when the biosphere remained intact; whatever disaster had occurred was already long ago. Science fiction and futuristic films are often judged improperly by how correct prophecies turn out in reality, as though enjoyment were based on fidelity to reality. Soylent Green fares well in that respect despite its clunky, dated, 70s production design. Vehicles, computer screens, phones, wardrobe, and décor are all, shall we say, quaintly vintage. But consider this: had collapse occurred in the 70s, who’s to say that cellphones, flat screens, and the Internet would ever have been developed? Maybe the U.S. (and the world) would have been stalled in the 70s much the way Cuba is stuck in the 50s (when the monumentally dumb, ongoing U.S. embargo commenced).

The film’s star is Charlton Heston, who had established himself as a handsomely bankable lead in science fiction, disaster, and dystopian films (e.g., The Omega Man and The Planet of the Apes series). Though serviceable, his portrayal is remarkably plain, revealing Heston as a poor man’s Sean Connery or John Wayne (both far more charismatic contemporaries of Heston’s even in lousy films). In Soylent Green, Heston plays Detective Robert Thorn, though he’s mostly called “Thorn” onscreen. Other characters below the age of 65 or so also go by only one name. They all grew up after real foodstuffs (the titular Soylent Green being a synthetic wafer reputedly made out of plankton — the most palatable of three colors) and creature comforts became exceedingly scarce and expensive. Oldsters are given the respect of first and last names. Thorn investigates the assassination of a high-ranking industrialist to its well-known conspiratorial conclusion (hardly a spoiler anymore) in that iconic line at the very end of the film: “Soylent Green is people!” Seems industrialists, to keep people fed, are making food of human corpses. That eventual revelation drives the investigation and the film forward, a device far tamer than today’s amped up action thrillers where, for instance, a mere snap of the fingers can magically wipe out or restore half of the universe. Once the truth is proclaimed by Thorn (after first being teased whispered into a couple ears), the movie ends rather abruptly. That’s also what makes it a police procedural set in a disastrous, dystopic, science-fiction future stuck distinctively in the past: once the crime/riddle is solved, the story and film are over with no dénouement whatsoever.

Some of the details of the film, entirely pedestrian to modern audiences, are modestly enjoyable throwbacks. For instance, today’s penchant for memes and slang renaming of commonplace things is employed in Soylent Green. The catchphrase “Tuesday is Soylent Green Day” appears but is not overdriven. A jar of strawberries costs “150D,” which I first thought might be future currency in the form of debits or demerits but is probably just short for dollars. Front end loaders used for crowd control are called “scoops.” High-end apartment building rentals come furnished with live-in girls (prostitutes or gold-diggers, really) known as Furniture Girls. OTOH, decidedly 70s-era trash trucks (design hasn’t really changed much) are not emblazoned with the corporate name or logo of the Soylent Corporation (why not?). Similarly, (1) dressing the proles in dull, gray work clothes and brimless caps, (2) having them sleep on stairways or church refuges piled on top of each other so that characters have to step gingerly through them, (3) being so crammed together in protest when the Tuesday ration of Soylent Green runs short that they can’t avoid the scoops, (4) dripped blood clearly made of thick, oversaturated paint (at least on the DVD), and (5) a sepia haze covering daytime outdoor scenes are fairly lazy nods to world building on a low budget. None of this is particularly high-concept filmmaking, though the restraint is appreciated. The sole meme (entirely unprepared) that should have been better deployed is “going home,” a euphemism for reporting voluntarily to a processing plant (into Soylent Green, of course) at the end of one’s suffering life. Those who volunteer are shown 30 minutes of scenes, projected on a 360-degree theater that envelops the viewer, depicting the beauty and grandeur of nature before it had disappeared. This final grace offered to people (rather needlessly) serves the environmental message of the film well and could have been “driven home” a bit harder.

Like other aspects of the film’s back story, how agriculture systems collapsed is largely omitted. Perhaps such details (conjecture) are in the book. The film suggests persistent heat (no seasons), and accordingly, character are made to look like they never stop sweating. Scientific projections of how global warming will manifest do in fact point to hothouse Earth, though seasons will still occur in temperate latitudes. Because such changes normally occur in geological time, it’s an exceedingly slow process compared to human history and activity. Expert inquiry into the subject prophesied long ago that human activity would trigger and accelerate the transition. How long it will take is still unknown, but industrial civilization is definitely on that trajectory and human have done little since the 70s to curb self-destructive appetites or behaviors — except of course talk, which in the end is just more hot air. Moreover, dystopian science fiction has shifted over the decades away from self-recrimination to a long, seemingly endless stream of superheros fighting crime (and sometimes aliens). Considering film is entertainment meant to be enjoyed, the self-serious messages embedded in so many 70s-era disaster films warning us of human hubris are out of fashion. Instead, superpowers and supersuits rule cinema, transforming formerly uplifting science-fiction properties such as Star Trek into hypermilitaristic stories of interstellar social collapse. Soylent Green is a grim reminder that we once knew better, even in our entertainments.

“Language is dynamic” is a phrase invoked in praise or derision of shifts in usage. Corollaries include “the only constant is change” and “time’s arrow points in only one direction” — both signalling that stasis is an invalid and ultimately futile conservative value. The flip side might well be the myth of progress, understood in reference not to technological advancement but human nature’s failure to rise above its base (animal) origins. This failure is especially grotesque considering that humans currently albeit temporarily live in an age of material abundance that would provide amply for everyone if that largesse were justly and equitably produced and distributed. However, resources (including labor) are being systematically exploited, diverted, and hoarded by a small, unethical elite (what some call “alpha chimps”) who often use state power to subjugate vulnerable populations to funnel further tribute to the already obscenely wealthy top of the socioeconomic hierarchy. But that’s a different diatribe.

Although I’m sensitive the dynamism of language — especially terms for broad ideas in need of short, snappy neologisms — I’m resistant to adopting most new coin. For instance, multiple colors of pill (red, blue, white, and black to my knowledge) refer to certain narrative complexes that people, in effect, swallow. Similarly, the “blue church” is used to refer to legacy media struggling desperately (and failing) to retain its last shreds of legitimacy and authority. (Dignity is long gone.) Does language really need these terms or are hipsters just being clever? That question probably lacks a definitive answer.

My real interest with this blog post, however, is how the modern digital mediascape has given rise to a curious phenomenon associated with cancel culture: deletion of tweets and social media posts to scrub one’s past of impropriety as though the tweet or post never happened. (I’ve never deleted a post nor have any plans to.) Silicon Valley hegemons can’t resist getting their piece of the action, too, by applying deeply flawed algorithms to everyone’s content to demonetize, restrict, and/or remove (i.e., censor) offensive opinion that runs counter to (shifting) consensus narratives decided upon in their sole discretion as water carriers for officialdom. Algorithmic dragnets are not effective kludges precisely because thoughts are not synonymous with their online expression; one merely points to the other. Used to be said that the Internet is forever, so wait before posting or tweeting a reasonable duration so that irresponsible behavior (opinion and trolling, mostly) can be tempered. Who knows who possesses technical expertise and access to tweet and video archives other than, say, the Wayback Machine? When a public figure says or does something dumb, a search-and-destroy mission is often launched to resurrect offending and damning past utterance. Of course, scrub-a-dub erasure or deletion is merely another attempt to manage narrative and isn’t a plea for forgiveness, which doesn’t exist in the public sphere anyway except for rehabilitated monsters such as past U.S. presidents a/k/a war criminals. And the Internet isn’t in fact forever; ask an archivist.

Shifting language, shifting records, shifting sentiment, shifting intellectual history are all aspects of culture that develop naturally and inevitably over time. We no longer believe, for instance, in the four elements or geocentrism (a/k/a the Aristotelian-Ptolemaic system; never mind the intransigent Flat Earthers who need not be silenced). Darker aspects of these shifts, however, include the remarkable Orwellian insight that “Who controls the past controls the future: who controls the present controls the past” from the 1949 novel Nineteen Eighty-Four. Here’s the passage for context:

Who controls the past, controls the future: who controls the present, controls the past … The mutability of the past is the central tenet of Ingsoc. Past events, it is argued, have no objective existence, but survive only in written records and in human memories. The past is whatever the records and the memories agree upon. And since the Party is in full control of all records, and in equally full control of the minds of its members, it follows that the past is whatever the Party chooses to make it.

In 2021, the awful lesson is taken to heart by multiple parties (not the Party in the novel but wannabes) who have latched maniacally onto Orwellian mechanisms of thought control specifically through the manipulation of records, history, and language. But as mentioned above, policing mere expression is not the same as policing thought itself, at least among those who retain critical thinking skills and independence of mind. I abstain judgment how effective attempted brainwashing is with the masses but will at least mention that Yeonmi Park, who escaped from North Korea in 2007 before settling in the U.S. in 2014, describes the chilling totalitarian thought control exercised by the North Korean government — the stuff of nightmare dystopianism. The template is by now well established and despots everywhere are only too happy to implement it repeatedly, following an evil trajectory that should be resisted at every turn while still possible.

This article at Scientific American argues in support of a fundamental change to its style sheet. A style sheet, for the uninitiated, is a guide to how a publication presents its output, including formatting, commonly used spellings, and preferred grammar. For instance, should ordinals (i.e., 1st, 2nd, 3rd, etc.) be raised? Or should web be capitalized when referring to the World Wide Web? The change Scientific American just adopted is dropping the term climate change in favor of climate emergency. Well, good for Scientific American, I guess. My lack of enthusiasm or urgency — the very urgency signaled by the revised term now that the emergency is upon us (um, has been for decades already if one thinks in terms of geological or evolutionary time rather than mere decades of human history) — stems not from the truthfulness or effectiveness of the arguments but by my assessment that the problem is flatly unsolvable at this late date and that, as a global civilization, we’re doing almost nothing to combat it anyway. That’s been the case since the basic problem swung into public view in the 1970s, and it’s been the case since James Howard Kunstler published The Long Emergency in 2006.

Climate emergency deniers have claimed that recent volcanic eruptions in the Caribbean, Iceland, and Hawaii have erased or nullified all the efforts by humans to stem the release of greenhouse gases from industrial activity. According to this link, that’s comparing apples and oranges: peak volcanic activity vs. a sliver of human activity. Since 1750 (a conventional start date of the Industrial Revolution), it’s human activity driving the climate emergency, not volcanic activity. Moreover, industrial activity shows no signs of abating, at least until is all creaks to a halt when the infernal machine will no longer crank. The blocked Suez Canal and deep freeze in Texas both remind how fragile industrial infrastructure is; just wait for a Carrington Event to fry everything at once. This link explains human carbon emissions (also mentions volcanoes), which continues to increase in volume every single year. (This past year might (might!) be an anomaly due to the pandemic, but things are already ramping back up.) And besides, humans can’t control volcanoes (did someone suggest dropping nukes in them to “seal them up”?) We can’t even control ourselves.

Some while back, I purged from my blogroll all the doom links and determined that industrial civilization is in its death throes, so why bother blogging about it anymore? Similarly, the last time I cited the Doomsday Clock in January 2020, it was (metaphorically) 100 seconds to midnight. The Clock today still sits at that harrowing eve of destruction, and I didn’t link to the January 2021 statement, which includes discussions of the novel coronavirus, nuclear threats, and climate change (the older term), summarizing them together as a wake-up call. Really? So now it’s time to get serious? Nope, don’t think so. The proper time is long past due, the catastrophic future is already locked in, and we’ve been steadfastly pretending there is nothing to see (so that there will eventually be nothing to do — a self-fulfilling prophecy). It’s merely unknown when members of the human species begin dropping like flies.

/rant on

Remember all those folks in the weeks and days preceding election day on November 4, 2020, who were buying guns, ammo, and other provisions in preparation for civil breakdown? (No one known personally, of course, and gawd no not actually any of us, either; just them other others who don’t read blogs or anything else.) Well, maybe they were correct adopting the precautionary principal (notably absent from a host of other perils besetting us). But as of this writing, nothing remotely resembling widespread disruption — feared by some, hotly anticipated by others — has developed. But wait! There’s still time. Considering Americans were set up by both political parties to distrust the outcome of the presidential race no matter which candidate claimed to have prevailed, we now face weeks or months of legal challenges and impatient formation of agitators (again, both sides) demanding their candidate be declared the winner (now, dammit!) by the courts instead of either official ballot-counters or the liberal-biased MSM. To say our institutions have failed us, and further, that political operatives all the way up to the sitting president have been openly fomenting violence in the streets, is a statement of the obvious.

Among my concerns more pressing than who gets to sit in the big chair, however, is the whipsawing stock market. Although no longer an accurate proxy of overall economic health or asset valuation, the stock market’s thoroughly irrational daily reaction to every rumor of, say, a vaccine for the raging coronavirus, or resumption of full economic activity and profitability despite widespread joblessness, renewed lockdowns, and a massive wave of homelessness in the offing due to bankruptcies, evictions, and foreclosures, none of this bodes well for the short-term future and maintenance of, oh, I dunno, supply lines to grocery stores. Indeed, I suspect we are rapidly approaching our very own Minsky Moment, which Wikipedia describes as “a sudden, major collapse of asset values which marks the end of the growth phase of a cycle in credit markets or business activity” [underlying links omitted]. This is another prospective event (overdue, actually) for which the set-up has been long prepared. Conspiratorial types call it “the great reset” — something quite different from a debt jubilee.

For lazy thinkers, rhyming comparisons with the past frequently resort to calling someone a Nazi (or the new Hitler) or reminding everyone of U.S. chattel slavery. At the risk of being accused of similar stupidity, I suggest that we’re not on the eve of a 1929-style market crash and ensuing second great depression (though those could well happen, too, bread lines having already formed in 2020) but are instead poised at the precipice of hyperinflation and intense humiliation akin to the Weimar Republic in 1933 or so. American humiliation will result from recognition that the U.S. is now a failed state and doesn’t even pretend anymore to look after its citizens or the commonweal. Look no further than the two preposterous presidential candidates, neither of whom made any campaign promises to improve the lives of average Americans. Rather, the state has been captured by kleptocrats. Accordingly, no more American exceptionalism and no more lying to ourselves how we’re the model for the rest of the world to admire and emulate.

Like Germany in the 1930s, the U.S. has also suffered military defeats and stagnation (perhaps by design) and currently demonstrates a marked inability to manage itself economically, politically, or culturally. Indeed, the American people may well be ungovernable at this point, nourished on a thin gruel of rugged individualism that forestalls our coming together to address adversity effectively. The possibility of another faux-populist savior arising out of necessity only to lead us over the edge (see the Great Man Theory of history) seems eerily likely, though the specific form that descent into madness would take is unclear. Recent history already indicates a deeply divided American citizenry having lost its collective mind but not yet having gone fully apeshit, flinging feces and destroying what remains of economically ravaged communities for the sheer sport of it. (I’ve never understood vandalism.) That’s what everyone was preparing for with emergency guns, ammo, and provisions. How narrowly we escaped catastrophe (or merely delayed it) should be clear in the fullness of time.

/rant off

I’ve been holding in mind for five months now the article at this link (an informal interview with neuroscientist and psychologist Oliver J. Robinson), waiting for conditions when I could return to forms of media consumption I prefer, namely, reading books, magazines, and long-form journalism. When I try to read something substantive these days, I find myself going over the same paragraph repeatedly, waiting in vain for it to register. Regrettably, the calm, composure, and concentration needed for deep reading has been effectively blocked since March 2020 as we wait (also in vain) for the pandemic to burn itself out. (I could argue that the soul-destroying prospect of industrial collapse and near-term human extinction is having the same effect for much longer.) So my attention and media habits have been resignedly diverted to crap news gathering, mostly via video, and cheap entertainments, mostly streaming TV (like everyone else, though others may complain less). The lack of nourishment is noticeable. Considering we’re only weeks away from the U.S. presidential election, stress levels are ratcheting up further, and civil authorities prepare for “election riots” (is that new term?), which I can only assume means piling violence upon violence under the pretense of keeping-the-peace or law-and-order or some other word string rendered meaningless now that the police are widely acknowledged to be a significant contributors to the very problems they are meant to address. These unresolved issues (pandemic, police violence, civil unrest) give rise to pathological anxiety, which explains (according to Robinson, disclaimers notwithstanding) why it’s so hard to read.

To say we live in unprecedented times is both obvious and banal. Unique stresses of modernity have led multiple times to widespread madness and conflict, as well as attempts to recapture things lost in previous shifts from other styles of social organization. Let me not mince words regarding what’s now happening: we’re in an era of repudiation of the Enlightenment, or a renewed Counter-Enlightenment. I’ve stated this before, and I’m not the only one making this diagnosis (just learned it’s a rather old idea — I’m always late to the party). For instance, Martin Jay’s essay “Dialectic of Counter-Enlightenment” appears to have been floating around in various forms since 2011. Correlation of this renewal of Counter-Enlightenment fervor with literacy seems clear. Despite basic literacy as a skill being widely improved worldwide over the past two centuries, especially in the developing world, deep literacy is eroding:

Beyond self-inflicted attention deficits, people who cannot deep read – or who do not use and hence lose the deep-reading skills they learned – typically suffer from an attenuated capability to comprehend and use abstract reasoning. In other words, if you can’t, or don’t, slow down sufficiently to focus quality attention – what Wolf calls “cognitive patience” – on a complex problem, you cannot effectively think about it.

Considering deep literacy is absolutely critical to clear thinking (or critical thought, if you prefer, not to be confused with the The Frankfurt School’s critical theory discussed in Jay’s essay), its erosion threatens fundamental institutions (e.g., liberal democracy and the scientific method) that constitute the West’s primary cultural inheritance from the Enlightenment. The reach of destruction wrought by reversing course via the Counter-Enlightenment cannot be overstated. Yet many among us, completely unable to construct coherent ideas, are rallying behind abandonment of Enlightenment traditions. They’re ideologues who actively want to return to the Dark Ages (while keeping modern tech, natch). As with many aspects of unavoidable cultural, social, environmental, and civilizational collapse, I have difficulty knowing quite what to hope for. So I won’t condemn retrograde thinking wholly. In fact, I feel empathy toward calls to return to simpler times, such as with German Romanticism or American Transcendentalism, both examples of cultural and aesthetic movements leading away from the Enlightenment.

Long before these ideas coalesced for me, I had noted (see here, here, and here) how literacy is under siege and a transition back toward a predominantly oral culture is underway. The Counter-Enlightenment is either a cause or an effect, I can’t assess which. At the risk of being a Cassandra, let me suggest that, if these times aren’t completely different from dark episodes of the past, we are now crossing the threshold of a new period of immense difficulty that makes pathological anxiety blocking the ability to read and think a minor concern. Indeed, that has been my basic assessment since crafting the About Brutus blurb way back in 2006. Indicators keep piling up. So far, I have a half dozen points of entry to process and digest by other cultural commentators exploring this theme, though they typically don’t adopt wide enough historical or cultural perspectives. Like the last time I failed to synthesize my ideas into a multipart blog series, I don’t have a snazzy title, and this time, I don’t even have planned installment titles. But I will do my best to roll out in greater detail over several blog posts some of the ways the Counter-Enlightenment is manifesting anew.

Supporting the Vietnam war was dumb. Supporting the Iraq invasion after being lied
to about Vietnam was an order of magnitude dumber. Supporting any US war agendas
after being lied to about Iraq is an order of magnitude even dumber than that.
—Caitlin Johnstone

Upon rereading, and with the advantage of modest hindsight, I think I got it exactly correct in this 5-year-old blog post. Even the two brief comments are correct. More specifically, the United States is understood to be the sole remaining military superpower following the collapse of the Soviet Union in 1991. Never mind that numerous countries count themselves members of the nuclear club (cue Groucho Marx joke) and thus possess sufficient power to destroy the world. Never mind that the U.S. failed to win the Korean War or the Vietnam War (the two major U.S. military involvements post-WWII), or in fact any of numerous 21st-century wars (undeclared, de facto, continuing). Never mind that the U.S. has been successful at multiple smaller regime-change actions, often on the back of a civil war instigated by the U.S. and purposefully designed to install a puppet leader. And never mind that the capitalist competition for control of economic resources and capture of perpetual growth is being won handily by China. Nope, the U.S. is no longer the only superpower but is instead busy transitioning from superpower (military and economic) to failed state. Or in the language of that old blog post, the U.S. is now a geopolitical Strong/Stupid hybrid but is actively deploying stupidity in a feverish play to be merely Stupid. The weirdest aspect, perhaps, is that it’s being done right in front of god and everybody, yet few bother to take notice.

It’s no stretch to assert that in the U.S. in particular (but also true of nearly every regime across the world), we’re piling stupidity upon stupidity. If I were inclined to go full conspiracy like some QAnon fool, I’d have to say that the power elite have adopted a deep, 3D-chess strategy that means one of two possible things using the Rock-Paper-Scissors power dynamic algorithm (which, unlike tic-tac-toe, produces a winner) modified and inverted to Strong-Stupid-Smart: it’s either (1) very Smart of them to appear so Stupid, granting victory (against all appearances) over Strong (but only Strong in a three-legged contest), or (2) they reject the algorithm entirely in the misguided belief that nuthin’ beats stoopid. That second option would indeed be entirely consistent with Stupid.

Take for instance three looming issues: the pandemic (and its follow-on effects), the U.S. presidential election (ugh, sorry, it’s unavoidable), and climate change. They loom threateningly despite being well underway already. But with each, we’ve acted and behaved very stupidly, stunningly so I would argue, boxing ourselves in and doing worse damage over time than if we had taken proper steps early on. But as suggested in a previous blog post, the truth is that decision-makers haven’t really even tried to address these issues with the purpose of solving, resolving, winning, remedying, or ameliorating entirely predictable outcomes. Rather, issues are being either swept under the rug (ignored with the futile hope that they will go away or resolve themselves on their own) or displaced in time for someone else to handle. This second option occurs quite a lot, which is also known as kicking the can down the road or stealing from the future (as with sovereign debt). What happens when there’s no more future (for humans and their institutions, anyway) because it’s been squandered in the present? You already know the answer(s) to that question.