Posts Tagged ‘Recent History’

From a statement (PDF link) by Supreme Court Justice Neil Gorsuch regarding Title 42 (quite long for this type of post but worthwhile):

[T]he history of this case illustrates the disruption we have experienced over the last three years in how our laws are made and our freedoms observed.

Since March 2020, we may have experienced the greatest intrusions on civil liberties in the peacetime history of this country. Executive officials across the country issued emergency decrees on a breathtaking scale. Governors and local leaders imposed lockdown orders forcing people to remain in their homes.

They shuttered businesses and schools public and private. They closed churches even as they allowed casinos and other favored businesses to carry on. They threatened violators not just with civil penalties but with criminal sanctions too.

They surveilled church parking lots, recorded license plates, and issued notices warning that attendance at even outdoor services satisfying all state social-distancing and hygiene requirements could amount to criminal conduct. They divided cities and neighborhoods into color-coded zones, forced individuals to fight for their freedoms in court on emergency timetables, and then changed their color-coded schemes when defeat in court seemed imminent.

Federal executive officials entered the act too. Not just with emergency immigration decrees. They deployed a public-health agency to regulate landlord-tenant relations nationwide.They used a workplace-safety agency to issue a vaccination mandate for most working Americans.

They threatened to fire noncompliant employees, and warned that service members who refused to vaccinate might face dishonorable discharge and confinement. Along the way, it seems federal officials may have pressured social-media companies to suppress information about pandemic policies with which they disagreed.

While executive officials issued new emergency decrees at a furious pace, state legislatures and Congress—the bodies normally responsible for adopting our laws—too often fell silent. Courts bound to protect our liberties addressed a few—but hardly all—of the intrusions upon them. In some cases, like this one, courts even allowed themselves to be used to perpetuate emergency public-health decrees for collateral purposes, itself a form of emergency-lawmaking-by-litigation.

Doubtless, many lessons can be learned from this chapter in our history, and hopefully serious efforts will be made to study it. One lesson might be this: Fear and the desire for safety are powerful forces. They can lead to a clamor for action—almost any action—as long as someone does something to address a perceived threat. 

A leader or an expert who claims he can fix everything, if only we do exactly as he says, can prove an irresistible force. We do not need to confront a bayonet, we need only a nudge, before we willingly abandon the nicety of requiring laws to be adopted by our legislative representatives and accept rule by decree. Along the way, we will accede to the loss of many cherished civil liberties—the right to worship freely, to debate public policy without censorship, to gather with friends and family, or simply to leave our homes. 

We may even cheer on those who ask us to disregard our normal lawmaking processes and forfeit our personal freedoms. Of course, this is no new story. Even the ancients warned that democracies can degenerate toward autocracy in the face of fear.

But maybe we have learned another lesson too. The concentration of power in the hands of so few may be efficient and sometimes popular. But it does not tend toward sound government. However wise one person or his advisors may be, that is no substitute for the wisdom of the whole of the American people that can be tapped in the legislative process.

Decisions produced by those who indulge no criticism are rarely as good as those produced after robust and uncensored debate. Decisions announced on the fly are rarely as wise as those that come after careful deliberation. Decisions made by a few often yield unintended consequences that may be avoided when more are consulted. Autocracies have always suffered these defects. Maybe, hopefully, we have relearned these lessons too.

In the 1970s, Congress studied the use of emergency decrees. It observed that they can allow executive authorities to tap into extraordinary powers. Congress also observed that emergency decrees have a habit of long outliving the crises that generate them; some federal emergency proclamations, Congress noted, had remained in effect for years or decades after the emergency in question had passed.

At the same time, Congress recognized that quick unilateral executive action is sometimes necessary and permitted in our constitutional order. In an effort to balance these considerations and ensure a more normal operation of our laws and a firmer protection of our liberties, Congress adopted a number of new guardrails in the National Emergencies Act.

Despite that law, the number of declared emergencies has only grown in the ensuing years. And it is hard not to wonder whether, after nearly a half-century and in light of our Nation’s recent experience, another look is warranted. It is hard not to wonder, too, whether state legislatures might profitably reexamine the proper scope of emergency executive powers at the state level. 

At the very least, one can hope that the Judiciary will not soon again allow itself to be part of the problem by permitting litigants to manipulate our docket to perpetuate a decree designed for one emergency to address another. Make no mistake—decisive executive action is sometimes necessary and appropriate. But if emergency decrees promise to solve some problems, they threaten to generate others. And rule by indefinite emergency edict risks leaving all of us with a shell of a democracy and civil liberties just as hollow.


Douglas Murray wrote a book called The Strange Death of Europe (2017), which is behind a long list of books as yet unread by me. Doubtful I will ever get to is. My familiarity with his thesis stems from his many appearances in interviews and webcasts describing the book. Murray continues a long line of declinists (mentioned here) prophesying and/or chronicling the long, slow demise of Europe, and in an only slightly wider sense, the West. Analyses of the causes and manifestations of decline range all over the map but typically include a combination of incompetence, exhaustion, and inevitability. For Murray, the strangeness is that it appears self-inflicted and openly desired out of some misplaced sense of guilt and shame following an entirely atypical (for the West) moral reckoning, perhaps for having been the dominant global culture for roughly five hundred years. This blog already says plenty about disaster, destruction, and doom, so let me instead pose a different question: Is the cultural inheritance of the West at all worth honoring and preserving? Won’t be an easy answer.

The question is partly prompted by the breathless announcement last fall in the alumni magazine of one of my alma maters of a newly created position (a further example of needless administrative bloat): Associate Dean of Equity and Inclusion (among other flowery academic titles attached to the position). For reasons unknown, the full sequence Diversity, Equity, and Inclusion (a/k/a DIE — yes, the initialism is purposely disordered) was avoided, but as with many trendy turns of phrase, precision hardly matters since the omitted word is inferred. Let’s first consider the term cultural inheritance. If one takes a warts-and-all approach, the full spectrum from glory to atrocity accumulated throughout history informs what descends from our forebears in the form of inheritance. Indeed, highs and lows are scarcely separable. Whereas individual memory tends to select mostly good parts, cultural memory records everything — albeit with interpretive bias — so long as one has the inclination to consult history books.

If one takes a charitable view of history, admirable events, practices, norms, and artifacts become the celebratory focus while nasty parts may be acknowledged but are nonetheless elided, forgotten, or shoved forcibly down the memory hole. But if one takes a pessimistic view, the very worst parts represent an irredeemable, permanent stain on everything else (the aforementioned moral reckoning). Recent arguments (revisionist history say many) that the founding of the United States of America, a political entity, should be redated to 1619 to coincide with the establishment of slavery rather than 1776 with the Declaration of Independence is an example. Uncharacteristically perhaps, I take the charitable view while the trés chic view in academe (echoed in corporate life and far-left government policy) is to condemn the past for failing to embody Woke standards of the present. Multiple influential segments of Western culture have thus succumbed to ideological possession as demonstrated by self-denunciation, self-flagellation, and complete loss of faith in Western institutions because of, well, slavery, genocide, colonialism, warfare, torture, racism, child labor, human trafficking, and other abominations, almost none not all of which are relegated to the past.

(more…)

From an article by Bruce Abramson called “Pity the Child” (ostensibly a review of the book Stolen Youth), published on April 4, 2023, at RealClearWire:

Though few recognized it as such at the time, the decision to shutter much of the world in March 2020 unraveled the entire socioeconomic fabric of modern life. As anyone who has ever studied or worked with any complex system can confirm, nothing ever restarts quite as it was before a shutdown. 

American society was hardly the exception. The hibernation derailed every pre-existing positive trend and accelerated all the negative. The restart, unfolding in uneven fits-and-starts over the course of two years, introduced an entirely new sociology. Though its precise contours are still taking shape, a few things are clear: Woke reigns supreme and children are expendable.

Ask parents what ambitions they harbor for their child or children and among the most patterned responses is “I just want them to be happy.” I find such an answer thoughtless and disingenuous, and the insertion of the hedge just to make happiness sound like a small ask is a red herring. To begin with, for most kids still in their first decade, happiness and playfulness are relatively effortless and natural so long as a secure, loving environment is provided. Certainly not a default setting, but it’s still quite commonplace. As the dreamy style of childhood cognition is gradually supplanted by supposedly more logical, rational, adult thinking, and as children become acquainted with iniquities of both history and contemporary life, innocence and optimism become impossible to retain. Cue the sullen teenager confronting the yawning chasm between desire and reality. Indeed, few people seem to make the transition into adulthood knowing with much clarity how to be happy in the midst of widespread travail and suffering. Instead, young adults frequently substitute self-destructive, nihilistic hedonism, something learned primarily (says me) from the posturing of movie characters and the celebrities who portray them. (Never understood the trope of criminals hanging at nightclubs, surrounded by drug addicts, nymphos, other unsavory types, and truly awful music, where they can indulge their assholery before everything inevitably goes sideways.)

Many philosophies recommend simplicity, naturalness, and independence as paths to happiness and moral rectitude. Transcendentalism was one such response to social and political complexities that spoil and/or corrupt. Yet two centuries on, the world has only gotten more and more complex, pressing on everyone especially for information processing in volume and sophistication that does not at all come naturally to most and is arguably not part of our evolutionary toolkit. Multiple social issues, if one is to engage them fairly, hinge on legalistic arguments and bewildering wordplay that render them fundamentally intractable. Accordingly, many waive away all nuance and adopt pro forma attitudes. Yet the airwaves, social media, the Internet, and even dinner conversations are suffused by the worst sorts of hypercomplexity and casuistry that confound even those who traffic regularly in such rhetoric. It’s a very long way from “I just want to be happy.”

(more…)

Although disinclined to take the optimistic perspective inhabited by bright-siders, I’m nonetheless unable to live in a state of perpetual fear that would to façile thinkers be more fitting for a pessimist. Yet unrelenting fear is the dominant approach, with every major media outlet constantly stoking a toxic combination of fear and hatred, as though activation and ongoing conditioning of the lizard brain (i.e., the amygdala — or maybe not) in everyone were worthy of the endeavor rather than it being a limited instinctual response, leaping to the fore only when immediate threat presents. I can’t guess the motivations of purveyors of constant fear to discern an endgame, but a few of the dynamics are clear enough to observe.

First thing that comes to mind is that the U.S. in the 1930s and 40s was pacifist and isolationist. Recent memory of the Great War was still keenly felt, and with the difficulties of the 1929 Crash and ensuing Great Depression still very must present, the prospect of engaging in a new, unlimited war (even over there) was not at all attractive to the citizenry. Of course, political leaders always regard (not) entering into war somewhat differently, maybe in terms of opportunity cost. Hard to say. Whether by hook or by crook (I don’t actually know whether advance knowledge of the Japanese attack on Pearl Harbor was suppressed), the U.S. was handily drawn into the war, and a variety of world-historical developments followed that promoted the U.S. (and its sprawling, unacknowledged empire) into the global hegemon, at least after the Soviet Union collapsed and before China rose from a predominantly peasant culture into world economic power. A not-so-subtle hindsight lesson was learned, namely, that against widespread public sentiment and at great cost, the war effort could (not would) provide substantial benefits (if ill-gotten and of questionable desirability).

None of the intervening wars (never declared) or Wars for Dummies (e.g., the war on poverty, the war on crime, the war on drugs) provided similar benefits except to government agencies and careerist administrators. Nor did the war on terror following the 9/11 attacks or subsequent undeclared wars and bombings in Afghanistan, Iraq, Syria, Libya, Yemen, and elsewhere provide benefits. All were massive boondoggles with substantial destruction and loss of life. Yet after 9/11, a body of sweeping legislation was enacted without much public debate or scrutiny — “smuggled in under cover of fear” one might say. The Patriot Act and The National Defense Authorization Act are among the most notable. The conditioned response by the citizenry to perceived but not actual existential fear was consistent: desperate pleading to keep everyone safe from threat (even if it originates in the U.S. government) and tacit approval to roll back civil liberties (even though the citizenry is not itself the threat). The wisdom of the old Benjamin Franklin quote, borne out of a very different era and now rendered more nearly as a bromide, has long been lost on many Americans.

The newest omnipresent threat, literally made-to-order (at least according to some — who can really know when it comes to conspiracy), is the Covid pandemic. Nearly every talking, squawking head in government and the mainstream media (the latter now practically useless except for obvious propaganda functions) is telling everyone who still watches (video and broadcast being the dominant modes) to cower in fear of each other, reduce or refuse human contact and social function, and most of all, take the vaccine-not-really-a-vaccine followed by what is developing into a ongoing series of boosters to maintain fear and anxiety if not indeed provide medical efficacy (no good way to measure and substantiate that, anyway). The drumbeat is loud and unabated, and a large, unthinking (or spineless) portion of the citizenry, cowed and cowering, has basically joined the drum circle, spreading a social consensus that is very, well, un-American. Opinion as to other nations on similar tracks are not ventured here. Running slightly ahead of the pandemic is the mind virus of wokery and its sufferers who demand, among other things, control over others’ thoughts and speech through threats and intimidation, censorship, and social cancellation — usually in the name of safety but without any evidence how driving independent thought underground or into hiding accomplishes anything worthwhile.

Again, motivations and endgame in all this are unclear, though concentration of power to compel seems to be exhilarating. In effect, regular folks are being told, “stand on one leg; good boy; now bark like a dog; very good boy; now get used to it because this shit is never going to end but will surely escalate to intolerability.” It truly surprises me to see police forces around the world harassing, beating, and terrorizing citizens for failing to do as told, however arbitrary or questionable the order or the underlying justification. Waiting for the moment to dawn on rank-and-file officers that their monopoly on use of force is serving and protecting the wrong constituency. (Not holding my breath.) This is the stuff of dystopic novels, except that it’s not limited to fiction and frankly never was. The hotspot(s) shift in terms of time and place, but totalitarian mind and behavioral control never seems to fade or invalidate itself as one might expect. Covid passports to grant full participation in society (signalling compliance, not health) is the early step already adopted by some countries. My repeated warnings over the years of creeping fascism (more coercive style than form of government) appears to be materializing before our very eyes. I’m afraid of what portends, but with what remains of my intact mind, I can’t live in perpetual fear, come what may.

/rant on

The ongoing epistemological crisis is getting no aid or relief from the chattering classes. Case in point: the Feb. 2021 issue of Harper’s Magazine has a special supplement devoted to “Life after Trump,” which divides recent history neatly into reality and unreality commencing from either the announcement of Trump’s candidacy, his unexpected success in the Republican primaries, his even less expected election (and inauguration), or now his removal from office following electoral defeat in Nov. 2020. Take your pick which signals the greatest deflection from history’s “proper” course before being derailed into a false trajectory. Charles Yu and Olivia Laing adopt the reality/unreality dichotomy in their contributions to the special supplement. Yu divides (as do many others) the nation into us and them: supporters of a supposed departure from reality/sanity and those whose clear perception penetrates the illusion. Laing bemoans the inability to distinguish fiction and fantasy from truth, unreality masquerading as your truth, my truth, anyone’s truth given repetition and persuasion sufficient to make it stick. Despite familiarity with these forced, unoriginal metaphors, I don’t believe them for a moment. Worse, they do more to encourage siloed thinking and congratulate the “Resistance” for being on the putative correct side of the glaringly obvious schism in the voting populace. Their arguments support a false binary, perpetuating and reinforcing a distorted and decidedly unhelpful interpretation of recent history. Much better analyses than theirs are available.

So let me state emphatically: like the universe, infinity, and oddly enough consciousness, reality is all-encompassing and unitary. Sure, different aspects can be examined separately, but the whole is nonetheless indivisible. Reality is a complete surround, not something one can opt into or out of. That doesn’t mean one’s mind can’t go elsewhere, either temporarily or permanently, but that does not create or constitute an alternate reality. It’s merely dissociation. Considering the rather extreme limitations of human perceptual apparatuses, it’s frankly inevitable that each of us occupies a unique position, an individual perspective, within a much, much (much, much …) larger reality. Add just a couple more axes to the graph below for time (from nanoseconds to eons) and physical scale (from subatomic to cosmic), and the available portion of reality anyone can grasp is clearly infinitesimally small, yet that tiny, tiny portion is utterly everything for each individual. It’s a weird kind of solipsism.

I get that Harper’s is a literary magazine and that writers/contributors take advantage of the opportunity to flex for whatever diminishing readership has the patience to actually finish their articles. Indeed, in the course of the special supplement, more than a few felicitous concepts and turns of phase appeared. However, despite commonplace protestations, the new chief executive at the helm of the ship of state has not in fact returned the American scene to normal reality after an awful but limited interregnum.

Aside: Citizens are asked to swallow the whopper that the current president, an elder statesman, the so-called leader of the free world, is in full control of this faculties. Funny how his handlers repeatedly erupt like a murder of crows at the first suggestion that a difficult, unvetted question might be posed, inviting the poor fellow to veer even slightly off the teleprompter script. Nope. Lest yet another foot-in-mouth PR disaster occur (too many already to count), he’s whisked away, out of range of cameras and mics before any lasting damage can be done. Everyone is supposed to pretend this charade is somehow normal. On the other hand, considering how many past presidents were plainly puppets, spokespersons, or charlatans (or at least denied the opportunity to enact an agenda), one could argue that the façade is normal. “Pay no attention to the man [or men] behind the curtain. I am the great and powerful Wizard of Oz!”

With some dismay, I admit that the tiny sliver of reality to which many attend incessantly is an even smaller subset of reality, served up via small, handheld devices that fit neatly in one’s pocket. One could say theirs is a pocket reality, mostly mass media controlled by Silicon Valley platforms and their censorious algorithms. Constrained by all things digital, and despite voluminous ephemera, that reality bears little resemblance to what digital refuseniks experience without the blue glare of screens washing all the color from their faces and their own authentic thoughts out of their heads. Instead, I recommend getting outside, into the open air and under the warm glow of the yellow sun, to experience life as an embodied being, not as a mere processor of yet someone else’s pocket reality. That’s how we all start out as children before getting sucked into the machine.

Weirdly, only when the screen size ramps up to 30 feet tall do consumers grow skeptical and critical of storytelling. At just the moment cinema audiences are invited to suspend disbelief, the Reality Principle and logic are applied to character, dialogue, plotting, and make-believe gadgetry, which often fail to ring true. Why does fiction come under such careful scrutiny while reality skates right on by, allowing the credulous to believe whatever they’re fed?

/rant off

I’ve often thought that my father was born at just the right time in the United States: too young to remember much of World War II, too old to be drafted into either the Korean War or the Vietnam War, yet well positioned to enjoy the fruits of the postwar boom and the 1960s counterculture. He retired early with a pension from the same company for which he had worked nearly the entirety of his adult life. Thus, he enjoyed the so-called Happy Days of the 1950s (the Eisenhower era) and all of the boom years, including the Baby Boom, the demographer’s term for my cohort (I came at the tail end). Good for him, I suppose. I admit some envy at his good fortune as most of the doors open to him were closed by the time I reached adulthood. It was the older (by 10–15 years) siblings of Boomers who lodged themselves in positions of power and influence. Accordingly, I’ve always felt somewhat like the snotty little brother clamoring for attention but who was overshadowed by the older brother always in the way. Luckily, my late teens and early twenties also fell between wars, so I never served — not that I ever supported the American Empire’s foreign escapades, then or now.

Since things have turned decidedly for the worse and industrial civilization can’t simply keep creaking along but will fail and collapse soon enough, my perspective has changed. Despite some life options having been withdrawn and my never having risen to world-beater status (not that that was ever my ambition, I recognize that, similar to my father, I was born at the right time to enjoy relative peace and tranquility of the second half of the otherwise bogus “American Century.” My good fortune allowed me to lead a quiet, respectable life, and reach a reasonable age (not yet retired) at which I now take stock. Mistakes were made, of course; that’s how we learn. But I’ve avoided the usual character deformations that spell disaster for lots of folks. (Never mind that some of those deformations are held up as admirable; the people who suffer them are in truth cretins of the first order, names withheld).

Those born at the wrong time? Any of those drafted into war (conquest, revolutionary, civil, regional, or worldwide), and certainly anyone in the last twenty years or so. Millennials appeared at the twilight of empire, many of whom are now mature enough to witness its fading glory but generally unable to participate in its bounties meaningfully. They are aware of their own disenfranchisement the same way oppressed groups (religious, ethnic, gender, working class, etc.) have always known they’re getting the shaft. Thus, the window of time one might claim optimal to have been born extends from around 1935 to around 1995, and my father and I both slot in. Beyond that fortuitous window, well, them’s the shakes.

Watched Soylent Green (1973) a few days ago for the first time since boyhood. The movie is based on a book by Richard Fleischer (which I haven’t read) and oddly enough has not yet been remade. How to categorize the film within familiar genres is tricky. Science fiction? Disaster? Dystopia? Police procedural? It checks all those boxes. Chief messages, considering its early 70s origin, are pollution and overpopulation, though global warming is also mentioned less pressingly. The opening montage looks surprisingly like what Godfrey Reggio did much better with Koyaanisqatsi (1982).

Soylent Green is set in 2022 — only a few months away now but a relatively remote future in 1973 — and the Earth is badly overpopulated, environmentally degraded, overheated, and struggling to support teeming billions mostly jammed into cities. Details are sketchy, and only old people can remember a time when the biosphere remained intact; whatever disaster had occurred was already long ago. Science fiction and futuristic films are often judged improperly by how correct prophecies turn out in reality, as though enjoyment were based on fidelity to reality. Soylent Green fares well in that respect despite its clunky, dated, 70s production design. Vehicles, computer screens, phones, wardrobe, and décor are all, shall we say, quaintly vintage. But consider this: had collapse occurred in the 70s, who’s to say that cellphones, flat screens, and the Internet would ever have been developed? Maybe the U.S. (and the world) would have been stalled in the 70s much the way Cuba is stuck in the 50s (when the monumentally dumb, ongoing U.S. embargo commenced).

The film’s star is Charlton Heston, who had established himself as a handsomely bankable lead in science fiction, disaster, and dystopian films (e.g., The Omega Man and The Planet of the Apes series). Though serviceable, his portrayal is remarkably plain, revealing Heston as a poor man’s Sean Connery or John Wayne (both far more charismatic contemporaries of Heston’s even in lousy films). In Soylent Green, Heston plays Detective Robert Thorn, though he’s mostly called “Thorn” onscreen. Other characters below the age of 65 or so also go by only one name. They all grew up after real foodstuffs (the titular Soylent Green being a synthetic wafer reputedly made out of plankton — the most palatable of three colors) and creature comforts became exceedingly scarce and expensive. Oldsters are given the respect of first and last names. Thorn investigates the assassination of a high-ranking industrialist to its well-known conspiratorial conclusion (hardly a spoiler anymore) in that iconic line at the very end of the film: “Soylent Green is people!” Seems industrialists, to keep people fed, are making food of human corpses. That eventual revelation drives the investigation and the film forward, a device far tamer than today’s amped up action thrillers where, for instance, a mere snap of the fingers can magically wipe out or restore half of the universe. Once the truth is proclaimed by Thorn (after first being teased whispered into a couple ears), the movie ends rather abruptly. That’s also what makes it a police procedural set in a disastrous, dystopic, science-fiction future stuck distinctively in the past: once the crime/riddle is solved, the story and film are over with no dénouement whatsoever.

Some of the details of the film, entirely pedestrian to modern audiences, are modestly enjoyable throwbacks. For instance, today’s penchant for memes and slang renaming of commonplace things is employed in Soylent Green. The catchphrase “Tuesday is Soylent Green Day” appears but is not overdriven. A jar of strawberries costs “150D,” which I first thought might be future currency in the form of debits or demerits but is probably just short for dollars. Front end loaders used for crowd control are called “scoops.” High-end apartment building rentals come furnished with live-in girls (prostitutes or gold-diggers, really) known as Furniture Girls. OTOH, decidedly 70s-era trash trucks (design hasn’t really changed much) are not emblazoned with the corporate name or logo of the Soylent Corporation (why not?). Similarly, (1) dressing the proles in dull, gray work clothes and brimless caps, (2) having them sleep on stairways or church refuges piled on top of each other so that characters have to step gingerly through them, (3) being so crammed together in protest when the Tuesday ration of Soylent Green runs short that they can’t avoid the scoops, (4) dripped blood clearly made of thick, oversaturated paint (at least on the DVD), and (5) a sepia haze covering daytime outdoor scenes are fairly lazy nods to world building on a low budget. None of this is particularly high-concept filmmaking, though the restraint is appreciated. The sole meme (entirely unprepared) that should have been better deployed is “going home,” a euphemism for reporting voluntarily to a processing plant (into Soylent Green, of course) at the end of one’s suffering life. Those who volunteer are shown 30 minutes of scenes, projected on a 360-degree theater that envelops the viewer, depicting the beauty and grandeur of nature before it had disappeared. This final grace offered to people (rather needlessly) serves the environmental message of the film well and could have been “driven home” a bit harder.

Like other aspects of the film’s back story, how agriculture systems collapsed is largely omitted. Perhaps such details (conjecture) are in the book. The film suggests persistent heat (no seasons), and accordingly, character are made to look like they never stop sweating. Scientific projections of how global warming will manifest do in fact point to hothouse Earth, though seasons will still occur in temperate latitudes. Because such changes normally occur in geological time, it’s an exceedingly slow process compared to human history and activity. Expert inquiry into the subject prophesied long ago that human activity would trigger and accelerate the transition. How long it will take is still unknown, but industrial civilization is definitely on that trajectory and human have done little since the 70s to curb self-destructive appetites or behaviors — except of course talk, which in the end is just more hot air. Moreover, dystopian science fiction has shifted over the decades away from self-recrimination to a long, seemingly endless stream of superheros fighting crime (and sometimes aliens). Considering film is entertainment meant to be enjoyed, the self-serious messages embedded in so many 70s-era disaster films warning us of human hubris are out of fashion. Instead, superpowers and supersuits rule cinema, transforming formerly uplifting science-fiction properties such as Star Trek into hypermilitaristic stories of interstellar social collapse. Soylent Green is a grim reminder that we once knew better, even in our entertainments.

Guy McPherson used to say in his presentations that we’re all born into bondage, meaning that there is no escape from Western civilization and its imperatives, including especially participation in the money economy. The oblique reference to chattel slavery is clumsy, perhaps, but the point is nonetheless clear. For all but a very few, civilization functions like Tolkien’s One Ring, bringing everyone ineluctably under its dominion. Enlightenment cheerleaders celebrate that circumstance and the undisputed material and technological (same thing, really) bounties of the industrial age, but Counter-Enlightenment thinkers recognize reasons for profound discontent. Having blogged at intervals about the emerging Counter-Enlightenment and what’s missing from modern technocratic society, my gnawing guilt by virtue of forced participation in the planet-killing enterprise of industrial civilization is growing intolerable. Skipping past the conclusion drawn by many doomers that collapse and ecocide due to unrestrained human consumption of resources (and the waste stream that follows) have already launched a mass extinction that will extirpate most species (including large mammals such as humans), let me focus instead on gross dysfunction occurring at levels falling more readily within human control.

An Empire of War

Long overdue U.S. troop withdrawal from Afghanistan has already yielded Taliban resurgence, which was a foregone conclusion at whatever point U.S. troops left (and before them, Soviets). After all, the Taliban lives there and had only to wait. Distasteful and inhumane as it may be to Westerners, a powerful faction (religious fanatics) truly wants to live under a 7th-century style of patriarchy. Considering how long the U.S. occupied the country, a new generation of wannabe patriarchs came to adulthood — an unbroken intergenerational descent. Of course, the U.S. (and others) keeps arming them. Indeed, I heard that the U.S. military is considering bombing raids to destroy the war machines left behind as positions were so swiftly abandoned. Oops, too late! This is the handiest example how failed U.S. military escapades extending over decades net nothing of value to anyone besides weapons and ordnance manufacturers and miserable careerists within various government branches and agencies. The costs (e.g., money, lives, honor, sanity) are incalculable and spread with each country where the American Empire engages. Indeed, the military-industrial complex chooses intervention and war over peace at nearly every opportunity (though careful not to poke them bears too hard). And although the American public’s inability to affect policy (unlike the Vietnam War era) doesn’t equate with participation, the notion that it’s a government of the people deposits some of the blame on our heads anyway. My frustration is that nothing is learned and the same war crimes mistakes keep being committed by maniacs who ought to know better.

Crony and Vulture Capitalism

Critics of capitalism are being proven correct far more often than are apologists and earnest capitalists. The two subcategories I most deplore are crony capitalism and vulture capitalism, both of which typically accrue to the benefit of those in no real need of financial assistance. Crony capitalism is deeply embedded within our political system and tilts the economic playing field heavily in favor of those willing to both pay for and grant favors rather than let markets sort themselves out. Vulture capitalism extracts value out of dead hosts vulnerable resource pools by attacking and often killing them off (e.g., Microsoft, Walmart, Amazon), or more charitably, absorbing them to create monopolies, often by hostile takeover at steep discounts. Distressed mortgage holders forced into short sales, default, and eviction is the contemporary example. Rationalizing predatory behavior as competition is deployed regularly.

Other historical economic systems had similarly skewed hierarchies, but none have reached quite the same heartless, absurd levels of inequality as late-stage capitalism. Pointing to competing systems and the rising tide that lifts all boats misdirects people to make ahistorical comparisons. Human psychology normally restricts one’s points of comparison to contemporaries in the same country/region. Under such narrow comparison, the rank injustice of hundred-billionaires (or even simply billionaires) existing at the same time as giant populations of political/economic/climate refugees and the unhoused (the new, glossy euphemism for homelessness) demonstrates the soul-forfeiting callousness of the top quintile and/or 1% — an ancient lesson never learned. Indeed, aspirational nonsense repackages suffering and sells it back to the underclass, which as a matter of definition will always exist but need not have to live as though on an entirely different planet from Richistan.

Human Development

Though I’ve never been a big fan of behaviorism, the idea that a hypercomplex stew of influences, inputs, and stimuli leads to better or worse individual human development, especially in critical childhood years but also throughout life, is pretty undeniable. As individuals aggregate into societies, the health and wellbeing of a given society is linked to the health and wellbeing of those very individuals who are understood metaphorically as the masses. Behaviorism would aim to optimize conditions (as if such a thing were possible), but because American institutions and social systems have been so completely subordinated to capitalism and its distortions, society has stumbled and fumbled from one brand of dysfunction to another, barely staying ahead of revolution or civil war (except that one time …). Indeed, as the decades have worn on from, say, the 1950s (a nearly idyllic postwar reset that looms large in the memories of today’s patrician octogenarians), it’s difficult to imaging how conditions could have deteriorated any worse other than a third world war.

Look no further than the U.S. educational system, both K–12 and higher ed. As with other institutions, education has had its peaks and valleys. However, the crazy, snowballing race to the bottom witnessed in the last few decades is utterly astounding. Stick a pin in it: it’s done. Obviously, some individuals manage to get educated (some doing quite well, even) despite the minefield that must be navigated, but the exception does not prove the rule. Countries that value quality education (e.g., Finland, China, Singapore, Japan, South Korea) in deed, not just in empty words trotted out predictably by every presidential campaign, routinely trounce decidedly middling results in the U.S. and reveal that dysfunctional U.S. political systems and agencies (Federal, state, municipal) just can’t get the job done properly anymore. (Exceptions are always tony suburbs populated by high-earning and -achieving parents who create opportunities and unimpeded pathways for their kids.) Indeed, the giant babysitting project that morphs into underclass school-to-prison and school-to-military service (cannon fodder) pipelines are what education has actually become for many. The opportunity cost of failing to invest in education (or by proxy, American youth) is already having follow-on effects. The low-information voter is not a fiction, and it extends to every American institution that requires clarity to see through the fog machine operated by the mainstream media.

As an armchair social critic, I often struggle to reconcile how history unfolds without a plan, and similarly, how society self-organizes without a plan. Social engineering gets a bad rap for reasons: it doesn’t work (small exceptions exist) and subverts the rights and freedoms of individuals. However, the rank failure to achieve progress (in human terms, not technological terms) does not suggest stasis. By many measures, the conditions in which we live are cratering. For instance, Dr. Gabor Maté discusses the relationship of stress to addiction in a startling interview at Democracy Now! Just how bad is it for most people?

… it never used to be that children grew up in a stressed nuclear family. That wasn’t the normal basis for child development. The normal basis for child development has always been the clan, the tribe, the community, the neighborhood, the extended family. Essentially, post-industrial capitalism has completely destroyed those conditions. People no longer live in communities which are still connected to one another. People don’t work where they live. They don’t shop where they live. The kids don’t go to school, necessarily, where they live. The parents are away most of the day. For the first time in history, children are not spending most of their time around the nurturing adults in their lives. And they’re spending their lives away from the nurturing adults, which is what they need for healthy brain development.

Does that not sound like self-hobbling? A similar argument can be made about human estrangement from the natural world, considering how rural-to-urban migration (largely completed in the U.S. but accelerating in the developing world) has rendered many Americans flatly unable to cope with, say, bugs and dirt and labor (or indeed most any discomfort). Instead, we’ve trapped ourselves within a society that is, as a result of its organizing principles, slowly grinding down everyone and everything. How can any of us (at least those of us without independent wealth) choose not to participate in this wretched concatenation? Nope, we’re all guilty.

“Language is dynamic” is a phrase invoked in praise or derision of shifts in usage. Corollaries include “the only constant is change” and “time’s arrow points in only one direction” — both signalling that stasis is an invalid and ultimately futile conservative value. The flip side might well be the myth of progress, understood in reference not to technological advancement but human nature’s failure to rise above its base (animal) origins. This failure is especially grotesque considering that humans currently albeit temporarily live in an age of material abundance that would provide amply for everyone if that largesse were justly and equitably produced and distributed. However, resources (including labor) are being systematically exploited, diverted, and hoarded by a small, unethical elite (what some call “alpha chimps”) who often use state power to subjugate vulnerable populations to funnel further tribute to the already obscenely wealthy top of the socioeconomic hierarchy. But that’s a different diatribe.

Although I’m sensitive the dynamism of language — especially terms for broad ideas in need of short, snappy neologisms — I’m resistant to adopting most new coin. For instance, multiple colors of pill (red, blue, white, and black to my knowledge) refer to certain narrative complexes that people, in effect, swallow. Similarly, the “blue church” is used to refer to legacy media struggling desperately (and failing) to retain its last shreds of legitimacy and authority. (Dignity is long gone.) Does language really need these terms or are hipsters just being clever? That question probably lacks a definitive answer.

My real interest with this blog post, however, is how the modern digital mediascape has given rise to a curious phenomenon associated with cancel culture: deletion of tweets and social media posts to scrub one’s past of impropriety as though the tweet or post never happened. (I’ve never deleted a post nor have any plans to.) Silicon Valley hegemons can’t resist getting their piece of the action, too, by applying deeply flawed algorithms to everyone’s content to demonetize, restrict, and/or remove (i.e., censor) offensive opinion that runs counter to (shifting) consensus narratives decided upon in their sole discretion as water carriers for officialdom. Algorithmic dragnets are not effective kludges precisely because thoughts are not synonymous with their online expression; one merely points to the other. Used to be said that the Internet is forever, so wait before posting or tweeting a reasonable duration so that irresponsible behavior (opinion and trolling, mostly) can be tempered. Who knows who possesses technical expertise and access to tweet and video archives other than, say, the Wayback Machine? When a public figure says or does something dumb, a search-and-destroy mission is often launched to resurrect offending and damning past utterance. Of course, scrub-a-dub erasure or deletion is merely another attempt to manage narrative and isn’t a plea for forgiveness, which doesn’t exist in the public sphere anyway except for rehabilitated monsters such as past U.S. presidents a/k/a war criminals. And the Internet isn’t in fact forever; ask an archivist.

Shifting language, shifting records, shifting sentiment, shifting intellectual history are all aspects of culture that develop naturally and inevitably over time. We no longer believe, for instance, in the four elements or geocentrism (a/k/a the Aristotelian-Ptolemaic system; never mind the intransigent Flat Earthers who need not be silenced). Darker aspects of these shifts, however, include the remarkable Orwellian insight that “Who controls the past controls the future: who controls the present controls the past” from the 1949 novel Nineteen Eighty-Four. Here’s the passage for context:

Who controls the past, controls the future: who controls the present, controls the past … The mutability of the past is the central tenet of Ingsoc. Past events, it is argued, have no objective existence, but survive only in written records and in human memories. The past is whatever the records and the memories agree upon. And since the Party is in full control of all records, and in equally full control of the minds of its members, it follows that the past is whatever the Party chooses to make it.

In 2021, the awful lesson is taken to heart by multiple parties (not the Party in the novel but wannabes) who have latched maniacally onto Orwellian mechanisms of thought control specifically through the manipulation of records, history, and language. But as mentioned above, policing mere expression is not the same as policing thought itself, at least among those who retain critical thinking skills and independence of mind. I abstain judgment how effective attempted brainwashing is with the masses but will at least mention that Yeonmi Park, who escaped from North Korea in 2007 before settling in the U.S. in 2014, describes the chilling totalitarian thought control exercised by the North Korean government — the stuff of nightmare dystopianism. The template is by now well established and despots everywhere are only too happy to implement it repeatedly, following an evil trajectory that should be resisted at every turn while still possible.