Archive for the ‘History’ Category

From the outset, credit goes to Jonathan Haidt for providing the ideas to launch this blog post. He appears to be making the rounds again flogging his most recent publication (where? I dunno, maybe The Atlantic). In the YouTube interview I caught, Haidt admits openly that as a social and behavioral psychologist, he’s prone to recommending incentives, programs, and regulations to combat destructive developments in contemporary life — especially those in the academy and on social media that have spread into politics and across the general public. Haidt wears impressive professional armor in support of arguments and contentions; I lack such rigor rather conspicuously. Accordingly, I offer no recommendations but instead try to limit myself to describing dynamics as an armchair social critic. Caveat emptor.

Haidt favors viewpoint diversity (see, for example, Heterodox Academy, which he helped to found and now chairs). Simple enough, right? Not so fast there, Señor Gonzalez! Any notion that even passing acquaintance with a given subject requires knowing both pros and cons is anathema to many of today’s thinkers, who would rather plug their ears and pretend opposition voices, principled or otherwise, are simply incoherent, need not be considered, and further, should be silenced and expunged. As a result, extremist branches of any faction tend to be ideological echo chambers. Cardinal weaknesses in such an approach are plain enough for critical thinkers to recognize, but if one happens to fall into one of those chambers, silos, or bubbles (or attend a school that trains students in rigid thinking), invitations to challenge cherished and closely held beliefs, upon which identity is built, mostly fall on deaf ears. The effect is bad enough in individuals, but when spread across organizations that adopt ill-advised solutionism, Haidt’s assessment is that institutional stupidity sets in. The handy example is higher education (now an oxymoron). Many formerly respectable institutions have essentially abandoned reason (ya know, the way reasonable people think) and begun flagellating themselves in abject shame over, for instance, a recovered history of participation in any of the cultural practices now cause for immediate and reflexive cancellation.

By way of analogy, think of one’s perspective as a knife (tool, not weapon) that requires periodic sharpening to retain effectiveness. Refusing to entertain opposing viewpoints is like sharpening only one side of the blade, resulting in a blunt, useless tool. That metaphor suggests a false dualism: two sides to an argument/blade when in fact many facets inform most complex issues, thus viewpoint diversity. By working in good faith with both supporters and detractors, better results (though not perfection) can be obtained than when radicalized entities come to dominate and impose their one-size-fits-all will indiscriminately. In precisely that way, it’s probably better not to become any too successful or powerful lest one be tempted to embrace a shortsighted will to power and accept character distortions that accompany a precipitous rise.

As mentioned a couple blog posts ago, an unwillingness to shut up, listen, and learn (why bother? solutions are just … so … obvious …) has set many people on a path of activism. The hubris of convincing oneself of possession of solutions to intractable issues is bizarre. Is there an example of top-down planning, channeling, and engineering of a society that actually worked without tyrannizing the citizenry in the process? I can’t think of one. Liberal democratic societies determined centuries ago that freedom and self-determination mixed with assumed responsibility and care within one’s community are preferable to governance that treats individuals as masses to be forced into conformity (administrative or otherwise), regulated heavily, and/or disproportionately incarcerated like in the U.S. But the worm has turned. Budding authoritarians now seek reforms and uniformity to manage diverse, messy populations.

Weirdly, ideologues also attempt to purge and purify history, which is chock full of villainy and atrocity. Those most ideologically possessed seek both historical and contemporary targets to denounce and cancel, not even excluding themselves because, after all, the scourges of history are so abject and everyone benefited from them somehow. Search oneself for inherited privilege and all pay up for past iniquities! That’s the self-flagellating aspect: taking upon oneself (and depositing on others) the full weight of and responsibility for the sins of our forebears. Yet stamping out stubborn embers of fires allegedly still burning from many generations ago is an endless task. Absolutely no one measures up to expectations of sainthood when situated with an inherently and irredeemably evil society of men and women. That’s original sin, which can never be erased or forgiven. Just look at what humanity (via industrial civilization) has done to the surface of the planet. Everyone is criminally culpable. So give up all aspirations; no one can ever be worthy. Indeed, who even deserves to live?

When the Canadian Freedom Convoy appeared out of nowhere over a month ago and managed to bring the Canadian capitol (Ottawa, Ontario) to a grinding halt, the news was reported with a variety of approaches. Witnessing “democracy” in action, even though initiated by a small but important segment of society, became a cause célèbre, some rallying behind the truckers as patriots and other deploring them as terrorists. Lots of onlookers in the middle ground, to be certain, but the extremes tend to define issues these days, dividing people into permafeuding Hatfields and McCoys. The Canadian government stupidly branded the truckers as terrorists, finally dispersing the nonviolent protest with unnecessary force. The Canadian model sparked numerous copycat protests around the globe.

One such copycat protest, rather late to the party, is The People’s Convoy in the U.S., which is still underway. Perhaps the model works only in the first instance, or maybe U.S. truckers learned something from the Canadian example, such as illegal seizure of crowdfunded financial support. Or maybe the prospect of confronting the U.S. military in one of the most heavily garrisoned locations in the world gave pause. (Hard to imagine Ottawa, Ontario, being ringed by military installations like D.C. is.) Either way, The People’s Convoy has not attempted to blockade D.C. Nor has the U.S. convoy been widely reported as was the Canadian version, which was a grass-roots challenge to government handling of the pandemic. Yeah, there’s actually an underlying issue. Protesters are angry about public health mandates and so-called vaccine passports that create a two-tier society. Regular folks must choose between bodily autonomy and freedom of movement on one hand and on the other compliance with mandates that have yet to prove themselves effective against spread of the virus. Quite a few people have already chosen to do as instructed, whether out of earnest belief in the efficacy of mandated approaches or to keep from falling into the lower of the two tiers. So they socially distance, wear masks, take the jab (and follow-up boosters), and provide papers upon demand. Protesters are calling for all those measures to end.

If the Canadian convoy attracted worldwide attention, the U.S. convoy has hardly caused a stir and is scarcely reported outside the foreign press and a few U.S. superpatriot websites. I observed years ago about The Republic of Lakota that the U.S. government essentially stonewalled that attempt at secession. Giving little or no official public attention to the People’s Convoy, especially while attention has turned to war between Russia and Ukraine, has boiled down to “move along, nothing to see.” Timing for the U.S. truckers could not possibly be worse. However, my suspicion is that privately, contingency plans were made to avoid the embarrassment the Canadian government suffered, which must have included instructing the media not to report on the convoy and getting search engines to demote search results that might enable the movement to go viral, so to speak. The conspiracy of silence is remarkable. Yet people line the streets and highways in support of the convoy. Sorta begs the question “what if they threw a protest but no one came?” A better question might be “what if they started a war but no one fought?”

Gross (even criminal) mismanagement of the pandemic is quickly being shoved down the memory hole as other crises and threats displace a two-year ordeal that resulted in significant loss of life and even greater, widespread loss of livelihoods and financial wellbeing among many people who were already teetering on the edge. Psychological impacts are expected to echo for generations. Frankly, I’m astonished that a far-reaching civil crack-up hasn’t already occurred. Yet despite these foreground tribulations and more besides (e.g., inflation shifting into hyperinflation, food and energy scarcity, the financial system failing every few years, and the epistemological crisis that has made every institution flatly untrustworthy), the background crisis is still the climate emergency. Governments around the world, for all the pomp and circumstance of the IPCC and periodic cheerleading conferences, have stonewalled that issue, too. Some individuals take the climate emergency quite seriously; no government does, at least by their actions. Talk is comparatively cheap. Like foreground and background, near- and far-term prospects just don’t compete. Near-term appetites and desires always win. Psychologists report that deferred gratification (e.g., the marshmallow test) is among the primary predictors of future success for individuals. Institutions, governments, and societies are in aggregate mindless and can’t formulate plans beyond the next election cycle, academic year, or business quarter to execute programs that desperately need doing. This may well be why political theorists observe that liberal democracies are helpless to truly accomplish things, whereas authoritarian regimes centered on an individual (i.e., a despot) can get things done though at extreme costs to members of society.

There’s a Joseph Conrad title with which I’ve always struggled, not having read the short story: The Secret Sharer (1910). The problem for me is the modifier secret. Is a secret being shared or is someone sharing in secret? Another ambivalent term came up recently at Macro-Futilism (on my blogroll) regarding the term animal farm (not the novel by George Orwell). Is the animal farming or is the animal being farmed? Mention was made that ant and termites share with humans the characteristic that we farm. Apparently, several others do as well. Omission of humans in the linked list is a frustratingly commonplace failure to observe, whether out of ignorance or stupid convention, that humans are animals, too. I also recalled ant farms from boyhood, and although I never had one (maybe because I never had one), I misunderstood that the ants themselves were doing the farming, as opposed to the keeper of the kit farming the ants.

The additional detail at Macro-Futilism that piqued my curiosity, citing John Gowdy’s book Ultrasocial: The Evolution of Human Nature and the Quest for a Sustainable Future (2021), is the contention that animals that farm organize themselves into labor hierarchies (e.g., worker/drone, soldier, and gyne/queen). Whether those hierarchies are a knowing choice (at least on the part of humans) or merely blind adaptation to the needs of agriculturalism is not clearly stated in the blog post or quotations, nor is the possibility of exceptions to formation of hierarchies in the list of other farming species. (Is there a jellyfish hierarchy?) However, lumping together humans, ants, and termites as ultrasocial agricultural species rather suggests that social and/or cultural evolution is driving their inner stratification, not foresight or planning. Put more plainly, humans are little or no different from insects after discovery and adoption of agriculture except for the obviously much higher complexity of human society over other animal farms.

I’ve suggested many times on this blog that humans are not really choosing the course of history (human or otherwise) as it unfolds around us, and further, that trying to drive or channel history in a chosen direction is futile. Rather, history is like a headless (thus, mindless) beast, and humans are mostly along for the ride. Gowdy’s contention regarding agricultural species supports the idea that no one is or can be in charge and that we’re all responding to survival pressure and adapting at unconscious levels. We’re not mindless, like insects, but neither are we able to choose our path in the macro-historical sense. The humanist in me — an artifact of Enlightenment liberalism, perhaps (more to say about that in forthcoming posts) — clings still to the assertion that we have agency, meaning choices to make. But those choices typically operate at a far more mundane level than human history. Perhaps political leaders and industrial tycoons have greater influence over human affairs by virtue of armies, weapons, and machinery, but my fear is that those decision-makers can really only dominate and destroy, not preserve or create in ways that allow for human flourishing.

Does this explain away scourges like inequality, exploitation, institutional failure, rank incompetence, and corruption, given that each of us responds to a radically different set of influences and available options? Impossible question to answer.

Although disinclined to take the optimistic perspective inhabited by bright-siders, I’m nonetheless unable to live in a state of perpetual fear that would to façile thinkers be more fitting for a pessimist. Yet unrelenting fear is the dominant approach, with every major media outlet constantly stoking a toxic combination of fear and hatred, as though activation and ongoing conditioning of the lizard brain (i.e., the amygdala — or maybe not) in everyone were worthy of the endeavor rather than it being a limited instinctual response, leaping to the fore only when immediate threat presents. I can’t guess the motivations of purveyors of constant fear to discern an endgame, but a few of the dynamics are clear enough to observe.

First thing that comes to mind is that the U.S. in the 1930s and 40s was pacifist and isolationist. Recent memory of the Great War was still keenly felt, and with the difficulties of the 1929 Crash and ensuing Great Depression still very must present, the prospect of engaging in a new, unlimited war (even over there) was not at all attractive to the citizenry. Of course, political leaders always regard (not) entering into war somewhat differently, maybe in terms of opportunity cost. Hard to say. Whether by hook or by crook (I don’t actually know whether advance knowledge of the Japanese attack on Pearl Harbor was suppressed), the U.S. was handily drawn into the war, and a variety of world-historical developments followed that promoted the U.S. (and its sprawling, unacknowledged empire) into the global hegemon, at least after the Soviet Union collapsed and before China rose from a predominantly peasant culture into world economic power. A not-so-subtle hindsight lesson was learned, namely, that against widespread public sentiment and at great cost, the war effort could (not would) provide substantial benefits (if ill-gotten and of questionable desirability).

None of the intervening wars (never declared) or Wars for Dummies (e.g., the war on poverty, the war on crime, the war on drugs) provided similar benefits except to government agencies and careerist administrators. Nor did the war on terror following the 9/11 attacks or subsequent undeclared wars and bombings in Afghanistan, Iraq, Syria, Libya, Yemen, and elsewhere provide benefits. All were massive boondoggles with substantial destruction and loss of life. Yet after 9/11, a body of sweeping legislation was enacted without much public debate or scrutiny — “smuggled in under cover of fear” one might say. The Patriot Act and The National Defense Authorization Act are among the most notable. The conditioned response by the citizenry to perceived but not actual existential fear was consistent: desperate pleading to keep everyone safe from threat (even if it originates in the U.S. government) and tacit approval to roll back civil liberties (even though the citizenry is not itself the threat). The wisdom of the old Benjamin Franklin quote, borne out of a very different era and now rendered more nearly as a bromide, has long been lost on many Americans.

The newest omnipresent threat, literally made-to-order (at least according to some — who can really know when it comes to conspiracy), is the Covid pandemic. Nearly every talking, squawking head in government and the mainstream media (the latter now practically useless except for obvious propaganda functions) is telling everyone who still watches (video and broadcast being the dominant modes) to cower in fear of each other, reduce or refuse human contact and social function, and most of all, take the vaccine-not-really-a-vaccine followed by what is developing into a ongoing series of boosters to maintain fear and anxiety if not indeed provide medical efficacy (no good way to measure and substantiate that, anyway). The drumbeat is loud and unabated, and a large, unthinking (or spineless) portion of the citizenry, cowed and cowering, has basically joined the drum circle, spreading a social consensus that is very, well, un-American. Opinion as to other nations on similar tracks are not ventured here. Running slightly ahead of the pandemic is the mind virus of wokery and its sufferers who demand, among other things, control over others’ thoughts and speech through threats and intimidation, censorship, and social cancellation — usually in the name of safety but without any evidence how driving independent thought underground or into hiding accomplishes anything worthwhile.

Again, motivations and endgame in all this are unclear, though concentration of power to compel seems to be exhilarating. In effect, regular folks are being told, “stand on one leg; good boy; now bark like a dog; very good boy; now get used to it because this shit is never going to end but will surely escalate to intolerability.” It truly surprises me to see police forces around the world harassing, beating, and terrorizing citizens for failing to do as told, however arbitrary or questionable the order or the underlying justification. Waiting for the moment to dawn on rank-and-file officers that their monopoly on use of force is serving and protecting the wrong constituency. (Not holding my breath.) This is the stuff of dystopic novels, except that it’s not limited to fiction and frankly never was. The hotspot(s) shift in terms of time and place, but totalitarian mind and behavioral control never seems to fade or invalidate itself as one might expect. Covid passports to grant full participation in society (signalling compliance, not health) is the early step already adopted by some countries. My repeated warnings over the years of creeping fascism (more coercive style than form of government) appears to be materializing before our very eyes. I’m afraid of what portends, but with what remains of my intact mind, I can’t live in perpetual fear, come what may.

/rant on

Remember when the War on Christmas meme materialized a few years ago out of thin air, even though no one in particular was on attack? Might have to rethink that one. Seems that every holiday now carries an obligation to revisit the past, recontextualize origin stories, and confess guilt over tawdry details of how the holiday came to be celebrated. Nearly everyone who wants to know knows by now that Christmas is a gross bastardization of pagan celebrations of the winter solstice, cooped by various organized churches (not limited to Christians!) before succumbing to the awesome marketing onslaught (thanks, Coca-Cola) that makes Xmas the “most wonderful time of the year” (as the tune goes) and returning the holiday to its secular roots. Thanksgiving is now similarly ruined, no longer able to be celebrated and enjoyed innocently (like a Disney princess story reinterpreted as a white or male savior story — or even worse, a while male) but instead used as an excuse to admit America’s colonial and genocidal past and extermination mistreatment of native populations as white Europeans encroached ever more persistently on lands the natives held more or less as a commons. Gone are the days when one could gather among family and friends, enjoy a nice meal and good company, and give humble, sincere thanks for whatever bounty fortuna had bestowed. Now it’s history lectures and acrimony and families rent asunder along generational lines, typically initiated by those newly minted graduates of higher education and their newfangled ideas about equity, justice, and victimhood. Kids these days … get off my lawn!

One need not look far afield to find alternative histories that position received wisdom about the past in the cross-hairs just to enact purification rituals that make it/them, what, clean? accurate? whole? I dunno what the real motivation is except perhaps to force whites to self-flagellate over sins of our ancestors. Damn us to hell for having cruelly visited iniquity upon everyone in the process of installing white, patriarchal Europeanness as the dominant Western culture. I admit all of it, though I’m responsible for none of it. Moreover, history stops for no man, no culture, no status quo. White, patriarchal Europeanness is in serious demographic retreat and has arguably already lost its grip on cultural dominance. The future is female (among other things), amirite? Indeed, whether intended or not, that was the whole idea behind the American experiment: the melting pot. Purity was never the point. Mass migration out of politically, economically, and ecologically ravaged regions means that the experiment is no longer uniquely American.

Interdisciplinary approaches to history, if academic rigidity can be overcome, regularly develop new understandings to correct the historical record. Accordingly, the past is remarkably dynamic. (I’ve been especially intrigued by Graham Hancock’s work on ancient civilizations, mostly misunderstood and largely forgotten except for megalithic projects left behind.) But the past is truly awful, with disaster piled upon catastrophe followed by calamity and cataclysm. Still waiting for the apocalypse. Peering too intently into the past is like staring at the sun: it scorches the retinas. Moreover, the entire history of history is replete with stories being told and retold, forgotten and recovered, transformed in each iteration from folklore into fable into myth into legend before finally passing entirely out of human memory systems. How many versions of Christmas are there across cultures and time? Or Thanksgiving, or Halloween, or any Hallmark® holiday that has crossed oceans and settled into foreign lands? What counts as the true spirit of any of them when their histories are so mutable?

/rant off

What if everyone showed up to an event with an expectation of using all their tech and gadgets to facilitate the group objective only to discover that nothing worked? You go to a fireworks display but the fireworks won’t ignite. You go to a concert but the instruments and voices make no sound. You go to a sporting event but none of the athletes can move. Does everyone just go home? Come back another day to try again? Or better yet, you climb into your car to go somewhere but it won’t start. Does everyone just stay home and the event never happens?

Those questions relate to a new “soft” weapons system called Scorpius (no link). The device or devices are said to disrupt and disable enemy tech by issuing a narrowly focused electromagnetic beam. (Gawd, just call it a raygun or phaser. No embarrassment over on-the-nose naming of other things.) Does the beam fry the electronics of its target, like a targeted Carrington event, or just simply scramble the signals, making the tech inoperable? Can tech be hardened against attack, such as being encased in a Faraday cage? Wouldn’t such isolation itself make tech nonfunctional, since electronic communications between locations is the essence of modern devices, especially for targeting and telemetry? These are a few more idle questions (unanswered, since announcements of new weaponry I consulted read like advertising copy) about this latest advance (euphemism alert) in the arms race. Calling a device that can knock a plane (um, warplane) out of the sky (crashing somewhere, obviously) “soft protection” because the mechanism is a beam rather than a missile rather obfuscates the point. Sure, ground-based technologies might be potentially disabled without damage, but would that require continuous beam-based defense?

I recall an old Star Trek episode, the one with the Gorn, where omnipotent aliens disabled all weapons systems of two spaceships postured for battle by superheating controls, making them too hot to handle. Guess no one thought of oven mitts or pencils to push the “Fire!” buttons. (Audiences were meant to think, considering Star Trek was a thinking person’s entertainment, but not too much.) Instead of mass carnage, the two captains were transported to the surface of a nearby planet to battle by proxy (human vs. reptile). In quintessential Star Trek fashion — imagining a hopeful future despite militaristic trappings — the human captain won not only the physical battle but the moral battle (with self) by refusing to dispatch the reptile captain after he/it was disabled. The episode posed interesting questions so long as no one searched in the weeds for plausibility.

We’re confronted now, and again, with many of these same questions, some technical, some strategic, but more importantly, others moral and ethical. Thousands of years of (human) history have already demonstrated the folly of war (but don’t forget profitability). It’s a perennial problem, and from my vantage point, combatants on all sides are no closer to Trekkie moral victory now than in the 1960s. For instance, the U.S. and its allies are responsible for millions of deaths in Iraq, Afghanistan, Syria, and elsewhere just in the last two decades. Go back further in time and imperial designs look more and more like sustained extermination campaigns. But hey, we came to play, and any strategic advantage must be developed and exploited, moral quandaries notwithstanding.

It’s worth pointing out that in the Gorn episode, the captains were deprived of their weapons and resorted to brute force before the human improvised a projectile weapon out of materials handily strewn about, suggesting perhaps that intelligence is the most deadly weapon. Turns out to have been just another arms race.

A quick search revealed that over 15 years of blog posts, the word macrohistory has been used only once. On reflection, macrohistory is something in which I’ve been involved for some time — mostly as a dilettante. Several book reviews and three book-blogging series (one complete, two either on hiatus or fully abandoned) concern macrohistory, and my own several multi-part blogs connect disparate dots over broader topics (if not quite history in the narrow sense). My ambition, as with macrohistory, is to tease out better (if only slightly) understandings of ourselves (since humans and human culture are obviously the most captivating thing evar). Scientists direct similar fascination to the inner workings of nonhuman systems — or at least larger systems in which humans are embedded. Thus, macrohistory can be distinguished from natural history by their objects of study. Relatedly, World-Systems Theory associated with Immanuel Wallerstein and The Fourth Turning (1997 book by William Strauss and Neil Howe) take similarly broad perspectives and attempt to identify historical dynamics and patterns not readily apparent. Other examples undoubtedly exist.

This is all preliminary to discussing a rather depressing article from the December 2020 issue of Harper’s Magazine: Rana Dasgupta’s disquieting (ahem) essay “The Silenced Majority” (probably behind a paywall). The subtitle poses the question, “Can America still afford democracy?” This innocuous line begs the question whether the U.S. (America and the United States of America [and its initialisms U.S. and U.S.A.] being sloppily equivalent almost everywhere, whereas useful distinctions describe the United Kingdom, Great Britain, and England) actually has or practices democracy anymore, to which many would answer flatly “nope.” The essay is an impressive exercise, short of book length, in macrohistory, though it’s limited to Western cultures, which is often the case with history told from inside the bubble. Indeed, if (as the aphorism goes) history is written/told primarily by the victors, one might expect to hear only of an ongoing series of victories and triumphs with all the setbacks, losses, and discontinuities excised like some censored curated Twitter or Facebook Meta discussion. One might also wonder how that same history reads when told from the perspective of non-Western countries, especially those in transitional regions such as Poland, Ukraine, Turkey, and Iran or those with histories long predating the rise of the West roughly 500 years ago, i.e., China, Japan, Egypt, and the lost cultures of Central America. Resentments of the Islamic world, having been eclipsed by the West, are a case in point. My grasp of world history is insufficient to entertain those perspectives. I note, however, that with globalism, the histories of all regions of the world are now intimately interconnected even while perspectives differ.

Dasgupta describes foundational Enlightenment innovations that animate Western thinking, even though the ideas are often poorly contextualized or understood. To wit:

In the seventeenth century, England was an emerging superpower. Supremacy would come from its invention of a world principle of property. This principle was developed following contact with the Americas, where it became possible to conjure vast new English properties “out of nothing”—in a way that was impracticable, for instance, in the militarized, mercantile societies of India. Such properties were created by a legal definition of ownership designed so that it could be applied only to the invaders. “As much land as a man tills, plants, improves, cultivates, and can use the product of,” John Locke wrote in 1689, “so much is his property.” When combined with other new legal categories such as “the savage” and “the state of nature,” this principle of property engendered societies such as Carolina, where Locke’s patron, the first earl of Shaftesbury, was a lord proprietor.

Obvious, isn’t it, that by imposing the notion of private property on indigenous inhabitants of North America, colonialists established ownership rights over territories where none had previously existed? Many consider that straightforward theft (again, begging the question) or at least fencing the commons. (Attempts to do the same in the open oceans and in space [orbit] will pick up as technology allows, I surmise.) In addition, extension of property ownership to human trafficking, i.e., slavery and its analogues still practiced today, has an exceptionally long history and was imported to the Americas, though the indigenous population proved to be poor candidates for subjugation. Accordingly, others were brought to North America in slave trade that extended across four centuries.

Dasgupta goes on:

From their pitiless opposition to the will of the people, we might imagine that British elites were dogmatic and reactionary. (Period dramas depicting stuck-up aristocrats scandalized by eccentricity and innovation flatter this version of history.) The truth is that they were open-minded radicals. They had no sentimentality about the political order, cutting the head off one king and sending another into exile. They could invent financial and legal structures (such as the Bank of England, founded in 1694) capable of releasing unprecedented market energies. Even their decision to exploit American land with African labor demonstrated their world-bending pursuit of wealth. Their mines and plantations would eventually supply the capital for the first industrial revolution. They loved fashion and technology, they believed in rationality, progress, and transparency. They were the “founding fathers” of our modern world.

And yet they presided over a political system as brutal as it was exclusive. Why? The answer is simple. They could not afford democracy, but also, crucially, they did not need it. [emphasis in original]

So much for the awe and sacred respect in which Enlightenment philosophers and the Founders are held — or used to be. Statues of these dudes (always dudes, natch) are being pulled down all the time. Moreover, association of liberal democracy with the 17th century is a fundamental mistake, though neoliberalism (another poorly defined and understood term) aims to shift backwards to a former or hybrid state of human affairs some are beginning to call digital feudalism.

The article goes on to discuss the balancing act and deals struck over the course of centuries to maintain economic and political control by the ownership class. It wasn’t until the 1930s and the postwar economic boom in the U.S. that democracy as commonly understood took root significantly. The labor movement in particular was instrumental in forcing FDR’s New Deal social programs, even though populism and socialism as political movements had been successfully beaten back. Interestingly, the hallowed American nuclear family (limited in its scope racially), an ahistorical formation that enjoyed a roughly 30-year heyday from 1945 to 1975, coincides with the rise of the American middle class and now-aged democratic institutions. They’re all connected with widely distributed wealth and prosperity. But after the oil crisis and stagflation of the middle 1970s, gains enjoyed by the middle class have steadily eroded and/or been actively beaten back (again!) so that dominant themes today are austerity imposed on the masses and inequality coughing up hundy-billionaires with increasing frequency. Estimates are that 30-40% of the American citizenry lives in poverty, bumping up against failed state territory. Inequality has returned to Gilded Age levels if not exceeded them. Dasgupta fails to cite perhaps the major underlying cause of this shift away from affordable democracy, back toward the brutal, world principal of property: falling EROI. Cheap foreign labor, productivity gains, and creation of a giant debtor society have simply not offset the disappearance of cheap energy.

Dasgupta’s further discussion of an emerging two-tier economy along with the Silicon Valley technocracy follows, but I’ll stop short here and encourage readers instead to investigate and think for themselves. Lots of guides and analyses help to illuminate the macrohistory, though I find the conclusions awful in their import. Dasgupta drives home the prognosis:

The neoliberal revolution aimed to restore the supremacy of capital after its twentieth-century subjugation by nation-states, and it has succeeded to an astonishing degree. As states compete and collude with gargantuan new private powers, a new political world arises. The principle of labor, which dominated the twentieth century—producing the industrious, democratic society we have come to regard, erroneously, as the norm—is once again being supplanted by a principle of property, the implications and consequences of which we know only too well from our history books.

I’ve often thought that my father was born at just the right time in the United States: too young to remember much of World War II, too old to be drafted into either the Korean War or the Vietnam War, yet well positioned to enjoy the fruits of the postwar boom and the 1960s counterculture. He retired early with a pension from the same company for which he had worked nearly the entirety of his adult life. Thus, he enjoyed the so-called Happy Days of the 1950s (the Eisenhower era) and all of the boom years, including the Baby Boom, the demographer’s term for my cohort (I came at the tail end). Good for him, I suppose. I admit some envy at his good fortune as most of the doors open to him were closed by the time I reached adulthood. It was the older (by 10–15 years) siblings of Boomers who lodged themselves in positions of power and influence. Accordingly, I’ve always felt somewhat like the snotty little brother clamoring for attention but who was overshadowed by the older brother always in the way. Luckily, my late teens and early twenties also fell between wars, so I never served — not that I ever supported the American Empire’s foreign escapades, then or now.

Since things have turned decidedly for the worse and industrial civilization can’t simply keep creaking along but will fail and collapse soon enough, my perspective has changed. Despite some life options having been withdrawn and my never having risen to world-beater status (not that that was ever my ambition, I recognize that, similar to my father, I was born at the right time to enjoy relative peace and tranquility of the second half of the otherwise bogus “American Century.” My good fortune allowed me to lead a quiet, respectable life, and reach a reasonable age (not yet retired) at which I now take stock. Mistakes were made, of course; that’s how we learn. But I’ve avoided the usual character deformations that spell disaster for lots of folks. (Never mind that some of those deformations are held up as admirable; the people who suffer them are in truth cretins of the first order, names withheld).

Those born at the wrong time? Any of those drafted into war (conquest, revolutionary, civil, regional, or worldwide), and certainly anyone in the last twenty years or so. Millennials appeared at the twilight of empire, many of whom are now mature enough to witness its fading glory but generally unable to participate in its bounties meaningfully. They are aware of their own disenfranchisement the same way oppressed groups (religious, ethnic, gender, working class, etc.) have always known they’re getting the shaft. Thus, the window of time one might claim optimal to have been born extends from around 1935 to around 1995, and my father and I both slot in. Beyond that fortuitous window, well, them’s the shakes.

Guy McPherson used to say in his presentations that we’re all born into bondage, meaning that there is no escape from Western civilization and its imperatives, including especially participation in the money economy. The oblique reference to chattel slavery is clumsy, perhaps, but the point is nonetheless clear. For all but a very few, civilization functions like Tolkien’s One Ring, bringing everyone ineluctably under its dominion. Enlightenment cheerleaders celebrate that circumstance and the undisputed material and technological (same thing, really) bounties of the industrial age, but Counter-Enlightenment thinkers recognize reasons for profound discontent. Having blogged at intervals about the emerging Counter-Enlightenment and what’s missing from modern technocratic society, my gnawing guilt by virtue of forced participation in the planet-killing enterprise of industrial civilization is growing intolerable. Skipping past the conclusion drawn by many doomers that collapse and ecocide due to unrestrained human consumption of resources (and the waste stream that follows) have already launched a mass extinction that will extirpate most species (including large mammals such as humans), let me focus instead on gross dysfunction occurring at levels falling more readily within human control.

An Empire of War

Long overdue U.S. troop withdrawal from Afghanistan has already yielded Taliban resurgence, which was a foregone conclusion at whatever point U.S. troops left (and before them, Soviets). After all, the Taliban lives there and had only to wait. Distasteful and inhumane as it may be to Westerners, a powerful faction (religious fanatics) truly wants to live under a 7th-century style of patriarchy. Considering how long the U.S. occupied the country, a new generation of wannabe patriarchs came to adulthood — an unbroken intergenerational descent. Of course, the U.S. (and others) keeps arming them. Indeed, I heard that the U.S. military is considering bombing raids to destroy the war machines left behind as positions were so swiftly abandoned. Oops, too late! This is the handiest example how failed U.S. military escapades extending over decades net nothing of value to anyone besides weapons and ordnance manufacturers and miserable careerists within various government branches and agencies. The costs (e.g., money, lives, honor, sanity) are incalculable and spread with each country where the American Empire engages. Indeed, the military-industrial complex chooses intervention and war over peace at nearly every opportunity (though careful not to poke them bears too hard). And although the American public’s inability to affect policy (unlike the Vietnam War era) doesn’t equate with participation, the notion that it’s a government of the people deposits some of the blame on our heads anyway. My frustration is that nothing is learned and the same war crimes mistakes keep being committed by maniacs who ought to know better.

Crony and Vulture Capitalism

Critics of capitalism are being proven correct far more often than are apologists and earnest capitalists. The two subcategories I most deplore are crony capitalism and vulture capitalism, both of which typically accrue to the benefit of those in no real need of financial assistance. Crony capitalism is deeply embedded within our political system and tilts the economic playing field heavily in favor of those willing to both pay for and grant favors rather than let markets sort themselves out. Vulture capitalism extracts value out of dead hosts vulnerable resource pools by attacking and often killing them off (e.g., Microsoft, Walmart, Amazon), or more charitably, absorbing them to create monopolies, often by hostile takeover at steep discounts. Distressed mortgage holders forced into short sales, default, and eviction is the contemporary example. Rationalizing predatory behavior as competition is deployed regularly.

Other historical economic systems had similarly skewed hierarchies, but none have reached quite the same heartless, absurd levels of inequality as late-stage capitalism. Pointing to competing systems and the rising tide that lifts all boats misdirects people to make ahistorical comparisons. Human psychology normally restricts one’s points of comparison to contemporaries in the same country/region. Under such narrow comparison, the rank injustice of hundred-billionaires (or even simply billionaires) existing at the same time as giant populations of political/economic/climate refugees and the unhoused (the new, glossy euphemism for homelessness) demonstrates the soul-forfeiting callousness of the top quintile and/or 1% — an ancient lesson never learned. Indeed, aspirational nonsense repackages suffering and sells it back to the underclass, which as a matter of definition will always exist but need not have to live as though on an entirely different planet from Richistan.

Human Development

Though I’ve never been a big fan of behaviorism, the idea that a hypercomplex stew of influences, inputs, and stimuli leads to better or worse individual human development, especially in critical childhood years but also throughout life, is pretty undeniable. As individuals aggregate into societies, the health and wellbeing of a given society is linked to the health and wellbeing of those very individuals who are understood metaphorically as the masses. Behaviorism would aim to optimize conditions (as if such a thing were possible), but because American institutions and social systems have been so completely subordinated to capitalism and its distortions, society has stumbled and fumbled from one brand of dysfunction to another, barely staying ahead of revolution or civil war (except that one time …). Indeed, as the decades have worn on from, say, the 1950s (a nearly idyllic postwar reset that looms large in the memories of today’s patrician octogenarians), it’s difficult to imaging how conditions could have deteriorated any worse other than a third world war.

Look no further than the U.S. educational system, both K–12 and higher ed. As with other institutions, education has had its peaks and valleys. However, the crazy, snowballing race to the bottom witnessed in the last few decades is utterly astounding. Stick a pin in it: it’s done. Obviously, some individuals manage to get educated (some doing quite well, even) despite the minefield that must be navigated, but the exception does not prove the rule. Countries that value quality education (e.g., Finland, China, Singapore, Japan, South Korea) in deed, not just in empty words trotted out predictably by every presidential campaign, routinely trounce decidedly middling results in the U.S. and reveal that dysfunctional U.S. political systems and agencies (Federal, state, municipal) just can’t get the job done properly anymore. (Exceptions are always tony suburbs populated by high-earning and -achieving parents who create opportunities and unimpeded pathways for their kids.) Indeed, the giant babysitting project that morphs into underclass school-to-prison and school-to-military service (cannon fodder) pipelines are what education has actually become for many. The opportunity cost of failing to invest in education (or by proxy, American youth) is already having follow-on effects. The low-information voter is not a fiction, and it extends to every American institution that requires clarity to see through the fog machine operated by the mainstream media.

As an armchair social critic, I often struggle to reconcile how history unfolds without a plan, and similarly, how society self-organizes without a plan. Social engineering gets a bad rap for reasons: it doesn’t work (small exceptions exist) and subverts the rights and freedoms of individuals. However, the rank failure to achieve progress (in human terms, not technological terms) does not suggest stasis. By many measures, the conditions in which we live are cratering. For instance, Dr. Gabor Maté discusses the relationship of stress to addiction in a startling interview at Democracy Now! Just how bad is it for most people?

… it never used to be that children grew up in a stressed nuclear family. That wasn’t the normal basis for child development. The normal basis for child development has always been the clan, the tribe, the community, the neighborhood, the extended family. Essentially, post-industrial capitalism has completely destroyed those conditions. People no longer live in communities which are still connected to one another. People don’t work where they live. They don’t shop where they live. The kids don’t go to school, necessarily, where they live. The parents are away most of the day. For the first time in history, children are not spending most of their time around the nurturing adults in their lives. And they’re spending their lives away from the nurturing adults, which is what they need for healthy brain development.

Does that not sound like self-hobbling? A similar argument can be made about human estrangement from the natural world, considering how rural-to-urban migration (largely completed in the U.S. but accelerating in the developing world) has rendered many Americans flatly unable to cope with, say, bugs and dirt and labor (or indeed most any discomfort). Instead, we’ve trapped ourselves within a society that is, as a result of its organizing principles, slowly grinding down everyone and everything. How can any of us (at least those of us without independent wealth) choose not to participate in this wretched concatenation? Nope, we’re all guilty.

“Language is dynamic” is a phrase invoked in praise or derision of shifts in usage. Corollaries include “the only constant is change” and “time’s arrow points in only one direction” — both signalling that stasis is an invalid and ultimately futile conservative value. The flip side might well be the myth of progress, understood in reference not to technological advancement but human nature’s failure to rise above its base (animal) origins. This failure is especially grotesque considering that humans currently albeit temporarily live in an age of material abundance that would provide amply for everyone if that largesse were justly and equitably produced and distributed. However, resources (including labor) are being systematically exploited, diverted, and hoarded by a small, unethical elite (what some call “alpha chimps”) who often use state power to subjugate vulnerable populations to funnel further tribute to the already obscenely wealthy top of the socioeconomic hierarchy. But that’s a different diatribe.

Although I’m sensitive the dynamism of language — especially terms for broad ideas in need of short, snappy neologisms — I’m resistant to adopting most new coin. For instance, multiple colors of pill (red, blue, white, and black to my knowledge) refer to certain narrative complexes that people, in effect, swallow. Similarly, the “blue church” is used to refer to legacy media struggling desperately (and failing) to retain its last shreds of legitimacy and authority. (Dignity is long gone.) Does language really need these terms or are hipsters just being clever? That question probably lacks a definitive answer.

My real interest with this blog post, however, is how the modern digital mediascape has given rise to a curious phenomenon associated with cancel culture: deletion of tweets and social media posts to scrub one’s past of impropriety as though the tweet or post never happened. (I’ve never deleted a post nor have any plans to.) Silicon Valley hegemons can’t resist getting their piece of the action, too, by applying deeply flawed algorithms to everyone’s content to demonetize, restrict, and/or remove (i.e., censor) offensive opinion that runs counter to (shifting) consensus narratives decided upon in their sole discretion as water carriers for officialdom. Algorithmic dragnets are not effective kludges precisely because thoughts are not synonymous with their online expression; one merely points to the other. Used to be said that the Internet is forever, so wait before posting or tweeting a reasonable duration so that irresponsible behavior (opinion and trolling, mostly) can be tempered. Who knows who possesses technical expertise and access to tweet and video archives other than, say, the Wayback Machine? When a public figure says or does something dumb, a search-and-destroy mission is often launched to resurrect offending and damning past utterance. Of course, scrub-a-dub erasure or deletion is merely another attempt to manage narrative and isn’t a plea for forgiveness, which doesn’t exist in the public sphere anyway except for rehabilitated monsters such as past U.S. presidents a/k/a war criminals. And the Internet isn’t in fact forever; ask an archivist.

Shifting language, shifting records, shifting sentiment, shifting intellectual history are all aspects of culture that develop naturally and inevitably over time. We no longer believe, for instance, in the four elements or geocentrism (a/k/a the Aristotelian-Ptolemaic system; never mind the intransigent Flat Earthers who need not be silenced). Darker aspects of these shifts, however, include the remarkable Orwellian insight that “Who controls the past controls the future: who controls the present controls the past” from the 1949 novel Nineteen Eighty-Four. Here’s the passage for context:

Who controls the past, controls the future: who controls the present, controls the past … The mutability of the past is the central tenet of Ingsoc. Past events, it is argued, have no objective existence, but survive only in written records and in human memories. The past is whatever the records and the memories agree upon. And since the Party is in full control of all records, and in equally full control of the minds of its members, it follows that the past is whatever the Party chooses to make it.

In 2021, the awful lesson is taken to heart by multiple parties (not the Party in the novel but wannabes) who have latched maniacally onto Orwellian mechanisms of thought control specifically through the manipulation of records, history, and language. But as mentioned above, policing mere expression is not the same as policing thought itself, at least among those who retain critical thinking skills and independence of mind. I abstain judgment how effective attempted brainwashing is with the masses but will at least mention that Yeonmi Park, who escaped from North Korea in 2007 before settling in the U.S. in 2014, describes the chilling totalitarian thought control exercised by the North Korean government — the stuff of nightmare dystopianism. The template is by now well established and despots everywhere are only too happy to implement it repeatedly, following an evil trajectory that should be resisted at every turn while still possible.