Archive for the ‘Corporatism’ Category

With each successive election cycle, I become more cynical (how is that even possible?) about the candidates and their supposed earnest mission to actually serve the public interest. The last couple cycles have produced a new meme that attempts to shift blame for poor governance to the masses: the low-information voter. Ironically, considering the fact that airwaves, magazines, books, public addresses, online venues, and even dinner conversations (such as they still exist if diners aren’t face-planted in their screens) are positively awash in political commentary and pointless debate and strategizing, there is no lack of information available. However, being buried under a déluge of information is akin to a defense attorney hiding damning discovery in an ocean of irrelevance, so I have some sympathy for voters who are thwarted in attempts to make even modestly informed decisions about political issues.

Multiply this basic relationship across many facets of ordinary life and the end result is the low-information citizen (also low-information consumer). Some parties (largely sellers of things, including ideas) possess a profusion of information, whereas useful, actionable information is hidden from the citizen/consumer by an information avalanche. For example, onerous terms of an insurance contract, debt instrument, liability waiver, or even routine license agreement are almost never read prior to signing or otherwise consenting; the acronym tl;dr (stands for “too long; didn’t read”) applies. In other situations, information is withheld entirely, such as pricing comparisons one might undertake if high-pressure sales tactics were not deployed to force potential buyers in decisions right here, right now, dammit! Or citizens are disempowered from exercising any critical judgment by erecting secrecy around a subject, national security being the utility excuse for everything the government doesn’t want people to know.

Add to this the concerted effort (plain enough to see if one bothers to look) to keep the population uneducated, without options and alternatives, scrambling just to get through the day/week/month (handily blocking information gathering), and thus trapped in a condition of low information. Such downward pressure (survival pressure, one might say when considering the burgeoning homeless population) is affecting a greater portion of the population than ever. The American Dream that energized and buoyed the lives of many generations of people (including immigrants) has morphed into the American Nightmare. Weirdly, the immigrant influx has not abated but rather intensified. However, I consider most of those folks (political, economic, and ecological) refugees, not immigrants.

So those are the options available to powers players, where knowledge is power: (1) withhold information, (2) if information can’t be withheld, then bury it as a proverbial needle in a haystack, and (3) render a large percentage of the public unable to process and evaluate information by keeping them undereducated. Oh, here’s another: (4) produce a mountain of mis- and disinformation that bewilders everyone. This last one is arguably the same as (2) except that the intent is propaganda or psyop. One could also argue that miseducating the public (e.g., various grievance studies blown into critical race theory now being taught throughout the educational system) is the same as undereducating. Again, intent matters. Turning someone’s head and radicalizing them with a highly specialized toolkit (mostly rhetorical) for destabilizing social relations is tantamount to making them completely deranged (if not merely bewildered).

These are elements of the ongoing epistemological crisis I’ve been observing for some time now, with the side effect of a quick descent into social madness being used to justify authoritarian (read: fascist) concentration of power and rollback of individual rights and freedoms. The trending term sensemaking also applies, referring to reality checks needed to keep oneself aligned with truth, which is not the same as consensus. Groups are forming up precisely for that purpose, centered on evidentiary rigor as well as skepticism toward the obvious disinformation issuing from government agencies and journalists who shape information according to rather transparent brazen agendas. I won’t point to any particular trusted source but instead recommend everyone do their best (no passivity!) to keep their wits about them and think for themselves. Not an easy task when the information environment is so thoroughly polluted — one might even say weaponized — that it requires special workarounds to navigate effectively.

/rant on

Since deleting from my blogroll all doom links and turning my attention elsewhere, the lurking dread of looming collapse (all sorts) has been at low ebb at The Spiral Staircase. Despite many indicators of imminent collapse likewise purged from front-page and top-of-the-broadcast news, evidence continues to mount while citizens contend with other issues, some political and geopolitical, others day-to-day tribulations stemming from politics, economics, and the ongoing pandemic. For instance, I only just recently learned that the Intergovernmental Panel on Climate Change (IPCC — oh yeah … them) issued AR6 last month, the sixth periodic Assessment Report (maybe instead call it the State of the Union Address Planet Report). It’s long, dense reading (the full report is nearly 4,000 pp., whereas the summary for policymakers is a mere 42 pp.) and subject to nearly continuous revision and error correction. The conclusion? Climate change is widespread, rapid, and intensifying. And although it’s true that mundane daily activities occupy center stage in the lives of average folks, there is simply no bigger story or concern for government leaders (I choke on that term) and journalists (that one, too) than climate change because it represents (oh, I dunno …) the collapse of industrial civilization and the early phase of mass extinction. Thus, all politics, geopolitics, economic warfare, class struggle, Wokeism, authoritarian seizure of power, and propaganda filling the minds of people at all levels as well as the institutions they serve amount to a serious misallocation of attention and effort. I will admit, though, that it’s both exhausting and by now futile to worry too much about collapse. Maybe that’s why the climate emergency (the new, improved term) is relegated to background noise easily tuned out.

It’s not just background noise, though, unlike the foreknowledge that death awaits decades from now if one is fortunate to persist into one’s 70s or beyond. No, it’s here now, outside (literally and figuratively), knocking on the door. Turn off your screens and pay attention! (Ironically, everyone now gets the lion’s share of information from screens, not print. So sue me.) Why am I returning to this yet again? Maybe I’ve been reviewing too many dystopian films and novels. Better answer is that those charged with managing and administering states and nations are failing so miserably. It’s not even clear that they’re trying, so pardon me, but I’m rather incensed. It’s not that there aren’t plenty of knowledgeable experts compiling data, writing scientific reports, publishing books, and offering not solutions exactly but at least better ways to manage our affairs. Among those experts, the inability to reverse the climate emergency is well enough understood though not widely acknowledged. (See Macro-Futilism on my blogroll for at least one truth teller who absolutely gets it.) Instead, some lame version of the same dire warning issues again and again: if action isn’t taken now (NOW, dammit!), it will be too late and all will be lost. The collective response is not, however, to pull back, rein in, or even prepare for something less awful than the worst imaginable hard landing where absolutely no one survives despite the existence of boltholes and underground bunkers. Instead, it’s a nearly gleeful acceleration toward doom, like a gambler happily forking over his last twenty at the blackjack table before losing and chucking himself off the top of the casino parking structure. Finally free (if fleetingly)!

Will festering public frustration over deteriorating social conditions tip over into outright revolt, revolution, civil war, and/or regime change? Doesn’t have to be just one. Why is the U.S. still developing and stockpiling armaments, maintaining hundreds of U.S. military bases abroad, and fighting costly, pointless wars of empire (defeat in withdrawal from Afghanistan notwithstanding)? Will destruction of purchasing power of the U.S. dollar continue to manifest as inflation of food and energy costs? Is the U.S. Dept. of Agriculture actually doing anything to secure food systems, or does it merely prepare reports like the AR6 that no one reads or acts upon? Will fragile supply lines be allowed to fail entirely, sparking desperation and unrest in the streets far worse than summer 2020? Famine is how some believe collapse will trigger a megadeath pulse, but I wouldn’t count out chaotic violence among the citizenry, probably exacerbated and escalated as regimes attempt (unsuccessfully) to restore social order. Are any meaningful steps being taken to stop sucking from the fossil fuel teat and return to small-scale agrarian social organization, establishing degrowth and effectively returning to the land (repatriation is my preferred term) instead of going under it? Greenwashing doesn’t count. This headline (“We Live In A World Without Consequences Where Everyone Is Corrupt“) demonstrates pretty well that garbage economics are what pass for governance, primarily preoccupied with maintaining the capitalist leviathan that has captured everything (capture ought to be the trending word of the 2021 but sadly isn’t). Under such constraint, aged institutions are flatly unable to accomplish or even address their missions anymore. And this headline (“Polls Show That The American People Are Extremely Angry – And They Are About To Get Even Angrier“) promises that things are about to get much, much worse (omitted the obvious-but-erroneous follow-on “before they get better”) — for the obvious reason that more and more people are at the ends of their ropes while the privileged few attend the Met Gala, virtue signal with their butts, and behave as though society isn’t in fact cracking up. Soon enough, we’ll get to truth-test Heinlein’s misunderstood aphorism “… an armed society is a polite society.”

Those who prophesy dates or deadlines for collapse have often been slightly embarrassed (but relieved) that collapse didn’t arrive on schedule. Against all odds, human history keeps trudging further into borrowed time, kicking cans down roads, blowing bubbles, spinning false narratives, insisting that all this is fine, and otherwise living in make-believe land. Civilization has not quite yet reached the end of all things, but developments over the last couple months feel ever more keenly like the equivalent of Frodo and Sam sitting atop Mount Doom, just outside the Cracks of Doom (a/k/a Sammath Naur), except that humanity is not on a noble, sacrificial mission to unmake the One Ring, whatever that might represent outside of fiction (for Tolkien, probably industrial machines capable of planetary destruction, either slowly and steadily or all at once; for 21st-century armchair social critics like me, capitalism). All former certainties, guarantees, sureties, continuities, and warranties are slipping away despite the current administration’s assurances that the status quo will be maintained. Or maybe it’s merely the transition of summer into fall, presaging the annual dormancy of winter looking suspiciously this year like the great dying. Whatever. From this moment on and in a fit of exuberant pique, I’m willing to call the contest: humanity is now decidedly on the down slope. The true end of history approaches, as no one will be left to tell the tale. When, precisely, the runaway train finally careens over the cliff remains unknown though entirely foreseeable. The concentration of goofy references, clichés, and catchphrases above — usually the mark of sophomoric writing — inspires in me to indulge (further) in gallows humor. Consider these metaphors (some mixed) suggesting that time is running out:

  • the show’s not over til it’s over, but the credits are rolling
  • the chickens are coming home to roost
  • the canary in the coal mine is gasping its last breath
  • the fat lady is singing her swan song
  • borrowed time is nearly up
  • time to join the great majority (I see dead people …)
  • the West fades into the west
  • kiss your babies goodnight and kiss your ass goodbye

/rant off

Watched Soylent Green (1973) a few days ago for the first time since boyhood. The movie is based on a book by Richard Fleischer (which I haven’t read) and oddly enough has not yet been remade. How to categorize the film within familiar genres is tricky. Science fiction? Disaster? Dystopia? Police procedural? It checks all those boxes. Chief messages, considering its early 70s origin, are pollution and overpopulation, though global warming is also mentioned less pressingly. The opening montage looks surprisingly like what Godfrey Reggio did much better with Koyaanisqatsi (1982).

Soylent Green is set in 2022 — only a few months away now but a relatively remote future in 1973 — and the Earth is badly overpopulated, environmentally degraded, overheated, and struggling to support teeming billions mostly jammed into cities. Details are sketchy, and only old people can remember a time when the biosphere remained intact; whatever disaster had occurred was already long ago. Science fiction and futuristic films are often judged improperly by how correct prophecies turn out in reality, as though enjoyment were based on fidelity to reality. Soylent Green fares well in that respect despite its clunky, dated, 70s production design. Vehicles, computer screens, phones, wardrobe, and décor are all, shall we say, quaintly vintage. But consider this: had collapse occurred in the 70s, who’s to say that cellphones, flat screens, and the Internet would ever have been developed? Maybe the U.S. (and the world) would have been stalled in the 70s much the way Cuba is stuck in the 50s (when the monumentally dumb, ongoing U.S. embargo commenced).

The film’s star is Charlton Heston, who had established himself as a handsomely bankable lead in science fiction, disaster, and dystopian films (e.g., The Omega Man and The Planet of the Apes series). Though serviceable, his portrayal is remarkably plain, revealing Heston as a poor man’s Sean Connery or John Wayne (both far more charismatic contemporaries of Heston’s even in lousy films). In Soylent Green, Heston plays Detective Robert Thorn, though he’s mostly called “Thorn” onscreen. Other characters below the age of 65 or so also go by only one name. They all grew up after real foodstuffs (the titular Soylent Green being a synthetic wafer reputedly made out of plankton — the most palatable of three colors) and creature comforts became exceedingly scarce and expensive. Oldsters are given the respect of first and last names. Thorn investigates the assassination of a high-ranking industrialist to its well-known conspiratorial conclusion (hardly a spoiler anymore) in that iconic line at the very end of the film: “Soylent Green is people!” Seems industrialists, to keep people fed, are making food of human corpses. That eventual revelation drives the investigation and the film forward, a device far tamer than today’s amped up action thrillers where, for instance, a mere snap of the fingers can magically wipe out or restore half of the universe. Once the truth is proclaimed by Thorn (after first being teased whispered into a couple ears), the movie ends rather abruptly. That’s also what makes it a police procedural set in a disastrous, dystopic, science-fiction future stuck distinctively in the past: once the crime/riddle is solved, the story and film are over with no dénouement whatsoever.

Some of the details of the film, entirely pedestrian to modern audiences, are modestly enjoyable throwbacks. For instance, today’s penchant for memes and slang renaming of commonplace things is employed in Soylent Green. The catchphrase “Tuesday is Soylent Green Day” appears but is not overdriven. A jar of strawberries costs “150D,” which I first thought might be future currency in the form of debits or demerits but is probably just short for dollars. Front end loaders used for crowd control are called “scoops.” High-end apartment building rentals come furnished with live-in girls (prostitutes or gold-diggers, really) known as Furniture Girls. OTOH, decidedly 70s-era trash trucks (design hasn’t really changed much) are not emblazoned with the corporate name or logo of the Soylent Corporation (why not?). Similarly, (1) dressing the proles in dull, gray work clothes and brimless caps, (2) having them sleep on stairways or church refuges piled on top of each other so that characters have to step gingerly through them, (3) being so crammed together in protest when the Tuesday ration of Soylent Green runs short that they can’t avoid the scoops, (4) dripped blood clearly made of thick, oversaturated paint (at least on the DVD), and (5) a sepia haze covering daytime outdoor scenes are fairly lazy nods to world building on a low budget. None of this is particularly high-concept filmmaking, though the restraint is appreciated. The sole meme (entirely unprepared) that should have been better deployed is “going home,” a euphemism for reporting voluntarily to a processing plant (into Soylent Green, of course) at the end of one’s suffering life. Those who volunteer are shown 30 minutes of scenes, projected on a 360-degree theater that envelops the viewer, depicting the beauty and grandeur of nature before it had disappeared. This final grace offered to people (rather needlessly) serves the environmental message of the film well and could have been “driven home” a bit harder.

Like other aspects of the film’s back story, how agriculture systems collapsed is largely omitted. Perhaps such details (conjecture) are in the book. The film suggests persistent heat (no seasons), and accordingly, character are made to look like they never stop sweating. Scientific projections of how global warming will manifest do in fact point to hothouse Earth, though seasons will still occur in temperate latitudes. Because such changes normally occur in geological time, it’s an exceedingly slow process compared to human history and activity. Expert inquiry into the subject prophesied long ago that human activity would trigger and accelerate the transition. How long it will take is still unknown, but industrial civilization is definitely on that trajectory and human have done little since the 70s to curb self-destructive appetites or behaviors — except of course talk, which in the end is just more hot air. Moreover, dystopian science fiction has shifted over the decades away from self-recrimination to a long, seemingly endless stream of superheros fighting crime (and sometimes aliens). Considering film is entertainment meant to be enjoyed, the self-serious messages embedded in so many 70s-era disaster films warning us of human hubris are out of fashion. Instead, superpowers and supersuits rule cinema, transforming formerly uplifting science-fiction properties such as Star Trek into hypermilitaristic stories of interstellar social collapse. Soylent Green is a grim reminder that we once knew better, even in our entertainments.

Guy McPherson used to say in his presentations that we’re all born into bondage, meaning that there is no escape from Western civilization and its imperatives, including especially participation in the money economy. The oblique reference to chattel slavery is clumsy, perhaps, but the point is nonetheless clear. For all but a very few, civilization functions like Tolkien’s One Ring, bringing everyone ineluctably under its dominion. Enlightenment cheerleaders celebrate that circumstance and the undisputed material and technological (same thing, really) bounties of the industrial age, but Counter-Enlightenment thinkers recognize reasons for profound discontent. Having blogged at intervals about the emerging Counter-Enlightenment and what’s missing from modern technocratic society, my gnawing guilt by virtue of forced participation in the planet-killing enterprise of industrial civilization is growing intolerable. Skipping past the conclusion drawn by many doomers that collapse and ecocide due to unrestrained human consumption of resources (and the waste stream that follows) have already launched a mass extinction that will extirpate most species (including large mammals such as humans), let me focus instead on gross dysfunction occurring at levels falling more readily within human control.

An Empire of War

Long overdue U.S. troop withdrawal from Afghanistan has already yielded Taliban resurgence, which was a foregone conclusion at whatever point U.S. troops left (and before them, Soviets). After all, the Taliban lives there and had only to wait. Distasteful and inhumane as it may be to Westerners, a powerful faction (religious fanatics) truly wants to live under a 7th-century style of patriarchy. Considering how long the U.S. occupied the country, a new generation of wannabe patriarchs came to adulthood — an unbroken intergenerational descent. Of course, the U.S. (and others) keeps arming them. Indeed, I heard that the U.S. military is considering bombing raids to destroy the war machines left behind as positions were so swiftly abandoned. Oops, too late! This is the handiest example how failed U.S. military escapades extending over decades net nothing of value to anyone besides weapons and ordnance manufacturers and miserable careerists within various government branches and agencies. The costs (e.g., money, lives, honor, sanity) are incalculable and spread with each country where the American Empire engages. Indeed, the military-industrial complex chooses intervention and war over peace at nearly every opportunity (though careful not to poke them bears too hard). And although the American public’s inability to affect policy (unlike the Vietnam War era) doesn’t equate with participation, the notion that it’s a government of the people deposits some of the blame on our heads anyway. My frustration is that nothing is learned and the same war crimes mistakes keep being committed by maniacs who ought to know better.

Crony and Vulture Capitalism

Critics of capitalism are being proven correct far more often than are apologists and earnest capitalists. The two subcategories I most deplore are crony capitalism and vulture capitalism, both of which typically accrue to the benefit of those in no real need of financial assistance. Crony capitalism is deeply embedded within our political system and tilts the economic playing field heavily in favor of those willing to both pay for and grant favors rather than let markets sort themselves out. Vulture capitalism extracts value out of dead hosts vulnerable resource pools by attacking and often killing them off (e.g., Microsoft, Walmart, Amazon), or more charitably, absorbing them to create monopolies, often by hostile takeover at steep discounts. Distressed mortgage holders forced into short sales, default, and eviction is the contemporary example. Rationalizing predatory behavior as competition is deployed regularly.

Other historical economic systems had similarly skewed hierarchies, but none have reached quite the same heartless, absurd levels of inequality as late-stage capitalism. Pointing to competing systems and the rising tide that lifts all boats misdirects people to make ahistorical comparisons. Human psychology normally restricts one’s points of comparison to contemporaries in the same country/region. Under such narrow comparison, the rank injustice of hundred-billionaires (or even simply billionaires) existing at the same time as giant populations of political/economic/climate refugees and the unhoused (the new, glossy euphemism for homelessness) demonstrates the soul-forfeiting callousness of the top quintile and/or 1% — an ancient lesson never learned. Indeed, aspirational nonsense repackages suffering and sells it back to the underclass, which as a matter of definition will always exist but need not have to live as though on an entirely different planet from Richistan.

Human Development

Though I’ve never been a big fan of behaviorism, the idea that a hypercomplex stew of influences, inputs, and stimuli leads to better or worse individual human development, especially in critical childhood years but also throughout life, is pretty undeniable. As individuals aggregate into societies, the health and wellbeing of a given society is linked to the health and wellbeing of those very individuals who are understood metaphorically as the masses. Behaviorism would aim to optimize conditions (as if such a thing were possible), but because American institutions and social systems have been so completely subordinated to capitalism and its distortions, society has stumbled and fumbled from one brand of dysfunction to another, barely staying ahead of revolution or civil war (except that one time …). Indeed, as the decades have worn on from, say, the 1950s (a nearly idyllic postwar reset that looms large in the memories of today’s patrician octogenarians), it’s difficult to imaging how conditions could have deteriorated any worse other than a third world war.

Look no further than the U.S. educational system, both K–12 and higher ed. As with other institutions, education has had its peaks and valleys. However, the crazy, snowballing race to the bottom witnessed in the last few decades is utterly astounding. Stick a pin in it: it’s done. Obviously, some individuals manage to get educated (some doing quite well, even) despite the minefield that must be navigated, but the exception does not prove the rule. Countries that value quality education (e.g., Finland, China, Singapore, Japan, South Korea) in deed, not just in empty words trotted out predictably by every presidential campaign, routinely trounce decidedly middling results in the U.S. and reveal that dysfunctional U.S. political systems and agencies (Federal, state, municipal) just can’t get the job done properly anymore. (Exceptions are always tony suburbs populated by high-earning and -achieving parents who create opportunities and unimpeded pathways for their kids.) Indeed, the giant babysitting project that morphs into underclass school-to-prison and school-to-military service (cannon fodder) pipelines are what education has actually become for many. The opportunity cost of failing to invest in education (or by proxy, American youth) is already having follow-on effects. The low-information voter is not a fiction, and it extends to every American institution that requires clarity to see through the fog machine operated by the mainstream media.

As an armchair social critic, I often struggle to reconcile how history unfolds without a plan, and similarly, how society self-organizes without a plan. Social engineering gets a bad rap for reasons: it doesn’t work (small exceptions exist) and subverts the rights and freedoms of individuals. However, the rank failure to achieve progress (in human terms, not technological terms) does not suggest stasis. By many measures, the conditions in which we live are cratering. For instance, Dr. Gabor Maté discusses the relationship of stress to addiction in a startling interview at Democracy Now! Just how bad is it for most people?

… it never used to be that children grew up in a stressed nuclear family. That wasn’t the normal basis for child development. The normal basis for child development has always been the clan, the tribe, the community, the neighborhood, the extended family. Essentially, post-industrial capitalism has completely destroyed those conditions. People no longer live in communities which are still connected to one another. People don’t work where they live. They don’t shop where they live. The kids don’t go to school, necessarily, where they live. The parents are away most of the day. For the first time in history, children are not spending most of their time around the nurturing adults in their lives. And they’re spending their lives away from the nurturing adults, which is what they need for healthy brain development.

Does that not sound like self-hobbling? A similar argument can be made about human estrangement from the natural world, considering how rural-to-urban migration (largely completed in the U.S. but accelerating in the developing world) has rendered many Americans flatly unable to cope with, say, bugs and dirt and labor (or indeed most any discomfort). Instead, we’ve trapped ourselves within a society that is, as a result of its organizing principles, slowly grinding down everyone and everything. How can any of us (at least those of us without independent wealth) choose not to participate in this wretched concatenation? Nope, we’re all guilty.

Continuing from part 1.

So here’s the dilemma: knowing a little bit about media theory and how the medium shapes the message, I’m spectacularly unconvinced that the cheerleaders are correct and that an entirely new mediascape (a word I thought maybe I had just made up, but alas, no) promises offers to correct the flaws of the older, inherited mediascape. It’s clearly not journalists leading the charge. Rather, comedians, gadflies, and a few academics (behaving as public intellectuals) command disproportionate attention among the digital chattering classes as regular folks seek entertainment and stimulation superior to the modal TikTok video. No doubt a significant number of news junkies still dote on their favorite journalists, but almost no journalist has escaped self-imposed limitations of the chosen media to offer serious reporting. Rather, they offer “commentary” and half-assed observations on human nature (much like like comedians who believe themselves especially insightful — armchair social critics like me probably fit that bill, too). If the sheer count of aggregate followers and subscribers across social media platforms is any indication (it isn’t …), athletes, musicians (mostly teenyboppers and former pop tarts, as I call them), and the irritatingly ubiquitous Kardashian/Jenner clan are the most influential, especially among Millennials and Gen Z, whose tastes skew toward the frivolous. Good luck getting insightful analysis out of those folks. Maybe in time they’ll mature into thoughtful, engaged citizens. After all, Kim Kardashian apparently completed a law degree (but has yet to pass the bar). Don’t quite know what to think of her three failed marriages (so far). Actually, I try not to.

I’ve heard arguments that the public is voting with its attention and financial support for new media and increasingly disregarding the so-called prestige media (no such thing anymore, though legacy media is still acceptable). That may well be, but it seems vaguely ungrateful for established journalists and comedians, having enjoyed the opportunity to apprentice under seasoned professionals, to take acquired skills to emerging platforms. Good information gathering and shaping — even for jokes — doesn’t happen in a vacuum, and responsible journalism in particular can’t simply be repackaging information gathered by others (i.e., Reuters, the Associated Press, and Al Jezeera) with the aforementioned “commentary.” A frequent reason cited for jumping ship is the desire to escape editorial control and institutional attempts to distort the news itself according to some corporate agenda or ideology. Just maybe new platforms have made that possible in a serious way. However, the related desire to take a larger portion of the financial reward for one’s own work (typically as celebrities seeking to extend their 15 minutes of fame — ugh) is a surefire way to introduce subtle, new biases and distortions. The plethora of metrics available online, for instance, allows content creators to see what “hits” or goes viral, inviting service to public interest that is decidedly less than wholesome (like so much rubbernecking).

It’s also curious that, despite all the talk about engaging with one’s audience, new media is mired in broadcast mode, meaning that most content is presented to be read or heard or viewed with minimal or no audience participation. It’s all telling, and because comments sections quickly run off the rails, successful media personalities ignore them wholesale. One weird feature some have adopted during livestreams is to display viewer donations accompanied by brief comments and questions, the donation being a means of separating and promoting one’s question to the top of an otherwise undifferentiated heap. To my knowledge, none has yet tried the established talk radio gambit of taking live telephone calls, giving the public a chance to make a few (unpurchased) remarks before the host resumes control. Though I’ve never been invited (an invitation is required) and would likely decline to participate, the Clubhouse smartphone app appears to offer regular folks a venue to discuss and debate topics of the day. However, reports on the platform dynamics suggest that the number of eager participants quickly rises to an impossible number for realistic group discussion (the classroom, or better yet, graduate seminar establishes better limitations). A workable moderation mechanism has yet to emerge. Instead, participants must “raise their hand” to be called upon to speak (i.e., be unmuted) and can be kicked out of the “room” arbitrarily if the moderator(s) so decide. This is decidedly not how conversation flows face-to-face.

What strikes me is that while different broadcast modes target and/or capture different demographics, they all still package organize content around the same principle: purporting to have obtained information and expertise to be shared with or taught to audiences. Whether subject matter is news, science, psychology, comedy, politics, etc., they have something ostensibly worth telling you (and me), hopefully while enhancing fame, fortune, and influence. So it frankly doesn’t matter that much whether the package is a 3-minute news segment, a brief celebrity interview on a late night talk show, an article published in print or online, a blog post, a YouTube video of varying duration, a private subscription to a Discord Server, a Subreddit, or an Instagram or Twitter feed; they are all lures for one’s attention. Long-form conversations hosted by Jordan Peterson, Joe Rogan, and Lex Fridman break out of self-imposed time limitations of the typical news segment and flow more naturally, but they also meander and get seriously overlong for anyone but long-haul truckers. (How many times have I tuned out partway into Paul VanderKlay’s podcast commentary or given up on on Matt Taibbi’s SubStack (tl;dr)? Yeah, lost count.) Yet these folks enthusiastically embrace the shifting mediascape. The digital communications era is already mature enough that several generations of platforms have come and gone as well-developed media are eventually coopted or turned commercial and innovators drive out weaker competitors. Remember MySpace, Google Plus, or American Online? The list of defunct social media is actually quite long. Because public attention is a perpetually moving target, I’m confident that those now enjoying their moment in the sun will face new challenges until it all eventually goes away amidst societal collapse. What then?

Coming back to this topic after some time (pt. 1 here). My intention was to expand upon demands for compliance, and unsurprisingly, relevant tidbits continuously pop up in the news. The dystopia American society is building for itself doesn’t disappoint — not that anyone is hoping for such a development (one would guess). It’s merely that certain influential elements of society reliably move toward consolidation of power and credulous citizens predictably forfeit their freedom and autonomy with little or no hesitation. The two main examples to discuss are Black Lives Matter (BLM) and the response to to the global pandemic, which have occurred simultaneously but are not particularly related.

The BLM movement began in summer 2013 but boiled over in summer 2020 on the heels of the George Floyd killing, with protests spilling over into straightforward looting, mayhem, and lawlessness. That fit of high emotional pique found many protester accosting random strangers in public and demanding a raised fist in support of the movement, which was always ideologically disorganized but became irrational and power-hungry as Wokedom discovered its ability to submit others to its will. In response, many businesses erected what I’ve heard called don’t-hurt-me walls in apparent support of BLM and celebration of black culture so that windows would not be smashed and stores ransacked. Roving protests in numerous cities demanded shows of support, though with what exactly was never clear, from anyone encountered. Ultimately, protests morphed into a sort of protection racket, and agitators learned to enjoy making others acquiesce to arbitrary demands. Many schools and corporations now conduct mandatory training to, among other things, identify unconscious bias, which has the distinct aroma of original sin that can never be assuaged or forgiven. It’s entirely understandable that many individuals, under considerable pressure to conform as moral panic seized the country, play along to keep the peace or keep their jobs. Backlash is building, of course.

The much larger example affecting everyone, nationwide and globally, is the response to the pandemic. Although quarantines have been used in the past to limit regional outbreaks of infectious disease, the global lockdown of business and travel was something entirely new. Despite of lack of evidence of efficacy, the precautionary principle prevailed and nearly everyone was forced into home sequestration and later, after an embarrassingly stupid scandal (in the U.S.), made to don masks when venturing out in public. As waves of viral infection and death rolled across the globe, political leaders learned to enjoy making citizens acquiesce to capricious and often contradictory demands. Like BLM, a loose consensus emerged about the “correct” way to handle the needs of the moment, but the science and demographics of the virus produced widely variant interpretations of such correctness. A truly coordinated national response in the U.S. never coalesced, and hindsight has judged the whole morass a fundamentally botched job of maintaining public health in most countries.

But political leaders weren’t done demanding compliance. Any entirely novel vaccine protocol was rushed into production after emergency use authorization was obtained and indemnification (against what?) was granted to the pharma companies that developed competing vaccines. Whether this historical moment will turn out to be something akin to the thalidomide scandal remains to be seen, but at the very least, the citizenry is being driven heavily toward participation in a global medical experiment. Some states even offer million-dollar lotteries to incentivize individuals to comply and take the jab. Open discussion of risks associated with the new vaccines has been largely off limits, and a two-tier society is already emerging: the vaccinated and the unclean (which is ironic, since many of the unclean have never been sick).

Worse yet (and like the don’t-hurt-me walls), many organizations are adopting as-yet-unproven protocols and requiring vaccination for participants in their activities (e.g., schools, sports, concerts) or simply to keep one’s job. The mask mandate was a tolerable discomfort (though not without many principled refusals), but forcing others to be crash test dummies experimental test subjects is well beyond the pale. Considering how the narrative continues to evolve and transform, thoughtful individuals trying to evaluate competing truth claims for themselves are unable to get clear, authoritative answers. Indeed, it’s hard to imagine a situation where authorities in politics, medicine, science, and journalism could worked so assiduously to undermine their own credibility. Predictably, heads (or boards of directors) of many organizations are learning to enjoy the newly discovered power to transform their organizations into petty fiefdoms and demand compliance from individuals — usually under the claim of public safety (“for the children” being unavailable this time). Considering how little efficacy has yet been truly demonstrated with any of the various regimes erected to contain or stall the pandemic, the notion that precautions undertaken have been worth giving injudicious authority to people up and down various power hierarchies to compel individuals remains just that: a notion.

Tyrants and bullies never seem to tire of watching others do the submission dance. In the next round, be ready to hop on one leg and/or bark like a dog when someone flexes on you. Land of the free and home of the brave no longer.

Addendum

The CDC just announced an emergency meeting to be held (virtually) June 18 to investigate reports (800+ via the Vaccination Adverse Effect Reporting System (VAERS), which almost no one had heard of only a month ago) of heart inflammation in adolescents following vaccination against the covid virus. Significant underreporting is anticipated following the circular logic that since authorities declared the vaccines safe prematurely (without standard scientific evidence to support such a statement), the effects cannot be due to the vaccine. What will be the effect of over 140 million people having been assured that vaccination is entirely safe, taken the jab, and then discovered “wait! maybe not so much ….” Will the complete erosion of trust in what we’re instructed told by officialdom and its mouthpieces in journalism spark widespread, organized, grassroots defiance once the bedrock truth is laid bare? Should it?

On the heels of a series of snowstorms, ice storms, and deep freezes (mid-Feb. 2021) that have inundated North America and knocked out power to millions of households and businesses, I couldn’t help but to notice inane remarks and single-pane comics to the effect “wish we had some global warming now!” Definitely, things are looking distinctly apocalyptic as folks struggle with deprivation, hardship, and existential threats. However, the common mistake here is to substitute one thing for another, failing to distinguish weather from climate.

National attention is focused on Texas, expected to be declared a disaster zone by Pres. Biden once he visits (a flyover, one suspects) to survey and assess the damage. It’s impossible to say that current events are without precedent. Texas has been in the cross-hairs for decades, suffering repeated droughts, floods, fires, and hurricanes that used to be prefixed by 50-year or 100-year. One or another is now occurring practically every year, which is exactly what climate chaos delivers. And in case the deep freeze and busted water pipes all over Texas appear to have been unpredictable, this very thing happened in Arizona in 2011. Might have been a shot across the bow for Texas to learn from and prepare, but its self-reliant, gun-totin’, freedom-lovin’ (fuck, yeah!), secessionist character is instead demonstrated by having its own electrical grid covering most of the state, separated from other North American power grids, ostensibly to skirt federal regulations. Whether that makes Texas’ grid more or less vulnerable to catastrophic failure is an open question, but events of the past week tested it sorely. It failed badly. People literally froze to death as a result. Some reports indicate Texas was mere moments away from an even greater failure that would have meant months to rebuild and reestablish electrical service. A substantial diaspora would have ensued, essentially meaning more climate refugees.

So where’s the evil in this? Well, let me tell you. Knowledge that we humans are on track to extirpate ourselves via ongoing industrial activity has been reported and ignored for generations. Guy McPherson’s essay “Extinction Foretold, Extinction Ignored” has this to say at the outset:

The warnings I will mention in this short essay were hardly the first ones about climate catastrophe likely to result from burning fossil fuels. A little time with your favorite online search engine will take you to George Perkins Marsh sounding the alarm in 1847, Svente Arrhenius’s relevant journal article in 1896, Richard Nixon’s knowledge in 1969, and young versions of Al Gore, Carl Sagan, and James Hansen testifying before the United States Congress in the 1980s. There is more, of course, all ignored for a few dollars in a few pockets. [links in original]

My personal acquaintance with this large body of knowledge began accumulating in 2007 or so. Others with decision-making capacity have known for much, much longer. Yet short-term motivations shoved aside responsible planning and preparation that is precisely the warrant of governments at all levels, especially, say, the U.S. Department of Energy. Sure, climate change is reported as controversy, or worse, as conspiracy, but in my experience, only a few individuals are willing to speak the obvious truth. They are often branded kooks. Institutions dither, distract, and even issue gag orders to, oh, I dunno, prop up real estate values in south Florida soon to be underwater. I’ve suggested repeatedly that U.S. leaders and institutions should be acting to manage contraction and alleviate suffering best as possible, knowing that civilization will fail anyway. To pretend otherwise and guarantee — no — drive us toward worst-case scenarios is just plain evil. Of course, the megalomania of a few tech billionaires who mistakenly believe they can engineer around society’s biggest problems is just as bad.

Writ small (there’s a phrase no one uses denoting narrowing scope), meaning at a scale less than anthropogenic climate change (a/k/a unwitting geoengineering), American society has struggled to prioritize guns vs. butter for over a century. The profiteering military-industrial complex has clearly won that debate, leaving infrastructure projects, such as bridge and road systems and public utilities, woefully underfunded and extremely vulnerable to market forces. Refusal to recognize public health as a right or public good demanding a national health system (like other developed countries have) qualifies as well. As inflated Pentagon budgets reveal, the U.S. never lacks money to oppress, fight, and kill those outside the U.S. Inside the U.S., however, cities and states fall into ruin, and American society is allowed to slowly unwind for lack of support. Should we withdraw militarily from the world stage and focus on domestic needs, such as homelessness and joblessness? Undoubtedly. Would that leave us open to attack or invasion (other than the demographic invasion of immigrants seeking refuge in the U.S.)? Highly doubtful. Other countries have their own domestic issues to manage and would probably appreciate a cessation of interference and intervention from the U.S. One might accuse me of substituting one thing for another, as I accused others at top, but the guns-vs.-butter debate is well established. Should be obvious that it’s preferable to prioritize caring for our own society rather than devoting so much of our limited time and resources to destroying others.

So far, this multipart blog post has trafficked in principles and generalities. Let me try now to be more specific, starting with an excerpt from Barry Lynn’s article in Harper’s Magazine titled “The Big Tech Extortion Racket” (Sept. 2020):

… around the middle of the nineteenth century, Americans began to develop technologies that could not be broken into component pieces. This was especially true of the railroad and the telegraph … Such corporations [railroad and telegraph companies] posed one overarching challenge: they charged some people more than others to get to market. They exploited their control over an essential service in order to extort money, and sometimes political favors … Americans found the answer to this problem in common law. For centuries, the owners of ferries, stagecoaches, and inns had been required to serve all customers for the same price and in the order in which they arrived. In the late nineteenth century, versions of such “common carrier” rules were applied to the new middleman corporations.

Today we rightly celebrate the Sherman Antitrust Act of 1890, which gave Americans the power to break apart private corporations. But in many respects, the Interstate Commerce Act of 1887 was the more important document. This act was based on the understanding that monopoly networks like the railroad and the telegraph could be used to influence the actions of people who depend on them, and hence their power must be carefully restricted …

For a century and a half, Americans used common carrier policies to ensure the rule of law in activities that depended on privately held monopolies … regulations freed Americans to take full advantage of every important network technology introduced during these years, including telephones, water and electrical services, energy pipelines, and even large, logistics-powered retailers. Citizens did not have to worry that the men who controlled the technologies involved would exploit their middleman position to steal other people’s business or disrupt balances of power.

I appreciate that Barry Lynn brings up the Interstate Commerce Act. If this legal doctrine appeared in the net neutrality debate a few years ago, it must have escaped my notice. While Internet Service Providers (ISPs) enable network access and connectivity, those utilities have not yet exhibited let’s-be-evil characteristics. Similarly, phone companies (including cell phones) and public libraries may well be eavesdropping and/or monitoring activities of the citizenry, but the real action lies elsewhere, namely, on social media networks and with online retailers. Evil is arguably concentrated in the FANG (or FAANG) corporations but has now grown to be ubiquitous in all social networks (e.g., Twitter) operating as common carriers (Zoom? Slack?) and across academe, nearly all of which have succumbed to moral panic. They are interpreting correctly, sad to observe, demands to censor and sanitize others’ no-longer-free speech appearing on their networks or within their realms. How much deeper it goes toward shaping politics and social engineering is quasi-conspiratorial and impossible for me to assess.

Much as I would prefer to believe that individuals possess the good sense to shift their activities away from social networks or turn their attention from discomfiting information sources, that does not appear to be the case. Demands for trigger warnings and safe spaces commonplace a few years ago on college campuses have instead morphed into censorious removal, deplatforming, and cancellation from the entire public sphere. Those are wrong responses in free societies, but modern institutions and technologies have gotten out of hand and outstripped the limits of normal human cognition. In short, we’re a society gone mad. So rather than accept responsibility to sort out information overflow oneself, many are demanding that others do it for them, and evil private corporations are complying (after a fashion). Moreover, calls for creation of an Orwellian Ministry of Truth, rebranded as a Truth Commission and Reality Czar, could hardly be any more chillingly and fascistically bizarre. People really need someone to brainwash decide for them what is real? Has anyone at the New York Times actually read Orwell’s dystopian novel 1984 and taken to heart its lessons?

Considering the acceleration of practically everything in the late-modern world (postmodern refers to something quite different), which makes planning one’s higher education somewhat fraught if the subject matter studied is rendered flatly out-of-date or moribund by the time of either graduation or entry into the workforce, I’ve heard it recommended that expertise in any particular subject area may be less important than developing expertise in at least one subject that takes a systems approach. That system might be language and communications, mathematics (or any other hard science), history, economics and finance, business administration, computer coding, law and governance, etc. So long as a rigorous understanding of procedures and rules is developed, a structuralist mindset can be repeated and transferred into other subject areas. Be careful, however, not to conflate this approach with a liberal arts education, which is sometimes described as learning how to learn and is widely applicable across disciplines. The liberal arts have fallen distinctly out of favor in the highly technological and technocratic world, which cares little for human values resistant to quantification. Problem is, Western societies in particular are based on liberal democratic institutions now straining due to their sclerotic old age. And because a liberal arts education is scarcely undertaken anymore, civics and citizenship are no longer taught. Even the study of English has now been corrupted (postmodern does apply here) to the point that the basic liberal arts skill of critical thinking is being lost through attrition. Nowhere is that more abundantly clear than in bristling debate over free speech and censorship.

Aside. Although society tinkers and refines itself (sometimes declines) over time, a great body of cultural inheritance informs how things are done properly within an ideology or system. When tinkering and refinement become outright intransigence and defiance of an established order, it’s commonplace to hear the objection “but that’s not how _______ works.” For instance, debate over climate science or the utility of vaccines often has one party proclaiming “trust [or believe] the science.” However, that’s not how science works (i.e., through unquestioning trust or belief). The scientific method properly understood includes verification, falsification, and revision when results and assertions fail to establish reasonable certainty (not the same as consensus). Similarly, critical thinking includes a robust falsification check before “facts” can be accepted at face value. So-called “critical studies” (a/k/a grievance studies), like religious faith, typically positions bald assertions beyond the reach of falsification. Well, sorry, that’s not how critical thinking works.

Being older and educated before critical studies were fully legitimized (or gave rise to things as risible as feminist glaciology), my understand has always been that free speech and other rights are absolutes that cannot be sliced and diced into bits. That way lies casuistry, where law founders frequently. Thus, if one wishes, say, to trample or burn the U.S. flag in protest, no law can be passed or constitutional amendment enacted to carve out an exception disallowed that instance of dissenting free speech. A lesser example is kneeling silently rather than participating in singing the national anthem before a sporting event. Though offensive to certain individual’s sensibilities, silencing speech is far worse according to liberal democratic values. Whatever our ideological or political difference are, we cannot work them out when one party has the power to place topics out or bounds or remove others from discussion entirely. The point at which spirited debate crosses over into inciting violence or fomenting insurrection is a large gray area, which is the subject of the second impeachment of 45. Civil law covers such contingencies, so abridging free speech, deplatforming, and adopting the formulation “language is violence” are highly improper responses under the liberal form of government codified in the U.S. Constitution, which includes the Bill of Rights originally omitted from the U.S. Constitution but quickly added to articulate the rights fully.

Liberal democratic ideology arose in mercantile, substantially agrarian Western societies before scientific, industrial, and capitalist revolutions built a full head of steam, so to speak. Considering just how much America has developed since the Colonial Period, it’s no surprise society has outgrown its own founding documents. More pointedly, the intellectual commons was a much smaller environment, often restricted to a soapbox in the town square and the availability of book, periodicals,and broadsides. Today, the public square has moved online to a bewildering array of social media platforms that enables publication of one’s ideas well beyond the sound of one’s voice over a crowd or the bottleneck of a publisher’s printing press. It’s an entirely new development, and civil law has not kept pace. Whether Internet communications are regulated like the airwaves or nationalized like the U.S. military, it’s clear that the Wild West uber-democratic approach (where anyone can basically say anything) has failed. Demands for regulation (restrictions on free speech) are being taken seriously and acted upon by the private corporations that run social media platforms. During this interim phase, it’s easy for me, as a subscriber to liberal democratic values, to insist reflexively on free speech absolutism. The apparent mood of the public lies elsewhere.

Evil exists in the world. History and current events both bear this out amply. Pseudo-philosophers might argue that, like emotions and other immaterial sensations, good and evil are merely reified concepts, meaning they are human constructs with no palpable external reality. Go tell that to victims of evildoers. Human suffering can’t be anonymized, rationalized, or philosophized away quite so handily.

It was sort of refreshing, back in the day, when Google’s motto and/or corporate code of conduct was simple: “Don’t Be Evil.” It acknowledged the potential for being or becoming evil (like any of the Bigs: Big Tobacco, Big Soda, Big Oil, Big Pharma, Big Media, Big Agriculture, etc.) and presumably aspired to resist obvious temptations. That was then (from 2000 to 2018), this is now (2021 until death take us — soon enough, I fear). But like all entities possessed of absurd levels of wealth and power, Google (now reorganized as a subsidiary of Alphabet, but who actually refers to it that way?) and its Silicon Valley brethren have succumbed to temptation and become straight-up evil.

One might charitably assess this development as something unbidden, unanticipated, and unexpected, but that’s no excuse, really. I certainly don’t envy celebrity executives experiencing difficulty resulting from having created unmanageable behemoths loosed on both public and polity unable to recognize beastly fangs until already clamped on their necks. As often occurs, dystopian extrapolations are explored in fiction, sometimes satirically. The dénouement of the HBO show Silicon Valley depicts tech mogul wannabes succeeding in creating an AI (or merely a sophisticated algorithm? doesn’t matter …) that would in time become far too powerful in blind execution of its inner imperative. In the show, characters recognize what they had done and kill their own project rather than allow it to destroy the world. In reality, multiple developers of computer tech platforms (and their embedded dynamic, including the wildly unhelpful albeit accurate term algorithm) lacked the foresight to anticipate awful downstream effects of their brainchildren. Yet now that those effects are manifesting recognizably, these corporations continue to operate and wreak havoc.

Silicon Valley shows a extended software development period of bungling ineptitude punctuated by brilliant though momentary breakthroughs. Characters are smart, flawed people laughably unable to get out of the way of their own success. The pièce de résistance was yoking one so-called “learning machine” to another and initiating what would become a runaway doomsday process (either like ecological collapse, building slowly the making the biosphere uninhabitable all at once, or like the gray goo problem, progressively “processing” biomass at the molecular level until all that remains is lifeless goo). It was a final act of bumbling that demanded the characters’ principled, ethical response before the window of opportunity closed. Real Silicon Valley tech platforms are in the (ongoing) process of rending the social fabric, which is no laughing matter. The issue du jour surrounds free speech and its inverse censorship. More broadly, real Silicon Valley succeeded in gaming human psychology for profit in at least two aspects (could be more as yet unrecognized): (1) mining behavioral data as an exploitable resource, and (2) delivering inexhaustible streams of extremely divisive content (not its own) to drive persistent engagement with its platforms. Yoked together, they operate to drive society mad, and yet, mounting evidence of this development has not produced even an inkling that maybe the damned doomsday devices ought to be shut off. As with the environment, we operate with freedom enough to destroy ourselves. Instead, politicians issue stunningly ineffectual calls for regulation or break-up of monopolies. In the meantime, ever more absurd wealth and power are concentrated in the hands of a few executives who have clearly punted and decided “let’s be evil.” No restraints on their behavioral experimentation across whole societies exist.

Much more to say on this topic in additional parts to come.