Posts Tagged ‘Cancel Culture’

Americans have always been accused of being anti-intellectual. Nothing new there. However, I’m noticing an upswing in remarks that question the value of knowledge, understanding, and education. Admittedly, those four terms (intellect/intelligence, knowledge, understanding, education) are distinct but overlapping in their meanings, and how something is valued properly also has multiple interpretations. I vaguely remember somewhere unlocatable in my backblog having proposed some definitions for those terms, but I hardly have the last word on epistemology. Suffice it to say for the purpose of this blog entry that intelligence is raw processing power (typically in the mind, often linguistic, mathematical, and/or abstract); knowledge is possession of information without resorting to, say, Google search to look up something; understanding is how knowledge is organized and contextualized; and education is the process, whether formal schooling or informal throughout life (e.g., experience), of acquiring knowledge and understanding. Any one of these terms can certainly be picked apart, but I’ll settle on these provisional statements so that I can get to the main point of this post.

Without providing links, the number of times in just the past week something has come to my attention questioning the value of knowledge has surprised me. Motivations differ, for example, as to whether one might in some ideal circumstance know the approach of death (i.e., to see the metaphorical bullet coming). That foreknowledge is unavailable for most of us absent the diagnosis of some terminal illness and given some ballpark days/weeks/months to live. Foreknowledge of death is of course part of the human condition, but most are arguably content putting the prospect out of mind until infirmity lands the inevitability squarely within a fairly immediate timeframe. Given the doomer nature of this blog, a similar question might be asked about foreknowledge of the collapse of industrial civilization. Is it important to know and understand the circumstances that are currently leading to collapse (only the most recent of many) or should one bury one’s head and live in ignorant bliss (were that possible)? To answer my own questions, I’m content knowing nothing certain about the how/why/when of my death but can’t pretend not to be simultaneously fascinated and horrified by the death spiral in which civilization finds itself. Frankly, the incompetence and apparent disconcern of those who might be in positions not to keep it from happening (that ship sailed long ago) but at least to prepare and diminish anticipated suffering surprises me. Instead, civilization is careening heedlessly and headlong into catastrophe. At least, that’s my assessment.

For stakes lower than life and death, the question has arisen anew whether knowledge and understanding are self-justifying, valuable in and other themselves beyond any sort of monetizeable consideration. Individual responses differ widely, of course, but if one surveys the wider American scene, those clamoring to develop themselves with a rigorous breadth of understanding (as distinguished from a money-grubbing professional specialization) are remarkably few or are at least laboring in anonymity. I’d venture that popular entertainments (TV, cinema, team sports, video games, jousting on X, etc.) are far more widely sought as objects of attention, enjoyment, and devotion. But wait, it gets worse. Through a variety of influencers, mostly in journalistic media and government (who are all mysteriously supplied the same scripts and talking point), individuals are directed not to think for themselves, not to do their own investigations and research, but instead to simply swallow the predigested narratives shoveled at them as so much slop for the hogs. And for those with the temerity to defy approved narratives out in public and beyond the confines of the dinner table, well, those folks come under heinous attack like knowledge itself. Shouting matches in media are not uncommon, though the emotional fury with which an opinion is prosecuted does nothing to strengthen one’s arguments or convince.

Among the intractable debates in the marketplace of ideas, the one now disrupting society most vehemently is the Israeli-Palestinian Conflict. It has a long, tortured history that has become enflamed in the past few months, and it has the potential to engulf the Middle East and indeed the world in a new world war. Most take one or the other side of the debate, either excusing bloodshed as necessary or at least empathizing with reasons for it to occur. The conflict is also the occasion of no small amount of propaganda, which muddies the waters considerably. Since escalations have been raging, quite a few experts on one side or the other have told and retold the history, advocating for one action or another. Sympathy with the underdog in this wholly mismatched contest make far more emotional sense to me considering many reports of events (told by the victors, as history always is) have been shown to be outright fabrications. The hapless public, if the party line is not adopted uncritically, is thus forced to decide whose narratives are more or less true and whose are garbage.

As a pacifist (why are there so few of us?), anyone who advocates or excuses further bloodshed and destruction loses my support, though in truth I’m only a passive observer. It aggrieves me that the U.S. government is up to its neck in this conflict. But I nonetheless form opinions best I can, sorting through conflicting characterizations of events in search of reliable reports. Calls to mind the remark “The first casualty in War is Truth,” another instance of knowledge under attack during the epistemological crisis I’ve been writing about for years (see, for example, here).

It happened again, though it’s been a long time since the previous occurrence: I stumbled headlong into someone’s hot-button topic, pressed the button unwittingly, and was promptly put on blast. I withhold the subject matter because it’s the behavior that interests me. Who was right or wrong (if such assessments can be made) matters little in such exchanges. Once someone is triggered, all reasonable discourse and manners depart the scene just as I did.

The familiar injunction never to discuss religion or politics at the dinner table (or at a bar, one presumes) probably applies. Problem is, those topics are central to human endeavor and can’t be avoided easily or completely. Indeed, how is one to learn, grow, and flourish if important topics are declared (permanently?) off limits or out of bounds? Relatedly, after ignoring the persistent prompt in my YouTube feed, I finally listened to novelist Andrew Klavan’s speech “How to Live a Crap Life: The Seven Deadly Sins of Leftism” (sorry, no link). He presents himself as a wizened old man dispensing sage advice to college students, basically things clear after many decades of life but not so obvious to young adults. At one point, he mentions the joys of lively intellectual life and debate. (The flip side — perils and frustrations — is not mentioned.) Accordingly, those fully engaged tend to wade in despite dangers of (oh my!) disagreements, hurt feelings, and wounded pride. Of course, reveling in working out ideas in the company of others doesn’t appeal to everyone. Many are focused on other things, openly apathetic, or avoid intellectual life altogether so long as issues don’t impact them personally and directly. Seventeen years of blogging demonstrates that apathy and disengagement are not my style.

In the aftermath of being blasted yet again, I couldn’t help but to recognize some features in common with nasty exchanges I’ve seen online, typically in news segments, talk shows, and congressional hearings. The most immediate feature is that the sheer emotional fury of being yelled at and/or verbally threatened does absolutely nothing to convince or compel. Rather, one’s guard goes up quickly when someone loses their shit and become unhinged. Moreover, protesters who self-immolate in public to demonstrate the depth of their conviction are dismissed as nut jobs. So, too, with people who lose emotional control.

A second feature is to project positions onto an interlocutor after a snap judgment following mere mention of a contentious subject. No waiting to interrogate or discover actual beliefs, if one cites Nazis then one IS a Nazi; if one says any number of disallowed words, then one IS that word; if one plays devil’s advocate or attempts to steel man a position, then one obviously believes those ideas. In addition, even if one is only trying to understand the parameters of an issue, questions and provisional positions are purposely misunderstood as intolerance, advocacy, and/or activism calling forth winners and losers, oppressors and victims.

A third is to position oneself within the bubble or adjacent to it so that first- or second-hand experience confers authority. Citing oneself, close friends, or family members who have dealt with some personal issue undoubtedly awards insight, but proximity is not generally sufficient authority to generalize one’s experience to everyone else. Subtle and gross variations in experience make it impossible to settle most issues definitively.

A fourth is to resort to name calling as though such labeling does anything to advance an argument. Once branded a __________, all one’s other positions can be inferred and/or one can be summarily dismissed. Recognizing categorical distinctions is basic to human cognition and unavoidable, but applying heuristics haphazardly in the heat of argument is plain foolish. Besides, after all is said and done, despicable others are still one’s family, neighbors, fellow Americans, and fellow humans. No one lives in a society of totally like-minded individuals, so getting along despite differences is essential.

Every time I hear someone called a name, even one that might be accepted proudly, I now substitute “baby eater.” (There might be a handful of baby eaters in existence.) Stark absence of evidence is normal; name calling is intended as a slur. Even if the charge sticks, nothing is accomplished. So it no longer puts me on my heels to be accused of being a __________. OK then, let’s say I am. Now what? How do we work through our issues? What does cancellation accomplish?

Possessing a sovereign mind, I am free to believe what makes sense to me, right, wrong or somewhere in between. This sovereignty applies to everyone and is not without consequences sometimes. Indeed, I often check myself against the opinions of others to determine if I’m siloed. However, I don’t take opinions without some exploration of a given topic. Hot takes are never demanded of me. Others may not adhere to that threshold. In addition, the sources and authorities to which different people subscribe can lead to wildly different conclusions even in good faith. When asking others, I often find I know more about a subject and am given replies based on cursory examination and conclusions drawn recklessly. Who, then, can be an effective check against my excesses? Certainly not someone who unleashes on me the full force of their emotional fury.

/rant on

New Year’s Day (or just prior) is the annual cue for fools full of loose talk to provide unasked their year-in-review and “best of” articles summarizing the previous calendar year. I don’t go in for such clichéd forms of curation but certainly recognize an appetite among Web denizens for predigested content that tells them where to park their attention and what or how to think rather than thinking for themselves. Considering how mis- and under-educated the public has grown to be since the steady slippage destruction of educational standards and curricula began in the 1970s (says me), I suppose that appetite might be better characterized as need in much the same way children needs guidance and rules enforced by wizened authorities beginning with parents yet never truly ending, only shifting over to various institutions that inform and restrain society as a whole. I continue to be flabbergasted by the failure of parents (and teachers) to curb the awful effects of electronic media. I also find it impossible not to characterize social media and other hyperstimuli as gateways into the minds of impressionable youth (and permanent adult children) very much like certain drugs (e.g., nicotine, alcohol, and cannabis) are characterized as gateways to even worse drugs. No doubt everyone must work out a relationship with these unavoidable, ubiquitous influences, but that’s not equivalent to throwing wide open the gate for everything imaginable to parade right in, as many do.

Hard to assess whether foundations below American institutions (to limit my focus) were allowed to decay through neglect and inattention or were actively undermined. Either way, their corruption and now inter-generational inability to function effectively put everyone in a wildly precarious position. The know-how, ambition, and moral focus needed to do anything other than game sclerotic systems for personal profit and acquisition of power are eroding so quickly that operations requiring widespread subscription by the public (such as English literacy) or taking more than the push of a button or click of a mouse to initiate preprogrammed commands are entering failure mode. Like the accidental horror film Idiocracy, the point will come when too few possess the knowledge and skills anymore to get things done but can only indulge in crass spectacle with their undeveloped minds. Because this is a date-related blog post, I point out that Idiocracy depicts results of cultural decay 500 years hence. It won’t take nearly that long. Just one miserable example is the fascist, censorious mood — a style of curation — that has swept through government agencies and Silicon Valley offices intent on installing unchallenged orthodoxies, or for that matter, news junkies and social media platform users content to accept coerced thinking. Religions of old ran that gambit but no need to wait for a new Inquisition to arise. Heretics are already persecuted via cancel culture, which includes excommunication social expulsion, suspension and/or cancellation of media accounts, and confiscation of bank deposits.

A similar point can be made about the climate emergency. Fools point to weather rather than climate to dispel urgency. Reports extrapolating trends often focus on the year 2100, well after almost all of us now alive will have departed this Earth, as a bogus target date for eventualities like disappearance of sea and glacial ice, sea level rise, unrecoverable greenhouse gas concentrations in the atmosphere, pH imbalance in the oceans, and other runaway, self-reinforcing consequences of roughly 300 years of industrial activity that succeeded unwittingly in terraforming the planet, along the way making it fundamentally uninhabitable for most species. The masses labor in 2023 under the false impression that everyone is safely distanced from those outcomes or indeed any of the consequences of institutional failure that don’t take geological time to manifest fully. Such notions are like assurances offered to children who seek to understand their own mortality: no need to worry about that now, that’s a long, long way off. Besides, right now there are hangovers to nurse, gifts to return for cash, snow to shovel, and Super Bowl parties to plan. Those are right now or at least imminent. Sorry to say, so is the full-on collapse of institutions that sustain and protect everyone. The past three years have already demonstrated just how precarious modern living arrangements are, yet most mental models can’t or won’t contemplate the wholesale disappearance of this way of life, and if one has learned of others pointing to this understanding, well, no need to worry about that just yet, that’s a long, long way off. However, the slide down the opposite side of all those energy, population, and wealth curves won’t take nearly as long as it took to climb up them.

/rant off

In the sense that all news is local and all learning is individual, meaning that it’s only when something is individualized and particularized that it takes on context and meaning, I may finally understand (some doubt still) Sheldon Wolin’s term “inverted totalitarianism,” part of the subtitle of his 2006 book Democracy Incorporated: Managed Democracy and the Specter of Inverted Totalitarianism. Regrettably, this book is among the (many) dozens that await my attention, so I can’t yet claim to have done the work. (I did catch a long YouTube interview of Wolin conducted by Chris Hedges, but that’s a poor substitute for reading the book.) My process is to percolate on a topic and its ancillary ideas over time until they come together satisfactorily, and my provisional understanding of the issues is closer to “proxy tyranny” than “inverted totalitarianism.”

I daresay most of us conceptualize tyranny and totalitarianism in the bootheel versions that manifested in several 20th-century despotic regimes (and survives in several others in the 21st century, names and locations withheld) where population management is characterized by stomping people down, grinding them into dust, and treating them as an undifferentiated mass. Administrators (e.g., secret police) paid close attention to anyone who might pose a problem for the regimes, and neighbors and family members were incentivized to betray inform on anyone who might be on officialdom’s radar. The 21st-century manifestation is different in that computers do most information gathering — a dragnet thrown over everyone — and we inform on ourselves by oversharing online. Close attention is still paid, but human eyes may never see extensive dossiers (forever records?) kept on each of us until something prompts attention. A further distinction is that in bootheel totalitarianism, intense scrutiny and punishment were ubiquitous, whereas at least in 21st-century America, a sizeable portion of the population can be handily ignored, abandoned, and/or forgotten. They’re powerless, harmless, and inconsequential, not drawing attention. Additionally, there is also no bottom to how low they can sink, as the burgeoning homeless population demonstrates.

If tyranny is normally understood as emanating from the top down, it’s inversion is bottom up. Wolin’s inverted totalitarianism is not a grassroots phenomenon but rather corporate capture of government. While Wolin’s formulation may be true (especially at the time his book was published), government has relinquished none of its power so much as realigned its objectives to fit corporate profit motives, and in doing so, shifted administrative burdens to proxies. Silicon Valley corporations (of the big data type especially) are the principal water carriers, practicing surveillance capitalism and as private entities exercising censorious cancellation of dissenting opinion that no formal government could countenance. Similarly, an entire generation of miseducated social justice warriors scours social media for evidence of noncomforming behavior, usually some offense of the meme of the moment a/k/a “I support the current thing” (though racism is the perennial accusation — an original sin that can never be forgiven or assuaged), waiting to pounce in indignation and destroy lives and livelihoods. Cancel culture is a true bottom-up phenomenon, with self-appointed emissaries doing the work that the government is only too happy to hand off to compliant, brainwashed ideologues.

In the Covid era, nonconforming individuals (e.g., those who refuse the jab(s) or call bullshit on continuously shifting narratives announced by various agencies that lack legal standing to compel anything) are disenfranchised in numerous ways even while the wider culture accepts that the pandemic is indeed endemic and simply gets on with life. Yet every brick-and-mortar establishment has been authorized, deputized, and indeed required to enforce unlawful policies of the moment as proxies for government application of force. Under threat of extended closure, every restaurant, retailer, arts organization, and sports venue demanded the literal or figurative equivalent of “papers please” to enter and assemble. Like the airlines, people are increasingly regarded as dehumanized cargo, treated roughly like the famous luggage ape (and not always without good reason). In most places, restrictions have been lifted; in others they persist. But make no mistake, this instantiation of proxy tyranny — compelling others to do the dirty work so that governments can not so plausibly deny direct responsibility — is the blueprint for future mistreatment. Personally, I’m rather ashamed that fewer Americans stood up for what is right and true (according to me, obviously), echoing this famous admission of moral failure. For my own part, I’ve resisted (and paid the price for that resistance) in several instances.

From the outset, credit goes to Jonathan Haidt for providing the ideas to launch this blog post. He appears to be making the rounds again flogging his most recent publication (where? I dunno, maybe The Atlantic). In the YouTube interview I caught, Haidt admits openly that as a social and behavioral psychologist, he’s prone to recommending incentives, programs, and regulations to combat destructive developments in contemporary life — especially those in the academy and on social media that have spread into politics and across the general public. Haidt wears impressive professional armor in support of arguments and contentions; I lack such rigor rather conspicuously. Accordingly, I offer no recommendations but instead try to limit myself to describing dynamics as an armchair social critic. Caveat emptor.

Haidt favors viewpoint diversity (see, for example, Heterodox Academy, which he helped to found and now chairs). Simple enough, right? Not so fast there, Señor Gonzalez! Any notion that even passing acquaintance with a given subject requires knowing both pros and cons is anathema to many of today’s thinkers, who would rather plug their ears and pretend opposition voices, principled or otherwise, are simply incoherent, need not be considered, and further, should be silenced and expunged. As a result, extremist branches of any faction tend to be ideological echo chambers. Cardinal weaknesses in such an approach are plain enough for critical thinkers to recognize, but if one happens to fall into one of those chambers, silos, or bubbles (or attend a school that trains students in rigid thinking), invitations to challenge cherished and closely held beliefs, upon which identity is built, mostly fall on deaf ears. The effect is bad enough in individuals, but when spread across organizations that adopt ill-advised solutionism, Haidt’s assessment is that institutional stupidity sets in. The handy example is higher education (now an oxymoron). Many formerly respectable institutions have essentially abandoned reason (ya know, the way reasonable people think) and begun flagellating themselves in abject shame over, for instance, a recovered history of participation in any of the cultural practices now cause for immediate and reflexive cancellation.

By way of analogy, think of one’s perspective as a knife (tool, not weapon) that requires periodic sharpening to retain effectiveness. Refusing to entertain opposing viewpoints is like sharpening only one side of the blade, resulting in a blunt, useless tool. That metaphor suggests a false dualism: two sides to an argument/blade when in fact many facets inform most complex issues, thus viewpoint diversity. By working in good faith with both supporters and detractors, better results (though not perfection) can be obtained than when radicalized entities come to dominate and impose their one-size-fits-all will indiscriminately. In precisely that way, it’s probably better not to become any too successful or powerful lest one be tempted to embrace a shortsighted will to power and accept character distortions that accompany a precipitous rise.

As mentioned a couple blog posts ago, an unwillingness to shut up, listen, and learn (why bother? solutions are just … so … obvious …) has set many people on a path of activism. The hubris of convincing oneself of possession of solutions to intractable issues is bizarre. Is there an example of top-down planning, channeling, and engineering of a society that actually worked without tyrannizing the citizenry in the process? I can’t think of one. Liberal democratic societies determined centuries ago that freedom and self-determination mixed with assumed responsibility and care within one’s community are preferable to governance that treats individuals as masses to be forced into conformity (administrative or otherwise), regulated heavily, and/or disproportionately incarcerated like in the U.S. But the worm has turned. Budding authoritarians now seek reforms and uniformity to manage diverse, messy populations.

Weirdly, ideologues also attempt to purge and purify history, which is chock full of villainy and atrocity. Those most ideologically possessed seek both historical and contemporary targets to denounce and cancel, not even excluding themselves because, after all, the scourges of history are so abject and everyone benefited from them somehow. Search oneself for inherited privilege and all pay up for past iniquities! That’s the self-flagellating aspect: taking upon oneself (and depositing on others) the full weight of and responsibility for the sins of our forebears. Yet stamping out stubborn embers of fires allegedly still burning from many generations ago is an endless task. Absolutely no one measures up to expectations of sainthood when situated with an inherently and irredeemably evil society of men and women. That’s original sin, which can never be erased or forgiven. Just look at what humanity (via industrial civilization) has done to the surface of the planet. Everyone is criminally culpable. So give up all aspirations; no one can ever be worthy. Indeed, who even deserves to live?

Heard a remark (can’t remember where) that most these days would attack as openly ageist. Basically, if you’re young (let’s say below 25 years of age), then it’s your time to shut up, listen, and learn. Some might even say that true wisdom doesn’t typically emerge until much later in life, if indeed it appears at all. Exceptions only prove the rule. On the flip side, energy, creativity, and indignation (e.g., “it’s not fair! “) needed to drive social movements are typically the domain of those who have less to lose and everything to gain, meaning those just starting out in adult life. A full age range is needed, I suppose, since society isn’t generally age stratified except at the extremes (childhood and advanced age). (Turns out that what to call old people and what counts as old is rather clumsy, though probably not especially controversial.)

With this in mind, I can’t help but to wonder what’s going on with recent waves of social unrest and irrational ideology. Competing factions agitate vociferously in favor of one social/political ideology or another as though most of the ideas presented have no history. (Resemblances to Marxism, Bolshevism, and white supremacy are quite common. Liberal democracy, not so much.) Although factions aren’t by any means populated solely by young people, I observe that roughly a decade ago, higher education in particular transformed itself into an incubator for radicals and revolutionaries. Whether dissatisfaction began with the faculty and infected the students is impossible for me to assess. I’m not inside that intellectual bubble. However, urgent calls for radical reform have since moved well beyond the academy. A political program or ideology has yet to be put forward that I can support fully. (My doomer assessment of what the future holds forestalls knowing with any confidence what sort of program or ideology into which to pour my waning emotional and intellectual energy.) It’s still fairly simple to criticize and denounce, of course. Lots of things egregiously wrong in the world.

My frustration with what passes for political debate (if Twitter is any indication) is the marked tendency to immediately resort to comparisons with Yahtzees in general or Phitler in particular. It’s unhinged and unproductive. Yahtzees are cited as an emotional trigger, much like baseless accusations of racism send everyone scrambling for cover lest they be cancelled. Typically, the Yahtzee/Phitler comparison or accusation itself is enough to put someone on their heels, but wizened folks (those lucky few) recognize the cheap rhetorical trick. The Yahtzee Protocol isn’t quite the same as Godwin’s Law, which states that the longer a discussion goes on (at Usenet in the earliest examples) increases the inevitability likelihood of someone bringing up Yahtzees and Phitler and ruining useful participation. The protocol has been deployed effectively in the Russian-Ukraine conflict, though I’m at a loss to determine in which direction. The mere existence of the now-infamous Azov Battalion, purportedly comprised of Yahtzees, means that automatically, reflexively, the fight is on. Who can say what the background rate of Yahtzee sympathizers (whatever that means) might be in any fighting force or indeed the general population? Not me. Similarly, what threshold qualifies a tyrant to stand beside Phitler on a list of worst evers? Those accusations are flung around like cooked spaghetti thrown against the wall just to see what sticks. Even if the accusation does stick, what possible good does it do? Ah, I know: it makes the accuser look like a virtuous fool.

Ask parents what ambitions they harbor for their child or children and among the most patterned responses is “I just want them to be happy.” I find such an answer thoughtless and disingenuous, and the insertion of the hedge just to make happiness sound like a small ask is a red herring. To begin with, for most kids still in their first decade, happiness and playfulness are relatively effortless and natural so long as a secure, loving environment is provided. Certainly not a default setting, but it’s still quite commonplace. As the dreamy style of childhood cognition is gradually supplanted by supposedly more logical, rational, adult thinking, and as children become acquainted with iniquities of both history and contemporary life, innocence and optimism become impossible to retain. Cue the sullen teenager confronting the yawning chasm between desire and reality. Indeed, few people seem to make the transition into adulthood knowing with much clarity how to be happy in the midst of widespread travail and suffering. Instead, young adults frequently substitute self-destructive, nihilistic hedonism, something learned primarily (says me) from the posturing of movie characters and the celebrities who portray them. (Never understood the trope of criminals hanging at nightclubs, surrounded by drug addicts, nymphos, other unsavory types, and truly awful music, where they can indulge their assholery before everything inevitably goes sideways.)

Many philosophies recommend simplicity, naturalness, and independence as paths to happiness and moral rectitude. Transcendentalism was one such response to social and political complexities that spoil and/or corrupt. Yet two centuries on, the world has only gotten more and more complex, pressing on everyone especially for information processing in volume and sophistication that does not at all come naturally to most and is arguably not part of our evolutionary toolkit. Multiple social issues, if one is to engage them fairly, hinge on legalistic arguments and bewildering wordplay that render them fundamentally intractable. Accordingly, many waive away all nuance and adopt pro forma attitudes. Yet the airwaves, social media, the Internet, and even dinner conversations are suffused by the worst sorts of hypercomplexity and casuistry that confound even those who traffic regularly in such rhetoric. It’s a very long way from “I just want to be happy.”

(more…)

“Language is dynamic” is a phrase invoked in praise or derision of shifts in usage. Corollaries include “the only constant is change” and “time’s arrow points in only one direction” — both signalling that stasis is an invalid and ultimately futile conservative value. The flip side might well be the myth of progress, understood in reference not to technological advancement but human nature’s failure to rise above its base (animal) origins. This failure is especially grotesque considering that humans currently albeit temporarily live in an age of material abundance that would provide amply for everyone if that largesse were justly and equitably produced and distributed. However, resources (including labor) are being systematically exploited, diverted, and hoarded by a small, unethical elite (what some call “alpha chimps”) who often use state power to subjugate vulnerable populations to funnel further tribute to the already obscenely wealthy top of the socioeconomic hierarchy. But that’s a different diatribe.

Although I’m sensitive the dynamism of language — especially terms for broad ideas in need of short, snappy neologisms — I’m resistant to adopting most new coin. For instance, multiple colors of pill (red, blue, white, and black to my knowledge) refer to certain narrative complexes that people, in effect, swallow. Similarly, the “blue church” is used to refer to legacy media struggling desperately (and failing) to retain its last shreds of legitimacy and authority. (Dignity is long gone.) Does language really need these terms or are hipsters just being clever? That question probably lacks a definitive answer.

My real interest with this blog post, however, is how the modern digital mediascape has given rise to a curious phenomenon associated with cancel culture: deletion of tweets and social media posts to scrub one’s past of impropriety as though the tweet or post never happened. (I’ve never deleted a post nor have any plans to.) Silicon Valley hegemons can’t resist getting their piece of the action, too, by applying deeply flawed algorithms to everyone’s content to demonetize, restrict, and/or remove (i.e., censor) offensive opinion that runs counter to (shifting) consensus narratives decided upon in their sole discretion as water carriers for officialdom. Algorithmic dragnets are not effective kludges precisely because thoughts are not synonymous with their online expression; one merely points to the other. Used to be said that the Internet is forever, so wait before posting or tweeting a reasonable duration so that irresponsible behavior (opinion and trolling, mostly) can be tempered. Who knows who possesses technical expertise and access to tweet and video archives other than, say, the Wayback Machine? When a public figure says or does something dumb, a search-and-destroy mission is often launched to resurrect offending and damning past utterance. Of course, scrub-a-dub erasure or deletion is merely another attempt to manage narrative and isn’t a plea for forgiveness, which doesn’t exist in the public sphere anyway except for rehabilitated monsters such as past U.S. presidents a/k/a war criminals. And the Internet isn’t in fact forever; ask an archivist.

Shifting language, shifting records, shifting sentiment, shifting intellectual history are all aspects of culture that develop naturally and inevitably over time. We no longer believe, for instance, in the four elements or geocentrism (a/k/a the Aristotelian-Ptolemaic system; never mind the intransigent Flat Earthers who need not be silenced). Darker aspects of these shifts, however, include the remarkable Orwellian insight that “Who controls the past controls the future: who controls the present controls the past” from the 1949 novel Nineteen Eighty-Four. Here’s the passage for context:

Who controls the past, controls the future: who controls the present, controls the past … The mutability of the past is the central tenet of Ingsoc. Past events, it is argued, have no objective existence, but survive only in written records and in human memories. The past is whatever the records and the memories agree upon. And since the Party is in full control of all records, and in equally full control of the minds of its members, it follows that the past is whatever the Party chooses to make it.

In 2021, the awful lesson is taken to heart by multiple parties (not the Party in the novel but wannabes) who have latched maniacally onto Orwellian mechanisms of thought control specifically through the manipulation of records, history, and language. But as mentioned above, policing mere expression is not the same as policing thought itself, at least among those who retain critical thinking skills and independence of mind. I abstain judgment how effective attempted brainwashing is with the masses but will at least mention that Yeonmi Park, who escaped from North Korea in 2007 before settling in the U.S. in 2014, describes the chilling totalitarian thought control exercised by the North Korean government — the stuff of nightmare dystopianism. The template is by now well established and despots everywhere are only too happy to implement it repeatedly, following an evil trajectory that should be resisted at every turn while still possible.

Continuing my book-blogging project on Orality and Literacy, Ong provides context for the oral tradition that surrounded the two great Homeric classics: The Iliad and The Odyssey. According to Ong, it took decades for literary critics and sociologists to overcome their bias, borne out of literacy, and recognize how formulaic are the two epics. They are essentially pastiches of commonplace plots, phrases, and sayings of the time, which was a notable strength when oral delivery based on memorization was how epic poetry was transmitted. In a literate era, such clichés are to be avoided (like the plague).

Aside: my review of David Serota’s Back to Our Future mentions the dialect he and his brother developed, filled with one-liners and catchphrases from entertainment media, especially TV and movies. The three-word (also three-syllable) form seems to be optimal: “Beam me up” (Star Trek), “Use the Force” (Star Wars), “Make my day” (Dirty Harry), “I’ll be back” (The Terminator), etc. This construction is short, punchy, and memorable. The first holder of high office in the U.S. to attempt to govern by catchphrase was probably Ronald Reagan, followed (of course) by Arnold Schwarzenegger and then Donald Trump. Mustn’t overlook that all three (and others) came to prominence via the entertainment industry rather than through earnest (Kennedyesque) public service. Trump’s numerous three-word phrases (shtick, really) lend themselves especially well to being chanted by adoring crowds at his pep rallies, swept up in groupthink, with a recognizable beat-beat-beat-(silence) structure. The rock band Queen stumbled upon this same elemental rhythm with its famous stomp-stomp-clap-(wait) from the anthem “We Are the Champions,” consciously intended for audience participation (as I understand it).

Further aside: “We Are the Champions” combines its iconic rhythm with a recitation tone sourced in antiquity. Make of that what you will.

Ong goes on to provide a discussion of the psychodynamics of orality, which I list here without substantive discussion (read for yourself):

  • orality is additive rather than subordinative
  • orality is aggregative rather than analytic
  • orality is redundant or copious
  • orality is conservative or traditionalist
  • orality is close to the human lifeworld
  • orality is agonistically toned
  • orality is empathetic and participatory rather than objectively distanced
  • orality is homeostatic
  • orality is situational rather than abstract

Of particular interest is Ong’s description of how language functions within oral cultures distinctly from literate cultures, which is the source of the bias mentioned above. To wit:

Fully literate persons can only with great difficulty imagine what a primary oral culture is like, that is, a culture with no knowledge whatsoever of writing or even the possibility of writing … In a primary oral culture, the expression ‘to look up something’ is an empty phrase … [w]ithout writing, words as such have no visual presence, even when the objects they represent are visual … [for] ‘primitive’ (oral) people … language is a mode of action and not simply a countersign of thought — oral people commonly, and probably universally, consider words to have great power. [pp. 31–32]

If this sounds conspicuously reminiscent of this previous post, well, congratulations on connecting the dots. The whole point, according to a certain perspective, is that words are capable of violence, which is (re)gaining adherents as our mental frameworks undergo continuous revision. It’s no small thing that slurs, insults, and fighting words (again) provoke offense and violent response and that mere verbal offense equates to violence. Not long ago, nasty words were reclaimed, nullified, and thus made impotent (with varying levels of irrational rules of usage). Well, now they sting again and are used as ammo to cancel (a form of administrative violence, often undertaken anonymously, bureaucratically, and with the assistance of the digital mob) anyone with improper credentials to deploy them.

Let me draw another connection. Here’s a curious quote by Walter Pater, though not well known:

All art constantly aspires towards the condition of music. For while in all other kinds of art it is possible to distinguish the matter from the form, and the understanding can always make this distinction, yet it is the constant effort of art to obliterate it.

Put another way, the separation of signifier from signified, an abstraction conditioned by literacy and rationalism (among other things) is removed (“obliterated”) by music, which connects to emotion more directly than representational art. Similarly, speech within primary oral cultures exists purely as sound and possesses an ephemeral, even effervescence (Ong’s term) quality only experienced in the flow of time. (Arguably, all of human experience takes place within the flow of time.) Music and “primitive” speech are accordingly dynamic and cannot be reduced to static snapshots, that is, fixed on a page as text or committed to a canvas or photograph as a still image (hence, the strange term still life). That’s why a three-word, three-syllable chant, or better yet, the Queen rhythm or the Wave in sports arenas (a gesture requiring subscription of everyone), can possess inherent power, especially as individuals are entrained in groupthink. Music and words-as-violence get inside us and are nearly wholly subjective, not objective — something we all experience organically in early childhood before being taught to read and write (if in fact those skills are learned beyond functional literacy). Does that mean culture is reverting to an earlier stage of development, more primitive, childlike, and irrational?

So far, this multipart blog post has trafficked in principles and generalities. Let me try now to be more specific, starting with an excerpt from Barry Lynn’s article in Harper’s Magazine titled “The Big Tech Extortion Racket” (Sept. 2020):

… around the middle of the nineteenth century, Americans began to develop technologies that could not be broken into component pieces. This was especially true of the railroad and the telegraph … Such corporations [railroad and telegraph companies] posed one overarching challenge: they charged some people more than others to get to market. They exploited their control over an essential service in order to extort money, and sometimes political favors … Americans found the answer to this problem in common law. For centuries, the owners of ferries, stagecoaches, and inns had been required to serve all customers for the same price and in the order in which they arrived. In the late nineteenth century, versions of such “common carrier” rules were applied to the new middleman corporations.

Today we rightly celebrate the Sherman Antitrust Act of 1890, which gave Americans the power to break apart private corporations. But in many respects, the Interstate Commerce Act of 1887 was the more important document. This act was based on the understanding that monopoly networks like the railroad and the telegraph could be used to influence the actions of people who depend on them, and hence their power must be carefully restricted …

For a century and a half, Americans used common carrier policies to ensure the rule of law in activities that depended on privately held monopolies … regulations freed Americans to take full advantage of every important network technology introduced during these years, including telephones, water and electrical services, energy pipelines, and even large, logistics-powered retailers. Citizens did not have to worry that the men who controlled the technologies involved would exploit their middleman position to steal other people’s business or disrupt balances of power.

I appreciate that Barry Lynn brings up the Interstate Commerce Act. If this legal doctrine appeared in the net neutrality debate a few years ago, it must have escaped my notice. While Internet Service Providers (ISPs) enable network access and connectivity, those utilities have not yet exhibited let’s-be-evil characteristics. Similarly, phone companies (including cell phones) and public libraries may well be eavesdropping and/or monitoring activities of the citizenry, but the real action lies elsewhere, namely, on social media networks and with online retailers. Evil is arguably concentrated in the FANG (or FAANG) corporations but has now grown to be ubiquitous in all social networks (e.g., Twitter) operating as common carriers (Zoom? Slack?) and across academe, nearly all of which have succumbed to moral panic. They are interpreting correctly, sad to observe, demands to censor and sanitize others’ no-longer-free speech appearing on their networks or within their realms. How much deeper it goes toward shaping politics and social engineering is quasi-conspiratorial and impossible for me to assess.

Much as I would prefer to believe that individuals possess the good sense to shift their activities away from social networks or turn their attention from discomfiting information sources, that does not appear to be the case. Demands for trigger warnings and safe spaces commonplace a few years ago on college campuses have instead morphed into censorious removal, deplatforming, and cancellation from the entire public sphere. Those are wrong responses in free societies, but modern institutions and technologies have gotten out of hand and outstripped the limits of normal human cognition. In short, we’re a society gone mad. So rather than accept responsibility to sort out information overflow oneself, many are demanding that others do it for them, and evil private corporations are complying (after a fashion). Moreover, calls for creation of an Orwellian Ministry of Truth, rebranded as a Truth Commission and Reality Czar, could hardly be any more chillingly and fascistically bizarre. People really need someone to brainwash decide for them what is real? Has anyone at the New York Times actually read Orwell’s dystopian novel 1984 and taken to heart its lessons?