Archive for the ‘Intellectual History’ Category

From Ran Prieur (no link, note nested reply):


I was heavily into conspiracy theory in the 90’s. There was a great paper magazine, Kenn Thomas’s Steamshovel Press, that always had thoughtful and well-researched articles exploring anomalies in the dominant narrative.

Another magazine, Jim Martin’s Flatland, was more dark and paranoid but still really smart. A more popular magazine, Paranoia, was stupid but fun.

At some point, conspiracy culture shifted to grand narratives about absolute evil. This happened at the same time that superhero movies (along with Harry Potter and Lord of the Rings) took over Hollywood. The more epic and the more black-and-white the story, the more humans are drawn to it.

This is my half-baked theory: It used to be that ordinary people would accept whatever the TV said — or before that, the church. Only a few weirdos developed the skill of looking at a broad swath of potential facts, and drawing their own pictures.

It’s like seeing shapes in the clouds. It’s not just something you do or don’t do — it’s a skill you can develop, to see more shapes more easily. And now everyone is learning it.

Through the magic of the internet, everyone is discovering that they can make reality look like whatever they want. They feel like they’re finding truth, when really they’re veering off into madness.

SamuraiBeanDog replies: Except that the real issue with the current conspiracy crisis is that people are just replacing the old TV and church sources with social media and YouTube. The masses of conspiracy culture aren’t coming up with their own realities, they’re just believing whatever shit they’re told by conspiracy influencers.

Something that’s rarely said about influencers, and propaganda in general, is that they can’t change anyone’s mind — they have to work with what people already feel good about believing.

“Language is dynamic” is a phrase invoked in praise or derision of shifts in usage. Corollaries include “the only constant is change” and “time’s arrow points in only one direction” — both signalling that stasis is an invalid and ultimately futile conservative value. The flip side might well be the myth of progress, understood in reference not to technological advancement but human nature’s failure to rise above its base (animal) origins. This failure is especially grotesque considering that humans currently albeit temporarily live in an age of material abundance that would provide amply for everyone if that largesse were justly and equitably produced and distributed. However, resources (including labor) are being systematically exploited, diverted, and hoarded by a small, unethical elite (what some call “alpha chimps”) who often use state power to subjugate vulnerable populations to funnel further tribute to the already obscenely wealthy top of the socioeconomic hierarchy. But that’s a different diatribe.

Although I’m sensitive the dynamism of language — especially terms for broad ideas in need of short, snappy neologisms — I’m resistant to adopting most new coin. For instance, multiple colors of pill (red, blue, white, and black to my knowledge) refer to certain narrative complexes that people, in effect, swallow. Similarly, the “blue church” is used to refer to legacy media struggling desperately (and failing) to retain its last shreds of legitimacy and authority. (Dignity is long gone.) Does language really need these terms or are hipsters just being clever? That question probably lacks a definitive answer.

My real interest with this blog post, however, is how the modern digital mediascape has given rise to a curious phenomenon associated with cancel culture: deletion of tweets and social media posts to scrub one’s past of impropriety as though the tweet or post never happened. (I’ve never deleted a post nor have any plans to.) Silicon Valley hegemons can’t resist getting their piece of the action, too, by applying deeply flawed algorithms to everyone’s content to demonetize, restrict, and/or remove (i.e., censor) offensive opinion that runs counter to (shifting) consensus narratives decided upon in their sole discretion as water carriers for officialdom. Algorithmic dragnets are not effective kludges precisely because thoughts are not synonymous with their online expression; one merely points to the other. Used to be said that the Internet is forever, so wait before posting or tweeting a reasonable duration so that irresponsible behavior (opinion and trolling, mostly) can be tempered. Who knows who possesses technical expertise and access to tweet and video archives other than, say, the Wayback Machine? When a public figure says or does something dumb, a search-and-destroy mission is often launched to resurrect offending and damning past utterance. Of course, scrub-a-dub erasure or deletion is merely another attempt to manage narrative and isn’t a plea for forgiveness, which doesn’t exist in the public sphere anyway except for rehabilitated monsters such as past U.S. presidents a/k/a war criminals. And the Internet isn’t in fact forever; ask an archivist.

Shifting language, shifting records, shifting sentiment, shifting intellectual history are all aspects of culture that develop naturally and inevitably over time. We no longer believe, for instance, in the four elements or geocentrism (a/k/a the Aristotelian-Ptolemaic system; never mind the intransigent Flat Earthers who need not be silenced). Darker aspects of these shifts, however, include the remarkable Orwellian insight that “Who controls the past controls the future: who controls the present controls the past” from the 1949 novel Nineteen Eighty-Four. Here’s the passage for context:

Who controls the past, controls the future: who controls the present, controls the past … The mutability of the past is the central tenet of Ingsoc. Past events, it is argued, have no objective existence, but survive only in written records and in human memories. The past is whatever the records and the memories agree upon. And since the Party is in full control of all records, and in equally full control of the minds of its members, it follows that the past is whatever the Party chooses to make it.

In 2021, the awful lesson is taken to heart by multiple parties (not the Party in the novel but wannabes) who have latched maniacally onto Orwellian mechanisms of thought control specifically through the manipulation of records, history, and language. But as mentioned above, policing mere expression is not the same as policing thought itself, at least among those who retain critical thinking skills and independence of mind. I abstain judgment how effective attempted brainwashing is with the masses but will at least mention that Yeonmi Park, who escaped from North Korea in 2007 before settling in the U.S. in 2014, describes the chilling totalitarian thought control exercised by the North Korean government — the stuff of nightmare dystopianism. The template is by now well established and despots everywhere are only too happy to implement it repeatedly, following an evil trajectory that should be resisted at every turn while still possible.

Having grown up in an ostensibly free, open society animated by liberal Western ideology, it’s fair to say in hindsight that I internalized a variety of assumptions (and illusions) regarding the role of the individual vis-à-vis society. The operative word here is ostensibly owing to the fact that society has always restricted pure expressions of individuality to some degree through socialization and pressure to conform, so freedom has always been constrained. That was one of the takeaways from my reading (long ago in high school) of Albert Camus’ novel The Stranger (1942) (British: The Outsider; French: L’Étranger), namely, that no matter how free one might believe oneself to be, if one refuses (radically, absurdly) to play by society’s rules and expectations, one will be destroyed. The basic, irresolvable conflict is also present in the concerto principle in classical music, which presents the soloist in dialogue with or in antithesis to the ensemble. Perhaps no work exemplifies this better than the 2nd movement of Ludwig van Beethoven’s Concerto No. 4 for piano and orchestra. A similar dialogue if found in the third movement of Gustav Mahler’s Symphony No. 3, though dialogue there might be better understood as man vs. nature. The significant point of similarity is not the musical style or themes but how the individual/man is ultimately subdued or absorbed by society/nature.

Aside: A broader examination of narrative conflict would include four traditional categories: (1) man vs. man, (2) man vs. nature, (3) man vs. self, and (4) man vs. society. Updated versions, often offered as tips for aspiring writers, sometimes include breakout conflicts (i.e., subcategories): (1) person vs. fate/god, (2) person vs. self, (3) person vs. person, (4) person vs. society, (5) person vs. nature, (6) person vs. supernatural, and (7) person vs. technology. Note that modern sensibilities demand use of person instead of man.

My reason for bringing up such disparate cultural artifacts is to provide context. Relying on my appreciation of the Zeitgeist, liberal Western ideology is undergoing a radical rethinking, with Woke activists in particular pretending to emancipate oppressed people when flattening of society is probably the hidden objective. Thus, Wokesters are not really freeing anyone, and flattening mechanisms are pulling people down, not building people up. On top of that, they are targeting the wrong oppressors. If leveling is meant to occur along various markers of identity (race, sexual and gender orientation, religion, political affiliation, nationality, etc.), the true conflict in the modern era has always been socioeconomic, i.e., the ownership class against all others. Sure, differences along identitarian lines have been used to oppress, but oppressors are merely using a surface characteristic to distract from their excessive power. The dispossessed oddly fail to recognize their true enemies, projecting animus instead on those with whom grievances are shared. Similarly, Wokesters are busy exploiting their newfound (albeit momentary) power to question the accepted paradigm and force RightThink on others. Yet traditional power holders are not especially threatened by squabbles among the oppressed masses. Moreover, it’s not quite accurate to say that the identitarian left is rethinking the established order. Whatever is happening is arguably occurring at a deeper, irrational level than any thoughtful, principled, political action meant to benefit a confluence of interest groups (not unlike the impossible-to-sort confluence of identities everyone has).

Although I haven’t read Howard Zinn’s A People’s History of the United States (1980), I gather that Zinn believed history should not be told from the winners’ perspective (i.e., that of the ownership and ruling classes, significant overlap acknowledged), or from top down, but instead through the lens of the masses (i.e., the people, a large segment of whom are oppressed and/or dispossessed), or from the bottom up. This reorientation applies not only within a given society or political entity but among nations. (Any guess which countries are the worst oppressors at the moment? Would be a long list.) Moreover, counter to the standard or accepted histories most of us learn, preparation of the U.S. Constitution and indeed quite a lot of U.S. history are deeply corrupt and oppressive by design. It should be obvious that the state (or nation, if one prefers), with its insistence on personal property and personal freedom (though only for a narrow class of landed gentry back in the day, plutocrats and corporatists today), systematically rolled over everyone else — none so egregiously as Native Americans, African slaves, and immigrants. Many early institutions in U.S. political history were in fact created as bulwarks against various forms of popular resistance, notably slave revolts. Thus, tensions and conflicts that might be mistakenly chalked up as man vs. society can be better characterized as man vs. the state, with the state having been erected specifically to preserve prerogatives of the ownership class.

More to come in part 2 and beyond.

The backblog at The Spiral Staircase includes numerous book reviews and three book-blogging projects — one completed and two others either abandoned or on semi-permanent hiatus. I’m launching a new project on Walter Ong’s Orality and Literacy: The Technologizing of the Word (1982), which comes highly recommended and appears quite interesting given my preoccupations with language, literacy, and consciousness. To keep my thinking fresh, I have not consulted any online reviews or synopses.

Early on, Ong provides curious (but unsurprising) definitions I suspect will contribute to the book’s main thesis. Here is one from the intro:

It is useful to approach orality and literacy synchronically, by comparing oral cultures and chirographic (i.e., writing) cultures that coexist at a given period of time. But it is absolutely essential to approach them also diachronically or historically, by comparing successive periods with one another. [p. 2]

I don’t recall reading the word chirographic before, but I blogged about the typographic mind (in which Ong’s analyses are discussed) and lamented that the modern world is moving away from literacy, back toward orality, which feels (to me at least) like retrogression and retreat. (Someone is certain to argue return to orality is actually progress.) As a result, Western institutions such as the independent press are decaying. Moreover, it’s probably fair to say that democracy in the West is by now only a remnant fiction, replaced by oligarchic rule and popular subscription to a variety of fantasy narratives easily dispelled by modest inventory of what exists in actuality.

Here is another passage and definition:

A grapholect is a transdialectal language formed by deep commitment to writing. Writing gives a grapholect a power far exceeding that of any purely oral dialect. The grapholect known as standard English has accessible for use a recorded vocabulary of at least a million and a half words, of which not only the present meanings but also hundreds of thousands of past meanings are known. A simply oral dialect will commonly have resources of only a few thousand words, and its users will have virtually no knowledge of the real semantic history of any of these words. [p. 8]

My finding is that terms such as democracy, liberalism, social justice, etc. fail to mean anything (except perhaps to academics and committed readers) precisely because their consensus usage has shifted so wildly over time that common historical points of reference are impossible to establish in a culture heavily dominated by contemporary memes, slang, talking heads, and talking points — components of orality rather than literacy. And as part of a wider epistemological crisis, one can no longer rely on critical thinking to sort out competing truth claims because the modifier critical now bandied about recklessly in academia, now infecting the workplace and politics, has unironically reversed its meaning and requires uncritical doublethink to swallow what’s taught and argued. Let me stress, too, that playing word games (such as dissembling what is means) is a commonplace tactic to put off criticism by distorting word meanings beyond recognition.

Although it’s unclear just yet (to me, obviously) what Ong argues in his book beyond the preliminary comparison and contrast of oral and chirographic cultures (or in terms of the title of the book, orality and literacy), I rather doubt he argues as I do that the modern world has swung around to rejection of literacy and the style of thought that flows from deep engagement with the written word. Frankly, it would surprise me if his did; the book predates the Internet, social media, and what’s now become omnimedia. The last decade in particular has demonstrated that by placing a cheap, personal, 24/7/365 communications device in the hands of every individual from the age of 12 or so, a radical social experiment was launched that no one in particular designed — except that once the outlines of the experiment began to clarify, those most responsible (i.e., social media platforms in particular but also biased journalists and activist academics) have refused to admit that they are major contributors to the derangement of society. Cynics learned long ago to expect that advertisers, PR hacks, and politicians should be discounted, which requires ongoing skepticism and resistance to omnipresent lures, cons, and propaganda. Call it waking up to reality or simply growing up and behaving responsibly in an information environment designed to be disorienting. Accordingly, the existence of counterweights — information networks derived from truth, authority, and integrity — has always been, um, well, critical. Their extinction presages much graver losses as information structures and even the memory of mental habits that society needs to function are simply swept aside.

Evil exists in the world. History and current events both bear this out amply. Pseudo-philosophers might argue that, like emotions and other immaterial sensations, good and evil are merely reified concepts, meaning they are human constructs with no palpable external reality. Go tell that to victims of evildoers. Human suffering can’t be anonymized, rationalized, or philosophized away quite so handily.

It was sort of refreshing, back in the day, when Google’s motto and/or corporate code of conduct was simple: “Don’t Be Evil.” It acknowledged the potential for being or becoming evil (like any of the Bigs: Big Tobacco, Big Soda, Big Oil, Big Pharma, Big Media, Big Agriculture, etc.) and presumably aspired to resist obvious temptations. That was then (from 2000 to 2018), this is now (2021 until death take us — soon enough, I fear). But like all entities possessed of absurd levels of wealth and power, Google (now reorganized as a subsidiary of Alphabet, but who actually refers to it that way?) and its Silicon Valley brethren have succumbed to temptation and become straight-up evil.

One might charitably assess this development as something unbidden, unanticipated, and unexpected, but that’s no excuse, really. I certainly don’t envy celebrity executives experiencing difficulty resulting from having created unmanageable behemoths loosed on both public and polity unable to recognize beastly fangs until already clamped on their necks. As often occurs, dystopian extrapolations are explored in fiction, sometimes satirically. The dénouement of the HBO show Silicon Valley depicts tech mogul wannabes succeeding in creating an AI (or merely a sophisticated algorithm? doesn’t matter …) that would in time become far too powerful in blind execution of its inner imperative. In the show, characters recognize what they had done and kill their own project rather than allow it to destroy the world. In reality, multiple developers of computer tech platforms (and their embedded dynamic, including the wildly unhelpful albeit accurate term algorithm) lacked the foresight to anticipate awful downstream effects of their brainchildren. Yet now that those effects are manifesting recognizably, these corporations continue to operate and wreak havoc.

Silicon Valley shows a extended software development period of bungling ineptitude punctuated by brilliant though momentary breakthroughs. Characters are smart, flawed people laughably unable to get out of the way of their own success. The pièce de résistance was yoking one so-called “learning machine” to another and initiating what would become a runaway doomsday process (either like ecological collapse, building slowly the making the biosphere uninhabitable all at once, or like the gray goo problem, progressively “processing” biomass at the molecular level until all that remains is lifeless goo). It was a final act of bumbling that demanded the characters’ principled, ethical response before the window of opportunity closed. Real Silicon Valley tech platforms are in the (ongoing) process of rending the social fabric, which is no laughing matter. The issue du jour surrounds free speech and its inverse censorship. More broadly, real Silicon Valley succeeded in gaming human psychology for profit in at least two aspects (could be more as yet unrecognized): (1) mining behavioral data as an exploitable resource, and (2) delivering inexhaustible streams of extremely divisive content (not its own) to drive persistent engagement with its platforms. Yoked together, they operate to drive society mad, and yet, mounting evidence of this development has not produced even an inkling that maybe the damned doomsday devices ought to be shut off. As with the environment, we operate with freedom enough to destroy ourselves. Instead, politicians issue stunningly ineffectual calls for regulation or break-up of monopolies. In the meantime, ever more absurd wealth and power are concentrated in the hands of a few executives who have clearly punted and decided “let’s be evil.” No restraints on their behavioral experimentation across whole societies exist.

Much more to say on this topic in additional parts to come.

I simply can’t keep up with all the reading, viewing, and listening in my queue. Waking hours are too few, and concentration dissipates long before sleep overtakes. Accordingly, it’s much easier to settle into couch-potato mode and watch some mindless drivel, such as the Netflix hit Bridgerton binged in two sittings. (Unlike cinema critics, I’m not bothered especially by continuity errors, plot holes, clunky dialogue, weak character motivation, gaps of logic, or glossy decadence of the fictional worlds. I am bothered by the Kafka trap sprung on anyone who notices casting decisions that defy time and place — an ill-advised but now commonplace historical revisionism like editing Mark Twain.) As a result, blog posts are less frequent than they might perhaps be as I pronounce upon American (or more broadly, Western) culture, trying vainly to absorb it as a continuously moving target. Calls to mind the phrase Après moi, le déluge, except that there is no need to wait. A deluge of entertainment, news, analysis, punditry, and trolling has buried everyone already. So rather than the more careful consideration I prefer to post, here are some hot takes.

The Irregular Aphorist. Caitlin Johnstone offers many trenchant observations in the form of aphorisms (some of which I’ve quoted before), all gathered under the subtitle Notes From The Edge Of The Narrative Matrix. The modifier irregular only means that aphorisms are a regular but not constant feature. Her site doesn’t have a tag to that effect but probably ought to. Here’s one in particular that caught my attention:

Everything our species has tried has led us to a dying world and a society that is stark raving mad, so nobody is in any position to tell you that you are wrong.

Twin truths here are (1) the dying world and (2) societal madness, both of which I’ve been describing for some time. Glad when others recognize them, too.

Piling on. Though few still are willing to admit it, nonpharmaceutical interventions (NPIs, e.g., distancing, masks, and lockdowns) to stall or reduce the spread of the virus failed to achieve their objectives according to this study. Instead, NPIs piled on suffering no one could forestall. I read somewhere (no link) that the world is approaching half of total, cumulative deaths/infections predicted had nothing been done to impede the pandemic running its course. Adding in deaths of despair (numbers not entirely up to date), we’re using the wrong tools to fight the wrong battle. Of course, interventions opened up giant opportunities for power grabs and vulture capitalism, so the cynic in me shrugs and wonders half aloud “what did you expect, really?”

Growth of the Managerial Bureaucracy. A blog called Easily Distracted by Timothy Burke (never on my blogroll) publishes only a few times per year, but his analysis is terrific — at least when it doesn’t wind up being overlong and inconclusive. Since a student debt jubilee is back in the news (plenty of arguments pro and con), unintended consequences are anticipated in this quote:

When you set out to create elaborate tiers that segregate the deserving poor from the comfortable middle-class and the truly wealthy, you create a system that requires a massive bureaucracy to administer and a process that forces people into petitionary humiliation in order to verify their eligibility. You create byzantine cutoff points that become business opportunities for predatory rentiers.

Something similar may well be occurring with stimulus checks being issued pro rata (has anyone actually gotten one?), but at least we’re spared any petitionary humiliations. We get whatever the algorithms (byzantine cutoff points) dictate. How those funds will be gamed and attached is not yet clear. Stay alert.

No Defense of Free Speech. Alan Jacobs often recommends deleting, unsubscribing, and/or ignoring social media accounts (after his own long love-hate relationship with them) considering how they have become wholly toxic to a balanced psyche as well as principal enablers of surveillance capitalism and narrative control. However, in an article about the manorial elite, he’s completely lost the plot that absolutism is required in defense of free speech. It’s not sufficient to be blasé or even relieved when 45 is kicked off Twitter permanently or when multiple parties conspire to kill Parler. Establishing your own turf beyond the reach of Silicon Valley censors is a nice idea but frankly impractical. Isn’t that what whoever ran Parler (or posted there) must have thought? And besides, fencing off the digital commons these very entities created has catapulted them into the unenviable position of undemocratic, unelected wielders of monopolistic power and co-conspirators to boot. That’s what needs to be curtailed, not free speech.

The Taxonomic Apocalypse. Although drawn from fiction and thus largely hypothetical, a new book (coming late 2021) by Adam Roberts called It’s the End of the World: But What Are We Really Afraid Of? surveys doomsday stories and categorizes different versions of how it all ends. Alan Jacobs (yeah, him again — must have an advance copy of the manuscript) recommends it as “a delightful and provocative little book” but fails to grok two things: (1) these stories are rehearsals-cum-preparations for the real thing, and (2) the real thing really is bearing down on us implacably and so is no longer a mere hypothetical to contemplate and categorize for shits and grins. Despite acceptance of the eventualities that await all of us, reading Roberts’ taxonomy is not something I would expect to find delightful. Skip.

Narrative Collapse. Ran Prier (no link) sometimes makes statements revealing an unexpected god’s-eye view:

[45] is a mean rich kid who figured out that if he does a good Archie Bunker impression, every lost soul with an authoritarian father will think he’s the messiah. We’re lucky that he cares only about himself, instead of having some crazy utopian agenda. But the power, and the agency, is with the disaffected citizens of a declining empire, tasting barbarism.

This is all about people wanting to be part of a group that’s part of a story. Lately, some of the big group-stories have been dying: sky father religion, American supremacy, the conquest of nature, the virtue of wealth-seeking. In their place, young and clumsy group-stories struggle and rise.

Collapse of certain fundamental stories that animate our thinking is at the core of The Spiral Staircase (see About Brutus at top), though it’s often couched in terms of consciousness in transition. Getting through the transition (only temporarily, see previous item in list) probably means completion of the Counter-Enlightenment historical arc, which necessarily includes further descent into barbarism.

Hail Mary for Individualism. I always take special notice when someone cites Allan Bloom. Alan Jacobs (um, yeah, he’s prolific and I’m using his ideas again — sue me) cites Bloom to argue that individualism or the sovereign self, a product of the Enlightenment, is already dead. No doubt, the thought-world described so ably by Bloom no longer exists, but individualism has not yet died out by attrition or been fully dissolved in nonduality. Many of us born before the advent of the Internet retain selfhood and authenticity not yet coopted by or incorporated into mass mind. Moreover, ongoing struggles over identity (e.g., gender, sexual orientation, and race that are often used improperly to define the self) result from an inchoate sense that individualism is eroding precipitously, not that it’s already passé. Defiant attempts to (re)establish an authentic self (contravening all logic and becoming critical theory of one sort or another) in the face of this loss may well be a last-ditch effort to save the self, but it’s failing.

Returning to the subject of this post, I asserted that the modern era frustrates a deep, human yearning for meaning. As a result, the Medieval Period, and to a lesser degree, life on the highroad, became narrative fixations. Had I time to investigate further, I would read C.S. Lewis’ The Discarded Image (1964), but my reading list is already overfull. Nonetheless, I found an executive summary of how Lewis describes the Medieval approach to history and education:

Medieval historians varied in that some of them were more scientific, but most historians tried to create a “picture of the past.” This “picture” was not necessarily based in fact and was meant more to entertain curiosity than to seriously inform. Educated people in medieval times, however, had a high standard for education composed of The Seven Liberal Arts of grammar, dialectic, rhetoric, arithmetic, music, geometry, and astronomy.

In the last chapter, Lewis summarizes the influence of the Medieval Model. In general, the model was widely accepted, meaning that most people of the time conformed to the same way of thinking. The model, he reiterates, satisfied imagination and curiosity, but was not necessarily accurate or factual, specifically when analyzed by modern thinkers.

Aside. Regular readers of The Spiral Staircase may also recognize how consciousness informs this blog post. Historical psychology offers a glimpse into worldviews of bygone eras, with the Medieval Period perhaps being the easiest to excavate contemplate due to proximity. Few storytellers (cinema or literature) attempt to depict what the world was truly like in the past (best as we can know) but instead resort to an ahistorical modern gloss on how men and women thought and behaved. One notable exception may be the 1986 film The Name of the Rose, which depicts the emerging rational mind in stark conflict with the cloistered Medieval mind. Sword-and-sandal epics set in ancient Rome and Greece get things even worse.

(more…)

From a lengthy blog post by Timothy Burke, which sparked considerable follow-on discussion in the comments:

What the liberal-progressive world largely doesn’t understand is that the 35% of the electorate that stand[s] with Trump no matter what he does (maybe a quarter of people resident inside the borders of the US) do[es] not believe in democracy. It is not that they don’t realize that Trump is an authoritarian, etc., that democracy is in danger. They realize it and they’re glad. Mission accomplished. They have a different view of power and political process, of social relations. They are brutalists. Fundamentally they think power is a zero-sum game. You hold it or you are held by it. You are the boot on someone’s neck or there will be a boot on yours. They agree that what they have was taken from others; they think that’s the way of all things. You take or are taken from.

They do not believe in liberty and justice for all, or even really for themselves: it is not that they reserve liberty for themselves, because they believe that even they should be subject to the will of a merciless authority (who they nevertheless expect to favor them as an elect of that authority). We often ask how evangelicals who think this way can stand the notion of a God who would permit a tornado to destroy a church and kill the innocents gathered in it for shelter. They can stand it because they expect that of authority: that authority is cruel and without mercy because it must be. They simply expect authority to be far more cruel to others than it is to them. And they expect to be cruel with the authority they possess.

Fantasies and delusions rush into the space
that reason has vacated in fear of its life.

—James Howard Kunstler

Since I first warned that this blog post was forthcoming, conditions of modern American life we might have hoped would be resolved by now remain intransigently with us. Most are scrambling to adjust to the new normal: no work (for tens of millions), no concerts, no sports (except for events staged for the camera to be broadcast later), little or no new cinema (but plenty of streaming TV), no school or church (except for abysmal substitutes via computer), no competent leadership, and no end in sight. The real economy swirls about the drain despite the fake economy (read: the stock market a/k/a the Richistan economy) having first shed value faster than ever before in history then staged a precipitous taxpayer-funded, debt-fueled recovery only to position itself for imminent resumption of its false-started implosion. The pandemic ebbed elsewhere then saw its own resumption, but not in the U.S., which scarcely ebbed at all and now leads the world in clownish mismanagement of the crisis. Throughout it all, we extend and pretend that the misguided modern age isn’t actually coming to a dismal close, based as it is on a consumption-and-growth paradigm that anyone even modestly numerically literate can recognize is, um, (euphemism alert) unsustainable.

Before full-on collapse (already rising over the horizon like those fires sweeping across the American West) hits, however, we’ve got unfinished business: getting our heads (and society) right regarding which of several competing ideologies can or should establish itself as the righteous path forward. That might sound like the proverbial arranging of deck chairs on the RMS Titanic, but in an uncharacteristically charitable moment, let me suggest that righting things before we’re done might be an earnest obligation even if we can’t admit openly just how close looms the end of (human) history. According to market fundamentalists, corporatists, and oligarchs, Socialism and Marxism, or more generally collectivism, must finally have a stake driven through its undead heart. According to radical progressives, Black Lives Matter, and Antifa, fascism and racism, or more generally intolerance, deserve to be finally stamped out, completing the long arc of history stalled after the Civil Rights Era. And according to barely-even-a-majority-anymore whites (or at least the conservative subset), benefits and advantages accrued over generations, or more generally privilege, must be leveraged, solidified, and maintained lest the status quo be irretrievably lost. Other factions no doubt exist. Thus, we are witnessing a battle royale among narratives and ideologies, none of which IMO crystallize the moment adequately.

Of those cited above, the first and third are easy to dismiss as moribund and self-serving. Only the second demonstrates any concern for the wellbeing of others. However, and despite its putative birthplace in the academy, it has twisted itself into pretzel logic and become every bit as intolerant as the scourges it rails against. Since I need a moniker for this loose, uncoordinated network of movements, I’ll refer to them as the Woke Left, which signifies waking up (i.e., being woke) to injustice and inequity. Sustained analysis of the Woke Left is available from James Lindsay through a variety of articles and interviews (do a search). Lindsay demonstrates handily how the Woke Left’s principle claims, often expressed through its specialized rhetoric called Critical Theory, is actually an inversion of everything it pretends to be. This body of thought has legitimate historical and academic lineage, so it’s arguable that only its most current incarnation in the Woke Left deserves scorn.

Two recently published books exemplify the rhetoric of the Woke Left: White Fragility (2018) by Robin DiAngelo and How to Be an Antiracist (2019) by Ibram Kendi. Although I’ve read neither book, I’m aware of numerous scathing reviews that point out fundamental problems with the books and their authors’ arguments. Foremost among them is what’s sometimes called a Kafka trap, a Catch-22 because all avenues of argument lead inescapably toward guilt, typically some form of original sin. Convinced they are on the righteous right side of history, Woke Left protesters and agitators have been harassing and physically threatening strangers to demand support for the cause, i.e., compliance. What cause is a good question, considering a coherent program has yet to be articulated. Forcing others to choose either side of a false binary — with us or against us — is madness, but that’s the cultural moment at which we’ve arrived. Everyone must align their ideology with some irrational narrative while being put at risk of cancellation and/or destruction no matter what alignment is ventured.

If things go south badly on the heels of contested election results this fall as many expect — the pump already primed for such conflict — and a second civil war ensues, I rather expect the Woke Left to be the first to fail and the other two, each representing the status quo (though different kinds), to be in an extended battle for control of whatever remains of the union. I can’t align with any of them, since by my lights they’re all different kinds of crazy. Sorta makes ya wonder, taking history as an indicator, if a fourth or fifth faction won’t appear before it’s a wrap. I don’t hold out any hope for any faction steering us competently through this crisis.

In educational philosophy, learning is often categorized in three domains: the cognitive, the affective, and the psychomotor (called Bloom’s Taxonomy). Although formal education admittedly concentrates primarily on the cognitive domain, a well-rounded person gives attention to all three. The psychomotor domain typically relates to tool use and manipulation, but if one considers the body itself a tool, then athletics and physical workouts are part of a balanced approach. The affective domain is addressed through a variety of mechanisms, not least of which is narrative, much of it entirely fictional. We learn how to process emotions through vicarious experience as a safe way to prepare for the real thing. Indeed, dream life is described as the unconscious mind’s mechanism for consolidating memory and experience as well as rehearsing prospective events (strategizing) in advance. Nightmares are, in effect, worst-case scenarios dreamt up for the purpose of avoiding the real thing (e.g., falling from a great height or venturing too far into the dark — a proxy for the unknown). Intellectual workouts address the cognitive domain. While some are happy to remain unbalanced, focusing on strengths found exclusively in a single domain (gym rats, eggheads, actors) and thus remaining physically, emotionally, or intellectually stunted or immature, most understand that workouts in all domains are worth seeking out as elements of healthy development.

One form of intellectual workout is debate, now offered by various media and educational institutions. Debate is quite old but has been embraced with renewed gusto in a quest to develop content (using new media) capable of drawing in viewers, which mixes educational objectives with commercial interests. The time-honored political debate used to be good for determining where to cast one’s vote but has become nearly useless in the last few decades as neither the sponsoring organizations, the moderators, nor the candidates seem to understand anymore how to run a debate or behave properly. Instead, candidates use the opportunity to attack each other, ignore questions and glaring issues at hand, and generally refuse to offer meaningful responses to the needs of voters. Indeed, this last was among the principal innovations of Bill Clinton: roll out some appealing bit of vacuous rhetoric yet offer little to no guidance what policies will actually be pursued once in office. Two presidential administrations later, Barack Obama did much the same, which I consider a most egregious betrayal or bait-and-switch. Opinions vary.

In a recent Munk Debate, the proposition under consideration was whether humankind’s best days lie ahead or behind. Optimists won the debate by a narrow margin (determined by audience vote); however, debate on the issue is not binding truth, nor does debate really resolve the question satisfactorily. The humor and personalities of the debaters probably had more influence than their arguments. Admitting that I possess biases, I found myself inclined favorably toward the most entertaining character, though what I find entertaining is itself further bias not shared especially with many others. In addition, I suspect the audience did not include many working class folks or others who see their prospects for better lives diminishing rapidly, which skews the resulting vote. The age-old parental desire to leave one’s children a better future than their own is imperiled according to this poll (polls may vary considerably — do your own search). How one understands “better off” is highly variable, but the usual way that’s understood is in terms of material wellbeing.

Folks on my radar (names withheld) range widely in their enthusiasm or disdain for debate. The poles appears to be default refusal to accept invitations to debate (often couched as open challenges to professed opinions) as a complete waste of time to earnest desire to participate in, host, and/or moderate debates as a means of informing the public by providing the benefit of expert argumentation. As an intellectual workout, I appreciate the opportunity to hear debates (at least when I’m not exasperated by a speaker’s lack of discipline or end-around arguments), but readers can guess from the title of this post that I expect nothing to be resolved by debate. Were I ever to be offered an opportunity to participate, I can well imagine accepting the invitation and having some fun flexing my intellectual muscles, but I would enter into the event with utterly no expectation of being able to convince anyone of anything. Minds are already too well made up on most issues. If I were offered a spot on some bogus news-and-opinion show to be a talking head, shot from the shoulders up and forced to shout and interrupt to get a brief comment or soundbite in edgewise, that I would decline handily as a total waste of time.