Posts Tagged ‘Cancel Culture’

From the outset, credit goes to Jonathan Haidt for providing the ideas to launch this blog post. He appears to be making the rounds again flogging his most recent publication (where? I dunno, maybe The Atlantic). In the YouTube interview I caught, Haidt admits openly that as a social and behavioral psychologist, he’s prone to recommending incentives, programs, and regulations to combat destructive developments in contemporary life — especially those in the academy and on social media that have spread into politics and across the general public. Haidt wears impressive professional armor in support of arguments and contentions; I lack such rigor rather conspicuously. Accordingly, I offer no recommendations but instead try to limit myself to describing dynamics as an armchair social critic. Caveat emptor.

Haidt favors viewpoint diversity (see, for example, Heterodox Academy, which he helped to found and now chairs). Simple enough, right? Not so fast there, Señor Gonzalez! Any notion that even passing acquaintance with a given subject requires knowing both pros and cons is anathema to many of today’s thinkers, who would rather plug their ears and pretend opposition voices, principled or otherwise, are simply incoherent, need not be considered, and further, should be silenced and expunged. As a result, extremist branches of any faction tend to be ideological echo chambers. Cardinal weaknesses in such an approach are plain enough for critical thinkers to recognize, but if one happens to fall into one of those chambers, silos, or bubbles (or attend a school that trains students in rigid thinking), invitations to challenge cherished and closely held beliefs, upon which identity is built, mostly fall on deaf ears. The effect is bad enough in individuals, but when spread across organizations that adopt ill-advised solutionism, Haidt’s assessment is that institutional stupidity sets in. The handy example is higher education (now an oxymoron). Many formerly respectable institutions have essentially abandoned reason (ya know, the way reasonable people think) and begun flagellating themselves in abject shame over, for instance, a recovered history of participation in any of the cultural practices now cause for immediate and reflexive cancellation.

By way of analogy, think of one’s perspective as a knife (tool, not weapon) that requires periodic sharpening to retain effectiveness. Refusing to entertain opposing viewpoints is like sharpening only one side of the blade, resulting in a blunt, useless tool. That metaphor suggests a false dualism: two sides to an argument/blade when in fact many facets inform most complex issues, thus viewpoint diversity. By working in good faith with both supporters and detractors, better results (though not perfection) can be obtained than when radicalized entities come to dominate and impose their one-size-fits-all will indiscriminately. In precisely that way, it’s probably better not to become any too successful or powerful lest one be tempted to embrace a shortsighted will to power and accept character distortions that accompany a precipitous rise.

As mentioned a couple blog posts ago, an unwillingness to shut up, listen, and learn (why bother? solutions are just … so … obvious …) has set many people on a path of activism. The hubris of convincing oneself of possession of solutions to intractable issues is bizarre. Is there an example of top-down planning, channeling, and engineering of a society that actually worked without tyrannizing the citizenry in the process? I can’t think of one. Liberal democratic societies determined centuries ago that freedom and self-determination mixed with assumed responsibility and care within one’s community are preferable to governance that treats individuals as masses to be forced into conformity (administrative or otherwise), regulated heavily, and/or disproportionately incarcerated like in the U.S. But the worm has turned. Budding authoritarians now seek reforms and uniformity to manage diverse, messy populations.

Weirdly, ideologues also attempt to purge and purify history, which is chock full of villainy and atrocity. Those most ideologically possessed seek both historical and contemporary targets to denounce and cancel, not even excluding themselves because, after all, the scourges of history are so abject and everyone benefited from them somehow. Search oneself for inherited privilege and all pay up for past iniquities! That’s the self-flagellating aspect: taking upon oneself (and depositing on others) the full weight of and responsibility for the sins of our forebears. Yet stamping out stubborn embers of fires allegedly still burning from many generations ago is an endless task. Absolutely no one measures up to expectations of sainthood when situated with an inherently and irredeemably evil society of men and women. That’s original sin, which can never be erased or forgiven. Just look at what humanity (via industrial civilization) has done to the surface of the planet. Everyone is criminally culpable. So give up all aspirations; no one can ever be worthy. Indeed, who even deserves to live?

Heard a remark (can’t remember where) that most these days would attack as openly ageist. Basically, if you’re young (let’s say below 25 years of age), then it’s your time to shut up, listen, and learn. Some might even say that true wisdom doesn’t typically emerge until much later in life, if indeed it appears at all. Exceptions only prove the rule. On the flip side, energy, creativity, and indignation (e.g., “it’s not fair! “) needed to drive social movements are typically the domain of those who have less to lose and everything to gain, meaning those just starting out in adult life. A full age range is needed, I suppose, since society isn’t generally age stratified except at the extremes (childhood and advanced age). (Turns out that what to call old people and what counts as old is rather clumsy, though probably not especially controversial.)

With this in mind, I can’t help but to wonder what’s going on with recent waves of social unrest and irrational ideology. Competing factions agitate vociferously in favor of one social/political ideology or another as though most of the ideas presented have no history. (Resemblances to Marxism, Bolshevism, and white supremacy are quite common. Liberal democracy, not so much.) Although factions aren’t by any means populated solely by young people, I observe that roughly a decade ago, higher education in particular transformed itself into an incubator for radicals and revolutionaries. Whether dissatisfaction began with the faculty and infected the students is impossible for me to assess. I’m not inside that intellectual bubble. However, urgent calls for radical reform have since moved well beyond the academy. A political program or ideology has yet to be put forward that I can support fully. (My doomer assessment of what the future holds forestalls knowing with any confidence what sort of program or ideology into which to pour my waning emotional and intellectual energy.) It’s still fairly simple to criticize and denounce, of course. Lots of things egregiously wrong in the world.

My frustration with what passes for political debate (if Twitter is any indication) is the marked tendency to immediately resort to comparisons with Yahtzees in general or Phitler in particular. It’s unhinged and unproductive. Yahtzees are cited as an emotional trigger, much like baseless accusations of racism send everyone scrambling for cover lest they be cancelled. Typically, the Yahtzee/Phitler comparison or accusation itself is enough to put someone on their heels, but wizened folks (those lucky few) recognize the cheap rhetorical trick. The Yahtzee Protocol isn’t quite the same as Godwin’s Law, which states that the longer a discussion goes on (at Usenet in the earliest examples) increases the inevitability likelihood of someone bringing up Yahtzees and Phitler and ruining useful participation. The protocol has been deployed effectively in the Russian-Ukraine conflict, though I’m at a loss to determine in which direction. The mere existence of the now-infamous Azov Battalion, purportedly comprised is Yahtzees, means that automatically, reflexively, the fight is on. Who can say what the background rate of Yahtzee sympathizers (whatever that means) might be in any fighting force or indeed the general population? Not me. Similarly, what threshold qualifies a tyrant to stand beside Phitler on a list of worst evers? Those accusations are flung around like cooked spaghetti thrown against the wall just to see what sticks. Even if the accusation does stick, what possible good does it do? Ah, I know: it makes the accuser look like a virtuous fool.

Ask parents what ambitions they harbor for their child or children and among the most patterned responses is “I just want them to be happy.” I find such an answer thoughtless and disingenuous, and the insertion of the hedge just to make happiness sound like a small ask is a red herring. To begin with, for most kids still in their first decade, happiness and playfulness are relatively effortless and natural so long as a secure, loving environment is provided. Certainly not a default setting, but it’s still quite commonplace. As the dreamy style of childhood cognition is gradually supplanted by supposedly more logical, rational, adult thinking, and as children become acquainted with iniquities of both history and contemporary life, innocence and optimism become impossible to retain. Cue the sullen teenager confronting the yawning chasm between desire and reality. Indeed, few people seem to make the transition into adulthood knowing with much clarity how to be happy in the midst of widespread travail and suffering. Instead, young adults frequently substitute self-destructive, nihilistic hedonism, something learned primarily (says me) from the posturing of movie characters and the celebrities who portray them. (Never understood the trope of criminals hanging at nightclubs, surrounded by drug addicts, nymphos, other unsavory types, and truly awful music, where they can indulge their assholery before everything inevitably goes sideways.)

Many philosophies recommend simplicity, naturalness, and independence as paths to happiness and moral rectitude. Transcendentalism was one such response to social and political complexities that spoil and/or corrupt. Yet two centuries on, the world has only gotten more and more complex, pressing on everyone especially for information processing in volume and sophistication that does not at all come naturally to most and is arguably not part of our evolutionary toolkit. Multiple social issues, if one is to engage them fairly, hinge on legalistic arguments and bewildering wordplay that render them fundamentally intractable. Accordingly, many waive away all nuance and adopt pro forma attitudes. Yet the airwaves, social media, the Internet, and even dinner conversations are suffused by the worst sorts of hypercomplexity and casuistry that confound even those who traffic regularly in such rhetoric. It’s a very long way from “I just want to be happy.”

(more…)

“Language is dynamic” is a phrase invoked in praise or derision of shifts in usage. Corollaries include “the only constant is change” and “time’s arrow points in only one direction” — both signalling that stasis is an invalid and ultimately futile conservative value. The flip side might well be the myth of progress, understood in reference not to technological advancement but human nature’s failure to rise above its base (animal) origins. This failure is especially grotesque considering that humans currently albeit temporarily live in an age of material abundance that would provide amply for everyone if that largesse were justly and equitably produced and distributed. However, resources (including labor) are being systematically exploited, diverted, and hoarded by a small, unethical elite (what some call “alpha chimps”) who often use state power to subjugate vulnerable populations to funnel further tribute to the already obscenely wealthy top of the socioeconomic hierarchy. But that’s a different diatribe.

Although I’m sensitive the dynamism of language — especially terms for broad ideas in need of short, snappy neologisms — I’m resistant to adopting most new coin. For instance, multiple colors of pill (red, blue, white, and black to my knowledge) refer to certain narrative complexes that people, in effect, swallow. Similarly, the “blue church” is used to refer to legacy media struggling desperately (and failing) to retain its last shreds of legitimacy and authority. (Dignity is long gone.) Does language really need these terms or are hipsters just being clever? That question probably lacks a definitive answer.

My real interest with this blog post, however, is how the modern digital mediascape has given rise to a curious phenomenon associated with cancel culture: deletion of tweets and social media posts to scrub one’s past of impropriety as though the tweet or post never happened. (I’ve never deleted a post nor have any plans to.) Silicon Valley hegemons can’t resist getting their piece of the action, too, by applying deeply flawed algorithms to everyone’s content to demonetize, restrict, and/or remove (i.e., censor) offensive opinion that runs counter to (shifting) consensus narratives decided upon in their sole discretion as water carriers for officialdom. Algorithmic dragnets are not effective kludges precisely because thoughts are not synonymous with their online expression; one merely points to the other. Used to be said that the Internet is forever, so wait before posting or tweeting a reasonable duration so that irresponsible behavior (opinion and trolling, mostly) can be tempered. Who knows who possesses technical expertise and access to tweet and video archives other than, say, the Wayback Machine? When a public figure says or does something dumb, a search-and-destroy mission is often launched to resurrect offending and damning past utterance. Of course, scrub-a-dub erasure or deletion is merely another attempt to manage narrative and isn’t a plea for forgiveness, which doesn’t exist in the public sphere anyway except for rehabilitated monsters such as past U.S. presidents a/k/a war criminals. And the Internet isn’t in fact forever; ask an archivist.

Shifting language, shifting records, shifting sentiment, shifting intellectual history are all aspects of culture that develop naturally and inevitably over time. We no longer believe, for instance, in the four elements or geocentrism (a/k/a the Aristotelian-Ptolemaic system; never mind the intransigent Flat Earthers who need not be silenced). Darker aspects of these shifts, however, include the remarkable Orwellian insight that “Who controls the past controls the future: who controls the present controls the past” from the 1949 novel Nineteen Eighty-Four. Here’s the passage for context:

Who controls the past, controls the future: who controls the present, controls the past … The mutability of the past is the central tenet of Ingsoc. Past events, it is argued, have no objective existence, but survive only in written records and in human memories. The past is whatever the records and the memories agree upon. And since the Party is in full control of all records, and in equally full control of the minds of its members, it follows that the past is whatever the Party chooses to make it.

In 2021, the awful lesson is taken to heart by multiple parties (not the Party in the novel but wannabes) who have latched maniacally onto Orwellian mechanisms of thought control specifically through the manipulation of records, history, and language. But as mentioned above, policing mere expression is not the same as policing thought itself, at least among those who retain critical thinking skills and independence of mind. I abstain judgment how effective attempted brainwashing is with the masses but will at least mention that Yeonmi Park, who escaped from North Korea in 2007 before settling in the U.S. in 2014, describes the chilling totalitarian thought control exercised by the North Korean government — the stuff of nightmare dystopianism. The template is by now well established and despots everywhere are only too happy to implement it repeatedly, following an evil trajectory that should be resisted at every turn while still possible.

Continuing my book-blogging project on Orality and Literacy, Ong provides context for the oral tradition that surrounded the two great Homeric classics: The Iliad and The Odyssey. According to Ong, it took decades for literary critics and sociologists to overcome their bias, borne out of literacy, and recognize how formulaic are the two epics. They are essentially pastiches of commonplace plots, phrases, and sayings of the time, which was a notable strength when oral delivery based on memorization was how epic poetry was transmitted. In a literate era, such clichés are to be avoided (like the plague).

Aside: my review of David Serota’s Back to Our Future mentions the dialect he and his brother developed, filled with one-liners and catchphrases from entertainment media, especially TV and movies. The three-word (also three-syllable) form seems to be optimal: “Beam me up” (Star Trek), “Use the Force” (Star Wars), “Make my day” (Dirty Harry), “I’ll be back” (The Terminator), etc. This construction is short, punchy, and memorable. The first holder of high office in the U.S. to attempt to govern by catchphrase was probably Ronald Reagan, followed (of course) by Arnold Schwarzenegger and then Donald Trump. Mustn’t overlook that all three (and others) came to prominence via the entertainment industry rather than through earnest (Kennedyesque) public service. Trump’s numerous three-word phrases (shtick, really) lend themselves especially well to being chanted by adoring crowds at his pep rallies, swept up in groupthink, with a recognizable beat-beat-beat-(silence) structure. The rock band Queen stumbled upon this same elemental rhythm with its famous stomp-stomp-clap-(wait) from the anthem “We Are the Champions,” consciously intended for audience participation (as I understand it).

Further aside: “We Are the Champions” combines its iconic rhythm with a recitation tone sourced in antiquity. Make of that what you will.

Ong goes on to provide a discussion of the psychodynamics of orality, which I list here without substantive discussion (read for yourself):

  • orality is additive rather than subordinative
  • orality is aggregative rather than analytic
  • orality is redundant or copious
  • orality is conservative or traditionalist
  • orality is close to the human lifeworld
  • orality is agonistically toned
  • orality is empathetic and participatory rather than objectively distanced
  • orality is homeostatic
  • orality is situational rather than abstract

Of particular interest is Ong’s description of how language functions within oral cultures distinctly from literate cultures, which is the source of the bias mentioned above. To wit:

Fully literate persons can only with great difficulty imagine what a primary oral culture is like, that is, a culture with no knowledge whatsoever of writing or even the possibility of writing … In a primary oral culture, the expression ‘to look up something’ is an empty phrase … [w]ithout writing, words as such have no visual presence, even when the objects they represent are visual … [for] ‘primitive’ (oral) people … language is a mode of action and not simply a countersign of thought — oral people commonly, and probably universally, consider words to have great power. [pp. 31–32]

If this sounds conspicuously reminiscent this previous post, well, congratulations on connecting the dots. The whole point, according to a certain perspective, is that words are capable of violence, which is (re)gaining adherents as our mental frameworks undergo continuous revision. It’s no small thing that slurs, insults, and fighting words (again) provoke offense and violent response and that mere verbal offense equates to violence. Not long ago, nasty words were reclaimed, nullified, and thus made impotent (with varying levels of irrational rules of usage). Well, now they sting again and are used as ammo to cancel (a form of administrative violence, often undertaken anonymously, bureaucratically, and with the assistance of the digital mob) anyone with improper credentials to deploy them.

Let me draw another connection. Here’s a curious quote by Walter Pater, though not well known:

All art constantly aspires towards the condition of music. For while in all other kinds of art it is possible to distinguish the matter from the form, and the understanding can always make this distinction, yet it is the constant effort of art to obliterate it.

Put another way, the separation of signifier from signified, an abstraction conditioned by literacy and rationalism (among other things) is removed (“obliterated”) by music, which connects to emotion more directly than representational art. Similarly, speech within primary oral cultures exists purely as sound and possesses an ephemeral, even effervescence (Ong’s term) quality only experienced in the flow of time. (Arguably, all of human experience takes place within the flow of time.) Music and “primitive” speech are accordingly dynamic and cannot be reduced to static snapshots, that is, fixed on a page as text or committed to a canvas or photograph as a still image (hence, the strange term still life). That’s why a three-word, three-syllable chant, or better yet, the Queen rhythm or the Wave in sports arenas (a gesture requiring subscription of everyone), can possess inherent power, especially as individuals are entrained in groupthink. Music and words-as-violence get inside us and are nearly wholly subjective, not objective — something we all experience organically in early childhood before being taught to read and write (if in fact those skills are learned beyond functional literacy). Does that mean culture is reverting to an earlier stage of development, more primitive, childlike, and irrational?

So far, this multipart blog post has trafficked in principles and generalities. Let me try now to be more specific, starting with an excerpt from Barry Lynn’s article in Harper’s Magazine titled “The Big Tech Extortion Racket” (Sept. 2020):

… around the middle of the nineteenth century, Americans began to develop technologies that could not be broken into component pieces. This was especially true of the railroad and the telegraph … Such corporations [railroad and telegraph companies] posed one overarching challenge: they charged some people more than others to get to market. They exploited their control over an essential service in order to extort money, and sometimes political favors … Americans found the answer to this problem in common law. For centuries, the owners of ferries, stagecoaches, and inns had been required to serve all customers for the same price and in the order in which they arrived. In the late nineteenth century, versions of such “common carrier” rules were applied to the new middleman corporations.

Today we rightly celebrate the Sherman Antitrust Act of 1890, which gave Americans the power to break apart private corporations. But in many respects, the Interstate Commerce Act of 1887 was the more important document. This act was based on the understanding that monopoly networks like the railroad and the telegraph could be used to influence the actions of people who depend on them, and hence their power must be carefully restricted …

For a century and a half, Americans used common carrier policies to ensure the rule of law in activities that depended on privately held monopolies … regulations freed Americans to take full advantage of every important network technology introduced during these years, including telephones, water and electrical services, energy pipelines, and even large, logistics-powered retailers. Citizens did not have to worry that the men who controlled the technologies involved would exploit their middleman position to steal other people’s business or disrupt balances of power.

I appreciate that Barry Lynn brings up the Interstate Commerce Act. If this legal doctrine appeared in the net neutrality debate a few years ago, it must have escaped my notice. While Internet Service Providers (ISPs) enable network access and connectivity, those utilities have not yet exhibited let’s-be-evil characteristics. Similarly, phone companies (including cell phones) and public libraries may well be eavesdropping and/or monitoring activities of the citizenry, but the real action lies elsewhere, namely, on social media networks and with online retailers. Evil is arguably concentrated in the FANG (or FAANG) corporations but has now grown to be ubiquitous in all social networks (e.g., Twitter) operating as common carriers (Zoom? Slack?) and across academe, nearly all of which have succumbed to moral panic. They are interpreting correctly, sad to observe, demands to censor and sanitize others’ no-longer-free speech appearing on their networks or within their realms. How much deeper it goes toward shaping politics and social engineering is quasi-conspiratorial and impossible for me to assess.

Much as I would prefer to believe that individuals possess the good sense to shift their activities away from social networks or turn their attention from discomfiting information sources, that does not appear to be the case. Demands for trigger warnings and safe spaces commonplace a few years ago on college campuses have instead morphed into censorious removal, deplatforming, and cancellation from the entire public sphere. Those are wrong responses in free societies, but modern institutions and technologies have gotten out of hand and outstripped the limits of normal human cognition. In short, we’re a society gone mad. So rather than accept responsibility to sort out information overflow oneself, many are demanding that others do it for them, and evil private corporations are complying (after a fashion). Moreover, calls for creation of an Orwellian Ministry of Truth, rebranded as a Truth Commission and Reality Czar, could hardly be any more chillingly and fascistically bizarre. People really need someone to brainwash decide for them what is real? Has anyone at the New York Times actually read Orwell’s dystopian novel 1984 and taken to heart its lessons?

Fantasies and delusions rush into the space
that reason has vacated in fear of its life.

—James Howard Kunstler

Since I first warned that this blog post was forthcoming, conditions of modern American life we might have hoped would be resolved by now remain intransigently with us. Most are scrambling to adjust to the new normal: no work (for tens of millions), no concerts, no sports (except for events staged for the camera to be broadcast later), little or no new cinema (but plenty of streaming TV), no school or church (except for abysmal substitutes via computer), no competent leadership, and no end in sight. The real economy swirls about the drain despite the fake economy (read: the stock market a/k/a the Richistan economy) having first shed value faster than ever before in history then staged a precipitous taxpayer-funded, debt-fueled recovery only to position itself for imminent resumption of its false-started implosion. The pandemic ebbed elsewhere then saw its own resumption, but not in the U.S., which scarcely ebbed at all and now leads the world in clownish mismanagement of the crisis. Throughout it all, we extend and pretend that the misguided modern age isn’t actually coming to a dismal close, based as it is on a consumption-and-growth paradigm that anyone even modestly numerically literate can recognize is, um, (euphemism alert) unsustainable.

Before full-on collapse (already rising over the horizon like those fires sweeping across the American West) hits, however, we’ve got unfinished business: getting our heads (and society) right regarding which of several competing ideologies can or should establish itself as the righteous path forward. That might sound like the proverbial arranging of deck chairs on the RMS Titanic, but in an uncharacteristically charitable moment, let me suggest that righting things before we’re done might be an earnest obligation even if we can’t admit openly just how close looms the end of (human) history. According to market fundamentalists, corporatists, and oligarchs, Socialism and Marxism, or more generally collectivism, must finally have a stake driven through its undead heart. According to radical progressives, Black Lives Matter, and Antifa, fascism and racism, or more generally intolerance, deserve to be finally stamped out, completing the long arc of history stalled after the Civil Rights Era. And according to barely-even-a-majority-anymore whites (or at least the conservative subset), benefits and advantages accrued over generations, or more generally privilege, must be leveraged, solidified, and maintained lest the status quo be irretrievably lost. Other factions no doubt exist. Thus, we are witnessing a battle royale among narratives and ideologies, none of which IMO crystallize the moment adequately.

Of those cited above, the first and third are easy to dismiss as moribund and self-serving. Only the second demonstrates any concern for the wellbeing of others. However, and despite its putative birthplace in the academy, it has twisted itself into pretzel logic and become every bit as intolerant as the scourges it rails against. Since I need a moniker for this loose, uncoordinated network of movements, I’ll refer to them as the Woke Left, which signifies waking up (i.e., being woke) to injustice and inequity. Sustained analysis of the Woke Left is available from James Lindsay through a variety of articles and interviews (do a search). Lindsay demonstrates handily how the Woke Left’s principle claims, often expressed through its specialized rhetoric called Critical Theory, is actually an inversion of everything it pretends to be. This body of thought has legitimate historical and academic lineage, so it’s arguable that only its most current incarnation in the Woke Left deserves scorn.

Two recently published books exemplify the rhetoric of the Woke Left: White Fragility (2018) by Robin DiAngelo and How to Be an Antiracist (2019) by Ibram Kendi. Although I’ve read neither book, I’m aware of numerous scathing reviews that point out fundamental problems with the books and their authors’ arguments. Foremost among them is what’s sometimes called a Kafka trap, a Catch-22 because all avenues of argument lead inescapably toward guilt, typically some form of original sin. Convinced they are on the righteous right side of history, Woke Left protesters and agitators have been harassing and physically threatening strangers to demand support for the cause, i.e., compliance. What cause is a good question, considering a coherent program has yet to be articulated. Forcing others to choose either side of a false binary — with us or against us — is madness, but that’s the cultural moment at which we’ve arrived. Everyone must align their ideology with some irrational narrative while being put at risk of cancellation and/or destruction no matter what alignment is ventured.

If things go south badly on the heels of contested election results this fall as many expect — the pump already primed for such conflict — and a second civil war ensues, I rather expect the Woke Left to be the first to fail and the other two, each representing the status quo (though different kinds), to be in an extended battle for control of whatever remains of the union. I can’t align with any of them, since by my lights they’re all different kinds of crazy. Sorta makes ya wonder, taking history as an indicator, if a fourth or fifth faction won’t appear before it’s a wrap. I don’t hold out any hope for any faction steering us competently through this crisis.

Caveat: this post is uncharacteristically long and perhaps a bit disjointed. Or perhaps an emerging blogging style is being forged. Be forewarned.

Sam Harris has been the subject of or mentioned in numerous previous blog posts. His podcast Making Sense (formerly, Waking Up), partially behind a paywall but generously offered for free (no questions asked) to those claiming financial hardship, used to be among those I would tune in regularly. Like the Joe Rogan Experience (soon moving to Spotify — does that mean its disappearance from YouTube? — the diversity of guests and reliable intellectual stimulation have been attractive. Calling his podcast Making Sense aligns with my earnest concern over actually making sense of things as the world spins out of control and the epistemological crisis deepens. Yet Harris has been a controversial figure since coming to prominence as a militant atheist. I really want to like what Harris offers, but regrettably, he has lost (most of) my attention. Others reaching the same conclusion have written or vlogged their reasons, e.g., “Why I’m no longer a fan of ….” Do a search.

Having already ranted over specific issues Harris has raised, let me instead register three general complaints. First, once a subject is open for discussion, it’s flogged to death, often without reaching any sort of conclusion, or frankly, helping to make sense. For instance, Harris’ solo discussion (no link) regarding facets of the killing of George Floyd in May 2020, which event sparked still unabated civil unrest, did more to confuse than clarify. It was as though Harris were trying the court case by himself, without a judge, jury, or opposing counsel. My second complaint is that Harris’ verbosity, while impressive in many respects, leads to interviews marred by long-winded, one-sided speeches where the thread is hopelessly lost, blocking an interlocutor from tracking and responding effectively. Whether Harris intends to bury others under an avalanche of argument or does so uncontrollably doesn’t matter. It’s still a Gish gallop. Third is his over-emphasis on hypotheticals and thought experiments. Extrapolation is a useful but limited rhetorical technique, as is distillation. However, treating prospective events as certainties is tantamount to building arguments on poor foundations, namely, abstractions. Much as I admire Harris’ ambition to carve out a space within the public sphere to get paid for thinking and discussing topics of significant political and philosophical currency, he frustrates me enough that I rarely tune in anymore.

In contrast, the Rebel Wisdom channel on YouTube offers considerably more useful content, which includes a series on sensemaking. The face of Rebel Wisdom is documentarian David Fuller, who asks informed questions but avoids positioning himself in the expository center. Quite a change from the too-familiar news-anchor-as-opinion-maker approach taken by most media stars. If there were a blog, I would add it to my blogroll. However, offer of memberships ranging from $5 to $500 per month irks me. Paid-for VIP status too closely resembles selling of empty cachet or Catholic indulgences, especially those with guarantees of “special access.”

I became especially interested in Daniel Schmachtenberger‘s appearances on Rebel Wisdom and his approach to sensemaking. Lots of exciting ideas; clearly the fellow has developed an impressive framework for the dynamics involved. But to make it really useful, as opposed to purely theoretical, formal study akin to taking a philosophy course is needed. Maybe there’s written material available, but without a clear text resource, the prospect of sifting unguided through a growing collection of YouTube videos caused me to retreat (out of frustration? laziness?). At some later point, I learned that Schmachtenberger was a participant among a loose collection of under-the-radar intellectuals (not yet having elevated themselves to thought leaders) working on an alternative to politics-and-civilization-as-usual called Game B (for lack of a better name). A good article about Schmachtenberger and what’s called “The War on Sensemaking” (numerous Internet locations) is found here.

While the Game B gang seems to have imploded over disagreements and impasses (though there may well be Internet subcultures still carrying the torch), its main thrust has been picked up by Bret Weinstein and his DarkHorse Podcast (var.: Dark Horse) co-hosted by his wife Heather Heying. Together, they analyze contemporary political and cultural trends through the twin perspectives of evolutionary biology and game theory. They also live in Portland, Oregon, home to the most radical leftist civil unrest currently under way this summer of 2020. They further warn unambiguously that we Americans are at grave risk of losing the grand melting pot experiment the U.S. represents as the self-anointed leader of the free world and standard-bearer of liberal democratic values sprung from the Enlightenment. What is meant by protesters to succeed the current regime in this proto-revolutionary moment is wildly unclear, but it looks to be decidedly fascist in character. Accordingly, Weinstein and Heying are actively promoting Unity 2020 (var.: Unity2020 and Un1ty2020) to select and nominate an independent U.S. presidential candidate — “Not Trump. Not Biden.” Unless you’re jacked into the Internet and political discussions avidly, it’s quite easy to overlook this emergent political reform. I was vaguely aware of Articles of Unity and its “Plan to Save the Republic” yet still had trouble locating it via Web searches. Weinstein’s penchant (shared with his brother Eric) for coining new terms with flexible spelling is no aid.

Like Rebel Wisdom, Weinstein and Heying, each on their individual Patreon pages, offer multiple levels of membership and access: $2 to $250 per month for him, $5 to $17 per month for her. Why such high divergence, I wonder? I raise paid memberships repeatedly because, while acknowledging the need to fund worthwhile endeavor and to earn a living, there is something tacky and unseemly about enabling concentric inner circles exclusively through paid access — no other apparent qualification needed. More pointedly, an article called “The Corrupting Power Of The Inner Ring” by Rod Dreher at The American Conservative discusses David Brooks’ column about Alan Jacobs’ book How to Think (2017) where Jacobs cites C.S. Lewis’ concept of the inner ring — something to be distrusted. (Sorry about that long string of names.) Also demonstrates how ideas are highly derivative of antecedents found throughout culture and history.

Anyway, the DarkHorse Podcast provides some of the best analysis (not to be confused with news reporting or journalism, neither of which is even remotely successful at sensemaking anymore) to be found among those inserting themselves into the public conversation (if such a thing can be said to exist). Willingness to transform oneself into a pundit and then opine freely about anything and everything is a shared attribute of the people profiled above. (I specifically disclaimed punditry as a goal of mine when launching this blog.) That most of them have embraced podcasting (not blogging — I’m so unhip, committed to a legacy medium that both came and went with surprising celerity) as the primary medium of information exchange is quite contemporary. I surmise it’s silent acknowledgement that Americans (on the whole) no longer read and that text has fallen out of favor compared to speech, especially the eavesdropped conversational type. Podcasting doesn’t complete the information gathering and sensemaking shift from text and newsprint to TV and video begun many decades ago but certainly intensifies it. Podcasting has also demonstrated real money-making potential if one succeeds in attracting a sufficient audience (driving ad revenue) and/or a cadre of subscribers and contributors. Potential for real political engagement is unproven as yet.

Another public intellectual I cited briefly a couple years ago, Thomas Sowell, crossed my browsing path yet again. And yet again, I found myself somewhat credulously led down the primrose path set by his reckless (or savvy?) juxtaposition of facts and details until a seemingly logical conclusion appeared magically without his ever having made it manifest. In the two-year-old interview I watched (no link), Sowell states cause-and-effect (or substitutes one combo for another) confidently while simultaneously exuding false humility. He basically serves up a series of small sells leading to the big sell, except that the small sells don’t combine convincingly unless one is swept unawares into their momentum. But the small sells work individually, and I found myself agreeing repeatedly before having to recognize and refuse the final sale. I also recognize in Sowell’s reliance on facts and numerical data my own adherence to evidence. That’s an epistemological foundation we should all share. Moreover, my willingness to consider Sowell’s remarks is a weak stab at heterodoxy. But as the modern information environment has made abundantly clear, lying with numbers and distortion of facts (or more simply, fake news and narrative spin) are precisely what makes sensemaking so difficult yet critical. For instance, I have echoed Sowell recently in suggesting that inequality and structural violence may be less rooted in snarling, overt racism (at least since the Civil Rights Era) than in simple greed and opportunism while acknowledging that virulent white supremacism does still exit. Yet others insist that everything is political, or racist, or owing to class conflict, or subsumed entirely by biology, chemistry, or physics (or religion). Take your pick of interpretations of reality a/k/a sensemaking. I had halfway expected someone to take me to task for failing to voice the approved radical leftist orthodoxy or try to cancel me for publishing something nominally conservative or Sowellesque. But no one cares what I blog about; I have succeeded in avoiding punditry.

With such a sprawling survey of sensemakers good and bad, successful and unsuccessful (according to me), there is no handy conclusion. Instead, let me point to the launching point for this blog post: my blog post called “Mad World Preamble.” Even before that, I blogged about Iain McGilchrist’s book The Master and His Emissary (2010), drawing particular attention to chap. 12 as his diagnosis of how and when the modern world went mad. Perhaps we have indeed managed to step back from the atomic brink (MAD) only to totter and stumble through a few extra decades as PoMo madness overtook us completely in the latter half of the 20th century; and maybe the madness is not yet the hallucinatory type fully evident at a glance. However, look no further than the two gibbering fools foisted upon the voting public in the upcoming U.S. presidential election. Neither is remotely capable of serving responsibly. Every presidential election in the 21st century has been accompanied by breathless analysis prophesying the implosion of either political party following an electoral loss. Well, they both imploded and can’t field a proper candidate for high office anymore. There is probably no stronger test case for societal and institutional madness than the charade we’re now witnessing. Maybe Unity 2020 is onto something.

One of the victims of cancel culture, coming to my attention only days ago, is Kate Smith (1907–1986), a singer of American popular song. Though Smith had a singing career spanning five decades, she is best remembered for her version(s) of Irving Berlin’s God Bless America, which justifiably became a bit of Americana. The decades of Smith’s peak activity were the 1930s and 40s.

/rant on

I dunno what goes through people’s heads, performing purity rituals or character excavation on folks long dead. The controversy stems from Smith having a couple other songs in her discography: That’s Why Darkies Were Born (1931) and Pickaninny Heaven from the movie Hello, Everybody! (1933). Hate to break it anyone still living under a rock, but these dates are not far removed from minstrelsy, blackface, and The Birth of a Nation (1915) — a time when typical Americans referred to blacks with a variety of terms we now consider slurs. Such references were still used during the American civil rights movement (1960s) and are in use among some virulent white supremacists even today. I don’t know the full context of Kate Smith having sung those songs, but I suspect I don’t need to. In that era, popular entertainment had few of the sensibilities regarding race we now have (culture may have moved on, but it’s hard to say with a straight face it’s evolved or progressed humanely), and uttering commonly used terms back then was not automatic evidence of any sort of snarling racism.

I remember having heard my grandparents, nearly exact contemporaries of Kate Smith, referring to blacks (the term I grew up with, still acceptable I think) with other terms we no longer consider acceptable. It shocked me, but to them, that’s simply what blacks were called (the term(s) they grew up with). Absolutely nothing in my grandparents’ character or behavior indicated a nasty, racist intent. I suspect the same was true of Kate Smith in the 1930s.

Back when I was a librarian, I also saw plenty of sheet music published before 1920 or so with the term darkie (or darkey) in the title. See for example this. The Library of Congress still uses the subject headings “negro spirituals” (is there another kind?) and “negro songs” to refer to various subgenres of American folk song that includes slave songs, work songs, spirituals, minstrel music, protest songs, etc. Maybe we should cancel the Library of Congress. Some published music titles from back then even call them coon songs. That last one is totally unacceptable today, but it’s frankly part of our history, and like changing character names in Mark Twain’s Huckleberry Finn, sanitizing the past does not make it go away or any less discomfiting. But if you wanna bury your head in the sand, go ahead, ostrich.

Also, if some person or entity ever does some questionably racist, sexist, or malign thing (even something short of abominable) situated contextually in the past, does that mean he, she, or it must be cancelled irrevocably? If that be the case, then I guess we gotta cancel composer Richard Wagner, one of the most notorious anti-Semites of the 19th century. Also, stop watching Pixar, Marvel, and Star Wars films (among others), because remember that time when Walt Disney Studios (now Walt Disney Company) made a racist musical film, Song of the South (1946)? Disney’s tainted legacy (extending well beyond that one movie) is at least as awful as, say, Kevin Spacey, and we’re certainly not about to rehabilitate him.

/rant off