Archive for the ‘Debate’ Category

The Bulletin of the Atomic Scientists has reset its doomsday clock ten seconds closer to midnight (figuratively, nuclear Armageddon), bringing the world closest to catastrophe in its history. Bizarrely, its statement published today points squarely at Russia as the culprit but fails to mention the participation of the United States and other NATO countries in the conflict. Seems to me a rather unethical deployment of distinctly one-sided rhetoric. With the clock poised so close to midnight, there’s nearly nowhere left to reset the clock until the bombs fly. In contrast, two of the blogs I read that are openly critical of provocations and escalation against Russia, not by Russia, are Bracing Views (on my blogroll) and Caitlin Johnstone (not on my blogroll). Neither minces words, and both either suggest or say openly that U.S. leadership are the bad guys indulging in nuclear brinkmanship. Many more example of that ongoing debate are available. Judge for yourself whose characterizations are more accurate and useful.

Although armed conflict at one scale or another, one time or another, is inescapable given mankind’s violent nature, it takes no subtle morality or highfalutin ethics to be default antiwar, whereas U.S. leadership is uniformly prowar. Can’t know for sure what the motivations are, but the usual suspects include fear of appearing weak (um, it’s actually that fear that’s weak, dummies) and the profit motive (war is a racket). Neither convinces me in the slightest that squandering lives, energy, and lucre make war worth contemplating except in extremis. Military action (and its logistical support such as the U.S. is providing to Ukraine) should only be undertaken with the gravest regret and reluctance. Instead, war is glorified and valorized. Fools rush in ….

No need to think hard on this subject, no matter where one gets information and reporting (alternately, disinformation and misreporting). Also doesn’t ultimately matter who are the good guys or the bad guys. What needs to happen in all directions and by all parties is deescalation and diplomacy. Name calling, scapegoating, and finger pointing might soothe some that they’re on the right side of history. The bad side of history will be when nuclear powers go MAD, at which point no one can stake any moral high ground.

My inquiries into media theory long ago led me to Alan Jacobs and his abandoned, reactivated, then reabandoned blog Text Patterns. Jacobs is a promiscuous thinker and even more promiscuous technologist in that he has adopted and abandoned quite a few computer apps and publishing venues over time, offering explanations each time. Always looking for better tools, perhaps, but this roving public intellectual requires persistent attention lest one lose track of him. His current blog (for now) is The Homebound Symphony (not on my ruthlessly short blogroll), which is updated roughly daily, sometimes with linkfests or simple an image, other times with thoughtful analysis. Since I’m not as available as most academics to spend all day reading and synthesizing what I’ve read to put into a blog post, college class, or book, I am not on any sort of schedule and only publish new blog posts when I’m ready. Discovered in my latest visit to The Homebound Symphony was a plethora of super-interesting subject matter, which I daresay is relevant to the more literate and literary among us. Let me draw out the one that most piqued my interest. (That was the long way of tipping my hat to Jacobs for the link.)

In an old (by Internet standards) yet fascinating book review by Michael Walzer of Siep Stuurman’s The Invention of Humanity: Equality and Cultural Difference in World History (2017), Walzer describes the four inequalities that have persisted throughout human history, adding a fifth identified by Stuurman:

  • geographic inequality
  • racial inequality
  • hierarchical inequality
  • economic inequality
  • temporal inequality

I won’t unpack what each means if they’re not apparent on their face. Read for yourself. Intersections and overlapping are common in taxonomies of this sort, so don’t expect categories to be completely separate and distinct. The question of equality (or its inverse inequality) is a fairly recent development, part of a stew of 18th-century thought in the West that was ultimately distilled to one famous phrase “all men are created equal.” Seems obvious, but the phrase is fraught, and we’ve never really been equal, have we? So is it equal before god? Equal before the law? Equal in all opportunities and outcomes as social justice warriors now insist? On a moment’s inspection, no one can possibly believe we’re all equal despite aspirations that everyone be treated fairly. The very existence of perennial inequalities puts the lie to any notion of equality trucked in with the invention of humanity during the Enlightenment.

To those inequalities I would add a sixth: genetic inequality. Again, overlap with the others is acknowledged, but it might be worth observing that divergent inherited characteristics (other than wealth) appear quite early in life among siblings and peers, before most others manifest. By that, I certainly don’t mean race or sex, though differences clearly exist there as well. Think instead of intelligence, height, beauty, athletic ability, charisma, health and constitution, and even longevity (life span). Each of us has a mixture of characteristics that are plainly different from those of others and which provide either springboards or produce disadvantages. Just as it’s unusual to find someone in possession of all positive characteristics at once — the equivalent of rolling a 12 for each attribute of a new D&D character — few possess all negatives (a series of 1’s), either. Also, there’s probably no good way to rank best to worst, strongest to weakest, or most to least successful. Bean counters from one discipline or another might try, but that runs counter to the mythology “all men are created equal” and thus becomes a taboo to acknowledge, much less scrutinize.

What to do with the knowledge that all men are not in fact created equal and never will be? That some are stronger; more charming; smarter; taller with good teeth (or these days, dentists), hair, vision, and square jaws; luckier in the genetic lottery? Well, chalk it up, buddy. We all lack some things and possess others.

Didn’t expect to come back to this one. Five years after having blogged on this topic, I was delighted to see Graham Hancock get full Netflix documentary treatment under the title Ancient Apocalypse. No doubt streaming video is shaped in both tone and content to fit modern audiences, not modern readers. We are no longer people of the book but instead people of the screen. (An even earlier mode, displaced by the onset in the Gutenberg Era, was the oral tradition, but that was a different blog.) As a result, the eight episodes come across as tabloid-style potboilers, which regrettably undermines Hancock’s authority. Having read two of Hancock’s books exploring the subject, I was already familiar with many of the ancient sites discussed and depicted, though some reports are updated from his books. The main thesis is that archeological structures and cultural origin stories all around the world point to a major human civilization now lost but being gradually rediscovered. The phase of destruction is unaccountably saved until episode eight, namely, a roughly twelve-hundred-year period known as the Younger Dryas marked by repeated, severe climatic events, most notably the Great Flood that raised sea level by more than 400 ft. Suspected causes of these events range from the breaking of ice dams and subsequent breakup of the continental ice sheets to multiple meteor impacts to a coronal mass ejection. Could be more than one.

Several YouTube reviews have already weighed in on strengths and weaknesses of the documentary. Learning that others have been completely absorbed by Hancock’s books is a little like discovering a lost sibling. Intellectual brethren focused on decidedly arcane subject matter is quite different from mass market fandom (or as I once heard someone joke, “You like pizza? I like pizza! BFF!”). Of course, beyond enthusiasts and aficionados are scofflaws, the latter of whom come under specific attack by Hancock for refusing to examine new evidence, instead adhering blindly to established, status quo, academic consensus. Although some would argue the principal takeaway Ancient Apocalypse is filling in gaps in the story of human development (a cosmology or better origin story), my assessment, perhaps a result of prior familiarity with Hancock’s work, is that officialdom as instantiated in various institutions is an abject and unremitting failure. The Catholic Church’s persecution of numerous proto-scientists as heretics during the Middle Ages, or similarly, what has recently become known derisively as “YouTube science” (where heterodox discussion is summarily demonetized in a pointless attempt to shut down dissent) should be concerning to anyone who supports the scientific method or wants to think for themselves. Whether refusals to even consider alternatives to cherished beliefs are a result of human frailty, power struggles, careerism, or sheer stupidity someone else can decide. Could be more than one.

A couple wild suggestions came up in the reviews I caught. For instance, lost knowledge of how to work stone into megaliths used to construct giant monuments is said to be related to either activating resonance in the stone or indeed a completely different form of energy from anything now known. A similar suggestion was made about how the World Trade Center and other nearby structures were demolished when 9/11 occurred. Specially, purported “directed free-energy technology” was deployed to weaken the molecular coherence of solid metal and concrete to collapse the buildings. (Video demonstrations of iron bars/beams being bent are available on YouTube.) For megaliths, the suggestion is that they are temporarily made into a softer, lighter (?) marshmallow-like substance to be positioned, reformed, and rehardened in situ. Indeed, material phase changes under extremes of pressure and temperature are both obvious and ubiquitous. However, to novices and the scientifically illiterate, this is the stuff of magic and alchemy or straight-up conspiracy (if one prefers). I’m largely agnostic when it comes to such assertions about megalithic structures but those theories are at least as tantalizing as evidence of existence of a lost civilization — especially when officialdom instructs everyone not to look there, or if one does anyway, not to believe one’s lying eyes.

As observed in my earlier blog on this subject, the possibility nay inevitability of destruction of our present civilization, whether from forces external or internal, would make putting aside petty squabbles and getting going on preparations (i.e., prepping for human survival) paramount. Good luck getting humanity all together on that project. Are there secret underground bunkers into which the financial and political elite can flee at the propitious moment, abandoning the masses to their fate? Again, conspiratorial types say yes, both now and in the ancient past. Good luck to any survivors, I guess, in the hellscape that awaits. I don’t want to be around after the first major catastrophe.

The comic below struck a chord and reminded me of Gary Larson’s clumsily drawn but often trenchant The Far Side comics on scientific subjects.

This one masquerades as science but is merely wordplay, i.e., puns, double entendres, and unexpectedly funny malapropisms (made famous by Yogi Berra, among others). Wordplay is also found in various cultural realms, including comic strips and stand-up comedy, advertising and branding, politics, and now Wokedom (a subset of grassroots politics, some might argue). Playing with words has gone from being a clever, sometimes enjoyable diversion (e.g., crossword puzzles) to fully deranging, weaponized language. Some might be inclined to waive away the seriousness of that contention using the childhood retort “sticks and stones ….” Indeed, I’m far less convinced of the psychological power of verbal nastiness than those who insist words are violence. But it’s equally wrong to say that words don’t matter (much) or have no effect whatsoever. Otherwise, why would those acting in bad faith work so tirelessly to control the narrative, often by restricting free speech (as though writing or out-loud speech were necessary for thoughts to form)?

It’s with some exasperation that I observe words no longer retain their meanings. Yeah, yeah … language is dynamic. But semantic shifts usually occur slowly as language evolves. Moreover, for communication to occur effectively, senders and receivers must be aligned in their understandings of words. If you and I have divergent understandings of, say, yellow, we won’t get very far in discussions of egg yolks and sunsets. The same is true of words such as liberal, fascist, freedom, and violence. A lack of shared understanding of terms, perhaps borne out of ignorance, bias, or agenda, leads to communications breakdown. But it’s gotten far worse than that. The meanings of words have been thrown wide open to PoMo reinterpretation that often invert their meanings in precisely the way George Orwell observed in his novel 1984 (published 1949): “War is peace. Freedom is slavery. Ignorance is strength.” Thus, earnest discussion of limitations on free speech and actual restriction on social media platforms, often via algorithmic identification of keywords that fail to account for irony, sarcasm, or context, fail to register that implementation of restrictive kludges already means free speech is essentially gone. The usual exceptions (obscenity, defamation, incitement, gag orders, secrecy, deceptive advertising, student speech, etc.) are not nearly as problematic because they have been adjudicated for several generations and accepted as established practice. Indeed, many exceptions have been relaxed considerably (e.g., obscenity that has become standard patois now fails to shock or offend), and slimy workarounds are now commonplace (e.g., using “people are saying …” to say something horrible yet shielding oneself while saying it). Another gray area includes fighting words and offensive words, which are being expanded (out of a misguided campaign to sanitize?) to include many words with origins as clinical and scientific terms, and faux offense used to stifle speech.

Restrictions on free speech are working in many respects, as many choose to self-censor to avoid running afoul of various self-appointed watchdogs or roving Internet thought police (another Orwell prophecy come true) ready to pounce on some unapproved utterance. One can argue whether self-censorship is cowardly or judicious, I suppose. However, silence and the pretense of agreement only conceal thoughts harbored privately and left unexpressed, which is why restrictions on public speech are fool’s errands and strategic blunders. Maybe the genie can be bottled for a time, but that only produces resentment (not agreement), which boils over into seething rage (and worse) at some point.

At this particular moment in U.S. culture, however, restrictions are not my greatest concern. Rather, it’s the wholesale control of information gathering and reporting that misrepresent or remove from the public sphere ingredients needed to form coherent thoughts and opinions. It’s not happening only to the hoi polloi; those in positions of power and control are manipulated, too. (How many lobbyists per member of Congress, industry after industry, whispering in their ears like so many Wormtongues?) And in extreme cases of fame and cult of personality, a leader or despot unwittingly surrounds him- or herself by a coterie of yes-men frankly afraid to tell the truth out of careerist self-interest or because of shoot-the-messenger syndrome. It’s lonely at the top, right?

Addendum: Mere minutes after publishing this post, I wandered over to Bracing Views (on my blogroll) and found this post saying some of the same things, namely, that choking information off at the source results in a degraded information landscape. Worth a read.

For this blog post, let me offer short and long versions of the assertion and argument, of which one of Caitlin Johnstone’s many aphorisms is the short one:

Short version: Modern mainstream feminism is just one big celebration of the idea that women can murder, predate, oppress, and exploit for power and profit just as well as any man.

Long version: Depicting strength in terms of quintessential masculine characteristics is ruining (fictional) storytelling. (Offenders in contemporary cinema and streaming will go unnamed, but examples abound now that the strong-female-lead meme has overwhelmed characters, plots, and stories. Gawd, I tire of it.) One could survey the past few decades to identify ostensibly strong women basically behaving like asshole men just to — what? — show that it can be done? Is this somehow better than misogynist depictions of characters using feminine wiles (euphemism alert) to get what they want? These options coexist today, plus some mixture of the two. However, the main reason the strong female lead fails as storytelling — punching, fighting, and shooting toe-to-toe with men — is that it bears little resemblance to reality.

In sports (combat sports especially), men and women are simply not equally equipped for reasons physiological, not ideological. Running, jumping, throwing, swinging, and punching in any sport where speed and power are principal attributes favors male physiology. Exceptions under extraordinary conditions (i.e., ultradistance running) only demonstrate the rule. Sure, a well-trained and -conditioned female in her prime can beat and/or defeat an untrained and poorly conditioned male. If one of those MMA females came after me, I’d be toast because I’m entirely untrained and I’m well beyond the age of a cage fighter. But that’s not what’s usually depicted onscreen. Instead, it’s one badass going up against another badass, trading blows until a victor emerges. If the female is understood as the righteous one, she is typically shown victorious despite the egregious mismatch.

Nonadherence to reality can be entertaining, I suppose, which might explain why the past two decades have delivered so many overpowered superheroes and strong female leads, both of which are quickly becoming jokes and producing backlash. Do others share my concern that, as fiction bleeds into reality, credulous women might be influenced by what they see onscreen to engage recklessly in fights with men (or for that matter, other women)? Undoubtedly, a gallant or chivalrous man would take a few shots before fighting back, but if not felled quickly, my expectation is that the fight is far more likely to go very badly for the female. Moreover, what sort of message does it communicate to have females engaging in violence and inflicting their will on others, whether in the service of justice or otherwise? That’s the patriarchy in a nutshell. Rebranding matriarchal social norms in terms of negative male characteristics, even for entertainment purposes, serves no one particularly well. I wonder if hindsight will prompt the questions “what on Earth were we thinking?” Considering how human culture is stuck in permanent adolescence, I rather doubt it.

In sales and marketing (as I understand them), one of the principal techniques to close a sale is to generate momentum by getting the prospective mark buyer to agree to a series of minor statements (small sells) leading to the eventual purchasing decision (the big sell or final sale). It’s narrow to broad, the reverse of the broad-to-narrow paragraph form many of us were taught in school. Both organizational forms proceed through assertions that are easy to swallow before getting to the intended conclusion. That conclusion could be either an automotive purchase or adoption of some argument or ideology. When the product, service, argument, or ideology is sold effectively by a skilled salesman or spin doctor narrative manager, that person may be recognized as a closer, as in sealing the deal.

Many learn to recognize the techniques of the presumptive closer and resist being drawn in too easily. One of those techniques is to reframe the additional price of something as equivalent to, say, one’s daily cup of coffee purchased at some overpriced coffee house. The presumption is that if one has the spare discretionary income to buy coffee every day, then one can put that coffee money instead toward a higher monthly payment. Suckers might fall for it — even if they don’t drink coffee — because the false equivalence is an easily recognized though bogus substitution. The canonical too-slick salesman no one trusts is the dude on the used car lot wearing some awful plaid jacket and sporting a pornstache. That stereotype, borne out of the 1970s, barely exists anymore but is kept alive by repetitive reinforcement in TV and movies set in that decade or at least citing the stereotype for cheap effect (just as I have). But how does one spot a skilled rhetorician, spewing social and political hot takes to drive custom narratives? Let me identify a few markers.

Thomas Sowell penned a brief article entitled “Point of No Return.” I surmise (admitting my lack of familiarity) that is a conservative website, which all by itself does not raise any flags. Indeed, in heterodox fashion, I want to read well reasoned arguments with which I may not always agree. My previous disappointment that Sowell fails in that regard was only reinforced by the linked article. Take note that the entire article uses paragraphs that are reduced to bite-sized chunks of only one or two sentences. Those are small sells, inviting closure with every paragraph break.

Worse yet, only five (small) paragraphs in, Sowell succumbs to Godwin’s Law and cites Nazis recklessly to put the U.S. on a slippery slope toward tyranny. The obvious learned function of mentioning Nazis is to trigger a reaction, much like baseless accusations of racism, sexual misconduct, or baby eating. It puts everyone on the defensive without having to demonstrate the assertion responsibly, which is why the first mention of Nazis in argument is usually sufficient to disregard anything else written or said by the person in question. I might have agreed with Sowell in his more general statements, just as conservatism (as in conservation) appeals as more and more slips away while history wears on, but after writing “Nazi,” he lost me entirely (again).

Sowell also raises several straw men just to knock them down, assessing (correctly or incorrectly, who can say?) what the public believes as though there were monolithic consensus. I won’t defend the general public’s grasp of history, ideological placeholders, legal maneuvers, or cultural touchstones. Plenty of comedy bits demonstrate the deplorable level of awareness of individual members of society like they were fully representative of the whole. Yet plenty of people pay attention and accordingly don’t make the cut when offering up idiocy for entertainment. (What fun, ridiculing fools!) The full range of opinion on any given topic is not best characterized by however many idiots and ignoramuses can be found by walking down the street and shoving a camera and mic in their astonishingly unembarrassed faces.

So in closing, let me suggest that, in defiance of the title of this blog post, Thomas Sowell is in fact not a closer. Although he drops crumbs and morsels gobbled up credulously by those unable to recognize they’re being sold a line of BS, they do not make a meal. Nor should Sowell’s main point, i.e., the titular point of no return, be accepted when his burden of proof has not been met. That does not necessary mean Sowell is wrong in the sense that even a stopped close tells the time correctly twice a day. The danger is that even if he’s partially correction some of the time, his perspective and program (nonpartisan freedom! whatever that may mean) must be considered with circumspection and disdain. Be highly suspicious before buying what Sowell is selling. Fundamentally, he’s a bullshit artist.

Cynics knew it was inevitable: weaponized drones and robots. Axon Enterprises, Inc., maker of police weaponry (euphemistically termed “public safety technologies”), announced its development of taser equipped drones presumed capable of neutralizing an active shooter inside of 60 seconds. Who knows what sorts of operating parameters restrict their functions or if they can be made invulnerable to hacking or disallowed use as offensive weapons?

A sane, civilized society would recognize that, despite bogus memes about an armed society being a polite society, the prospect of everyone being strapped (like the fabled Old American West) and public spaces (schools, churches, post offices, laundromats, etc.) each being outfitted with neutralizing technologies is fixing the wrong problem. But we are no longer a sane society (begging the question whether we ever were). So let me suggest something radical yet obvious: the problem is not technological, it’s cultural. The modern world has made no progress with respect to indifference toward the suffering of others. Dehumanizing attitudes and technologies are no longer, well, medieval, but they’re no less cruel. For instance, people are not put in public stocks or drawn and quartered anymore, but they are shamed, cancelled, tortured, terrorized, propagandized, and abandoned in other ways that allow maniacs to pretend to others and to themselves that they are part of the solution. Hard to believe that one could now feel nostalgia for the days when, in the aftermath of yet another mass shooting, calls for gun control were met with inaction (other then empty rhetoric) rather than escalation.

The problem with diagnosing the problem as cultural is that no one is in control. Like water, culture goes where it goes and apparently sinks to its lowest ebb. Attempts to channel, direct, and uplift culture might work on a small scale, but at the level of society — and with distorted incentives freedom is certain to deliver — malefactors are guaranteed to appear. Indeed, anything that contributes to the arms race (now tiny, remote-controlled, networked killing devices rather than giant atomic/nuclear ones) only invites greater harm and is not a solution. Those maniacs (social and technical engineers promising safety) have the wrong things wrong.

Small, insular societies with strict internal codes of conduct may have figured out something that large, free societies have not, namely, that mutual respect, knowable communities, and repudiation of advanced technologies give individuals something and someone to care about, a place to belong, and things to do. When the entire world is thrown open, such as with social media, populations become atomized and anonymized, unable to position or understand themselves within a meaningful social context. Anomie and nihilism are often the rotten fruit. Splintered family units, erosion of community involvement, and dysfunctional institutions add to the rot. Those symptoms of cultural collapse need to be addressed even if they are among the most difficult wrong things to get right.

This intended follow-up has been stalled (pt. 1 here) for one simple reason: the premise presented in the embedded YouTube video is (for me at least) easy to dismiss out of hand and I haven’t wanted to revisit it. Nevertheless, here’s the blurb at the top of the comments at the webpage:

Is reality created in our minds, or are the things you can touch and feel all that’s real? Philosopher Bernardo Kastrup holds doctorates in both philosophy and computer science, and has made a name for himself by arguing for metaphysical idealism, the idea that reality is essentially a mental phenomenon.

Without going into point-by-point discussion, the top-level assertion, if I understand it correctly (not assured), is that material reality comes out of mental experience rather than the reverse. It’s a chicken-and-egg question with materialism and idealism (fancy technical terms not needed) each vying for primacy. The string of conjectures (mental gymnastics, really, briefly impressive until one recognizes how quickly they lose correlation with how most of us think about and experience reality) that inverts the basic relationship of inner experience to outer reality is an example of waaaay overthinking a problem. No doubt quite a lot of erudition can be brought to bear on such questions, but even if those questions were resolved satisfactorily on an intellectual level and an internally coherent structure or system were developed or revealed, it doesn’t matter or lead anywhere. Humans are unavoidably embodied beings. Each individual existence also occupies a tiny sliver of time (the timeline extending in both directions to infinity). Suggesting that mental experience is briefly instantiated in personhood but is actually drawn out of some well of souls, collective consciousness, or panpsychism and rejoins them in heaven, hell, or elsewhere upon expiration is essentially a religious claim. It’s also an attractive supposition, granting each of us not permanence or immortality but rather something somehow better (?) though inscrutable because it lies beyond perception (but not conceptualization). Except for an eternity of torments in hell, I guess, if one deserves that awful fate.

One comment about Kastrup. He presents his perspective (his findings?) with haughty derision of others who can’t see or understand what it so (duh!) obvious. He falls victim to the very same over-intellectualized flim-flam he mentions when dismissing materialists who need miracles and shortcuts to smooth over holes in their scientific/philosophical systems. The very existence of earnest disagreement by those who occupy themselves with such questions might suggest some humility, as in “here’s my explanation, they have theirs, judge for yourself.” But there’s a third option: the great unwashed masses (including nearly all our ancestors) for whom such questions are never even fleeting thoughts. It’s all frankly immaterial (funnily, both the right and wrong word at once). Life is lived and experienced fundamentally on its surface — unless, for instance, one has been incubated too long within the hallowed halls of academia, lost touch with one’s brethren, and become preoccupied with cosmic investigations. Something quite similar happens to politicians and the wealthy, who typically hyperfocus on gathering to themselves power and then exercising that power over others (typically misunderstood as simply pulling the levers and operating the mechanisms of society). No wonder their pocket of reality looks so strikingly different.

From the outset, credit goes to Jonathan Haidt for providing the ideas to launch this blog post. He appears to be making the rounds again flogging his most recent publication (where? I dunno, maybe The Atlantic). In the YouTube interview I caught, Haidt admits openly that as a social and behavioral psychologist, he’s prone to recommending incentives, programs, and regulations to combat destructive developments in contemporary life — especially those in the academy and on social media that have spread into politics and across the general public. Haidt wears impressive professional armor in support of arguments and contentions; I lack such rigor rather conspicuously. Accordingly, I offer no recommendations but instead try to limit myself to describing dynamics as an armchair social critic. Caveat emptor.

Haidt favors viewpoint diversity (see, for example, Heterodox Academy, which he helped to found and now chairs). Simple enough, right? Not so fast there, Señor Gonzalez! Any notion that even passing acquaintance with a given subject requires knowing both pros and cons is anathema to many of today’s thinkers, who would rather plug their ears and pretend opposition voices, principled or otherwise, are simply incoherent, need not be considered, and further, should be silenced and expunged. As a result, extremist branches of any faction tend to be ideological echo chambers. Cardinal weaknesses in such an approach are plain enough for critical thinkers to recognize, but if one happens to fall into one of those chambers, silos, or bubbles (or attend a school that trains students in rigid thinking), invitations to challenge cherished and closely held beliefs, upon which identity is built, mostly fall on deaf ears. The effect is bad enough in individuals, but when spread across organizations that adopt ill-advised solutionism, Haidt’s assessment is that institutional stupidity sets in. The handy example is higher education (now an oxymoron). Many formerly respectable institutions have essentially abandoned reason (ya know, the way reasonable people think) and begun flagellating themselves in abject shame over, for instance, a recovered history of participation in any of the cultural practices now cause for immediate and reflexive cancellation.

By way of analogy, think of one’s perspective as a knife (tool, not weapon) that requires periodic sharpening to retain effectiveness. Refusing to entertain opposing viewpoints is like sharpening only one side of the blade, resulting in a blunt, useless tool. That metaphor suggests a false dualism: two sides to an argument/blade when in fact many facets inform most complex issues, thus viewpoint diversity. By working in good faith with both supporters and detractors, better results (though not perfection) can be obtained than when radicalized entities come to dominate and impose their one-size-fits-all will indiscriminately. In precisely that way, it’s probably better not to become any too successful or powerful lest one be tempted to embrace a shortsighted will to power and accept character distortions that accompany a precipitous rise.

As mentioned a couple blog posts ago, an unwillingness to shut up, listen, and learn (why bother? solutions are just … so … obvious …) has set many people on a path of activism. The hubris of convincing oneself of possession of solutions to intractable issues is bizarre. Is there an example of top-down planning, channeling, and engineering of a society that actually worked without tyrannizing the citizenry in the process? I can’t think of one. Liberal democratic societies determined centuries ago that freedom and self-determination mixed with assumed responsibility and care within one’s community are preferable to governance that treats individuals as masses to be forced into conformity (administrative or otherwise), regulated heavily, and/or disproportionately incarcerated like in the U.S. But the worm has turned. Budding authoritarians now seek reforms and uniformity to manage diverse, messy populations.

Weirdly, ideologues also attempt to purge and purify history, which is chock full of villainy and atrocity. Those most ideologically possessed seek both historical and contemporary targets to denounce and cancel, not even excluding themselves because, after all, the scourges of history are so abject and everyone benefited from them somehow. Search oneself for inherited privilege and all pay up for past iniquities! That’s the self-flagellating aspect: taking upon oneself (and depositing on others) the full weight of and responsibility for the sins of our forebears. Yet stamping out stubborn embers of fires allegedly still burning from many generations ago is an endless task. Absolutely no one measures up to expectations of sainthood when situated with an inherently and irredeemably evil society of men and women. That’s original sin, which can never be erased or forgiven. Just look at what humanity (via industrial civilization) has done to the surface of the planet. Everyone is criminally culpable. So give up all aspirations; no one can ever be worthy. Indeed, who even deserves to live?

Heard a remark (can’t remember where) that most these days would attack as openly ageist. Basically, if you’re young (let’s say below 25 years of age), then it’s your time to shut up, listen, and learn. Some might even say that true wisdom doesn’t typically emerge until much later in life, if indeed it appears at all. Exceptions only prove the rule. On the flip side, energy, creativity, and indignation (e.g., “it’s not fair! “) needed to drive social movements are typically the domain of those who have less to lose and everything to gain, meaning those just starting out in adult life. A full age range is needed, I suppose, since society isn’t generally age stratified except at the extremes (childhood and advanced age). (Turns out that what to call old people and what counts as old is rather clumsy, though probably not especially controversial.)

With this in mind, I can’t help but to wonder what’s going on with recent waves of social unrest and irrational ideology. Competing factions agitate vociferously in favor of one social/political ideology or another as though most of the ideas presented have no history. (Resemblances to Marxism, Bolshevism, and white supremacy are quite common. Liberal democracy, not so much.) Although factions aren’t by any means populated solely by young people, I observe that roughly a decade ago, higher education in particular transformed itself into an incubator for radicals and revolutionaries. Whether dissatisfaction began with the faculty and infected the students is impossible for me to assess. I’m not inside that intellectual bubble. However, urgent calls for radical reform have since moved well beyond the academy. A political program or ideology has yet to be put forward that I can support fully. (My doomer assessment of what the future holds forestalls knowing with any confidence what sort of program or ideology into which to pour my waning emotional and intellectual energy.) It’s still fairly simple to criticize and denounce, of course. Lots of things egregiously wrong in the world.

My frustration with what passes for political debate (if Twitter is any indication) is the marked tendency to immediately resort to comparisons with Yahtzees in general or Phitler in particular. It’s unhinged and unproductive. Yahtzees are cited as an emotional trigger, much like baseless accusations of racism send everyone scrambling for cover lest they be cancelled. Typically, the Yahtzee/Phitler comparison or accusation itself is enough to put someone on their heels, but wizened folks (those lucky few) recognize the cheap rhetorical trick. The Yahtzee Protocol isn’t quite the same as Godwin’s Law, which states that the longer a discussion goes on (at Usenet in the earliest examples) increases the inevitability likelihood of someone bringing up Yahtzees and Phitler and ruining useful participation. The protocol has been deployed effectively in the Russian-Ukraine conflict, though I’m at a loss to determine in which direction. The mere existence of the now-infamous Azov Battalion, purportedly comprised is Yahtzees, means that automatically, reflexively, the fight is on. Who can say what the background rate of Yahtzee sympathizers (whatever that means) might be in any fighting force or indeed the general population? Not me. Similarly, what threshold qualifies a tyrant to stand beside Phitler on a list of worst evers? Those accusations are flung around like cooked spaghetti thrown against the wall just to see what sticks. Even if the accusation does stick, what possible good does it do? Ah, I know: it makes the accuser look like a virtuous fool.