Posts Tagged ‘Rhetoric’

Buzzwords circulating heavily in the public sphere these days include equality, equity, inclusion, representation, diversity, pluralism, multiculturalism, and privilege. How they are defined, understood, and implemented are contentious issues that never seem to resolve. Indeed, looking back on decade after decade of activism undertaken to address various social scourges, limited progress has been made, which amounts to tinkering around the edges. The underlying bigotry (opinion, motivation, activity) has never really been eradicated, though it may be diminished somewhat through attrition. Sexism and racism in particular (classics in an expanding universe of -isms) continue to rage despite surface features taking on salutary aspects. Continued activism use the buzzwords above (and others) as bludgeons to win rhetorical battles — frequently attacks on language itself through neologism, redefinition, and reclamation — without really addressing the stains on our souls that perpetuate problematic thinking (as though anyone ever had a lock on RightThink). Violence (application of actual force, not just mean or emphatic words) is the tool of choice deployed by those convinced their agenda is more important than general societal health and wellbeing. Is violence sometimes appropriate in pursuit of social justice? Yes, in some circumstances, probably so. Is this a call for violent protest? No.

Buzzwords stand in for what we may think we want as a society. But there’s built-in tension between competition and cooperation, or alternatively, individual and society (see the start of this multipart blog post) and all the social units that nest between. Each has its own desires (known in politics by the quaint term special interests), which don’t usually combine to form just societies despite mythology to that effect (e.g., the invisible hand). Rather, competition results in winners and losers even when the playing field is fair, which isn’t often. To address what is sometimes understood (or misunderstood, hard to know) as structural inequity, activists advocate privileging disenfranchised groups. Competence and merit are often sacrificed in the process. Various forms of criminal and sociopathic maneuvering also keep a sizeable portion of the population (the disenfranchised) in a perpetual and unnecessary state of desperation. That’s the class struggle.

So here’s my beef: if privilege (earned or unearned) is categorically bad because it’s been concentrated in a narrow class (who then position themselves to retain and/or grow it), why do activists seek to redistribute privilege by bestowing it on the downtrodden? Isn’t that a recipe for destroying ambition? If the game instead becomes about deploying one or more identifiers of oppression to claim privilege rather than working diligently to achieve a legitimate goal, such as acquiring skill or understanding, why bother to try hard? Shortcuts magically appear and inherent laziness is incentivized. Result: the emergent dynamic flattens valuable, nay necessary, competence hierarchies. In it’s communist formulation, social justice is achieved by making everyone equally precarious and miserable. Socialism fares somewhat better. Ideologues throughout history have wrecked societies (and their members) by redistributing or demolishing privilege forcibly while hypocritically retaining privilege for themselves. Ideology never seems to work out as theorized, though the current state of affairs (radical inequality) is arguably no more just.

More to unpack in further installments.

The Bulletin of the Atomic Scientists has reset its doomsday clock ten seconds closer to midnight (figuratively, nuclear Armageddon), bringing the world closest to catastrophe in its history. Bizarrely, its statement published today points squarely at Russia as the culprit but fails to mention the participation of the United States and other NATO countries in the conflict. Seems to me a rather unethical deployment of distinctly one-sided rhetoric. With the clock poised so close to midnight, there’s nearly nowhere left to reset the clock until the bombs fly. In contrast, two of the blogs I read that are openly critical of provocations and escalation against Russia, not by Russia, are Bracing Views (on my blogroll) and Caitlin Johnstone (not on my blogroll). Neither minces words, and both either suggest or say openly that U.S. leadership are the bad guys indulging in nuclear brinkmanship. Many more example of that ongoing debate are available. Judge for yourself whose characterizations are more accurate and useful.

Although armed conflict at one scale or another, one time or another, is inescapable given mankind’s violent nature, it takes no subtle morality or highfalutin ethics to be default antiwar, whereas U.S. leadership is uniformly prowar. Can’t know for sure what the motivations are, but the usual suspects include fear of appearing weak (um, it’s actually that fear that’s weak, dummies) and the profit motive (war is a racket). Neither convinces me in the slightest that squandering lives, energy, and lucre make war worth contemplating except in extremis. Military action (and its logistical support such as the U.S. is providing to Ukraine) should only be undertaken with the gravest regret and reluctance. Instead, war is glorified and valorized. Fools rush in ….

No need to think hard on this subject, no matter where one gets information and reporting (alternately, disinformation and misreporting). Also doesn’t ultimately matter who are the good guys or the bad guys. What needs to happen in all directions and by all parties is deescalation and diplomacy. Name calling, scapegoating, and finger pointing might soothe some that they’re on the right side of history. The bad side of history will be when nuclear powers go MAD, at which point no one can stake any moral high ground.

In sales and marketing (as I understand them), one of the principal techniques to close a sale is to generate momentum by getting the prospective mark buyer to agree to a series of minor statements (small sells) leading to the eventual purchasing decision (the big sell or final sale). It’s narrow to broad, the reverse of the broad-to-narrow paragraph form many of us were taught in school. Both organizational forms proceed through assertions that are easy to swallow before getting to the intended conclusion. That conclusion could be either an automotive purchase or adoption of some argument or ideology. When the product, service, argument, or ideology is sold effectively by a skilled salesman or spin doctor narrative manager, that person may be recognized as a closer, as in sealing the deal.

Many learn to recognize the techniques of the presumptive closer and resist being drawn in too easily. One of those techniques is to reframe the additional price of something as equivalent to, say, one’s daily cup of coffee purchased at some overpriced coffee house. The presumption is that if one has the spare discretionary income to buy coffee every day, then one can put that coffee money instead toward a higher monthly payment. Suckers might fall for it — even if they don’t drink coffee — because the false equivalence is an easily recognized though bogus substitution. The canonical too-slick salesman no one trusts is the dude on the used car lot wearing some awful plaid jacket and sporting a pornstache. That stereotype, borne out of the 1970s, barely exists anymore but is kept alive by repetitive reinforcement in TV and movies set in that decade or at least citing the stereotype for cheap effect (just as I have). But how does one spot a skilled rhetorician, spewing social and political hot takes to drive custom narratives? Let me identify a few markers.

Thomas Sowell penned a brief article entitled “Point of No Return.” I surmise (admitting my lack of familiarity) that creators.com is a conservative website, which all by itself does not raise any flags. Indeed, in heterodox fashion, I want to read well reasoned arguments with which I may not always agree. My previous disappointment that Sowell fails in that regard was only reinforced by the linked article. Take note that the entire article uses paragraphs that are reduced to bite-sized chunks of only one or two sentences. Those are small sells, inviting closure with every paragraph break.

Worse yet, only five (small) paragraphs in, Sowell succumbs to Godwin’s Law and cites Nazis recklessly to put the U.S. on a slippery slope toward tyranny. The obvious learned function of mentioning Nazis is to trigger a reaction, much like baseless accusations of racism, sexual misconduct, or baby eating. It puts everyone on the defensive without having to demonstrate the assertion responsibly, which is why the first mention of Nazis in argument is usually sufficient to disregard anything else written or said by the person in question. I might have agreed with Sowell in his more general statements, just as conservatism (as in conservation) appeals as more and more slips away while history wears on, but after writing “Nazi,” he lost me entirely (again).

Sowell also raises several straw men just to knock them down, assessing (correctly or incorrectly, who can say?) what the public believes as though there were monolithic consensus. I won’t defend the general public’s grasp of history, ideological placeholders, legal maneuvers, or cultural touchstones. Plenty of comedy bits demonstrate the deplorable level of awareness of individual members of society like they were fully representative of the whole. Yet plenty of people pay attention and accordingly don’t make the cut when offering up idiocy for entertainment. (What fun, ridiculing fools!) The full range of opinion on any given topic is not best characterized by however many idiots and ignoramuses can be found by walking down the street and shoving a camera and mic in their astonishingly unembarrassed faces.

So in closing, let me suggest that, in defiance of the title of this blog post, Thomas Sowell is in fact not a closer. Although he drops crumbs and morsels gobbled up credulously by those unable to recognize they’re being sold a line of BS, they do not make a meal. Nor should Sowell’s main point, i.e., the titular point of no return, be accepted when his burden of proof has not been met. That does not necessary mean Sowell is wrong in the sense that even a stopped close tells the time correctly twice a day. The danger is that even if he’s partially correction some of the time, his perspective and program (nonpartisan freedom! whatever that may mean) must be considered with circumspection and disdain. Be highly suspicious before buying what Sowell is selling. Fundamentally, he’s a bullshit artist.

Heard a remark (can’t remember where) that most these days would attack as openly ageist. Basically, if you’re young (let’s say below 25 years of age), then it’s your time to shut up, listen, and learn. Some might even say that true wisdom doesn’t typically emerge until much later in life, if indeed it appears at all. Exceptions only prove the rule. On the flip side, energy, creativity, and indignation (e.g., “it’s not fair! “) needed to drive social movements are typically the domain of those who have less to lose and everything to gain, meaning those just starting out in adult life. A full age range is needed, I suppose, since society isn’t generally age stratified except at the extremes (childhood and advanced age). (Turns out that what to call old people and what counts as old is rather clumsy, though probably not especially controversial.)

With this in mind, I can’t help but to wonder what’s going on with recent waves of social unrest and irrational ideology. Competing factions agitate vociferously in favor of one social/political ideology or another as though most of the ideas presented have no history. (Resemblances to Marxism, Bolshevism, and white supremacy are quite common. Liberal democracy, not so much.) Although factions aren’t by any means populated solely by young people, I observe that roughly a decade ago, higher education in particular transformed itself into an incubator for radicals and revolutionaries. Whether dissatisfaction began with the faculty and infected the students is impossible for me to assess. I’m not inside that intellectual bubble. However, urgent calls for radical reform have since moved well beyond the academy. A political program or ideology has yet to be put forward that I can support fully. (My doomer assessment of what the future holds forestalls knowing with any confidence what sort of program or ideology into which to pour my waning emotional and intellectual energy.) It’s still fairly simple to criticize and denounce, of course. Lots of things egregiously wrong in the world.

My frustration with what passes for political debate (if Twitter is any indication) is the marked tendency to immediately resort to comparisons with Yahtzees in general or Phitler in particular. It’s unhinged and unproductive. Yahtzees are cited as an emotional trigger, much like baseless accusations of racism send everyone scrambling for cover lest they be cancelled. Typically, the Yahtzee/Phitler comparison or accusation itself is enough to put someone on their heels, but wizened folks (those lucky few) recognize the cheap rhetorical trick. The Yahtzee Protocol isn’t quite the same as Godwin’s Law, which states that the longer a discussion goes on (at Usenet in the earliest examples) increases the inevitability likelihood of someone bringing up Yahtzees and Phitler and ruining useful participation. The protocol has been deployed effectively in the Russian-Ukraine conflict, though I’m at a loss to determine in which direction. The mere existence of the now-infamous Azov Battalion, purportedly comprised is Yahtzees, means that automatically, reflexively, the fight is on. Who can say what the background rate of Yahtzee sympathizers (whatever that means) might be in any fighting force or indeed the general population? Not me. Similarly, what threshold qualifies a tyrant to stand beside Phitler on a list of worst evers? Those accusations are flung around like cooked spaghetti thrown against the wall just to see what sticks. Even if the accusation does stick, what possible good does it do? Ah, I know: it makes the accuser look like a virtuous fool.

After a hiatus due to health issues, Jordan Peterson has reappeared in the public sphere. Good for him. I find him one of the most stimulating public intellectuals to appear thus far into the 21st century, though several others (unnamed) spring to mind who have a stronger claims on my attention. Yet I’m wary of Peterson as an effective evaluator of every development coughed up for public consideration. It’s simply not necessary or warranted for him to opine recklessly about every last damn thing. (Other podcasters are doing the same, and although I don’t want to instruct anyone to stay in their lane, I also recognize that Joe “Talkity-Talk” Blow’s hot take or rehash on this, that, and every other thing really isn’t worth my time.) With the inordinate volume of text in his books, video on his YouTube channel (classroom lectures, podcasts, interviews) and as a guest on others’ podcasts, and third-party writing about him (like mine), it’s inevitable that Peterson will run afoul of far better analysis than he himself can bring to bear. However, he declares his opinions forcefully and with overbearing confidence then decamps to obfuscation and reframing whenever someone pushes back effectively (which isn’t often, at least when in direct communication). With exasperation, I observe that he’s basically up to his old rhetorical tricks.

In a wide-ranging discussion on The Joe Rogan Experience from January 2022 (found exclusively on Spotify for anyone somehow unaware of Rogan’s influence in the public sphere), the thing that most irked me was Peterson’s take on the climate emergency. He described climate as too complex, with too many variable and unknowns, to embody in scientific models over extended periods of time. Seems to me Peterson has that entirely backwards. Weather (and extreme weather events) on the short term can’t be predicted too accurately, so daily/weekly/monthly forecasts give wide ranges of, say, cloud cover, temperature, and precipitation. But over substantial time (let’s start with a few decades, which is still a blink in geological time), trends and boundaries reveal themselves pretty reliably, which is why disturbances — such as burning enough fossil fuels to alter the chemical composition of the planet’s atmosphere — that upset the climate steady-state known as Garden Earth are not merely cause for serious concern but harbingers of doom. And then, as others often do, Peterson reframed the climate emergency largely in terms of economics (same thing happened with the pandemic, though not by Peterson so far as I know), suggesting that the problem is characterized by inefficiencies and grass-roots policy that would be magically different if more people were raised out of poverty and could advocate for solutions rather than simply struggle to survive. Dude apparently hasn’t grasped that wealth in the modern world is an outgrowth of the very thing — fossil fuels — that is the root of the problem. Further, industrial civilization is a heat engine that binds us to a warming trend. That’s a thermodynamic principle flatly immune to half-baked economic theories and ecological advocacy. Peterson also gives no indication of ever having acknowledged Jevons Paradox.

So let me state somewhat emphatically: the climate emergency is in fact an existential crisis on several fronts (e.g., resource depletion and scarcity, ecological despoliation, extreme weather events, and loss of habitat, all resulting in civilizational collapse). The rate of species extinction — before human population has begun to collapse in earnest, 8 Billion Day looms near — is several orders of magnitude greater than historical examples. Humans are unlikely to survive to the end of the century even if we refrain from blowing ourselves up over pointless geopolitical squabbles. I’ll defer to Peterson in his area of expertise: personality inventories. I’ll also grant him space to explore myth and symbolism in Western culture. But speaking on climate, he sounds like an ignoramus — the dangerous sort who leads others astray. And when challenged by someone armed with knowledge of governing principles, grasp of detail, and thus analysis superior to what he can muster (such as when debating Richard Wolff about Marxism), Peterson frequently resorts to a series of motte-and-bailey assertions that confound inexpert interlocutors. “Well, that depends on what you mean by ….” His retreat to faux safety is sometimes so astonishingly complete that he resorts to questioning the foundation of reality: “Why the sun? Why this sky? Why these stars? Why not something else completely?” Also, Peterson’s penchant for pointing out that the future is contingent and unknown despite, for instance, all indicators positively screaming to stop destroying our own habitat, as though no predictions or models can be made that have more than a whisper of accuracy in future outcomes, is mere rhetoric to forestall losing an argument.

As I’ve asserted repeatedly, sufficiency is the crucible on which all decisions are formed because earnest information gathering cannot persist interminably. Tipping points (ecological ones, sure, but more importantly, psychological ones) actually exist, where one must act despite incomplete knowledge and unclear prognosis. Accordingly, every decision is on some level a leap into the unknown and/or an act of faith. That doesn’t mean every decision is a wild, reckless foray based on nothing. Rather, when the propitious moment arrives (if one has the wherewithal to recognize it), one has to go with what one’s got, knowing that mistakes will be made and corrections will be needed.

Peterson’s improvisational speaking style is both impressive and inscrutable. I’m sometimes reminded of Marshall McLuhan, whose purported Asperger’s Syndrome (low-grade autism, perhaps, I’m unsure) awarded him unique insights into the emerging field of media theory that were not easily distilled in speech. Greta Thunberg is another more recent public figure whose cognitive character allows her to recognize rather acutely how human institutions have completely botched the job of keeping industrial civilization from consuming itself. Indeed, people from many diverse backgrounds, not hemmed in by the rigid dictates of politics, economics, and science, intuit through diverse ways of knowing (e.g., kinesthetic, aesthetic, artistic, religious, psychedelic) what I’ve written about repeatedly under the title “Life Out of Balance.” I’ve begun to understand Peterson as a mystic overwhelmed by the sheer beauty of existence but simultaneously horrified by unspeakably awful evils humans perpetrate on each other. Glimpses of both (and humor, a bit unexpectedly) often provoke cracks in his voice, sniffles, and tears as he speaks, clearly choking back emotions to keep his composure. Peterson’s essential message (if I can be so bold), like other mystics, is aspirational, transcendental, and charismatic. Such messages are impossible to express fully and are frankly ill-suited to 21st-century Western culture. That we’re severely out of balance, unable to regain an upright and righteous orientation, is plain to nearly everyone not already lost in the thrall of mass media and social media, but so long as the dominant culture remains preoccupied with wealth, consumption, celebrity, geopolitical violence, spectacle, and trash entertainment, I can’t envision any sort of return to piety and self-restraint. Plus, we can’t outrun the climate emergency bearing down on us.

“Language is dynamic” is a phrase invoked in praise or derision of shifts in usage. Corollaries include “the only constant is change” and “time’s arrow points in only one direction” — both signalling that stasis is an invalid and ultimately futile conservative value. The flip side might well be the myth of progress, understood in reference not to technological advancement but human nature’s failure to rise above its base (animal) origins. This failure is especially grotesque considering that humans currently albeit temporarily live in an age of material abundance that would provide amply for everyone if that largesse were justly and equitably produced and distributed. However, resources (including labor) are being systematically exploited, diverted, and hoarded by a small, unethical elite (what some call “alpha chimps”) who often use state power to subjugate vulnerable populations to funnel further tribute to the already obscenely wealthy top of the socioeconomic hierarchy. But that’s a different diatribe.

Although I’m sensitive the dynamism of language — especially terms for broad ideas in need of short, snappy neologisms — I’m resistant to adopting most new coin. For instance, multiple colors of pill (red, blue, white, and black to my knowledge) refer to certain narrative complexes that people, in effect, swallow. Similarly, the “blue church” is used to refer to legacy media struggling desperately (and failing) to retain its last shreds of legitimacy and authority. (Dignity is long gone.) Does language really need these terms or are hipsters just being clever? That question probably lacks a definitive answer.

My real interest with this blog post, however, is how the modern digital mediascape has given rise to a curious phenomenon associated with cancel culture: deletion of tweets and social media posts to scrub one’s past of impropriety as though the tweet or post never happened. (I’ve never deleted a post nor have any plans to.) Silicon Valley hegemons can’t resist getting their piece of the action, too, by applying deeply flawed algorithms to everyone’s content to demonetize, restrict, and/or remove (i.e., censor) offensive opinion that runs counter to (shifting) consensus narratives decided upon in their sole discretion as water carriers for officialdom. Algorithmic dragnets are not effective kludges precisely because thoughts are not synonymous with their online expression; one merely points to the other. Used to be said that the Internet is forever, so wait before posting or tweeting a reasonable duration so that irresponsible behavior (opinion and trolling, mostly) can be tempered. Who knows who possesses technical expertise and access to tweet and video archives other than, say, the Wayback Machine? When a public figure says or does something dumb, a search-and-destroy mission is often launched to resurrect offending and damning past utterance. Of course, scrub-a-dub erasure or deletion is merely another attempt to manage narrative and isn’t a plea for forgiveness, which doesn’t exist in the public sphere anyway except for rehabilitated monsters such as past U.S. presidents a/k/a war criminals. And the Internet isn’t in fact forever; ask an archivist.

Shifting language, shifting records, shifting sentiment, shifting intellectual history are all aspects of culture that develop naturally and inevitably over time. We no longer believe, for instance, in the four elements or geocentrism (a/k/a the Aristotelian-Ptolemaic system; never mind the intransigent Flat Earthers who need not be silenced). Darker aspects of these shifts, however, include the remarkable Orwellian insight that “Who controls the past controls the future: who controls the present controls the past” from the 1949 novel Nineteen Eighty-Four. Here’s the passage for context:

Who controls the past, controls the future: who controls the present, controls the past … The mutability of the past is the central tenet of Ingsoc. Past events, it is argued, have no objective existence, but survive only in written records and in human memories. The past is whatever the records and the memories agree upon. And since the Party is in full control of all records, and in equally full control of the minds of its members, it follows that the past is whatever the Party chooses to make it.

In 2021, the awful lesson is taken to heart by multiple parties (not the Party in the novel but wannabes) who have latched maniacally onto Orwellian mechanisms of thought control specifically through the manipulation of records, history, and language. But as mentioned above, policing mere expression is not the same as policing thought itself, at least among those who retain critical thinking skills and independence of mind. I abstain judgment how effective attempted brainwashing is with the masses but will at least mention that Yeonmi Park, who escaped from North Korea in 2007 before settling in the U.S. in 2014, describes the chilling totalitarian thought control exercised by the North Korean government — the stuff of nightmare dystopianism. The template is by now well established and despots everywhere are only too happy to implement it repeatedly, following an evil trajectory that should be resisted at every turn while still possible.

Fantasies and delusions rush into the space
that reason has vacated in fear of its life.

—James Howard Kunstler

Since I first warned that this blog post was forthcoming, conditions of modern American life we might have hoped would be resolved by now remain intransigently with us. Most are scrambling to adjust to the new normal: no work (for tens of millions), no concerts, no sports (except for events staged for the camera to be broadcast later), little or no new cinema (but plenty of streaming TV), no school or church (except for abysmal substitutes via computer), no competent leadership, and no end in sight. The real economy swirls about the drain despite the fake economy (read: the stock market a/k/a the Richistan economy) having first shed value faster than ever before in history then staged a precipitous taxpayer-funded, debt-fueled recovery only to position itself for imminent resumption of its false-started implosion. The pandemic ebbed elsewhere then saw its own resumption, but not in the U.S., which scarcely ebbed at all and now leads the world in clownish mismanagement of the crisis. Throughout it all, we extend and pretend that the misguided modern age isn’t actually coming to a dismal close, based as it is on a consumption-and-growth paradigm that anyone even modestly numerically literate can recognize is, um, (euphemism alert) unsustainable.

Before full-on collapse (already rising over the horizon like those fires sweeping across the American West) hits, however, we’ve got unfinished business: getting our heads (and society) right regarding which of several competing ideologies can or should establish itself as the righteous path forward. That might sound like the proverbial arranging of deck chairs on the RMS Titanic, but in an uncharacteristically charitable moment, let me suggest that righting things before we’re done might be an earnest obligation even if we can’t admit openly just how close looms the end of (human) history. According to market fundamentalists, corporatists, and oligarchs, Socialism and Marxism, or more generally collectivism, must finally have a stake driven through its undead heart. According to radical progressives, Black Lives Matter, and Antifa, fascism and racism, or more generally intolerance, deserve to be finally stamped out, completing the long arc of history stalled after the Civil Rights Era. And according to barely-even-a-majority-anymore whites (or at least the conservative subset), benefits and advantages accrued over generations, or more generally privilege, must be leveraged, solidified, and maintained lest the status quo be irretrievably lost. Other factions no doubt exist. Thus, we are witnessing a battle royale among narratives and ideologies, none of which IMO crystallize the moment adequately.

Of those cited above, the first and third are easy to dismiss as moribund and self-serving. Only the second demonstrates any concern for the wellbeing of others. However, and despite its putative birthplace in the academy, it has twisted itself into pretzel logic and become every bit as intolerant as the scourges it rails against. Since I need a moniker for this loose, uncoordinated network of movements, I’ll refer to them as the Woke Left, which signifies waking up (i.e., being woke) to injustice and inequity. Sustained analysis of the Woke Left is available from James Lindsay through a variety of articles and interviews (do a search). Lindsay demonstrates handily how the Woke Left’s principle claims, often expressed through its specialized rhetoric called Critical Theory, is actually an inversion of everything it pretends to be. This body of thought has legitimate historical and academic lineage, so it’s arguable that only its most current incarnation in the Woke Left deserves scorn.

Two recently published books exemplify the rhetoric of the Woke Left: White Fragility (2018) by Robin DiAngelo and How to Be an Antiracist (2019) by Ibram Kendi. Although I’ve read neither book, I’m aware of numerous scathing reviews that point out fundamental problems with the books and their authors’ arguments. Foremost among them is what’s sometimes called a Kafka trap, a Catch-22 because all avenues of argument lead inescapably toward guilt, typically some form of original sin. Convinced they are on the righteous right side of history, Woke Left protesters and agitators have been harassing and physically threatening strangers to demand support for the cause, i.e., compliance. What cause is a good question, considering a coherent program has yet to be articulated. Forcing others to choose either side of a false binary — with us or against us — is madness, but that’s the cultural moment at which we’ve arrived. Everyone must align their ideology with some irrational narrative while being put at risk of cancellation and/or destruction no matter what alignment is ventured.

If things go south badly on the heels of contested election results this fall as many expect — the pump already primed for such conflict — and a second civil war ensues, I rather expect the Woke Left to be the first to fail and the other two, each representing the status quo (though different kinds), to be in an extended battle for control of whatever remains of the union. I can’t align with any of them, since by my lights they’re all different kinds of crazy. Sorta makes ya wonder, taking history as an indicator, if a fourth or fifth faction won’t appear before it’s a wrap. I don’t hold out any hope for any faction steering us competently through this crisis.

Once in a while, when discussing current events and their interpretations and implications, a regular interlocutor of mine will impeach me, saying “What do you know, really?” I’m always forced to reply that I know only what I’ve learned through various media sources, faulty though they may be, not through first-hand observation. (Reports of anything I have observed personally tend to differ considerably from my own experience once the news media completes its work.) How, then, can I know, to take a very contemporary instance this final week of July 2020, what’s going on in Portland from my home in Chicago other than what’s reported? Makes no sense to travel there (or much of anywhere) in the middle of a public health crisis just to see a different slice of protesting, lawbreaking, and peacekeeping [sic] activities with my own eyes. Extending the challenge to its logical extremity, everything I think I know collapses into solipsism. The endpoint of that trajectory is rather, well, pointless.

If you read my previous post, there is an argument that can’t be falsified any too handily that what we understand about ourselves and the world we inhabit is actually a constructed reality. To which I reply: is there any other kind? That construction achieves a fair lot of consensus about basics, more than one might even guess, but that still leaves quite a lot of space for idiosyncratic and/or personal interpretations that conflict wildly. In the absence of stabilizing authority and expertise, it has become impossible to tease a coherent story out of the many voices pressing on us with their interpretations of how we ought to think and feel. Twin conspiracies foisted on us by the Deep State and MSM known as RussiaGate and BountyGate attest to this. I’ll have more to say about inability to figure things out when I complete my post called Making Sense and Sensemaking.

In the meantime, the modern world has in effect constructed its own metaphorical Tower of Babel (borrowing from Jonathan Haidt — see below). It’s not different languages we speak so much (though it’s that, too) as the conflicting stories we tell. Democratization of media has given each us of — authorities, cranks, and everyone between — new platforms and vehicles for promulgating pet stories, interpretations, and conspiracies. Most of it is noise, and divining the worthwhile signal portion is a daunting task even for disciplined, earnest folks trying their best to penetrate the cacophony. No wonder so many simply turn away in disgust.

/rant on

MAD is a term I haven’t thought about for a good long while. No illusions here regarding that particularly nasty genie having been stuffed back into its lamp. Nope, it lingers out there in some weird liminal space, routinely displaced by more pressing concerns. However, MAD came back into my thoughts because of saber-rattling by U.S. leadership suggesting resumed above-ground nuclear testing might be just the ticket to remind our putative enemies around the world what complete assholes we are. Leave it to Americans to be the very last — in the midst of a global pandemic (that’s redundant, right?) — to recognize that geopolitical squabbles (alert: reckless minimization of severity using that word squabble) pale in comparison to other looming threats. Strike that: we never learn; we lack the reflective capacity. Still, we ought to reorient in favor of mutual aid and assistance instead of our MAD, insane death pact.

The authoritative body that normally springs to mind when MAD is invoked is the Bulletin of the Atomic Scientists. Ironically, it appears to be an independent, nonprofit 501(c)(3) entity, a media organization, not an actual collection of atomic scientists. (I’ll continue to italicize Bulletin as though it’s a publication like the New York Times even though it’s arguably something else.) I’ve blogged repeatedly about its iconic Doomsday Clock. In an otherwise astute post against sloppy appeals to authority using the increasingly meaningless term expert, Alan Jacobs takes to task the Bulletin for straying out of its lane to consider threats that are political in nature rather than scientific. Reminded me of when Pope Francis in his encyclical deigned to acknowledge climate change, recognizing that Mother Earth is our “common home” and maybe we shouldn’t be raping her. (OK, that coarse bit at the end is mine.) What? He’s not a climatologist! How dare he opine on something outside his official capacity? Go back to saving souls!

At the same time we desperately need expertise to accomplish things like building bridges that don’t fall down (yet still do) or performing an appendectomy without killing the patient, it’s inevitable that people form opinions about myriad subjects without the benefit of complete authority or expertise, if such a thing even exists. As students, citizens, and voters, we’re enjoined to inform ourselves, discuss, and learn rather than forfeit all opinion-making to, oh I dunno, the chattering classes. That’s intellectual sovereignty, unless one is unfortunate enough to live in a totalitarian regime practicing thought control. Oh, wait … So it’s a sly form of credentialing to fence off or police opinion expressed from inexpert quarters as some sort of thought crime. Regarding MAD, maybe the era has passed when actual atomic scientists assessed our threat level. Now it’s a Science and Security Board made up of people few have ever heard of, and the scope of their concern, like the Pope’s, is wide enough to include all existential threats, not just the one assigned to them by pointy-headed categorists. Are politicians better qualified on such matters? Puhleeze! (OK, maybe Al Gore, but he appears to be busy monetizing climate change.)

As a self-described armchair social critic, I, too, recognized more than a decade ago the existential threat (extinction level, too) of climate change and have blogged about it continuously. Am I properly credentialed to see and state the, um, obvious? Maybe not. That’s why I don’t argue the science and peer-reviewed studies. But the dynamics, outlines, and essentials of climate change are eminently understandable by laypersons. That was true as well for Michael Ruppert, who was impeached by documentarians for lacking supposed credentialed expertise yet still having the temerity to state the obvious and sound the alarm. Indeed, considering our failure to act meaningfully to ameliorate even the worst case scenario, we’ve now got a second instance of mutually assured destruction, a suicide pact, and this one doesn’t rely on game-theoretical inevitability. It’s already happening all around us as we live and breathe … and die.

/rant off

Here’s a rather strange interaction: destruction budgets and moral license. The former refers to a theoretical or proposed budget for allowable environmental destruction. The latter refers to how doing something good allows rationalization of doing something bad as though one offsets (recognize that word?) the other. A familiar example is a physical workout that justifies a later sugar binge.

So just maybe some (outside executive offices anyway) are coming round to the idea that ongoing destruction of nature ought to be curtailed or better regulated. That’s the thrust of an article in Nature that mentions emissions budgets, which I’ve renamed destruction budgets. The article provides a decent overview of the largest threats, or environmental tipping points, that lead to an uninhabitable Earth. Human activity isn’t only about greenhouse gas emissions, however. Because industrial civilization has essentially had an unlimited destruction budget in the past, we’ve depleted and toxified air, soil, and water at such an alarming rate that we now have a limited number of harvests left and already face fresh water shortages that are only expected to worsen.

Turning to the viral pandemic, large segments of the population kept at home on lockdown triggered a different sort of destruction budget that didn’t exist before it suddenly did: economic destruction, joblessness, and financial ruin. For many Americans already stretched thin financially and psychologically, if the virus doesn’t get you first, then bankruptcy and despair will. Several rounds of bailouts (based on money that doesn’t exist) followed the economic slowdown and are freighted with moral hazard and moral license. Prior bailouts make clear where most of the money goes: deep corporate pockets, banks, and Wall Street. According to this unsophisticated poll, a clear majority do not want banks and financial institutions bailed out. There is even stronger public support for conditions on corporate bailouts, especially those conditions designed to protect employees.

Since we’re in wildly uncharted terrain from only 1.5 months of whatever this new paradigm is, it’s nearly impossible to predict what will occur by summertime or the fall. We’ve blown way past any reasonable destruction budget. In truth, such budgets probably never existed in the first place but were only used as metaphors to make plans no one expects to be binding, much like the toothless 2016 Paris Agreement. Every time we set a hypothetical self-imposed limit, we exceed it. That’s why, to me at least, 350.org is such a cruel joke: the target ceiling was breached decades before the organization was even founded in 2009 and hasn’t slowed its rate of increase since then. In effect, we’ve given ourselves license to disregard any imaginary budgets we might impose on ourselves. The pertinent question was raised by Thomas Massie (KY-Rep.) in the first new bailout bill when he openly challenged the number: “If getting us into $6 trillion more debt doesn’t matter, then why are we not getting $350 trillion more in debt so that we can give a check of $1 million to every person in the country?” How weird is it that both issues cite the number 350?