Posts Tagged ‘Culture’

The comic below struck a chord and reminded me of Gary Larson’s clumsily drawn but often trenchant The Far Side comics on scientific subjects.

This one masquerades as science but is merely wordplay, i.e., puns, double entendres, and unexpectedly funny malapropisms (made famous by Yogi Berra, among others). Wordplay is also found in various cultural realms, including comic strips and stand-up comedy, advertising and branding, politics, and now Wokedom (a subset of grassroots politics, some might argue). Playing with words has gone from being a clever, sometimes enjoyable diversion (e.g., crossword puzzles) to fully deranging, weaponized language. Some might be inclined to waive away the seriousness of that contention using the childhood retort “sticks and stones ….” Indeed, I’m far less convinced of the psychological power of verbal nastiness than those who insist words are violence. But it’s equally wrong to say that words don’t matter (much) or have no effect whatsoever. Otherwise, why would those acting in bad faith work so tirelessly to control the narrative, often by restricting free speech (as though writing or out-loud speech were necessary for thoughts to form)?

It’s with some exasperation that I observe words no longer retain their meanings. Yeah, yeah … language is dynamic. But semantic shifts usually occur slowly as language evolves. Moreover, for communication to occur effectively, senders and receivers must be aligned in their understandings of words. If you and I have divergent understandings of, say, yellow, we won’t get very far in discussions of egg yolks and sunsets. The same is true of words such as liberal, fascist, freedom, and violence. A lack of shared understanding of terms, perhaps borne out of ignorance, bias, or agenda, leads to communications breakdown. But it’s gotten far worse than that. The meanings of words have been thrown wide open to PoMo reinterpretation that often invert their meanings in precisely the way George Orwell observed in his novel 1984 (published 1949): “War is peace. Freedom is slavery. Ignorance is strength.” Thus, earnest discussion of limitations on free speech and actual restriction on social media platforms, often via algorithmic identification of keywords that fail to account for irony, sarcasm, or context, fail to register that implementation of restrictive kludges already means free speech is essentially gone. The usual exceptions (obscenity, defamation, incitement, gag orders, secrecy, deceptive advertising, student speech, etc.) are not nearly as problematic because they have been adjudicated for several generations and accepted as established practice. Indeed, many exceptions have been relaxed considerably (e.g., obscenity that has become standard patois now fails to shock or offend), and slimy workarounds are now commonplace (e.g., using “people are saying …” to say something horrible yet shielding oneself while saying it). Another gray area includes fighting words and offensive words, which are being expanded (out of a misguided campaign to sanitize?) to include many words with origins as clinical and scientific terms, and faux offense used to stifle speech.

Restrictions on free speech are working in many respects, as many choose to self-censor to avoid running afoul of various self-appointed watchdogs or roving Internet thought police (another Orwell prophecy come true) ready to pounce on some unapproved utterance. One can argue whether self-censorship is cowardly or judicious, I suppose. However, silence and the pretense of agreement only conceal thoughts harbored privately and left unexpressed, which is why restrictions on public speech are fool’s errands and strategic blunders. Maybe the genie can be bottled for a time, but that only produces resentment (not agreement), which boils over into seething rage (and worse) at some point.

At this particular moment in U.S. culture, however, restrictions are not my greatest concern. Rather, it’s the wholesale control of information gathering and reporting that misrepresent or remove from the public sphere ingredients needed to form coherent thoughts and opinions. It’s not happening only to the hoi polloi; those in positions of power and control are manipulated, too. (How many lobbyists per member of Congress, industry after industry, whispering in their ears like so many Wormtongues?) And in extreme cases of fame and cult of personality, a leader or despot unwittingly surrounds him- or herself by a coterie of yes-men frankly afraid to tell the truth out of careerist self-interest or because of shoot-the-messenger syndrome. It’s lonely at the top, right?

Addendum: Mere minutes after publishing this post, I wandered over to Bracing Views (on my blogroll) and found this post saying some of the same things, namely, that choking information off at the source results in a degraded information landscape. Worth a read.

A few months ago, I discovered David Hurwitz’s YouTube channel, which offers reviews of classical music recordings (as opposed to live concerts). Hurwitz shares with me (or is it the other way around?) an apparent fascination with the German symphonic repertoire. As executive editor of Classics Today, he has access to a far wider discography and, for purposes of comparison, delves into historical recordings from the 1930s to 60s far more thoroughly than I do. He deplores streaming services (I do, too), preferring physical media, though I will admit I stream plenty of recordings I don’t (yet) own, primarily to make a purchasing decision. (It’s a little weird that so much of the recorded repertoire is available to stream, essentially for free.) My opinions about specific recordings (orchestras, soloists, conductors) vary widely from those of Hurwitz, which is just fine since I’m not a newbie in need of guidance. Still, Hurwitz always has interesting things to say and some biases I find inexplicable.

Having heard quite a lot of Hurwitz’s discussions of various symphony cycles, I was prompted to go back and listen to discs (both LPs and CDs) not spun in a while. Just yesterday, I recovered a startling memory, namely, that in my early adulthood (pre-Internet), releases of new recordings were not publicized and it was only when one appeared in record stores (remember them?) that I was caught between the horns of an obvious dilemma: whether to purchase (with my rather limited funds) or defer. More importantly, I recalled febrile excitement when something appeared I really, really wanted to hear and own. On more than a few occasions, I had to prioritize and/or sacrifice in order to obtain to the venerated object(s), something less true now than then. Leaving something behind was disappointing but inevitable.

That singular excitement felt at the availability of some new objet d’art is commonplace in various fandoms, though individual tastes and predilections channel people toward different things. For instance, I’ve never camped out or even queued for the initial release of a new model iPhone (back in the day, derisively called the “Jesus phone”). Nor do I attend the opening night of a new movie out of a desire to be among the first viewers. Overpaying for tickets to a championship sporting event doesn’t appeal to me. I also don’t pay for pointless upgrades (e.g., airline tickets, valet parking) that function more as markers of status than as desirable, enhanced services. However, for many others, these are the venerated objects and services for which they are prepared to pay and/or sacrifice — sometimes quite a lot.

Age and wealth inform the calculus. The heightened emotionalism of my youth has been alleviated over the decades so that I now only infrequently venerate some object or experience. It’s too exhausting, but back when I had an abundance of emotional energy, it was commonplace. Also, had I the wealth to simply obtain everything I ever wanted without deferral or sacrifice, it’s not clear that anything would have gained special significance bordering on the sacred. This may well be one of the inevitable pitfalls of excess wealth: draining meaning out of things others are able enjoy with enthusiasm precisely because of scarcity or hardship.

Reflecting on these ideas, I also realized that there is still one category of venerated object for which I lust. It’s not a branded fashion item, luxury German sedan, pampered vacation, or second home on a secluded lake somewhere. Those are arguably within my reach but tend to be the domains of others far better situated financially than am I. No, my remaining venerated objects are obvious given what I’ve written above: high-end audio components. Whereas most recordings are quite easily obtained for less than what is now spent on a typical fast-food combo meal, the truly exceptional high-end audio I venerate starts around $15k and climbs from there. As with all luxury goods, diminishing returns set in early despite considerable emotional investment, so I have settled instead on an audiophile middle tier that frankly puts to shame the degraded listening environments most are only too happy to accept, typically out of ignorance and under-developed taste. Their veneration is projected onto other things.

For this blog post, let me offer short and long versions of the assertion and argument, of which one of Caitlin Johnstone’s many aphorisms is the short one:

Short version: Modern mainstream feminism is just one big celebration of the idea that women can murder, predate, oppress, and exploit for power and profit just as well as any man.

Long version: Depicting strength in terms of quintessential masculine characteristics is ruining (fictional) storytelling. (Offenders in contemporary cinema and streaming will go unnamed, but examples abound now that the strong-female-lead meme has overwhelmed characters, plots, and stories. Gawd, I tire of it.) One could survey the past few decades to identify ostensibly strong women basically behaving like asshole men just to — what? — show that it can be done? Is this somehow better than misogynist depictions of characters using feminine wiles (euphemism alert) to get what they want? These options coexist today, plus some mixture of the two. However, the main reason the strong female lead fails as storytelling — punching, fighting, and shooting toe-to-toe with men — is that it bears little resemblance to reality.

In sports (combat sports especially), men and women are simply not equally equipped for reasons physiological, not ideological. Running, jumping, throwing, swinging, and punching in any sport where speed and power are principal attributes favors male physiology. Exceptions under extraordinary conditions (i.e., ultradistance running) only demonstrate the rule. Sure, a well-trained and -conditioned female in her prime can beat and/or defeat an untrained and poorly conditioned male. If one of those MMA females came after me, I’d be toast because I’m entirely untrained and I’m well beyond the age of a cage fighter. But that’s not what’s usually depicted onscreen. Instead, it’s one badass going up against another badass, trading blows until a victor emerges. If the female is understood as the righteous one, she is typically shown victorious despite the egregious mismatch.

Nonadherence to reality can be entertaining, I suppose, which might explain why the past two decades have delivered so many overpowered superheroes and strong female leads, both of which are quickly becoming jokes and producing backlash. Do others share my concern that, as fiction bleeds into reality, credulous women might be influenced by what they see onscreen to engage recklessly in fights with men (or for that matter, other women)? Undoubtedly, a gallant or chivalrous man would take a few shots before fighting back, but if not felled quickly, my expectation is that the fight is far more likely to go very badly for the female. Moreover, what sort of message does it communicate to have females engaging in violence and inflicting their will on others, whether in the service of justice or otherwise? That’s the patriarchy in a nutshell. Rebranding matriarchal social norms in terms of negative male characteristics, even for entertainment purposes, serves no one particularly well. I wonder if hindsight will prompt the questions “what on Earth were we thinking?” Considering how human culture is stuck in permanent adolescence, I rather doubt it.

In the sense that all news is local and all learning is individual, meaning that it’s only when something is individualized and particularized that it takes on context and meaning, I may finally understand (some doubt still) Sheldon Wolin’s term “inverted totalitarianism,” part of the subtitle of his 2006 book Democracy Incorporated: Managed Democracy and the Specter of Inverted Totalitarianism. Regrettably, this book is among the (many) dozens that await my attention, so I can’t yet claim to have done the work. (I did catch a long YouTube interview of Wolin conducted by Chris Hedges, but that’s a poor substitute for reading the book.) My process is to percolate on a topic and its ancillary ideas over time until they come together satisfactorily, and my provisional understanding of the issues is closer to “proxy tyranny” than “inverted totalitarianism.”

I daresay most of us conceptualize tyranny and totalitarianism in the bootheel versions that manifested in several 20th-century despotic regimes (and survives in several others in the 21st century, names and locations withheld) where population management is characterized by stomping people down, grinding them into dust, and treating them as an undifferentiated mass. Administrators (e.g., secret police) paid close attention to anyone who might pose a problem for the regimes, and neighbors and family members were incentivized to betray inform on anyone who might be on officialdom’s radar. The 21st-century manifestation is different in that computers do most information gathering — a dragnet thrown over everyone — and we inform on ourselves by oversharing online. Close attention is still paid, but human eyes may never see extensive dossiers (forever records?) kept on each of us until something prompts attention. A further distinction is that in bootheel totalitarianism, intense scrutiny and punishment were ubiquitous, whereas at least in 21st-century America, a sizeable portion of the population can be handily ignored, abandoned, and/or forgotten. They’re powerless, harmless, and inconsequential, not drawing attention. Additionally, there is also no bottom to how low they can sink, as the burgeoning homeless population demonstrates.

If tyranny is normally understood as emanating from the top down, it’s inversion is bottom up. Wolin’s inverted totalitarianism is not a grassroots phenomenon but rather corporate capture of government. While Wolin’s formulation may be true (especially at the time his book was published), government has relinquished none of its power so much as realigned its objectives to fit corporate profit motives, and in doing so, shifted administrative burdens to proxies. Silicon Valley corporations (of the big data type especially) are the principal water carriers, practicing surveillance capitalism and as private entities exercising censorious cancellation of dissenting opinion that no formal government could countenance. Similarly, an entire generation of miseducated social justice warriors scours social media for evidence of noncomforming behavior, usually some offense of the meme of the moment a/k/a “I support the current thing” (though racism is the perennial accusation — an original sin that can never be forgiven or assuaged), waiting to pounce in indignation and destroy lives and livelihoods. Cancel culture is a true bottom-up phenomenon, with self-appointed emissaries doing the work that the government is only too happy to hand off to compliant, brainwashed ideologues.

In the Covid era, nonconforming individuals (e.g., those who refuse the jab(s) or call bullshit on continuously shifting narratives announced by various agencies that lack legal standing to compel anything) are disenfranchised in numerous ways even while the wider culture accepts that the pandemic is indeed endemic and simply gets on with life. Yet every brick-and-mortar establishment has been authorized, deputized, and indeed required to enforce unlawful policies of the moment as proxies for government application of force. Under threat of extended closure, every restaurant, retailer, arts organization, and sports venue demanded the literal or figurative equivalent of “papers please” to enter and assemble. Like the airlines, people are increasingly regarded as dehumanized cargo, treated roughly like the famous luggage ape (and not always without good reason). In most places, restrictions have been lifted; in others they persist. But make no mistake, this instantiation of proxy tyranny — compelling others to do the dirty work so that governments can not so plausibly deny direct responsibility — is the blueprint for future mistreatment. Personally, I’m rather ashamed that fewer Americans stood up for what is right and true (according to me, obviously), echoing this famous admission of moral failure. For my own part, I’ve resisted (and paid the price for that resistance) in several instances.

Search the tag Counter-Enlightenment at the footer of this blog to find roughly ten disparate blog posts, all circling around the idea that intellectual history, despite all the obvious goodies trucked in with science and technology, is turning decidedly away from long-established Enlightenment values. A fair number of resources are available online and in book form exploring various movements against the Enlightenment over the past few centuries, none of which I have consulted. Instead, I picked up Daniel Schwindt’s The Case Against the Modern World: A Crash Course in Traditionalist Thought (2016), which was gifted to me. The book was otherwise unlikely to attract my attention considering that Schwindt takes Catholicism as a starting point whereas I’m an avowed atheist, though with no particular desire to proselytize or attempt to convince others of anything. However, The Case Against is suffused with curious ideas, so it is a good subject for a new book blogging project, which in characteristic fashion (for me) will likely proceed in fits and starts.

Two interrelated ideas Schwindt puts forward early in the book fit with multiple themes of this blog, namely, (1) the discovery and/or development of the self (I refer more regularly to consciousness) and (2) the reductive compartmentalization of thought and behavior. Let’s take them in order. Here’s a capsule of the first issue:

(more…)

Most poets in the West believe that some sort of democracy is preferable to any sort of totalitarian state and accept certain political obligations … but I cannot think of a single poet of consequence whose work does not, either directly or by implication, condemn modern civilisation as an irremediable mistake, a bad world which we have to endure because it is there and no one knows how it could be made into a better one, but in which we can only retain our humanity in the degree to which we resist its pressures. — W.H. Auden

A while back, I made an oblique reference (a comment elsewhere, no link) to a famous Krishnamurti quote: “It is no measure of health to be well adjusted to a profoundly sick society.” Taken on its face, who would agree to be swept up in the madness and absurdity of any given historical moment? Turns out, almost everyone — even if that means self-destruction. The brief reply to my comment was along the lines of “Why shouldn’t you or I also make mental adjustments to prevailing sickness to obtain peace of mind and tranquility amidst the tumult?” Such an inversion of what seems to me right, proper, and acceptable caused me to reflect and recall the satirical movie Dr. Strangelove or: How I Learned to Stop Worrying and Love the Bomb. The full title is not often given, but the forgotten second part is what’s instructive (e.g., mutually assured destruction: MAD). Events spinning out of control? Nothing any individual can do to restore sanity? Stop squirming and embrace it.

That’s one option when faced with the prospect of futile resistance, I suppose. Give in, succumb, and join the party (more like a rager since the beginning of the Cold War). I also recognize that I’m not special enough to warrant any particular consideration for my intransigence. Yet it feels like self-betrayal to abandon the good character I’ve struggled (with mixed success) to build and maintain over the course of a lifetime. Why chuck all that now? Distinguishing character growth from decay it not always so simple. In addition, given my openness to new ideas and interpretations, established bodies of thought (often cultural consensus) are sometimes upended and destabilized by someone arguing cogently for or against something settled and unexamined for a long time. And then there is the epistemological crisis that has rendered sense-making nearly impossible. That crisis is intensified by a variety of character types acting in bad faith to pollute the public sphere and drive false narratives.

For instance, the show trial public hearings just begun regarding the January 6 attack on the U.S. Capitol (or whatever it’s being called, I prefer “Storming of the Capitol”) are commonly understood, at least from one side of the political spectrum, as a deliberate and brazen attempt to brainwash the public. I decline to tune in. But that doesn’t mean my opinions on that topic are secure any more than I know how true and accurate was the 2020 election that preceded and sparked the Jan. 6 attack. Multiple accounts of the election and subsequent attack aim to convert me (opinion-wise) to one exclusive narrative or another, but I have no way to evaluate narrative claims beyond whatever noise reaches me through the mainstream media I try to ignore. Indeed, those in the streets and Capitol building on Jan. 6 were arguably swept into a narrative maelstrom that provoked a fairly radical if ultimately harmless event. No one knew at the time, of course, exactly how it would play out.

So that’s the current state of play. Ridiculous, absurd events, each with competing narratives, have become the new normal. Yours facts and beliefs do daily battle with my facts and beliefs in an ideological battle of all against all — at least until individuals form into tribes declare their political identity and join that absurdity.

From the outset, credit goes to Jonathan Haidt for providing the ideas to launch this blog post. He appears to be making the rounds again flogging his most recent publication (where? I dunno, maybe The Atlantic). In the YouTube interview I caught, Haidt admits openly that as a social and behavioral psychologist, he’s prone to recommending incentives, programs, and regulations to combat destructive developments in contemporary life — especially those in the academy and on social media that have spread into politics and across the general public. Haidt wears impressive professional armor in support of arguments and contentions; I lack such rigor rather conspicuously. Accordingly, I offer no recommendations but instead try to limit myself to describing dynamics as an armchair social critic. Caveat emptor.

Haidt favors viewpoint diversity (see, for example, Heterodox Academy, which he helped to found and now chairs). Simple enough, right? Not so fast there, Señor Gonzalez! Any notion that even passing acquaintance with a given subject requires knowing both pros and cons is anathema to many of today’s thinkers, who would rather plug their ears and pretend opposition voices, principled or otherwise, are simply incoherent, need not be considered, and further, should be silenced and expunged. As a result, extremist branches of any faction tend to be ideological echo chambers. Cardinal weaknesses in such an approach are plain enough for critical thinkers to recognize, but if one happens to fall into one of those chambers, silos, or bubbles (or attend a school that trains students in rigid thinking), invitations to challenge cherished and closely held beliefs, upon which identity is built, mostly fall on deaf ears. The effect is bad enough in individuals, but when spread across organizations that adopt ill-advised solutionism, Haidt’s assessment is that institutional stupidity sets in. The handy example is higher education (now an oxymoron). Many formerly respectable institutions have essentially abandoned reason (ya know, the way reasonable people think) and begun flagellating themselves in abject shame over, for instance, a recovered history of participation in any of the cultural practices now cause for immediate and reflexive cancellation.

By way of analogy, think of one’s perspective as a knife (tool, not weapon) that requires periodic sharpening to retain effectiveness. Refusing to entertain opposing viewpoints is like sharpening only one side of the blade, resulting in a blunt, useless tool. That metaphor suggests a false dualism: two sides to an argument/blade when in fact many facets inform most complex issues, thus viewpoint diversity. By working in good faith with both supporters and detractors, better results (though not perfection) can be obtained than when radicalized entities come to dominate and impose their one-size-fits-all will indiscriminately. In precisely that way, it’s probably better not to become any too successful or powerful lest one be tempted to embrace a shortsighted will to power and accept character distortions that accompany a precipitous rise.

As mentioned a couple blog posts ago, an unwillingness to shut up, listen, and learn (why bother? solutions are just … so … obvious …) has set many people on a path of activism. The hubris of convincing oneself of possession of solutions to intractable issues is bizarre. Is there an example of top-down planning, channeling, and engineering of a society that actually worked without tyrannizing the citizenry in the process? I can’t think of one. Liberal democratic societies determined centuries ago that freedom and self-determination mixed with assumed responsibility and care within one’s community are preferable to governance that treats individuals as masses to be forced into conformity (administrative or otherwise), regulated heavily, and/or disproportionately incarcerated like in the U.S. But the worm has turned. Budding authoritarians now seek reforms and uniformity to manage diverse, messy populations.

Weirdly, ideologues also attempt to purge and purify history, which is chock full of villainy and atrocity. Those most ideologically possessed seek both historical and contemporary targets to denounce and cancel, not even excluding themselves because, after all, the scourges of history are so abject and everyone benefited from them somehow. Search oneself for inherited privilege and all pay up for past iniquities! That’s the self-flagellating aspect: taking upon oneself (and depositing on others) the full weight of and responsibility for the sins of our forebears. Yet stamping out stubborn embers of fires allegedly still burning from many generations ago is an endless task. Absolutely no one measures up to expectations of sainthood when situated with an inherently and irredeemably evil society of men and women. That’s original sin, which can never be erased or forgiven. Just look at what humanity (via industrial civilization) has done to the surface of the planet. Everyone is criminally culpable. So give up all aspirations; no one can ever be worthy. Indeed, who even deserves to live?

Heard a remark (can’t remember where) that most these days would attack as openly ageist. Basically, if you’re young (let’s say below 25 years of age), then it’s your time to shut up, listen, and learn. Some might even say that true wisdom doesn’t typically emerge until much later in life, if indeed it appears at all. Exceptions only prove the rule. On the flip side, energy, creativity, and indignation (e.g., “it’s not fair! “) needed to drive social movements are typically the domain of those who have less to lose and everything to gain, meaning those just starting out in adult life. A full age range is needed, I suppose, since society isn’t generally age stratified except at the extremes (childhood and advanced age). (Turns out that what to call old people and what counts as old is rather clumsy, though probably not especially controversial.)

With this in mind, I can’t help but to wonder what’s going on with recent waves of social unrest and irrational ideology. Competing factions agitate vociferously in favor of one social/political ideology or another as though most of the ideas presented have no history. (Resemblances to Marxism, Bolshevism, and white supremacy are quite common. Liberal democracy, not so much.) Although factions aren’t by any means populated solely by young people, I observe that roughly a decade ago, higher education in particular transformed itself into an incubator for radicals and revolutionaries. Whether dissatisfaction began with the faculty and infected the students is impossible for me to assess. I’m not inside that intellectual bubble. However, urgent calls for radical reform have since moved well beyond the academy. A political program or ideology has yet to be put forward that I can support fully. (My doomer assessment of what the future holds forestalls knowing with any confidence what sort of program or ideology into which to pour my waning emotional and intellectual energy.) It’s still fairly simple to criticize and denounce, of course. Lots of things egregiously wrong in the world.

My frustration with what passes for political debate (if Twitter is any indication) is the marked tendency to immediately resort to comparisons with Yahtzees in general or Phitler in particular. It’s unhinged and unproductive. Yahtzees are cited as an emotional trigger, much like baseless accusations of racism send everyone scrambling for cover lest they be cancelled. Typically, the Yahtzee/Phitler comparison or accusation itself is enough to put someone on their heels, but wizened folks (those lucky few) recognize the cheap rhetorical trick. The Yahtzee Protocol isn’t quite the same as Godwin’s Law, which states that the longer a discussion goes on (at Usenet in the earliest examples) increases the inevitability likelihood of someone bringing up Yahtzees and Phitler and ruining useful participation. The protocol has been deployed effectively in the Russian-Ukraine conflict, though I’m at a loss to determine in which direction. The mere existence of the now-infamous Azov Battalion, purportedly comprised is Yahtzees, means that automatically, reflexively, the fight is on. Who can say what the background rate of Yahtzee sympathizers (whatever that means) might be in any fighting force or indeed the general population? Not me. Similarly, what threshold qualifies a tyrant to stand beside Phitler on a list of worst evers? Those accusations are flung around like cooked spaghetti thrown against the wall just to see what sticks. Even if the accusation does stick, what possible good does it do? Ah, I know: it makes the accuser look like a virtuous fool.

I use the tag redux to signal that the topic of a previous blog post is being revisited, reinforced, and repurposed. The choice of title for this one could easily have gone instead to Your Brain on Postmodernism, Coping with the Post-Truth World, or numerous others. The one chosen, however, is probably the best fit given than compounding crises continue pushing along the path of self-annihilation. Once one crisis grows stale — at least in terms of novelty — another is rotated in to keep us shivering in fear, year after year. The date of civilizational collapse is still unknown, which is really more process anyway, also of an unknown duration. Before reading what I’ve got to offer, perhaps wander over to Clusterfuck Nation and read James Howard Kunstler’s latest take on our current madness.

/rant on

So yeah, various cultures and subcultures are either in the process of going mad or have already achieved that sorry state. Because madness is inherently irrational and unrestrained, specific manifestations are unpredictable. However, the usual trigger for entire societies to lose their tether to reality is relatively clear: existential threat. And boy howdy are those threats multiplying and gaining intensity. Pick which of the Four Horsemen of the Apocalypse with whom to ride to the grave, I guess. Any one will do; all four are galloping simultaneously, plus a few other demonic riders not identified in that mythological taxonomy. Kunstler’s focus du jour is censorship and misinformation (faux disambiguation: disinformation, malinformation, dishonesty, gaslighting, propaganda, fake news, falsehood, lying, cozenage, confidence games, fraud, conspiracy theories, psyops, personal facts), about which I’ve blogged repeatedly under the tag epistemology. Although major concerns, censorship and misinformation are outgrowths of spreading madness, not the things that will kill anyone directly. Indeed, humans have shown a remarkable capacity to hold in mind crazy belief systems or stuff down discomfiting and disapproved thoughts even without significant threat. Now that significant threats spark the intuition that time is running perilously short, no wonder so many have fled reality into the false safety of ideation. Inability to think and express oneself freely or to detect and divine truth does, however, block what few solutions to problems remain to be discovered.

Among recent developments I find unsettling and dispiriting is news that U.S. officials, in their effort to — what? — defeat the Russians in a war we’re not officially fighting, are just making shit up and issuing statements to their dutiful stenographers in the legacy press to report. As I understand it, there isn’t even any pretense about it. So to fight phantoms, U.S. leaders conjure out of nothingness justifications for involvements, strategies, and actions that are the stuff of pure fantasy. This is a fully, recognizably insane: to fight monsters, we must become monsters. It’s also maniacally stupid. Further, it’s never been clear to me that Russians are categorically baddies. They have dealt with state propaganda and existential threats (e.g., the Bolshevik Revolution, WWII, the Cold War, the Soviet collapse, being hemmed in by NATO countries) far more regularly than most Americans and know better than to believe blindly what they’re told. On a human level, who can’t empathize with their plights? (Don’t answer that question.)

In other denial-of-reality news, demand for housing in Sun Belt cities has driven rent increases ranging between approximately 30% and 60% over the past two years compared to many northern cities well under 10%. Americans are migrating to the Sun Belt despite, for instance, catastrophic drought and wild fires. Lake Powell sits at an historically low level, threatening reductions in water and electrical power. What happens when desert cities in CA, AZ, NV, and NM become uninhabitable? Texas isn’t far behind. This trend has been visible for decades, yet many Americans (and immigrants, too) are positioning themselves directly in harm’s way.

I’ve been a doomsayer for over a decade now, reminding my two or three readers (on and off) that the civilization humans built for ourselves cannot stand much longer. Lots of people know this yet act as though concerns are overstated or irrelevant. It’s madness, no? Or is it one last, great hurrah before things crack up apocalyptically? On balance, what’s a person to do but to keep trudging on? No doubt the Absurdists got something correct.

/rant off

The major media — particularly, the elite media that set the agenda that others generally follow — are corporations “selling” privileged audiences to other businesses. It would hardly come as a surprise if the picture of the world they present were to reflect the perspectives and interests of the sellers, the buyers, and the product … Furthermore, those who occupy managerial positions in the media, or gain status within them as commentators, belong to the same privileged elites, and might be expected to share the perceptions, aspirations, and attitudes of their associates, reflecting their own class interests as well. Journalists entering the system are unlikely to make their way unless they conform to these ideological pressures, generally by internalizing the values; it is not easy to say one thing and believe another, and those who fail to conform will tend to be weeded out by familiar mechanisms.
—Noam Chomsky (Necessary Illusions: Thought Control in Democratic Societies)

In U.S. politics, received wisdom instructs citizens to work within the system, not to challenge the system directly in protest, rebellion, or revolt. Yet it’s often paradoxically believed that only an outsider can reform or fix problems that endure generation after generation. The 2016 U.S. presidential election was emblematic of this second sentiment: an outsider who had never held political office but was unexpectedly installed in the Oval Office anyway — largely on the basis of several three-word promises to accomplish things only he, an outsider unbeholden to existing power structures, could do. Since that chief executive no longer holds office and none of his three-word promises came to fruition, one might pause to wonder why the putatively intrepid outsider is still held up in some circles as preferable to the insider. Was he beholden to existing power structures after all? Or was he transformed quickly into a faithful tool of the establishment despite antipathy toward it and his coarse, unorthodox style?

These are unanswerable questions, and one could argue that conjecture on the subject doesn’t matter, either. The reality most of us experience outside the halls of power is markedly different from that of those on the inside. Further, when a recently hired journalist or newly elected government official completes their orientation period, they reliably become insiders, too. The Chomsky quote above is directed to that process, which is a system dynamic without anyone already inside needing to twirl a mustache or roll their hands in a cliché of evil. The outsider becomes an insider simply by being hired or elected and seeing how things get done by colleagues. No need to name names. Bringing the outside inside appears to be an effective mechanism for nullifying authoritative dissent and watchdog action that used to be handled by the 4th Estate in particular. What’s appeared following journalistic abandonment of that role is a variety of citizens and breakaway journalists on alternative media. The job is getting done, sorta.

Principled dissent, not the two-party theatrics that pass for opposition, are needed to keep self-governance from falling prey to capture. Since roughly the 1990s, when the Democratic Party betrayed its working class constituency and became corporate boosters, opposition dried up and corporate was effectively added to the term military-industrial-corporate complex, an old term that drew attention to a unified chorus of pro-war military leaders and arms manufacturers that had captured government in the early days of the Cold War. Indeed, for decades now, very few prominent insiders in journalism and government have even bothered to try to steer the U.S. away from war and nonstop military escapades. Popular opposition among the citizenry unfailingly falls on deaf ears. Do insiders know things we outsiders don’t? I rather doubt it.