Archive for the ‘Cognition’ Category

After a hiatus due to health issues, Jordan Peterson has reappeared in the public sphere. Good for him. I find him one of the most stimulating public intellectuals to appear thus far into the 21st century, though several others (unnamed) spring to mind who have a stronger claims on my attention. Yet I’m wary of Peterson as an effective evaluator of every development coughed up for public consideration. It’s simply not necessary or warranted for him to opine recklessly about every last damn thing. (Other podcasters are doing the same, and although I don’t want to instruct anyone to stay in their lane, I also recognize that Joe “Talkity-Talk” Blow’s hot take or rehash on this, that, and every other thing really isn’t worth my time.) With the inordinate volume of text in his books, video on his YouTube channel (classroom lectures, podcasts, interviews) and as a guest on others’ podcasts, and third-party writing about him (like mine), it’s inevitable that Peterson will run afoul of far better analysis than he himself can bring to bear. However, he declares his opinions forcefully and with overbearing confidence then decamps to obfuscation and reframing whenever someone pushes back effectively (which isn’t often, at least when in direct communication). With exasperation, I observe that he’s basically up to his old rhetorical tricks.

In a wide-ranging discussion on The Joe Rogan Experience from January 2022 (found exclusively on Spotify for anyone somehow unaware of Rogan’s influence in the public sphere), the thing that most irked me was Peterson’s take on the climate emergency. He described climate as too complex, with too many variable and unknowns, to embody in scientific models over extended periods of time. Seems to me Peterson has that entirely backwards. Weather (and extreme weather events) on the short term can’t be predicted too accurately, so daily/weekly/monthly forecasts give wide ranges of, say, cloud cover, temperature, and precipitation. But over substantial time (let’s start with a few decades, which is still a blink in geological time), trends and boundaries reveal themselves pretty reliably, which is why disturbances — such as burning enough fossil fuels to alter the chemical composition of the planet’s atmosphere — that upset the climate steady-state known as Garden Earth are not merely cause for serious concern but harbingers of doom. And then, as others often do, Peterson reframed the climate emergency largely in terms of economics (same thing happened with the pandemic, though not by Peterson so far as I know), suggesting that the problem is characterized by inefficiencies and grass-roots policy that would be magically different if more people were raised out of poverty and could advocate for solutions rather than simply struggle to survive. Dude apparently hasn’t grasped that wealth in the modern world is an outgrowth of the very thing — fossil fuels — that is the root of the problem. Further, industrial civilization is a heat engine that binds us to a warming trend. That’s a thermodynamic principle flatly immune to half-baked economic theories and ecological advocacy. Peterson also gives no indication of ever having acknowledged Jevons Paradox.

So let me state somewhat emphatically: the climate emergency is in fact an existential crisis on several fronts (e.g., resource depletion and scarcity, ecological despoliation, extreme weather events, and loss of habitat, all resulting in civilizational collapse). The rate of species extinction — before human population has begun to collapse in earnest, 8 Billion Day looms near — is several orders of magnitude greater than historical examples. Humans are unlikely to survive to the end of the century even if we refrain from blowing ourselves up over pointless geopolitical squabbles. I’ll defer to Peterson in his area of expertise: personality inventories. I’ll also grant him space to explore myth and symbolism in Western culture. But speaking on climate, he sounds like an ignoramus — the dangerous sort who leads others astray. And when challenged by someone armed with knowledge of governing principles, grasp of detail, and thus analysis superior to what he can muster (such as when debating Richard Wolff about Marxism), Peterson frequently resorts to a series of motte-and-bailey assertions that confound inexpert interlocutors. “Well, that depends on what you mean by ….” His retreat to faux safety is sometimes so astonishingly complete that he resorts to questioning the foundation of reality: “Why the sun? Why this sky? Why these stars? Why not something else completely?” Also, Peterson’s penchant for pointing out that the future is contingent and unknown despite, for instance, all indicators positively screaming to stop destroying our own habitat, as though no predictions or models can be made that have more than a whisper of accuracy in future outcomes, is mere rhetoric to forestall losing an argument.

As I’ve asserted repeatedly, sufficiency is the crucible on which all decisions are formed because earnest information gathering cannot persist interminably. Tipping points (ecological ones, sure, but more importantly, psychological ones) actually exist, where one must act despite incomplete knowledge and unclear prognosis. Accordingly, every decision is on some level a leap into the unknown and/or an act of faith. That doesn’t mean every decision is a wild, reckless foray based on nothing. Rather, when the propitious moment arrives (if one has the wherewithal to recognize it), one has to go with what one’s got, knowing that mistakes will be made and corrections will be needed.

Peterson’s improvisational speaking style is both impressive and inscrutable. I’m sometimes reminded of Marshall McLuhan, whose purported Asperger’s Syndrome (low-grade autism, perhaps, I’m unsure) awarded him unique insights into the emerging field of media theory that were not easily distilled in speech. Greta Thunberg is another more recent public figure whose cognitive character allows her to recognize rather acutely how human institutions have completely botched the job of keeping industrial civilization from consuming itself. Indeed, people from many diverse backgrounds, not hemmed in by the rigid dictates of politics, economics, and science, intuit through diverse ways of knowing (e.g., kinesthetic, aesthetic, artistic, religious, psychedelic) what I’ve written about repeatedly under the title “Life Out of Balance.” I’ve begun to understand Peterson as a mystic overwhelmed by the sheer beauty of existence but simultaneously horrified by unspeakably awful evils humans perpetrate on each other. Glimpses of both (and humor, a bit unexpectedly) often provoke cracks in his voice, sniffles, and tears as he speaks, clearly choking back emotions to keep his composure. Peterson’s essential message (if I can be so bold), like other mystics, is aspirational, transcendental, and charismatic. Such messages are impossible to express fully and are frankly ill-suited to 21st-century Western culture. That we’re severely out of balance, unable to regain an upright and righteous orientation, is plain to nearly everyone not already lost in the thrall of mass media and social media, but so long as the dominant culture remains preoccupied with wealth, consumption, celebrity, geopolitical violence, spectacle, and trash entertainment, I can’t envision any sort of return to piety and self-restraint. Plus, we can’t outrun the climate emergency bearing down on us.

Continuing from the previous blog post, lengthy credit scrolls at the ends of movies have become a favorite hiding place for bloopers and teasers. The purpose of this practice is unclear, since I can’t pretend (unlike many reckless opinonators) to inhabit the minds of filmmakers, but it has become a fairly reliable afterthought for film-goers willing to wait out the credits. Those who depart the theater, change the channel, or click away to other content may know they are relinquishing some last tidbit to be discovered, but there’s no way to know in advance if one is being punked or pleased, or indeed if there is anything at all there. Clickbait news often employs this same technique, teasing some newsbit in the headline to entice readers to wade (or skim) through a series of (ugh!) one-sentence paragraphs to find the desired content, which sometimes is not even provided. At least one film (Monty Python’s The Secret Policeman’s Other Ball (1982) as memory serves) pranked those in a rush to beat foot traffic out of the theater (back when film-going meant visiting the cinema) by having an additional thirty minutes of material after the (first) credit sequence.

This also put me in mind of Paul Harvey radio broadcasts ending with the sign-off tag line, “… the rest of the story.” Harvey supplemented the news with obscure yet interesting facts and analysis that tended to reshape one’s understanding of consensus narrative. Such reshaping is especially important as an ongoing process of clarification and revision. When served up in delectable chunks by winning personalities like Paul Harvey, supplemental material is easily absorbed. When material requires effort to obtain and/or challenges one’s beliefs, something strongly, well, the default response is probably not to bother. However, those possessing intellectual integrity welcome challenging material and indeed seek it out. Indeed, invalidation of a thesis or hypothesis is fundamental to the scientific method, and no body of work can be sequestered from scrutiny and then be held as legitimately authoritative.

Yet that’s what happens routinely in the contemporary infosphere. A government press office or corporate public relations officer issues guidance or policy in direct conflict with earlier guidance or policy and in doing so seeks to place any resulting cognitive dissonance beyond examination and out of scope. Simple matters of adjustment are not what concern me. Rather, it’s wholesale brainwashing that is of concern, when something is clear within one’s memory or plainly documented in print/video yet brazenly denied, circumvented, and deflected in favor of a new directive. The American public has contended with this repeatedly as each new presidential administration demonizes the policies of its predecessors but typically without demonstrating the self-reflection and -examination to admit, wrongdoing, responsibility, or error on anyone’s part. It’s a distinctly American phenomenon, though others have cottoned onto it and adopted the practice for themselves.

Exhaustion from separating the spin-doctored utterances of one malefactor or another from one’s own direct experience and sense-making drives many to simply give up. “Whatever you say, sir. Lemme go back to my entertainments.” The prospect of a never-ending slog through evidence and analysis only to arrive on unsteady ground, due to shift underfoot again and again with each new revelation, is particularly unsatisfactory. And as discussed before, those who nonetheless strain to achieve knowledge and understanding that reach temporary sufficiency yet remain permanently, intransigently provisional find themselves thwarted by those in the employ of organizations willing and eager to game information systems in the service of their not-even-hidden agendas. Alternative dangers for the muddled thinker include retreating into fixed ideology or collapsing into solipsism. Maybe none of it matters in the end. We can choose our beliefs from the buffet of available options without adherence to reality. We can create our own reality. Of course, that’s a description of madness, to which many have already succumbed. Why aren’t they wearing straitjackets?

Let me first restate axioms developed in previous blog posts. Narrative is the essential outward form of consciousness. Cognition has many preverbal and nonverbal subtleties, but the exchange of ideas occurs predominantly through narrative, and the story of self (told to oneself) can be understood as stream of consciousness: ongoing self-narration of sensations and events. The principal characteristic of narrative, at least that which is not pure fantasy, is in-the-moment sufficiency. Snap-judgment heuristics are merely temporary placeholders until, ideally at least, thoughtful reconsideration and revision that take time and discernment can be brought to bear. Stories we tell and are told, however, often do not reflect reality well, partly because our perceptual apparatuses are flawed, partly because individuals are untrained and unskilled in critical thinking (or overtrained and distorted), and partly because stories are polluted with emotions that make clear assessments impossible (to say nothing of malefactors with agendas). Some of us struggle to remove confabulation from narrative (as best we can) whereas others embrace it because it’s emotionally gratifying.

A good example of the reality principle is recognition, similar to the 1970s energy crisis, that energy supplies don’t magically appear by simply digging and drilling more of the stuff out of the ground. Those easy-to-get resources have been plundered already. The term peak oil refers to eventual decline in energy production (harvesting, really) when the easy stuff is more than half gone and undiminished (read: increasing) demand impels energy companies to go in search of more exotic supply (e.g., underwater or embedded in shale). If that reality is dissatisfying, a host of dreamt-up stories offer us deliverance from inevitable decline and reduction of lifestyle prerogatives by positing extravagant resources in renewables, hydrogen fuel cells, fusion (not to be confused with fission), or as-yet unexploited regions such as The Arctic National Wildlife Refuge. None of these represent plausible realities (except going into heretofore protected regions and bringing ecological devastation).

The relationship of fictional stories to reality is quite complex. For this blog post, a radically narrow description is that fiction is the imaginary space whereas ideas can be tried out and explored safely in preparation for implementation in reality. Science fiction (i.e., imagining interstellar space travel despite its flat impossibility in Newtonian physics) is a good example. Some believe humans can eventually accomplish what’s depicted in sci-fi, and in certain limited examples we already have. But many sci-fi stories simply don’t present a plausible reality. Taken as vicarious entertainment, they’re AOK superfine with me. But given that Western cultures (I can’t opine on cultures outside the West) have veered dangerously into rank ideation and believing their own hype, too many people believe fervently in aspirational futures that have no hope of ever instantiating. Just like giant pools of oil hidden under the Rocky Mountains (to cite something sent to me just today offering illusory relief from skyrocketing gasoline prices).

Among the many genres of narrative now on offer in fiction, no better example of sought-after-power is the superhero story. Identifying with the technological and financial power of Ironman and Batman or the god-powers of Thor and Wonder Woman is thrilling, perhaps, but again, these are not plausible realities. Yet these superrich, superstrong, superintelligent superheros are everywhere in fiction, attesting to liminal awareness of lack of power and indeed frailty. Many superhero stories are couched as coming-of-age stories for girls, who with grit and determination can fight toe-to-toe with any man and dominate. (Too many BS examples to cite.) Helps, of course, if the girl has magic at her disposal. Gawd, do I tire of these stories, told as origins in suffering, acquisition of skills, and coming into one’s own with the mature ability to force one’s will on others, often in the form of straight-up killing and assassination. Judge, jury, and executioner all rolled into one but entirely acceptable vigilantism if done wearing a supersuit and claiming spurious, self-appointed moral authority.

There are better narratives that don’t conflate power with force or lack plausibility in the world we actually inhabit. In a rather complicated article by Adam Tooze entitled “John Mearsheimer and the Dark Origins of Realism” at The New Statesman, after a lengthy historical and geopolitical analysis of competing narratives, a mode of apprehending reality is described:

… adopting a realistic approach towards the world does not consist in always reaching for a well-worn toolkit of timeless verities, nor does it consist in affecting a hard-boiled attitude so as to inoculate oneself forever against liberal enthusiasm. Realism, taken seriously, entails a never-ending cognitive and emotional challenge. It involves a minute-by-minute struggle to understand a complex and constantly evolving world, in which we are ourselves immersed, a world that we can, to a degree, influence and change, but which constantly challenges our categories and the definitions of our interests. And in that struggle for realism – the never-ending task of sensibly defining interests and pursuing them as best we can – to resort to war, by any side, should be acknowledged for what it is. It should not be normalised as the logical and obvious reaction to given circumstances, but recognised as a radical and perilous act, fraught with moral consequences. Any thinker or politician too callous or shallow to face that stark reality, should be judged accordingly.

The phrase “all roads lead to Rome” is a way of saying that, at its height, Rome was the center of the Western world and multiple paths led to that eventual destination. That’s where the action was. Less obviously, the phrase also suggests that different approaches can lead to an identical outcome. Not all approaches to a given result are equal, however, but who’s splitting those hairs? Not too many, and not nearly enough. Instead, the public has been railroaded into false consensus on a variety of issues, the principal attributes being that the destination is predetermined and all rails lead there. Probably oughta be a new saying about being “railroaded to Rome” but I haven’t hit upon a formulation I like.

Ukraine

War drums have been beating for some time now about a hotly desired (by TPTB, who else?) regional war over Ukraine being aligned with Europe or part of Russia. Personal opinions (mine, yours) on whether Ukraine might join NATO or be annexed by Russia don’t really matter. Even a default aversion to war doesn’t matter. Accordingly, every step taken by the Russian government is reported as a provocation, escalation, and/or signal of imminent invasion. And in reverse, no interference, meddling, or manipulation undertaken by Western powers is cause for concern because it’s all good, clean, innocent business. Wasn’t a version of this maneuver executed in the run-up to the U.S.-led invasion of Iraq? (Be honest, it was a preemptive invasion that preempted nothing.) Legacy media have failed entirely (as I understand their reporting, anyway) to ask the right questions or conceive of any outcome that isn’t war with Russia. (Similar moves are being made regarding the China-Taiwan controversy.) After all, that’s where the action is.

Vaccines

Many observers were surprised at how quickly vaccines appeared after the current pandemic circled the world. Vaccines normally take many years to achieve sufficient safety and effectiveness to be approved for general use. However, Covid vaccines were apparently in development well before the pandemic hit because medical labs had been monkeying with the virus for years along with possible treatments. No matter that vaccine safety and effectiveness were never fully realized. The rush to market was a predetermined outcome that required emergency use authorization, redefinition of the term vaccine, and active suppression of other drug protocols. Mandates that everyone (EVERYONE!) be vaccinated and boosted (and now boosted again every few months to keep current) are in clear conflict with a host of rules, codes, and medical ethics, to say nothing of common sense. Creation of two-tier societies based on vaccination status (three-tier if RightThink is added) to force the unvaccinated into compliance is outright tyranny. This coercion lends false legitimacy to an emerging biosecurity state with wide application for the supposedly clean (vaccinated but not necessarily healthy) and unclean (unvaccinated but not necessarily unhealthy). After all, that’s where the action is.

Free Thought and Free Speech

The constant thrum of fearmongering legacy media has turned a large percentage of the public into cowering fools lest something turn out to be even mildly upsetting or give offense. “Save me, mommy! Protect me, daddy! Allow no one to introduce or discuss a fact or idea that conflicts with my cherished innocence. Words are violence!” Well, sorry, snowflake. The big, bad world is not set up to keep you safe, nor is history a series of nice, tidy affairs without disturbing incidents or periodic social madness. Moreover, governments, media, and scientists cannot rightfully claim to be final arbiters of truth. Truth seeking and truth telling are decidedly messy and have been throughout modern history. Despite attempts, no one can command or suppress thought in any but the most banal fashion (e.g., don’t think about polka-dot elephants); that’s not how cognition works except perhaps under extraordinary circumstances. Similarly, the scientific method doesn’t work without free, open scrutiny and reevaluation of scientific claims. Yet the risible notion that people can or should be railroaded into approved thought/speech via censorship and/or cancellation is back with a vengeance. Were lessons of the past (or the present in some regimes) never learned? Indeed, I must admit to being flabbergasted how otherwise normal thinkers (no disability or brain damage) — especially reflexive rule-followers (sheeple?) who shrink from all forms of conflict — have accepted rather easily others quite literally telling them what to think/say/do. To resolve cognitive dissonance lurking beneath consciousness, rationalizations are sought to explain away the inability to think for oneself with integrity. Adoption of routine DoubleThink just floors me. But hey, that’s where the action is.

Addendum

Apologies or citing George Orwell as often as I do. Orwell got so much correct in his depiction of a dystopic future. Regrettably, and as others have pointed out, TPTB have mistaken 1984 (Orwell’s novel) as a playbook rather than a dire warning.

By a substack author going by the pseudonym Moneycircus, describing the cult of paranoid preparedness (a subset of safetyism?), unnecessary paragraph breaks removed:

We should be alert to the suffering of children. For they are the most delicate in society, the point at which the bough breaks. Children should experience life one bright day at a time, bursting with colours, tastes and sounds. It is an experience so complete that they only have time for the present. Yet talking to children during the pandemic I see their time accelerates. They are already falling into remembrance. They ask questions that only adults should ask, and later in life: “Do you remember when … such and such? What was that place where …?” This means they are experiencing life at one remove. This is cruelty beyond measure.

As a sometimes presenter of aphorisms, felicitous and humorous turns of phrase and logic interest me as examples of heuristics aimed as parsimony and cognitive efficiency. Whether one recognizes those terms or not, everyone uses snap categorization and other shortcuts to manage and alleviate crowded thinking from overwhelming demands on perception. Most of us, most of the time, use sufficiency as the primary decision-making mode, which boils down to “close enough for horseshoes and hand grenades.” Emotion is typically the trigger, not rational analysis. After enough repetition is established, unthinking habit takes over. Prior to habituation, however, the wisdom of sages has provided useful rubrics to save unnecessary and pointless labor over casuistry flung into one’s way to impede, convince, or gaslight. (I previously wrote about this effect here).

As categories, I pay close attention to razors, rules, laws, principles, and Zuihitsu when they appear as aphorisms in the writing of those I read and follow online. Famous rules, laws, and principles include Occam’s Razor, (Finagle’s Corollary to) Murphy’s Law, Godwin’s Law, Jevon’s Paradox, and the Dunning-Kruger Effect (do your own searches if these escape you). Some are quite useful at dispelling faulty thinking and argumentation. Café Bedouin (see blogroll) has an ongoing series of Zuihitsu, which has grown quite long. Many ring fundamentally true; others are either highly situational or wrong on their face, perhaps revealing the cardinal weakness of reduction of ideas to short, quotable phrases.

I recently learned of Hitchens’ Razor (after Christopher Hitchens), usually given as “What can be asserted without evidence can also be dismissed without evidence.” According to the Wikipedia entry, it may well have been reconstituted, repurposed, or revived from other sources stretching back into antiquity. Caitlin Johnson, a notable aphorist I’ve quoted numerous times, uses Hitchens’ Razor to put the lie to claims from the U.S. war machine and its dutiful media lapdogs that the “situation in Ukraine” (whatever that is) demands intervention by Western powers lest the utility bad guys of the moment, the Russians, be allowed to run roughshod over its neighbor Ukraine, which (significantly) used to be part of the now-defunct Soviet Union. As with many controversial, inflammatory claims and assertions continuously heaped like a dog pile on hapless U.S. citizens with little time, few resources, and no obligation to perform their own investigations and analyses, I have only weak opinions but very strong suspicions. That’s where Hitchens’ Razor comes in handy. Under its instruction, I can discard out-of-hand and in disbelief extraordinary claims designed to whip me and the wider public into an emotional frenzy and thus accept or support actions that shouldn’t just raise eyebrows but be met with considerable dissent, protest, and disobedience. Saves me a lot of time entertaining nonsense just because it gets repeated often enough to be accepted as truth (Bernays’ Principle).

I had that dream again. You know the one: I have to go take a final test in a class I forgot about, never attended or dropped from my schedule. Most higher-ed students have this dream repeatedly, as do former students (or for those who take the educational enterprise seriously as a life-long endeavor, perpetual students). The dream usually features open-ended anxiety because it’s all anticipation — one never steps into the classroom to sit for the test. But this time, the twist was that the final test transformed into a group problem-solving seminar. The subject matter was an arcane post-calculus specialty (maybe I’ve seen too many Big Bang Theory whiteboards strewn with undecipherable equations), and the student group was stumped trying to solve some sort of engineering problem. In heroic dream mode, I recontextualized the problem despite my lack of expertise, which propelled the group past its block. Not a true test of knowledge or understanding, since I hadn’t attended class and didn’t learn its subject matter, but a reminder that problem-solving is often not straight application of factors easily set forth and manipulable.

Outside of the dream, in my morning twilight (oxymoron alert), I mused on the limitations of tackling social issues like there were engineering problems, which typically regards materials, processes, and personnel as mere resources to be marshaled and acted upon to achieve a goal but with little consideration — at least in the moment — of downstream effects or indeed human values. The Manhattan Project is a prime example, which (arguably) helped the allied powers win WWII but launched the world into the Atomic Age, complete with its own Cold War and the awful specter of mutually assured destruction (MAD). Borrowing a term from economics, it’s easy to rationalize negative collateral effects in terms of creative destruction. I object: the modifier creative masks that the noun is still destruction (cracked eggs needed to make omelets, ya know). Otherwise, maybe the term would be destructive creation. Perhaps I misunderstand, but the breakthrough with the Manhattan Project came about through synthesis of knowledge that lay beyond the purview of most narrowly trained engineers.

That is precisely the problem with many social ills today, those that actually have solutions anyway. The political class meant to manage and administer views problems primarily through a political lens (read: campaigning) and is not especially motivated to solve anything. Similarly, charitable organizations aimed at eradicating certain problems (e.g., hunger, homelessness, crime, educational disadvantage) can’t actually solve any problems because that would be the end of their fundraising and/or government funding, meaning that the organization itself would cease. Synthetic knowledge needed to solve a problem and then terminate the project is anathema to how society now functions; better that problems persist.

Past blog posts on this topic include “Techies and Fuzzies” and “The Man Who Knew Too Little,” each of which has a somewhat different emphasis. I’m still absorbed by the conflict between generalists and specialists while recognizing that both are necessary for full effectiveness. That union is the overarching message, too, of Iain McGilchrist’s The Master and His Emissary (2010), the subject of many past blog posts.

Ask parents what ambitions they harbor for their child or children and among the most patterned responses is “I just want them to be happy.” I find such an answer thoughtless and disingenuous, and the insertion of the hedge just to make happiness sound like a small ask is a red herring. To begin with, for most kids still in their first decade, happiness and playfulness are relatively effortless and natural so long as a secure, loving environment is provided. Certainly not a default setting, but it’s still quite commonplace. As the dreamy style of childhood cognition is gradually supplanted by supposedly more logical, rational, adult thinking, and as children become acquainted with iniquities of both history and contemporary life, innocence and optimism become impossible to retain. Cue the sullen teenager confronting the yawning chasm between desire and reality. Indeed, few people seem to make the transition into adulthood knowing with much clarity how to be happy in the midst of widespread travail and suffering. Instead, young adults frequently substitute self-destructive, nihilistic hedonism, something learned primarily (says me) from the posturing of movie characters and the celebrities who portray them. (Never understood the trope of criminals hanging at nightclubs, surrounded by drug addicts, nymphos, other unsavory types, and truly awful music, where they can indulge their assholery before everything inevitably goes sideways.)

Many philosophies recommend simplicity, naturalness, and independence as paths to happiness and moral rectitude. Transcendentalism was one such response to social and political complexities that spoil and/or corrupt. Yet two centuries on, the world has only gotten more and more complex, pressing on everyone especially for information processing in volume and sophistication that does not at all come naturally to most and is arguably not part of our evolutionary toolkit. Multiple social issues, if one is to engage them fairly, hinge on legalistic arguments and bewildering wordplay that render them fundamentally intractable. Accordingly, many waive away all nuance and adopt pro forma attitudes. Yet the airwaves, social media, the Internet, and even dinner conversations are suffused by the worst sorts of hypercomplexity and casuistry that confound even those who traffic regularly in such rhetoric. It’s a very long way from “I just want to be happy.”

(more…)

/rant on

The ongoing epistemological crisis is getting no aid or relief from the chattering classes. Case in point: the Feb. 2021 issue of Harper’s Magazine has a special supplement devoted to “Life after Trump,” which divides recent history neatly into reality and unreality commencing from either the announcement of Trump’s candidacy, his unexpected success in the Republican primaries, his even less expected election (and inauguration), or now his removal from office following electoral defeat in Nov. 2020. Take your pick which signals the greatest deflection from history’s “proper” course before being derailed into a false trajectory. Charles Yu and Olivia Laing adopt the reality/unreality dichotomy in their contributions to the special supplement. Yu divides (as do many others) the nation into us and them: supporters of a supposed departure from reality/sanity and those whose clear perception penetrates the illusion. Laing bemoans the inability to distinguish fiction and fantasy from truth, unreality masquerading as your truth, my truth, anyone’s truth given repetition and persuasion sufficient to make it stick. Despite familiarity with these forced, unoriginal metaphors, I don’t believe them for a moment. Worse, they do more to encourage siloed thinking and congratulate the “Resistance” for being on the putative correct side of the glaringly obvious schism in the voting populace. Their arguments support a false binary, perpetuating and reinforcing a distorted and decidedly unhelpful interpretation of recent history. Much better analyses than theirs are available.

So let me state emphatically: like the universe, infinity, and oddly enough consciousness, reality is all-encompassing and unitary. Sure, different aspects can be examined separately, but the whole is nonetheless indivisible. Reality is a complete surround, not something one can opt into or out of. That doesn’t mean one’s mind can’t go elsewhere, either temporarily or permanently, but that does not create or constitute an alternate reality. It’s merely dissociation. Considering the rather extreme limitations of human perceptual apparatuses, it’s frankly inevitable that each of us occupies a unique position, an individual perspective, within a much, much (much, much …) larger reality. Add just a couple more axes to the graph below for time (from nanoseconds to eons) and physical scale (from subatomic to cosmic), and the available portion of reality anyone can grasp is clearly infinitesimally small, yet that tiny, tiny portion is utterly everything for each individual. It’s a weird kind of solipsism.

I get that Harper’s is a literary magazine and that writers/contributors take advantage of the opportunity to flex for whatever diminishing readership has the patience to actually finish their articles. Indeed, in the course of the special supplement, more than a few felicitous concepts and turns of phase appeared. However, despite commonplace protestations, the new chief executive at the helm of the ship of state has not in fact returned the American scene to normal reality after an awful but limited interregnum.

Aside: Citizens are asked to swallow the whopper that the current president, an elder statesman, the so-called leader of the free world, is in full control of this faculties. Funny how his handlers repeatedly erupt like a murder of crows at the first suggestion that a difficult, unvetted question might be posed, inviting the poor fellow to veer even slightly off the teleprompter script. Nope. Lest yet another foot-in-mouth PR disaster occur (too many already to count), he’s whisked away, out of range of cameras and mics before any lasting damage can be done. Everyone is supposed to pretend this charade is somehow normal. On the other hand, considering how many past presidents were plainly puppets, spokespersons, or charlatans (or at least denied the opportunity to enact an agenda), one could argue that the façade is normal. “Pay no attention to the man [or men] behind the curtain. I am the great and powerful Wizard of Oz!”

With some dismay, I admit that the tiny sliver of reality to which many attend incessantly is an even smaller subset of reality, served up via small, handheld devices that fit neatly in one’s pocket. One could say theirs is a pocket reality, mostly mass media controlled by Silicon Valley platforms and their censorious algorithms. Constrained by all things digital, and despite voluminous ephemera, that reality bears little resemblance to what digital refuseniks experience without the blue glare of screens washing all the color from their faces and their own authentic thoughts out of their heads. Instead, I recommend getting outside, into the open air and under the warm glow of the yellow sun, to experience life as an embodied being, not as a mere processor of yet someone else’s pocket reality. That’s how we all start out as children before getting sucked into the machine.

Weirdly, only when the screen size ramps up to 30 feet tall do consumers grow skeptical and critical of storytelling. At just the moment cinema audiences are invited to suspend disbelief, the Reality Principle and logic are applied to character, dialogue, plotting, and make-believe gadgetry, which often fail to ring true. Why does fiction come under such careful scrutiny while reality skates right on by, allowing the credulous to believe whatever they’re fed?

/rant off

Continuing from part 2. I’m so slow ….

If cognitive inertia (i.e., fear of change) used to manifest as technophobia, myriad examples demonstrate how technology has fundamentally penetrated the social fabric and shared mental space, essentially flipping the script to fear of missing out (FOMO) of whatever latest, greatest innovation comes down the pike (laden with fraud and deception — caveat emptor). With FOMO, a new phobia has emerged: fear of technological loss, or more specifically, inability to connect to the Internet. This is true especially among the young, born and bred after the onset of the computing and digital communications era. Who knows when, why, or how loss of connectivity might occur? Maybe a Carrrington Event, maybe rolling blackouts due to wildfires (such as those in California and Oregon), maybe a ransomware attack on ISPs, or maybe a totalitarian clampdown by an overweening government after martial law is declared (coming soon to a neighborhood near you!). Or maybe something simpler: infrastructure failure. For some, inability to connect digitally, electronically, is tantamount to total isolation. Being cut off from the thoughts of others and abandoned left to one’s own thoughts, even on the short term, is thus roughly equivalent to the torture of solitary confinement. Forget the notion of digital detox.

/rant on

Cheerleaders for technocracy are legion, of course, while the mind boggles at how society might or necessarily will be organized differently when it all fails (as it must, if for no other reason than energy depletion). Among the bounties of the communications era is a surfeit of entertainments, movies and TV shows especially, that are essentially new stories to replace or supplant old stories. It’s no accident, however, that the new ones come wrapped up in the themes, iconography, and human psychology (is there any other kind, really?) of old ones. Basically, everything old is new again. And because new stories are delivered through hyperpalatable media — relatively cheap, on demand, and removed from face-to-face social contexts — they arguably cause as much disorientation as reorientation. See, for instance, the embedded video, which is rather long and rambling but nevertheless gets at how religious instincts manifest differently throughout the ages and are now embedded in comic book stories and superheros that have overtaken the entertainment landscape.

Mention is made that the secular age coincides roughly with the rise of video stores, a form of on-demand selection of content more recently made even simpler with ubiquitous streaming services. Did people really start hunkering down in their living rooms, eschewing group entertainments and civic involvements only in the 1980s? The extreme lateness of that development in Western history is highly suspect, considering the death of god had been declared back in the middle of the 19th century. Moreover, the argument swings around to the religious instinct, a cry or meaning if you will, being blocked by organized churches and their endemic corruption and instead finding expression in so-called secular religions (oxymoron alert). Gawd, how I tire of everything that functions as psychological grounding being called a religion. Listen, pseudo-religious elements can be found in Cheerios if one twists up one’s mind sufficiently. That doesn’t make General Mills or Kellogg’s new secular-religious prophets.

Back to the main point. Like money grubbing, technophilia might quiet the desperate search for meaning temporarily, since there’s always more of both to acquire. Can’t get enough, ever. But after even partial acquisition, the soul feels strangely dissatisfied and disquieted. Empty, one might even say. So out roving into the public sphere one goes, seeking and pursuing something to fill one’s time and appetites. Curiously, many traditional solutions to this age-old problem taught the seeker to probe within as an alternative. Well screw that! In the hyper-connected 20th-century world, who has time for that measly self-isolation? More reified Cheerios!

/rant off