Posts Tagged ‘Open Secrets’

The comic below struck a chord and reminded me of Gary Larson’s clumsily drawn but often trenchant The Far Side comics on scientific subjects.

This one masquerades as science but is merely wordplay, i.e., puns, double entendres, and unexpectedly funny malapropisms (made famous by Yogi Berra, among others). Wordplay is also found in various cultural realms, including comic strips and stand-up comedy, advertising and branding, politics, and now Wokedom (a subset of grassroots politics, some might argue). Playing with words has gone from being a clever, sometimes enjoyable diversion (e.g., crossword puzzles) to fully deranging, weaponized language. Some might be inclined to waive away the seriousness of that contention using the childhood retort “sticks and stones ….” Indeed, I’m far less convinced of the psychological power of verbal nastiness than those who insist words are violence. But it’s equally wrong to say that words don’t matter (much) or have no effect whatsoever. Otherwise, why would those acting in bad faith work so tirelessly to control the narrative, often by restricting free speech (as though writing or out-loud speech were necessary for thoughts to form)?

It’s with some exasperation that I observe words no longer retain their meanings. Yeah, yeah … language is dynamic. But semantic shifts usually occur slowly as language evolves. Moreover, for communication to occur effectively, senders and receivers must be aligned in their understandings of words. If you and I have divergent understandings of, say, yellow, we won’t get very far in discussions of egg yolks and sunsets. The same is true of words such as liberal, fascist, freedom, and violence. A lack of shared understanding of terms, perhaps borne out of ignorance, bias, or agenda, leads to communications breakdown. But it’s gotten far worse than that. The meanings of words have been thrown wide open to PoMo reinterpretation that often invert their meanings in precisely the way George Orwell observed in his novel 1984 (published 1949): “War is peace. Freedom is slavery. Ignorance is strength.” Thus, earnest discussion of limitations on free speech and actual restriction on social media platforms, often via algorithmic identification of keywords that fail to account for irony, sarcasm, or context, fail to register that implementation of restrictive kludges already means free speech is essentially gone. The usual exceptions (obscenity, defamation, incitement, gag orders, secrecy, deceptive advertising, student speech, etc.) are not nearly as problematic because they have been adjudicated for several generations and accepted as established practice. Indeed, many exceptions have been relaxed considerably (e.g., obscenity that has become standard patois now fails to shock or offend), and slimy workarounds are now commonplace (e.g., using “people are saying …” to say something horrible yet shielding oneself while saying it). Another gray area includes fighting words and offensive words, which are being expanded (out of a misguided campaign to sanitize?) to include many words with origins as clinical and scientific terms, and faux offense used to stifle speech.

Restrictions on free speech are working in many respects, as many choose to self-censor to avoid running afoul of various self-appointed watchdogs or roving Internet thought police (another Orwell prophecy come true) ready to pounce on some unapproved utterance. One can argue whether self-censorship is cowardly or judicious, I suppose. However, silence and the pretense of agreement only conceal thoughts harbored privately and left unexpressed, which is why restrictions on public speech are fool’s errands and strategic blunders. Maybe the genie can be bottled for a time, but that only produces resentment (not agreement), which boils over into seething rage (and worse) at some point.

At this particular moment in U.S. culture, however, restrictions are not my greatest concern. Rather, it’s the wholesale control of information gathering and reporting that misrepresent or remove from the public sphere ingredients needed to form coherent thoughts and opinions. It’s not happening only to the hoi polloi; those in positions of power and control are manipulated, too. (How many lobbyists per member of Congress, industry after industry, whispering in their ears like so many Wormtongues?) And in extreme cases of fame and cult of personality, a leader or despot unwittingly surrounds him- or herself by a coterie of yes-men frankly afraid to tell the truth out of careerist self-interest or because of shoot-the-messenger syndrome. It’s lonely at the top, right?

Addendum: Mere minutes after publishing this post, I wandered over to Bracing Views (on my blogroll) and found this post saying some of the same things, namely, that choking information off at the source results in a degraded information landscape. Worth a read.

/rant on

One of my personal favorites among my own blog posts is my remapping of the familiar Rock, Paper, Scissors contest to Strong, Stupid, and Smart, respectively. In that post, I concluding (among other things) that, despite a supposedly equal three-way power dynamic, in the long run, nothing beats stupid. I’ve been puzzling recently over this weird dynamic in anticipation of a mass exodus of boomers from the labor force as they age into retirement (or simply die off). (Digression about the ruling gerontocracy withheld.) It’s not by any stretch clear that their younger cohorts divided into not-so-cleverly named demographics are prepared to bring competence or work ethic to bear on labor needs, which includes job descriptions ranging across the spectrum from blue collar to white collar to bona fide expert.

Before being accused of ageism and boomerism, I don’t regard the issue as primarily a function of age but rather as a result of gradual erosion of educational standards that has now reached such a startling level of degradation that many American institutions are frankly unable to accomplish their basic missions for lack of qualified, competent, engaged workers and managers. See, for example, this Gallup poll showing how confidence in U.S. institutions is ebbing. Curious that the U.S. Congress is at the very bottom, followed closely and unsurprisingly by the TV news. Although the poll only shows year-over-year decline, it’s probably fair to say that overall consensus is that institutions simply cannot be relied upon anymore to do their jobs effectively. I’ve believed that for some time about Cabinet-level agencies that, administration after administration, never manage to improve worsening conditions that are the very reason for their existence. Some of those failures are arguably because solutions to issues simply do not exist (such as with the renewed energy crisis or the climate emergency). But when addressing concerns below the level of global civilization, my suspicion is that failure is the result of a combination of corruption (including careerism) and sheer incompetence.

The quintessential example came to my attention in the embedded YouTube video, which spells out in gruesome detail how schools of education are wickedly distorted by ideologues pushing agendas that don’t produce either better educational results or social justice. Typically, it’s quite the reverse.

In short, school administrators and curriculum designers are incompetent boobs (much like many elected government officials) possessed of decision-making authority who have managed to quell dissent among the ranks precisely because many who know better are invested in careers and pension programs that would be sacrificed in order to call bullshit on insane things now being forced on everyone within those institutions. Those of us who attended college often witnessed how, over the course of several decades, faculties have essentially caved repeatedly on issues where administrators acted in contravention of respectable educational goals and policies. Fortitude to resist is not in abundance for reasons quite easy to understand. Here’s one cry from the wilderness by a college professor retiring early to escape the madness. One surmises that whoever is hired as a replacement will check a number of boxes, including compliance with administrative diktats.

Viewpoint diversity may well be the central value being jettisoned, along with the ability to think clearly. If cultism is truly the correct characterization, administrators have adopted an open agenda of social reform and appear to believe that they, uniquely, have arrived at the correct answers (final solutions?) to be brainwashed into both teachers and students as doctrine. Of course, revolutions are launched on the strength of such misguided convictions, often purging resistance violently and revealing that best intentions matter little in the hellscapes that follow. But on the short term, the basic program is to silent dissent, as though banishing disallowed thinking from the public sphere collapses viewpoint diversity. Nope, sorry. That’s not how cognition works except in totalitarian regimes that remove all awareness of options and interpretations we in open societies currently take for granted. It’s barking mad, for instance, to think that all the propaganda flung at the American public about, say, the proxy war in Ukraine is truly capable of buffaloing the entire population into believing we (the U.S. and NATO) are the good guys in the conflict. (There are no goods guys.) Even the Chinese government, with its restricted Internet and social credit system, can’t squelch entirely the yearning to think and breathe freely. Instead, that’s the domain of North Korea, which only despots hold up as a salutary model.

My point, which bears reiteration, is that poorly educated, miseducated, and uneducated ignoramuses (the ignorati, whose numbers have swelled) in positions of power and influence embody precisely the unmovable, unreachable, slow, grinding stupidity that can’t be overcome by knowledge, understanding, expertise, or appeals to reason and good faith. Stupid people don’t know just how stupid they are but sally forth with blind confidence in themselves, and their abject stupidity becomes like kryptonite used to weaken others. One can use smarts (scissors) once in a while to cut through stupidity (paper), but in the end, nothing beats stupid.

/rant off

This intended follow-up has been stalled (pt. 1 here) for one simple reason: the premise presented in the embedded YouTube video is (for me at least) easy to dismiss out of hand and I haven’t wanted to revisit it. Nevertheless, here’s the blurb at the top of the comments at the webpage:

Is reality created in our minds, or are the things you can touch and feel all that’s real? Philosopher Bernardo Kastrup holds doctorates in both philosophy and computer science, and has made a name for himself by arguing for metaphysical idealism, the idea that reality is essentially a mental phenomenon.

Without going into point-by-point discussion, the top-level assertion, if I understand it correctly (not assured), is that material reality comes out of mental experience rather than the reverse. It’s a chicken-and-egg question with materialism and idealism (fancy technical terms not needed) each vying for primacy. The string of conjectures (mental gymnastics, really, briefly impressive until one recognizes how quickly they lose correlation with how most of us think about and experience reality) that inverts the basic relationship of inner experience to outer reality is an example of waaaay overthinking a problem. No doubt quite a lot of erudition can be brought to bear on such questions, but even if those questions were resolved satisfactorily on an intellectual level and an internally coherent structure or system were developed or revealed, it doesn’t matter or lead anywhere. Humans are unavoidably embodied beings. Each individual existence also occupies a tiny sliver of time (the timeline extending in both directions to infinity). Suggesting that mental experience is briefly instantiated in personhood but is actually drawn out of some well of souls, collective consciousness, or panpsychism and rejoins them in heaven, hell, or elsewhere upon expiration is essentially a religious claim. It’s also an attractive supposition, granting each of us not permanence or immortality but rather something somehow better (?) though inscrutable because it lies beyond perception (but not conceptualization). Except for an eternity of torments in hell, I guess, if one deserves that awful fate.

One comment about Kastrup. He presents his perspective (his findings?) with haughty derision of others who can’t see or understand what it so (duh!) obvious. He falls victim to the very same over-intellectualized flim-flam he mentions when dismissing materialists who need miracles and shortcuts to smooth over holes in their scientific/philosophical systems. The very existence of earnest disagreement by those who occupy themselves with such questions might suggest some humility, as in “here’s my explanation, they have theirs, judge for yourself.” But there’s a third option: the great unwashed masses (including nearly all our ancestors) for whom such questions are never even fleeting thoughts. It’s all frankly immaterial (funnily, both the right and wrong word at once). Life is lived and experienced fundamentally on its surface — unless, for instance, one has been incubated too long within the hallowed halls of academia, lost touch with one’s brethren, and become preoccupied with cosmic investigations. Something quite similar happens to politicians and the wealthy, who typically hyperfocus on gathering to themselves power and then exercising that power over others (typically misunderstood as simply pulling the levers and operating the mechanisms of society). No wonder their pocket of reality looks so strikingly different.

Here’s a deal many people would take: you get to live in the First World and enjoy the ample albeit temporary benefits of a modern, post-industrial economy, but to enable you, people in the Third World must be brutally exploited, mostly out of sight and out of mind. (Dunno what to say about the Second World; comparisons are typically hi-lo. And besides, that Cold War nomenclature is probably badly out of date.) There no need to say “would take,” of course, because that’s already the default in the First World. Increasingly, that’s also the raw deal experienced by the lower/working class in the United States, which now resembles other failed states. People without means are driven into cycles of poverty or channeled into the prison-industrial complex to labor for a pittance. That’s not enough, though. The entirety of public health must be gamed as a profit center for Big Pharma, which wrings profit out of suffering just like the U.S. prison system. That’s one of the principal takeaways from the last two years of pandemic. Indeed, from a capitalist perspective, that’s what people are for: to feed the beast (i.e., produce profit for the ownership class). For this very reason — the inhumanity of exploiting and subjugating people — critics of capitalism believe the ruthlessness of the profit motive cannot be tempered and the economic system is ripe for replacement.

Arguments that, “yeah, sure, it’s a flawed system but it’s still the best one on offer” are unconvincing. Rather, they’re a rationalization for lack of imagination how a better, more equitable system might be developed and tried. Human nature, frankly as “animal” as any other animal, also discourages anyone from rising above social conditioning or breaking from the herd. Instead, history forces fundamental change only when decrepit systems no longer function. Late-stage capitalism, having reached nearly the full extent of easily exploitable resources (materials and labor), is creaking and groaning under the weight of its inbuilt perpetual growth imperative. The dynamic is nonnegotiable, as measures of gross national product (GNP) are only acceptable if a positive index, the higher the better. Whereas previous social/economic systems failed in fits and starts, transitioning gradually from one to the next, it’s doubtful capitalism can morph gracefully into a new system given its momentum and totalizing character.

For many millennia, slavery was the solution to labor needs, which became morally intolerable especially in the 19th century but was really only driven underground, never truly extinguished. That’s the point of the first paragraph above. Terminology and mechanisms have sometimes been swapped out, but the end result is scarcely less disagreeable for those on the bottom rungs. Globalization brought practically the entire world population into the money economy, which had been irrelevant to peasant and subsistence societies. Apologists often say that the spread of capitalism enabled those peoples to be lifted out of poverty. That’s a ridiculous claim while wealth/income inequality continues to launch the ultrarich into the stratosphere (literally in the infamous case of at least a couple billionaires) compared to the masses. Yes, refrigerators and cell phones are now commonplace, but those are hardly the best measures of human wellbeing.

So what’s person of conscience to do? Born into a socioeconomic system from which there is no escape — at least until it all collapses — is there any way not to be evil, to not exploit others? Hard to say, considering we all consume (in varying degrees) products and services obtained and provided through the machinations of large corporations exploiting humans and nature on our behalf. When it all does collapse in a heap of death and destruction, don’t look for high-minded reexamination of the dynamics that led to that endgame. Rather, it will be everyone for themselves in a futile attempt to preserve life against all odds.

I use the tag redux to signal that the topic of a previous blog post is being revisited, reinforced, and repurposed. The choice of title for this one could easily have gone instead to Your Brain on Postmodernism, Coping with the Post-Truth World, or numerous others. The one chosen, however, is probably the best fit given than compounding crises continue pushing along the path of self-annihilation. Once one crisis grows stale — at least in terms of novelty — another is rotated in to keep us shivering in fear, year after year. The date of civilizational collapse is still unknown, which is really more process anyway, also of an unknown duration. Before reading what I’ve got to offer, perhaps wander over to Clusterfuck Nation and read James Howard Kunstler’s latest take on our current madness.

/rant on

So yeah, various cultures and subcultures are either in the process of going mad or have already achieved that sorry state. Because madness is inherently irrational and unrestrained, specific manifestations are unpredictable. However, the usual trigger for entire societies to lose their tether to reality is relatively clear: existential threat. And boy howdy are those threats multiplying and gaining intensity. Pick which of the Four Horsemen of the Apocalypse with whom to ride to the grave, I guess. Any one will do; all four are galloping simultaneously, plus a few other demonic riders not identified in that mythological taxonomy. Kunstler’s focus du jour is censorship and misinformation (faux disambiguation: disinformation, malinformation, dishonesty, gaslighting, propaganda, fake news, falsehood, lying, cozenage, confidence games, fraud, conspiracy theories, psyops, personal facts), about which I’ve blogged repeatedly under the tag epistemology. Although major concerns, censorship and misinformation are outgrowths of spreading madness, not the things that will kill anyone directly. Indeed, humans have shown a remarkable capacity to hold in mind crazy belief systems or stuff down discomfiting and disapproved thoughts even without significant threat. Now that significant threats spark the intuition that time is running perilously short, no wonder so many have fled reality into the false safety of ideation. Inability to think and express oneself freely or to detect and divine truth does, however, block what few solutions to problems remain to be discovered.

Among recent developments I find unsettling and dispiriting is news that U.S. officials, in their effort to — what? — defeat the Russians in a war we’re not officially fighting, are just making shit up and issuing statements to their dutiful stenographers in the legacy press to report. As I understand it, there isn’t even any pretense about it. So to fight phantoms, U.S. leaders conjure out of nothingness justifications for involvements, strategies, and actions that are the stuff of pure fantasy. This is a fully, recognizably insane: to fight monsters, we must become monsters. It’s also maniacally stupid. Further, it’s never been clear to me that Russians are categorically baddies. They have dealt with state propaganda and existential threats (e.g., the Bolshevik Revolution, WWII, the Cold War, the Soviet collapse, being hemmed in by NATO countries) far more regularly than most Americans and know better than to believe blindly what they’re told. On a human level, who can’t empathize with their plights? (Don’t answer that question.)

In other denial-of-reality news, demand for housing in Sun Belt cities has driven rent increases ranging between approximately 30% and 60% over the past two years compared to many northern cities well under 10%. Americans are migrating to the Sun Belt despite, for instance, catastrophic drought and wild fires. Lake Powell sits at an historically low level, threatening reductions in water and electrical power. What happens when desert cities in CA, AZ, NV, and NM become uninhabitable? Texas isn’t far behind. This trend has been visible for decades, yet many Americans (and immigrants, too) are positioning themselves directly in harm’s way.

I’ve been a doomsayer for over a decade now, reminding my two or three readers (on and off) that the civilization humans built for ourselves cannot stand much longer. Lots of people know this yet act as though concerns are overstated or irrelevant. It’s madness, no? Or is it one last, great hurrah before things crack up apocalyptically? On balance, what’s a person to do but to keep trudging on? No doubt the Absurdists got something correct.

/rant off

Continuing from the previous blog post, lengthy credit scrolls at the ends of movies have become a favorite hiding place for bloopers and teasers. The purpose of this practice is unclear, since I can’t pretend (unlike many reckless opinonators) to inhabit the minds of filmmakers, but it has become a fairly reliable afterthought for film-goers willing to wait out the credits. Those who depart the theater, change the channel, or click away to other content may know they are relinquishing some last tidbit to be discovered, but there’s no way to know in advance if one is being punked or pleased, or indeed if there is anything at all there. Clickbait news often employs this same technique, teasing some newsbit in the headline to entice readers to wade (or skim) through a series of (ugh!) one-sentence paragraphs to find the desired content, which sometimes is not even provided. At least one film (Monty Python’s The Secret Policeman’s Other Ball (1982) as memory serves) pranked those in a rush to beat foot traffic out of the theater (back when film-going meant visiting the cinema) by having an additional thirty minutes of material after the (first) credit sequence.

This also put me in mind of Paul Harvey radio broadcasts ending with the sign-off tag line, “… the rest of the story.” Harvey supplemented the news with obscure yet interesting facts and analysis that tended to reshape one’s understanding of consensus narrative. Such reshaping is especially important as an ongoing process of clarification and revision. When served up in delectable chunks by winning personalities like Paul Harvey, supplemental material is easily absorbed. When material requires effort to obtain and/or challenges one’s beliefs, something strongly, well, the default response is probably not to bother. However, those possessing intellectual integrity welcome challenging material and indeed seek it out. Indeed, invalidation of a thesis or hypothesis is fundamental to the scientific method, and no body of work can be sequestered from scrutiny and then be held as legitimately authoritative.

Yet that’s what happens routinely in the contemporary infosphere. A government press office or corporate public relations officer issues guidance or policy in direct conflict with earlier guidance or policy and in doing so seeks to place any resulting cognitive dissonance beyond examination and out of scope. Simple matters of adjustment are not what concern me. Rather, it’s wholesale brainwashing that is of concern, when something is clear within one’s memory or plainly documented in print/video yet brazenly denied, circumvented, and deflected in favor of a new directive. The American public has contended with this repeatedly as each new presidential administration demonizes the policies of its predecessors but typically without demonstrating the self-reflection and -examination to admit, wrongdoing, responsibility, or error on anyone’s part. It’s a distinctly American phenomenon, though others have cottoned onto it and adopted the practice for themselves.

Exhaustion from separating the spin-doctored utterances of one malefactor or another from one’s own direct experience and sense-making drives many to simply give up. “Whatever you say, sir. Lemme go back to my entertainments.” The prospect of a never-ending slog through evidence and analysis only to arrive on unsteady ground, due to shift underfoot again and again with each new revelation, is particularly unsatisfactory. And as discussed before, those who nonetheless strain to achieve knowledge and understanding that reach temporary sufficiency yet remain permanently, intransigently provisional find themselves thwarted by those in the employ of organizations willing and eager to game information systems in the service of their not-even-hidden agendas. Alternative dangers for the muddled thinker include retreating into fixed ideology or collapsing into solipsism. Maybe none of it matters in the end. We can choose our beliefs from the buffet of available options without adherence to reality. We can create our own reality. Of course, that’s a description of madness, to which many have already succumbed. Why aren’t they wearing straitjackets?

From Barrett Swanson’s article “The Anxiety of Influencers” in the June 2021 issue of Harper’s Magazine:

For the past thirteen years, I’ve taught a course called Living in the Digital Age, which mobilizes the techniques of the humanities—critical thinking, moral contemplation, and information literacy—to interrogate the version of personhood that is being propagated by … social networks. Occasionally, there have been flashes of student insight that rivaled moments from Dead Poets Society—one time a student exclaimed, “Wait, so on social media, it’s almost like I’m the product”—but it increasingly feels like a Sisyphean task, given that I have them for three hours a week and the rest of the time they are marinating in the jacuzzi of personalized algorithms.

As someone who suffers from Churchillian spells of depression, it was easy for me to connect this to the pervasive disquiet on campus. In the past ten years, my email correspondence has been increasingly given over to calming down students who are hyperventilating with anxiety—about grades, about their potential marketability, about their Instagram followings. The previous semester, for instance, during a class on creative non-fiction, twenty-four of my twenty-six students wrote about self-harm or suicidal ideation. Several of them had been hospitalized for anxiety or depression, and my office hours were now less occasions to discuss course concepts—James Baldwin’s narrative persona, say, or Joan Didion’s use of imagery—than they were de facto counseling sessions. Even students who seemed happy and neurologically stable—Abercrombie-clad, toting a pencil case and immaculate planner—nevertheless displayed unsettling in-class behavior: snacking incessantly during lectures, showing Victorian levels of repression. The number of emotional-support service animals had skyrocketed on campus. It seemed like every third person had a Fido in tow, and had you wandered into my lecture hall when we were still holding in-person classes, you might have assumed that my lessons were on obedience training or the virtues of dog-park etiquette. And while it seems clichéd even to mention it, the students were inexorably—compulsively—on their phones.

Further to this blog post, see this quote from Daniel Schwindt’s The Case Against the Modern World (2016), which will be the subject of a new book blogging project:

As Frank Herbert, the master of science fiction, once put it: “fear is the mind-killer.” And this is the precise truth, because a person acting in fear loses his capacity for judgment precisely insofar as he is affected by his fear. In fear, he does things that, in a peaceful frame of mind, he’d have found ridiculous. This is why we would expect that, if fear were to become a generalized condition in a civilization, knowledge itself would begin to deteriorate. [p. 35]

Ask parents what ambitions they harbor for their child or children and among the most patterned responses is “I just want them to be happy.” I find such an answer thoughtless and disingenuous, and the insertion of the hedge just to make happiness sound like a small ask is a red herring. To begin with, for most kids still in their first decade, happiness and playfulness are relatively effortless and natural so long as a secure, loving environment is provided. Certainly not a default setting, but it’s still quite commonplace. As the dreamy style of childhood cognition is gradually supplanted by supposedly more logical, rational, adult thinking, and as children become acquainted with iniquities of both history and contemporary life, innocence and optimism become impossible to retain. Cue the sullen teenager confronting the yawning chasm between desire and reality. Indeed, few people seem to make the transition into adulthood knowing with much clarity how to be happy in the midst of widespread travail and suffering. Instead, young adults frequently substitute self-destructive, nihilistic hedonism, something learned primarily (says me) from the posturing of movie characters and the celebrities who portray them. (Never understood the trope of criminals hanging at nightclubs, surrounded by drug addicts, nymphos, other unsavory types, and truly awful music, where they can indulge their assholery before everything inevitably goes sideways.)

Many philosophies recommend simplicity, naturalness, and independence as paths to happiness and moral rectitude. Transcendentalism was one such response to social and political complexities that spoil and/or corrupt. Yet two centuries on, the world has only gotten more and more complex, pressing on everyone especially for information processing in volume and sophistication that does not at all come naturally to most and is arguably not part of our evolutionary toolkit. Multiple social issues, if one is to engage them fairly, hinge on legalistic arguments and bewildering wordplay that render them fundamentally intractable. Accordingly, many waive away all nuance and adopt pro forma attitudes. Yet the airwaves, social media, the Internet, and even dinner conversations are suffused by the worst sorts of hypercomplexity and casuistry that confound even those who traffic regularly in such rhetoric. It’s a very long way from “I just want to be happy.”

(more…)

/rant on

The ongoing epistemological crisis is getting no aid or relief from the chattering classes. Case in point: the Feb. 2021 issue of Harper’s Magazine has a special supplement devoted to “Life after Trump,” which divides recent history neatly into reality and unreality commencing from either the announcement of Trump’s candidacy, his unexpected success in the Republican primaries, his even less expected election (and inauguration), or now his removal from office following electoral defeat in Nov. 2020. Take your pick which signals the greatest deflection from history’s “proper” course before being derailed into a false trajectory. Charles Yu and Olivia Laing adopt the reality/unreality dichotomy in their contributions to the special supplement. Yu divides (as do many others) the nation into us and them: supporters of a supposed departure from reality/sanity and those whose clear perception penetrates the illusion. Laing bemoans the inability to distinguish fiction and fantasy from truth, unreality masquerading as your truth, my truth, anyone’s truth given repetition and persuasion sufficient to make it stick. Despite familiarity with these forced, unoriginal metaphors, I don’t believe them for a moment. Worse, they do more to encourage siloed thinking and congratulate the “Resistance” for being on the putative correct side of the glaringly obvious schism in the voting populace. Their arguments support a false binary, perpetuating and reinforcing a distorted and decidedly unhelpful interpretation of recent history. Much better analyses than theirs are available.

So let me state emphatically: like the universe, infinity, and oddly enough consciousness, reality is all-encompassing and unitary. Sure, different aspects can be examined separately, but the whole is nonetheless indivisible. Reality is a complete surround, not something one can opt into or out of. That doesn’t mean one’s mind can’t go elsewhere, either temporarily or permanently, but that does not create or constitute an alternate reality. It’s merely dissociation. Considering the rather extreme limitations of human perceptual apparatuses, it’s frankly inevitable that each of us occupies a unique position, an individual perspective, within a much, much (much, much …) larger reality. Add just a couple more axes to the graph below for time (from nanoseconds to eons) and physical scale (from subatomic to cosmic), and the available portion of reality anyone can grasp is clearly infinitesimally small, yet that tiny, tiny portion is utterly everything for each individual. It’s a weird kind of solipsism.

I get that Harper’s is a literary magazine and that writers/contributors take advantage of the opportunity to flex for whatever diminishing readership has the patience to actually finish their articles. Indeed, in the course of the special supplement, more than a few felicitous concepts and turns of phase appeared. However, despite commonplace protestations, the new chief executive at the helm of the ship of state has not in fact returned the American scene to normal reality after an awful but limited interregnum.

Aside: Citizens are asked to swallow the whopper that the current president, an elder statesman, the so-called leader of the free world, is in full control of this faculties. Funny how his handlers repeatedly erupt like a murder of crows at the first suggestion that a difficult, unvetted question might be posed, inviting the poor fellow to veer even slightly off the teleprompter script. Nope. Lest yet another foot-in-mouth PR disaster occur (too many already to count), he’s whisked away, out of range of cameras and mics before any lasting damage can be done. Everyone is supposed to pretend this charade is somehow normal. On the other hand, considering how many past presidents were plainly puppets, spokespersons, or charlatans (or at least denied the opportunity to enact an agenda), one could argue that the façade is normal. “Pay no attention to the man [or men] behind the curtain. I am the great and powerful Wizard of Oz!”

With some dismay, I admit that the tiny sliver of reality to which many attend incessantly is an even smaller subset of reality, served up via small, handheld devices that fit neatly in one’s pocket. One could say theirs is a pocket reality, mostly mass media controlled by Silicon Valley platforms and their censorious algorithms. Constrained by all things digital, and despite voluminous ephemera, that reality bears little resemblance to what digital refuseniks experience without the blue glare of screens washing all the color from their faces and their own authentic thoughts out of their heads. Instead, I recommend getting outside, into the open air and under the warm glow of the yellow sun, to experience life as an embodied being, not as a mere processor of yet someone else’s pocket reality. That’s how we all start out as children before getting sucked into the machine.

Weirdly, only when the screen size ramps up to 30 feet tall do consumers grow skeptical and critical of storytelling. At just the moment cinema audiences are invited to suspend disbelief, the Reality Principle and logic are applied to character, dialogue, plotting, and make-believe gadgetry, which often fail to ring true. Why does fiction come under such careful scrutiny while reality skates right on by, allowing the credulous to believe whatever they’re fed?

/rant off