Archive for the ‘Narrative’ Category

Like another show I dubbed SufferFest, if you haven’t yet forfeited the nearly three-hour run time, save your effort and skip Blonde unless you enjoy watching people suffer. This pointless, highly selective biopic on Marilyn Monroe depicts her solely as struggling under the so-called tyranny of the lurid male gaze. (No real acknowledgement of her participation in that dynamic.) Yeah, it’s been long established that the old Hollywood studio system and the devouring public can’t avoid destroying their most iconic creations — especially the females. That’s the bitter end of the hype cycle if one is unfortunate enough to succumb. Oddly, the film omits Marilyn Monroe’s death at a relatively young age under questionable circumstances.

To its extremely modest credit, the film separates Norma Jeane Mortensen (or Baker) from Marilyn Monroe, meaning that the blonde bombshell character was an act, a Hollywood fiction — at least until it stopped being one. Whereas most would probably prefer to be remembered in a positive light (e.g., an Irish wake), this film insists (and insists and insists and then insists some more) on its subject being a victim and sufferer, which makes the audience into voyeurs and accomplices. A couple brief moments suggesting her exceptional, incandescent talent on multiple levels were reframed as yet more suffering. The film probably fits the contemporary Woke narrative of victimhood and is wholly unenjoyable. I regret having watched it. Earns a negative recommendation: stay away.

The comic below struck a chord and reminded me of Gary Larson’s clumsily drawn but often trenchant The Far Side comics on scientific subjects.

This one masquerades as science but is merely wordplay, i.e., puns, double entendres, and unexpectedly funny malapropisms (made famous by Yogi Berra, among others). Wordplay is also found in various cultural realms, including comic strips and stand-up comedy, advertising and branding, politics, and now Wokedom (a subset of grassroots politics, some might argue). Playing with words has gone from being a clever, sometimes enjoyable diversion (e.g., crossword puzzles) to fully deranging, weaponized language. Some might be inclined to waive away the seriousness of that contention using the childhood retort “sticks and stones ….” Indeed, I’m far less convinced of the psychological power of verbal nastiness than those who insist words are violence. But it’s equally wrong to say that words don’t matter (much) or have no effect whatsoever. Otherwise, why would those acting in bad faith work so tirelessly to control the narrative, often by restricting free speech (as though writing or out-loud speech were necessary for thoughts to form)?

It’s with some exasperation that I observe words no longer retain their meanings. Yeah, yeah … language is dynamic. But semantic shifts usually occur slowly as language evolves. Moreover, for communication to occur effectively, senders and receivers must be aligned in their understandings of words. If you and I have divergent understandings of, say, yellow, we won’t get very far in discussions of egg yolks and sunsets. The same is true of words such as liberal, fascist, freedom, and violence. A lack of shared understanding of terms, perhaps borne out of ignorance, bias, or agenda, leads to communications breakdown. But it’s gotten far worse than that. The meanings of words have been thrown wide open to PoMo reinterpretation that often invert their meanings in precisely the way George Orwell observed in his novel 1984 (published 1949): “War is peace. Freedom is slavery. Ignorance is strength.” Thus, earnest discussion of limitations on free speech and actual restriction on social media platforms, often via algorithmic identification of keywords that fail to account for irony, sarcasm, or context, fail to register that implementation of restrictive kludges already means free speech is essentially gone. The usual exceptions (obscenity, defamation, incitement, gag orders, secrecy, deceptive advertising, student speech, etc.) are not nearly as problematic because they have been adjudicated for several generations and accepted as established practice. Indeed, many exceptions have been relaxed considerably (e.g., obscenity that has become standard patois now fails to shock or offend), and slimy workarounds are now commonplace (e.g., using “people are saying …” to say something horrible yet shielding oneself while saying it). Another gray area includes fighting words and offensive words, which are being expanded (out of a misguided campaign to sanitize?) to include many words with origins as clinical and scientific terms, and faux offense used to stifle speech.

Restrictions on free speech are working in many respects, as many choose to self-censor to avoid running afoul of various self-appointed watchdogs or roving Internet thought police (another Orwell prophecy come true) ready to pounce on some unapproved utterance. One can argue whether self-censorship is cowardly or judicious, I suppose. However, silence and the pretense of agreement only conceal thoughts harbored privately and left unexpressed, which is why restrictions on public speech are fool’s errands and strategic blunders. Maybe the genie can be bottled for a time, but that only produces resentment (not agreement), which boils over into seething rage (and worse) at some point.

At this particular moment in U.S. culture, however, restrictions are not my greatest concern. Rather, it’s the wholesale control of information gathering and reporting that misrepresent or remove from the public sphere ingredients needed to form coherent thoughts and opinions. It’s not happening only to the hoi polloi; those in positions of power and control are manipulated, too. (How many lobbyists per member of Congress, industry after industry, whispering in their ears like so many Wormtongues?) And in extreme cases of fame and cult of personality, a leader or despot unwittingly surrounds him- or herself by a coterie of yes-men frankly afraid to tell the truth out of careerist self-interest or because of shoot-the-messenger syndrome. It’s lonely at the top, right?

Addendum: Mere minutes after publishing this post, I wandered over to Bracing Views (on my blogroll) and found this post saying some of the same things, namely, that choking information off at the source results in a degraded information landscape. Worth a read.

For this blog post, let me offer short and long versions of the assertion and argument, of which one of Caitlin Johnstone’s many aphorisms is the short one:

Short version: Modern mainstream feminism is just one big celebration of the idea that women can murder, predate, oppress, and exploit for power and profit just as well as any man.

Long version: Depicting strength in terms of quintessential masculine characteristics is ruining (fictional) storytelling. (Offenders in contemporary cinema and streaming will go unnamed, but examples abound now that the strong-female-lead meme has overwhelmed characters, plots, and stories. Gawd, I tire of it.) One could survey the past few decades to identify ostensibly strong women basically behaving like asshole men just to — what? — show that it can be done? Is this somehow better than misogynist depictions of characters using feminine wiles (euphemism alert) to get what they want? These options coexist today, plus some mixture of the two. However, the main reason the strong female lead fails as storytelling — punching, fighting, and shooting toe-to-toe with men — is that it bears little resemblance to reality.

In sports (combat sports especially), men and women are simply not equally equipped for reasons physiological, not ideological. Running, jumping, throwing, swinging, and punching in any sport where speed and power are principal attributes favors male physiology. Exceptions under extraordinary conditions (i.e., ultradistance running) only demonstrate the rule. Sure, a well-trained and -conditioned female in her prime can beat and/or defeat an untrained and poorly conditioned male. If one of those MMA females came after me, I’d be toast because I’m entirely untrained and I’m well beyond the age of a cage fighter. But that’s not what’s usually depicted onscreen. Instead, it’s one badass going up against another badass, trading blows until a victor emerges. If the female is understood as the righteous one, she is typically shown victorious despite the egregious mismatch.

Nonadherence to reality can be entertaining, I suppose, which might explain why the past two decades have delivered so many overpowered superheroes and strong female leads, both of which are quickly becoming jokes and producing backlash. Do others share my concern that, as fiction bleeds into reality, credulous women might be influenced by what they see onscreen to engage recklessly in fights with men (or for that matter, other women)? Undoubtedly, a gallant or chivalrous man would take a few shots before fighting back, but if not felled quickly, my expectation is that the fight is far more likely to go very badly for the female. Moreover, what sort of message does it communicate to have females engaging in violence and inflicting their will on others, whether in the service of justice or otherwise? That’s the patriarchy in a nutshell. Rebranding matriarchal social norms in terms of negative male characteristics, even for entertainment purposes, serves no one particularly well. I wonder if hindsight will prompt the questions “what on Earth were we thinking?” Considering how human culture is stuck in permanent adolescence, I rather doubt it.

In sales and marketing (as I understand them), one of the principal techniques to close a sale is to generate momentum by getting the prospective mark buyer to agree to a series of minor statements (small sells) leading to the eventual purchasing decision (the big sell or final sale). It’s narrow to broad, the reverse of the broad-to-narrow paragraph form many of us were taught in school. Both organizational forms proceed through assertions that are easy to swallow before getting to the intended conclusion. That conclusion could be either an automotive purchase or adoption of some argument or ideology. When the product, service, argument, or ideology is sold effectively by a skilled salesman or spin doctor narrative manager, that person may be recognized as a closer, as in sealing the deal.

Many learn to recognize the techniques of the presumptive closer and resist being drawn in too easily. One of those techniques is to reframe the additional price of something as equivalent to, say, one’s daily cup of coffee purchased at some overpriced coffee house. The presumption is that if one has the spare discretionary income to buy coffee every day, then one can put that coffee money instead toward a higher monthly payment. Suckers might fall for it — even if they don’t drink coffee — because the false equivalence is an easily recognized though bogus substitution. The canonical too-slick salesman no one trusts is the dude on the used car lot wearing some awful plaid jacket and sporting a pornstache. That stereotype, borne out of the 1970s, barely exists anymore but is kept alive by repetitive reinforcement in TV and movies set in that decade or at least citing the stereotype for cheap effect (just as I have). But how does one spot a skilled rhetorician, spewing social and political hot takes to drive custom narratives? Let me identify a few markers.

Thomas Sowell penned a brief article entitled “Point of No Return.” I surmise (admitting my lack of familiarity) that creators.com is a conservative website, which all by itself does not raise any flags. Indeed, in heterodox fashion, I want to read well reasoned arguments with which I may not always agree. My previous disappointment that Sowell fails in that regard was only reinforced by the linked article. Take note that the entire article uses paragraphs that are reduced to bite-sized chunks of only one or two sentences. Those are small sells, inviting closure with every paragraph break.

Worse yet, only five (small) paragraphs in, Sowell succumbs to Godwin’s Law and cites Nazis recklessly to put the U.S. on a slippery slope toward tyranny. The obvious learned function of mentioning Nazis is to trigger a reaction, much like baseless accusations of racism, sexual misconduct, or baby eating. It puts everyone on the defensive without having to demonstrate the assertion responsibly, which is why the first mention of Nazis in argument is usually sufficient to disregard anything else written or said by the person in question. I might have agreed with Sowell in his more general statements, just as conservatism (as in conservation) appeals as more and more slips away while history wears on, but after writing “Nazi,” he lost me entirely (again).

Sowell also raises several straw men just to knock them down, assessing (correctly or incorrectly, who can say?) what the public believes as though there were monolithic consensus. I won’t defend the general public’s grasp of history, ideological placeholders, legal maneuvers, or cultural touchstones. Plenty of comedy bits demonstrate the deplorable level of awareness of individual members of society like they were fully representative of the whole. Yet plenty of people pay attention and accordingly don’t make the cut when offering up idiocy for entertainment. (What fun, ridiculing fools!) The full range of opinion on any given topic is not best characterized by however many idiots and ignoramuses can be found by walking down the street and shoving a camera and mic in their astonishingly unembarrassed faces.

So in closing, let me suggest that, in defiance of the title of this blog post, Thomas Sowell is in fact not a closer. Although he drops crumbs and morsels gobbled up credulously by those unable to recognize they’re being sold a line of BS, they do not make a meal. Nor should Sowell’s main point, i.e., the titular point of no return, be accepted when his burden of proof has not been met. That does not necessary mean Sowell is wrong in the sense that even a stopped close tells the time correctly twice a day. The danger is that even if he’s partially correction some of the time, his perspective and program (nonpartisan freedom! whatever that may mean) must be considered with circumspection and disdain. Be highly suspicious before buying what Sowell is selling. Fundamentally, he’s a bullshit artist.

Continuing from the previous blog post, lengthy credit scrolls at the ends of movies have become a favorite hiding place for bloopers and teasers. The purpose of this practice is unclear, since I can’t pretend (unlike many reckless opinonators) to inhabit the minds of filmmakers, but it has become a fairly reliable afterthought for film-goers willing to wait out the credits. Those who depart the theater, change the channel, or click away to other content may know they are relinquishing some last tidbit to be discovered, but there’s no way to know in advance if one is being punked or pleased, or indeed if there is anything at all there. Clickbait news often employs this same technique, teasing some newsbit in the headline to entice readers to wade (or skim) through a series of (ugh!) one-sentence paragraphs to find the desired content, which sometimes is not even provided. At least one film (Monty Python’s The Secret Policeman’s Other Ball (1982) as memory serves) pranked those in a rush to beat foot traffic out of the theater (back when film-going meant visiting the cinema) by having an additional thirty minutes of material after the (first) credit sequence.

This also put me in mind of Paul Harvey radio broadcasts ending with the sign-off tag line, “… the rest of the story.” Harvey supplemented the news with obscure yet interesting facts and analysis that tended to reshape one’s understanding of consensus narrative. Such reshaping is especially important as an ongoing process of clarification and revision. When served up in delectable chunks by winning personalities like Paul Harvey, supplemental material is easily absorbed. When material requires effort to obtain and/or challenges one’s beliefs, something strongly, well, the default response is probably not to bother. However, those possessing intellectual integrity welcome challenging material and indeed seek it out. Indeed, invalidation of a thesis or hypothesis is fundamental to the scientific method, and no body of work can be sequestered from scrutiny and then be held as legitimately authoritative.

Yet that’s what happens routinely in the contemporary infosphere. A government press office or corporate public relations officer issues guidance or policy in direct conflict with earlier guidance or policy and in doing so seeks to place any resulting cognitive dissonance beyond examination and out of scope. Simple matters of adjustment are not what concern me. Rather, it’s wholesale brainwashing that is of concern, when something is clear within one’s memory or plainly documented in print/video yet brazenly denied, circumvented, and deflected in favor of a new directive. The American public has contended with this repeatedly as each new presidential administration demonizes the policies of its predecessors but typically without demonstrating the self-reflection and -examination to admit, wrongdoing, responsibility, or error on anyone’s part. It’s a distinctly American phenomenon, though others have cottoned onto it and adopted the practice for themselves.

Exhaustion from separating the spin-doctored utterances of one malefactor or another from one’s own direct experience and sense-making drives many to simply give up. “Whatever you say, sir. Lemme go back to my entertainments.” The prospect of a never-ending slog through evidence and analysis only to arrive on unsteady ground, due to shift underfoot again and again with each new revelation, is particularly unsatisfactory. And as discussed before, those who nonetheless strain to achieve knowledge and understanding that reach temporary sufficiency yet remain permanently, intransigently provisional find themselves thwarted by those in the employ of organizations willing and eager to game information systems in the service of their not-even-hidden agendas. Alternative dangers for the muddled thinker include retreating into fixed ideology or collapsing into solipsism. Maybe none of it matters in the end. We can choose our beliefs from the buffet of available options without adherence to reality. We can create our own reality. Of course, that’s a description of madness, to which many have already succumbed. Why aren’t they wearing straitjackets?

Let me first restate axioms developed in previous blog posts. Narrative is the essential outward form of consciousness. Cognition has many preverbal and nonverbal subtleties, but the exchange of ideas occurs predominantly through narrative, and the story of self (told to oneself) can be understood as stream of consciousness: ongoing self-narration of sensations and events. The principal characteristic of narrative, at least that which is not pure fantasy, is in-the-moment sufficiency. Snap-judgment heuristics are merely temporary placeholders until, ideally at least, thoughtful reconsideration and revision that take time and discernment can be brought to bear. Stories we tell and are told, however, often do not reflect reality well, partly because our perceptual apparatuses are flawed, partly because individuals are untrained and unskilled in critical thinking (or overtrained and distorted), and partly because stories are polluted with emotions that make clear assessments impossible (to say nothing of malefactors with agendas). Some of us struggle to remove confabulation from narrative (as best we can) whereas others embrace it because it’s emotionally gratifying.

A good example of the reality principle is recognition, similar to the 1970s energy crisis, that energy supplies don’t magically appear by simply digging and drilling more of the stuff out of the ground. Those easy-to-get resources have been plundered already. The term peak oil refers to eventual decline in energy production (harvesting, really) when the easy stuff is more than half gone and undiminished (read: increasing) demand impels energy companies to go in search of more exotic supply (e.g., underwater or embedded in shale). If that reality is dissatisfying, a host of dreamt-up stories offer us deliverance from inevitable decline and reduction of lifestyle prerogatives by positing extravagant resources in renewables, hydrogen fuel cells, fusion (not to be confused with fission), or as-yet unexploited regions such as The Arctic National Wildlife Refuge. None of these represent plausible realities (except going into heretofore protected regions and bringing ecological devastation).

The relationship of fictional stories to reality is quite complex. For this blog post, a radically narrow description is that fiction is the imaginary space whereas ideas can be tried out and explored safely in preparation for implementation in reality. Science fiction (i.e., imagining interstellar space travel despite its flat impossibility in Newtonian physics) is a good example. Some believe humans can eventually accomplish what’s depicted in sci-fi, and in certain limited examples we already have. But many sci-fi stories simply don’t present a plausible reality. Taken as vicarious entertainment, they’re AOK superfine with me. But given that Western cultures (I can’t opine on cultures outside the West) have veered dangerously into rank ideation and believing their own hype, too many people believe fervently in aspirational futures that have no hope of ever instantiating. Just like giant pools of oil hidden under the Rocky Mountains (to cite something sent to me just today offering illusory relief from skyrocketing gasoline prices).

Among the many genres of narrative now on offer in fiction, no better example of sought-after-power is the superhero story. Identifying with the technological and financial power of Ironman and Batman or the god-powers of Thor and Wonder Woman is thrilling, perhaps, but again, these are not plausible realities. Yet these superrich, superstrong, superintelligent superheros are everywhere in fiction, attesting to liminal awareness of lack of power and indeed frailty. Many superhero stories are couched as coming-of-age stories for girls, who with grit and determination can fight toe-to-toe with any man and dominate. (Too many BS examples to cite.) Helps, of course, if the girl has magic at her disposal. Gawd, do I tire of these stories, told as origins in suffering, acquisition of skills, and coming into one’s own with the mature ability to force one’s will on others, often in the form of straight-up killing and assassination. Judge, jury, and executioner all rolled into one but entirely acceptable vigilantism if done wearing a supersuit and claiming spurious, self-appointed moral authority.

There are better narratives that don’t conflate power with force or lack plausibility in the world we actually inhabit. In a rather complicated article by Adam Tooze entitled “John Mearsheimer and the Dark Origins of Realism” at The New Statesman, after a lengthy historical and geopolitical analysis of competing narratives, a mode of apprehending reality is described:

… adopting a realistic approach towards the world does not consist in always reaching for a well-worn toolkit of timeless verities, nor does it consist in affecting a hard-boiled attitude so as to inoculate oneself forever against liberal enthusiasm. Realism, taken seriously, entails a never-ending cognitive and emotional challenge. It involves a minute-by-minute struggle to understand a complex and constantly evolving world, in which we are ourselves immersed, a world that we can, to a degree, influence and change, but which constantly challenges our categories and the definitions of our interests. And in that struggle for realism – the never-ending task of sensibly defining interests and pursuing them as best we can – to resort to war, by any side, should be acknowledged for what it is. It should not be normalised as the logical and obvious reaction to given circumstances, but recognised as a radical and perilous act, fraught with moral consequences. Any thinker or politician too callous or shallow to face that stark reality, should be judged accordingly.

As a sometimes presenter of aphorisms, felicitous and humorous turns of phrase and logic interest me as examples of heuristics aimed as parsimony and cognitive efficiency. Whether one recognizes those terms or not, everyone uses snap categorization and other shortcuts to manage and alleviate crowded thinking from overwhelming demands on perception. Most of us, most of the time, use sufficiency as the primary decision-making mode, which boils down to “close enough for horseshoes and hand grenades.” Emotion is typically the trigger, not rational analysis. After enough repetition is established, unthinking habit takes over. Prior to habituation, however, the wisdom of sages has provided useful rubrics to save unnecessary and pointless labor over casuistry flung into one’s way to impede, convince, or gaslight. (I previously wrote about this effect here).

As categories, I pay close attention to razors, rules, laws, principles, and Zuihitsu when they appear as aphorisms in the writing of those I read and follow online. Famous rules, laws, and principles include Occam’s Razor, (Finagle’s Corollary to) Murphy’s Law, Godwin’s Law, Jevon’s Paradox, and the Dunning-Kruger Effect (do your own searches if these escape you). Some are quite useful at dispelling faulty thinking and argumentation. Café Bedouin (see blogroll) has an ongoing series of Zuihitsu, which has grown quite long. Many ring fundamentally true; others are either highly situational or wrong on their face, perhaps revealing the cardinal weakness of reduction of ideas to short, quotable phrases.

I recently learned of Hitchens’ Razor (after Christopher Hitchens), usually given as “What can be asserted without evidence can also be dismissed without evidence.” According to the Wikipedia entry, it may well have been reconstituted, repurposed, or revived from other sources stretching back into antiquity. Caitlin Johnson, a notable aphorist I’ve quoted numerous times, uses Hitchens’ Razor to put the lie to claims from the U.S. war machine and its dutiful media lapdogs that the “situation in Ukraine” (whatever that is) demands intervention by Western powers lest the utility bad guys of the moment, the Russians, be allowed to run roughshod over its neighbor Ukraine, which (significantly) used to be part of the now-defunct Soviet Union. As with many controversial, inflammatory claims and assertions continuously heaped like a dog pile on hapless U.S. citizens with little time, few resources, and no obligation to perform their own investigations and analyses, I have only weak opinions but very strong suspicions. That’s where Hitchens’ Razor comes in handy. Under its instruction, I can discard out-of-hand and in disbelief extraordinary claims designed to whip me and the wider public into an emotional frenzy and thus accept or support actions that shouldn’t just raise eyebrows but be met with considerable dissent, protest, and disobedience. Saves me a lot of time entertaining nonsense just because it gets repeated often enough to be accepted as truth (Bernays’ Principle).

Ask parents what ambitions they harbor for their child or children and among the most patterned responses is “I just want them to be happy.” I find such an answer thoughtless and disingenuous, and the insertion of the hedge just to make happiness sound like a small ask is a red herring. To begin with, for most kids still in their first decade, happiness and playfulness are relatively effortless and natural so long as a secure, loving environment is provided. Certainly not a default setting, but it’s still quite commonplace. As the dreamy style of childhood cognition is gradually supplanted by supposedly more logical, rational, adult thinking, and as children become acquainted with iniquities of both history and contemporary life, innocence and optimism become impossible to retain. Cue the sullen teenager confronting the yawning chasm between desire and reality. Indeed, few people seem to make the transition into adulthood knowing with much clarity how to be happy in the midst of widespread travail and suffering. Instead, young adults frequently substitute self-destructive, nihilistic hedonism, something learned primarily (says me) from the posturing of movie characters and the celebrities who portray them. (Never understood the trope of criminals hanging at nightclubs, surrounded by drug addicts, nymphos, other unsavory types, and truly awful music, where they can indulge their assholery before everything inevitably goes sideways.)

Many philosophies recommend simplicity, naturalness, and independence as paths to happiness and moral rectitude. Transcendentalism was one such response to social and political complexities that spoil and/or corrupt. Yet two centuries on, the world has only gotten more and more complex, pressing on everyone especially for information processing in volume and sophistication that does not at all come naturally to most and is arguably not part of our evolutionary toolkit. Multiple social issues, if one is to engage them fairly, hinge on legalistic arguments and bewildering wordplay that render them fundamentally intractable. Accordingly, many waive away all nuance and adopt pro forma attitudes. Yet the airwaves, social media, the Internet, and even dinner conversations are suffused by the worst sorts of hypercomplexity and casuistry that confound even those who traffic regularly in such rhetoric. It’s a very long way from “I just want to be happy.”

(more…)

/rant on

The ongoing epistemological crisis is getting no aid or relief from the chattering classes. Case in point: the Feb. 2021 issue of Harper’s Magazine has a special supplement devoted to “Life after Trump,” which divides recent history neatly into reality and unreality commencing from either the announcement of Trump’s candidacy, his unexpected success in the Republican primaries, his even less expected election (and inauguration), or now his removal from office following electoral defeat in Nov. 2020. Take your pick which signals the greatest deflection from history’s “proper” course before being derailed into a false trajectory. Charles Yu and Olivia Laing adopt the reality/unreality dichotomy in their contributions to the special supplement. Yu divides (as do many others) the nation into us and them: supporters of a supposed departure from reality/sanity and those whose clear perception penetrates the illusion. Laing bemoans the inability to distinguish fiction and fantasy from truth, unreality masquerading as your truth, my truth, anyone’s truth given repetition and persuasion sufficient to make it stick. Despite familiarity with these forced, unoriginal metaphors, I don’t believe them for a moment. Worse, they do more to encourage siloed thinking and congratulate the “Resistance” for being on the putative correct side of the glaringly obvious schism in the voting populace. Their arguments support a false binary, perpetuating and reinforcing a distorted and decidedly unhelpful interpretation of recent history. Much better analyses than theirs are available.

So let me state emphatically: like the universe, infinity, and oddly enough consciousness, reality is all-encompassing and unitary. Sure, different aspects can be examined separately, but the whole is nonetheless indivisible. Reality is a complete surround, not something one can opt into or out of. That doesn’t mean one’s mind can’t go elsewhere, either temporarily or permanently, but that does not create or constitute an alternate reality. It’s merely dissociation. Considering the rather extreme limitations of human perceptual apparatuses, it’s frankly inevitable that each of us occupies a unique position, an individual perspective, within a much, much (much, much …) larger reality. Add just a couple more axes to the graph below for time (from nanoseconds to eons) and physical scale (from subatomic to cosmic), and the available portion of reality anyone can grasp is clearly infinitesimally small, yet that tiny, tiny portion is utterly everything for each individual. It’s a weird kind of solipsism.

I get that Harper’s is a literary magazine and that writers/contributors take advantage of the opportunity to flex for whatever diminishing readership has the patience to actually finish their articles. Indeed, in the course of the special supplement, more than a few felicitous concepts and turns of phase appeared. However, despite commonplace protestations, the new chief executive at the helm of the ship of state has not in fact returned the American scene to normal reality after an awful but limited interregnum.

Aside: Citizens are asked to swallow the whopper that the current president, an elder statesman, the so-called leader of the free world, is in full control of this faculties. Funny how his handlers repeatedly erupt like a murder of crows at the first suggestion that a difficult, unvetted question might be posed, inviting the poor fellow to veer even slightly off the teleprompter script. Nope. Lest yet another foot-in-mouth PR disaster occur (too many already to count), he’s whisked away, out of range of cameras and mics before any lasting damage can be done. Everyone is supposed to pretend this charade is somehow normal. On the other hand, considering how many past presidents were plainly puppets, spokespersons, or charlatans (or at least denied the opportunity to enact an agenda), one could argue that the façade is normal. “Pay no attention to the man [or men] behind the curtain. I am the great and powerful Wizard of Oz!”

With some dismay, I admit that the tiny sliver of reality to which many attend incessantly is an even smaller subset of reality, served up via small, handheld devices that fit neatly in one’s pocket. One could say theirs is a pocket reality, mostly mass media controlled by Silicon Valley platforms and their censorious algorithms. Constrained by all things digital, and despite voluminous ephemera, that reality bears little resemblance to what digital refuseniks experience without the blue glare of screens washing all the color from their faces and their own authentic thoughts out of their heads. Instead, I recommend getting outside, into the open air and under the warm glow of the yellow sun, to experience life as an embodied being, not as a mere processor of yet someone else’s pocket reality. That’s how we all start out as children before getting sucked into the machine.

Weirdly, only when the screen size ramps up to 30 feet tall do consumers grow skeptical and critical of storytelling. At just the moment cinema audiences are invited to suspend disbelief, the Reality Principle and logic are applied to character, dialogue, plotting, and make-believe gadgetry, which often fail to ring true. Why does fiction come under such careful scrutiny while reality skates right on by, allowing the credulous to believe whatever they’re fed?

/rant off

/rant on

Remember when the War on Christmas meme materialized a few years ago out of thin air, even though no one in particular was on attack? Might have to rethink that one. Seems that every holiday now carries an obligation to revisit the past, recontextualize origin stories, and confess guilt over tawdry details of how the holiday came to be celebrated. Nearly everyone who wants to know knows by now that Christmas is a gross bastardization of pagan celebrations of the winter solstice, cooped by various organized churches (not limited to Christians!) before succumbing to the awesome marketing onslaught (thanks, Coca-Cola) that makes Xmas the “most wonderful time of the year” (as the tune goes) and returning the holiday to its secular roots. Thanksgiving is now similarly ruined, no longer able to be celebrated and enjoyed innocently (like a Disney princess story reinterpreted as a white or male savior story — or even worse, a while male) but instead used as an excuse to admit America’s colonial and genocidal past and extermination mistreatment of native populations as white Europeans encroached ever more persistently on lands the natives held more or less as a commons. Gone are the days when one could gather among family and friends, enjoy a nice meal and good company, and give humble, sincere thanks for whatever bounty fortuna had bestowed. Now it’s history lectures and acrimony and families rent asunder along generational lines, typically initiated by those newly minted graduates of higher education and their newfangled ideas about equity, justice, and victimhood. Kids these days … get off my lawn!

One need not look far afield to find alternative histories that position received wisdom about the past in the cross-hairs just to enact purification rituals that make it/them, what, clean? accurate? whole? I dunno what the real motivation is except perhaps to force whites to self-flagellate over sins of our ancestors. Damn us to hell for having cruelly visited iniquity upon everyone in the process of installing white, patriarchal Europeanness as the dominant Western culture. I admit all of it, though I’m responsible for none of it. Moreover, history stops for no man, no culture, no status quo. White, patriarchal Europeanness is in serious demographic retreat and has arguably already lost its grip on cultural dominance. The future is female (among other things), amirite? Indeed, whether intended or not, that was the whole idea behind the American experiment: the melting pot. Purity was never the point. Mass migration out of politically, economically, and ecologically ravaged regions means that the experiment is no longer uniquely American.

Interdisciplinary approaches to history, if academic rigidity can be overcome, regularly develop new understandings to correct the historical record. Accordingly, the past is remarkably dynamic. (I’ve been especially intrigued by Graham Hancock’s work on ancient civilizations, mostly misunderstood and largely forgotten except for megalithic projects left behind.) But the past is truly awful, with disaster piled upon catastrophe followed by calamity and cataclysm. Still waiting for the apocalypse. Peering too intently into the past is like staring at the sun: it scorches the retinas. Moreover, the entire history of history is replete with stories being told and retold, forgotten and recovered, transformed in each iteration from folklore into fable into myth into legend before finally passing entirely out of human memory systems. How many versions of Christmas are there across cultures and time? Or Thanksgiving, or Halloween, or any Hallmark® holiday that has crossed oceans and settled into foreign lands? What counts as the true spirit of any of them when their histories are so mutable?

/rant off