Archive for the ‘Consciousness’ Category

Watched Everything Everywhere All at Once (DVD version) at home on my TV, which is where I see most films these days. Very few inspire me to trek to the theater anymore to overpay for seats and popcorn. Was pleased to enjoy this film quite a bit — at least before turning an analytical eye toward it. Let me provide a fun, glossy assessment before getting bogged down in troublesome detail.

The film introduces and trades heavily on characters from a supposed multiverse (a multitude of parallel universes branching indiscreetly from arbitrary decision points into an infinity of possibilities) “verse-jumping” into our universe to fix and repair damage done in one or more of the others. As plot devices go, this one is now quite commonplace and always (perhaps inevitably, given our preoccupation with ourselves) positions our universe (the only one we know until someone from outside intrudes) at the center of the others and as the linchpin in some grand plan to save the space-time continuum. It’s a worn trope yet allows storytellers immense freedom to conjure anything imaginable. Everything depicts disorienting alternative universes quite well, most of them (for no particular reason beyond having fun, I surmise) absurd variations of the familiar. Indeed, unlike most films where I sit in stone silence no matter what is presented, this one generated laugh-out-loud moments and gestures across the couch to the effect “did you see that?” In short, what that means is the film produced reflexive responses (it goosed me), which is quite unusual considering how most films, despite lots of overwrought action and drama, fail to register more than a checkbox “yup, got it.”

Actors portraying the three or four main characters do well in their respective jobs, playing several versions of themselves from different universes with diverse experiences. Most of the film is chase-and-evade, devolving at times into a familiar martial-arts punchfest that has frankly lost all possibility of making an impact in the era of overpowered, invulnerable superheros and magical unpredictability. Why filmmakers believe audiences want to see more of this drivel is beyond me, but I guess the animal curiosity to find out which make-believe character will prevail in a battle royale never gets old with mouth-breathers. I’m quite over it. The central conflict, however, wasn’t about the strongest punch. Rather, it was about persisting in the face of revealed meaninglessness a/k/a nihilism.

So here’s where hindsight analysis kinda ruined things for me. Although I recognize storytelling as elemental to modern cognition and consciousness, I don’t regard most narrative forms as art. Cinema, because of its financial interests and collaborative nature, rarely rises to the level of art. There are simply too many diverse elements that must be assembled under a unified aesthetic vision for that to occur often. Cinema is thus more entertainment than art, just like sports and games are entertainment, not art. Impressive skill may be demonstrated, which often produces enjoyable results, but I don’t conflate skill or mere craft with artistry. (I also tire of everything that provides moral and epistemological orientation being conflated with religion). So when films introduce super-serious subjects that really trouble me (e.g., overpopulation, institutional corruption, the climate emergency) but treat them lightly, I’m bothered. Everything does that with philosophy.

Coming to grips with nihilism and the absurdity of existence is the central feature of more than one 20th-century philosophy (and their variants). Downstream (or parallel?) are artistic genres that also express the idea, though in far less overt terms. One can easily get lost down a hole, seeking the bottom (alternatively, the root of things) but finding only the abyss. For that very reason, I have acquaintance with philosophical themes but have not truly sunk into them deeply. Nihilism is not something to mess with, even as a thought experiment or intellectual inquiry — especially if one is inclined to connect strongly with those same things. In Everything, the nihilist conclusion (i.e., that nothing matters) manifests absurdly as a giant, black, everything bagel that can literally suck a person into its hole. Well and good enough; probably best not to overexplain that McGuffin. But it demands a conclusion or resolution, which comes in the form of the mother rescuing the daughter. Ironically, it was the mother (from an alternative universe) who had introduced the daughter (also an alternative) to verse-jumping, who then (the daughter) got lost down the hole and threatened to collapse the multiverse into the everything bagel in a final gesture of despair. In effect, the mother had tinkered with powers well beyond her control, unwittingly created the daughter-monster with out-of-control feeling and unexpected powers, and had to clean up her own mess. How does she (the mother) do it? Through the power of love.

OK, fine. Love (especially unconditional love, as opposed to romantic or familial love) is a universal salve capable of healing all wounds. Except that it’s not. When the film finally depicts the rescue, saving the daughter and multiverse from destruction, it comes across as flat, obvious, and ineffectual (to me at least) and breaks the tone and pacing of the film. Lots of films resort to the power of love to save the day (typically just before the stroke of midnight), but they usually (not always) have better set-ups, which is to say, their film universes cohere and deliver cogent conclusions rather than waving a magic love-wand over everything to solve and resolve. The writers of this film are adept at the enjoyable absurd parts that launch and propel the story but could not stick the landing. Introducing (albeit comically) doomsday philosophy but then failing to treat it seriously enough left me deeply conflicted and dissatisfied. Perhaps it’s a case where my suspension of disbelief was not complete enough. Or maybe I brought too much into the film from outside, but we all have inescapable frames of reference. I wasn’t exactly triggered, merely frustrated. YMMV

Search the tag Counter-Enlightenment at the footer of this blog to find roughly ten disparate blog posts, all circling around the idea that intellectual history, despite all the obvious goodies trucked in with science and technology, is turning decidedly away from long-established Enlightenment values. A fair number of resources are available online and in book form exploring various movements against the Enlightenment over the past few centuries, none of which I have consulted. Instead, I picked up Daniel Schwindt’s The Case Against the Modern World: A Crash Course in Traditionalist Thought (2016), which was gifted to me. The book was otherwise unlikely to attract my attention considering that Schwindt takes Catholicism as a starting point whereas I’m an avowed atheist, though with no particular desire to proselytize or attempt to convince others of anything. However, The Case Against is suffused with curious ideas, so it is a good subject for a new book blogging project, which in characteristic fashion (for me) will likely proceed in fits and starts.

Two interrelated ideas Schwindt puts forward early in the book fit with multiple themes of this blog, namely, (1) the discovery and/or development of the self (I refer more regularly to consciousness) and (2) the reductive compartmentalization of thought and behavior. Let’s take them in order. Here’s a capsule of the first issue:

(more…)

This intended follow-up has been stalled (pt. 1 here) for one simple reason: the premise presented in the embedded YouTube video is (for me at least) easy to dismiss out of hand and I haven’t wanted to revisit it. Nevertheless, here’s the blurb at the top of the comments at the webpage:

Is reality created in our minds, or are the things you can touch and feel all that’s real? Philosopher Bernardo Kastrup holds doctorates in both philosophy and computer science, and has made a name for himself by arguing for metaphysical idealism, the idea that reality is essentially a mental phenomenon.

Without going into point-by-point discussion, the top-level assertion, if I understand it correctly (not assured), is that material reality comes out of mental experience rather than the reverse. It’s a chicken-and-egg question with materialism and idealism (fancy technical terms not needed) each vying for primacy. The string of conjectures (mental gymnastics, really, briefly impressive until one recognizes how quickly they lose correlation with how most of us think about and experience reality) that inverts the basic relationship of inner experience to outer reality is an example of waaaay overthinking a problem. No doubt quite a lot of erudition can be brought to bear on such questions, but even if those questions were resolved satisfactorily on an intellectual level and an internally coherent structure or system were developed or revealed, it doesn’t matter or lead anywhere. Humans are unavoidably embodied beings. Each individual existence also occupies a tiny sliver of time (the timeline extending in both directions to infinity). Suggesting that mental experience is briefly instantiated in personhood but is actually drawn out of some well of souls, collective consciousness, or panpsychism and rejoins them in heaven, hell, or elsewhere upon expiration is essentially a religious claim. It’s also an attractive supposition, granting each of us not permanence or immortality but rather something somehow better (?) though inscrutable because it lies beyond perception (but not conceptualization). Except for an eternity of torments in hell, I guess, if one deserves that awful fate.

One comment about Kastrup. He presents his perspective (his findings?) with haughty derision of others who can’t see or understand what it so (duh!) obvious. He falls victim to the very same over-intellectualized flim-flam he mentions when dismissing materialists who need miracles and shortcuts to smooth over holes in their scientific/philosophical systems. The very existence of earnest disagreement by those who occupy themselves with such questions might suggest some humility, as in “here’s my explanation, they have theirs, judge for yourself.” But there’s a third option: the great unwashed masses (including nearly all our ancestors) for whom such questions are never even fleeting thoughts. It’s all frankly immaterial (funnily, both the right and wrong word at once). Life is lived and experienced fundamentally on its surface — unless, for instance, one has been incubated too long within the hallowed halls of academia, lost touch with one’s brethren, and become preoccupied with cosmic investigations. Something quite similar happens to politicians and the wealthy, who typically hyperfocus on gathering to themselves power and then exercising that power over others (typically misunderstood as simply pulling the levers and operating the mechanisms of society). No wonder their pocket of reality looks so strikingly different.

Let me first restate axioms developed in previous blog posts. Narrative is the essential outward form of consciousness. Cognition has many preverbal and nonverbal subtleties, but the exchange of ideas occurs predominantly through narrative, and the story of self (told to oneself) can be understood as stream of consciousness: ongoing self-narration of sensations and events. The principal characteristic of narrative, at least that which is not pure fantasy, is in-the-moment sufficiency. Snap-judgment heuristics are merely temporary placeholders until, ideally at least, thoughtful reconsideration and revision that take time and discernment can be brought to bear. Stories we tell and are told, however, often do not reflect reality well, partly because our perceptual apparatuses are flawed, partly because individuals are untrained and unskilled in critical thinking (or overtrained and distorted), and partly because stories are polluted with emotions that make clear assessments impossible (to say nothing of malefactors with agendas). Some of us struggle to remove confabulation from narrative (as best we can) whereas others embrace it because it’s emotionally gratifying.

A good example of the reality principle is recognition, similar to the 1970s energy crisis, that energy supplies don’t magically appear by simply digging and drilling more of the stuff out of the ground. Those easy-to-get resources have been plundered already. The term peak oil refers to eventual decline in energy production (harvesting, really) when the easy stuff is more than half gone and undiminished (read: increasing) demand impels energy companies to go in search of more exotic supply (e.g., underwater or embedded in shale). If that reality is dissatisfying, a host of dreamt-up stories offer us deliverance from inevitable decline and reduction of lifestyle prerogatives by positing extravagant resources in renewables, hydrogen fuel cells, fusion (not to be confused with fission), or as-yet unexploited regions such as The Arctic National Wildlife Refuge. None of these represent plausible realities (except going into heretofore protected regions and bringing ecological devastation).

The relationship of fictional stories to reality is quite complex. For this blog post, a radically narrow description is that fiction is the imaginary space whereas ideas can be tried out and explored safely in preparation for implementation in reality. Science fiction (i.e., imagining interstellar space travel despite its flat impossibility in Newtonian physics) is a good example. Some believe humans can eventually accomplish what’s depicted in sci-fi, and in certain limited examples we already have. But many sci-fi stories simply don’t present a plausible reality. Taken as vicarious entertainment, they’re AOK superfine with me. But given that Western cultures (I can’t opine on cultures outside the West) have veered dangerously into rank ideation and believing their own hype, too many people believe fervently in aspirational futures that have no hope of ever instantiating. Just like giant pools of oil hidden under the Rocky Mountains (to cite something sent to me just today offering illusory relief from skyrocketing gasoline prices).

Among the many genres of narrative now on offer in fiction, no better example of sought-after-power is the superhero story. Identifying with the technological and financial power of Ironman and Batman or the god-powers of Thor and Wonder Woman is thrilling, perhaps, but again, these are not plausible realities. Yet these superrich, superstrong, superintelligent superheros are everywhere in fiction, attesting to liminal awareness of lack of power and indeed frailty. Many superhero stories are couched as coming-of-age stories for girls, who with grit and determination can fight toe-to-toe with any man and dominate. (Too many BS examples to cite.) Helps, of course, if the girl has magic at her disposal. Gawd, do I tire of these stories, told as origins in suffering, acquisition of skills, and coming into one’s own with the mature ability to force one’s will on others, often in the form of straight-up killing and assassination. Judge, jury, and executioner all rolled into one but entirely acceptable vigilantism if done wearing a supersuit and claiming spurious, self-appointed moral authority.

There are better narratives that don’t conflate power with force or lack plausibility in the world we actually inhabit. In a rather complicated article by Adam Tooze entitled “John Mearsheimer and the Dark Origins of Realism” at The New Statesman, after a lengthy historical and geopolitical analysis of competing narratives, a mode of apprehending reality is described:

… adopting a realistic approach towards the world does not consist in always reaching for a well-worn toolkit of timeless verities, nor does it consist in affecting a hard-boiled attitude so as to inoculate oneself forever against liberal enthusiasm. Realism, taken seriously, entails a never-ending cognitive and emotional challenge. It involves a minute-by-minute struggle to understand a complex and constantly evolving world, in which we are ourselves immersed, a world that we can, to a degree, influence and change, but which constantly challenges our categories and the definitions of our interests. And in that struggle for realism – the never-ending task of sensibly defining interests and pursuing them as best we can – to resort to war, by any side, should be acknowledged for what it is. It should not be normalised as the logical and obvious reaction to given circumstances, but recognised as a radical and perilous act, fraught with moral consequences. Any thinker or politician too callous or shallow to face that stark reality, should be judged accordingly.

/rant on

The ongoing epistemological crisis is getting no aid or relief from the chattering classes. Case in point: the Feb. 2021 issue of Harper’s Magazine has a special supplement devoted to “Life after Trump,” which divides recent history neatly into reality and unreality commencing from either the announcement of Trump’s candidacy, his unexpected success in the Republican primaries, his even less expected election (and inauguration), or now his removal from office following electoral defeat in Nov. 2020. Take your pick which signals the greatest deflection from history’s “proper” course before being derailed into a false trajectory. Charles Yu and Olivia Laing adopt the reality/unreality dichotomy in their contributions to the special supplement. Yu divides (as do many others) the nation into us and them: supporters of a supposed departure from reality/sanity and those whose clear perception penetrates the illusion. Laing bemoans the inability to distinguish fiction and fantasy from truth, unreality masquerading as your truth, my truth, anyone’s truth given repetition and persuasion sufficient to make it stick. Despite familiarity with these forced, unoriginal metaphors, I don’t believe them for a moment. Worse, they do more to encourage siloed thinking and congratulate the “Resistance” for being on the putative correct side of the glaringly obvious schism in the voting populace. Their arguments support a false binary, perpetuating and reinforcing a distorted and decidedly unhelpful interpretation of recent history. Much better analyses than theirs are available.

So let me state emphatically: like the universe, infinity, and oddly enough consciousness, reality is all-encompassing and unitary. Sure, different aspects can be examined separately, but the whole is nonetheless indivisible. Reality is a complete surround, not something one can opt into or out of. That doesn’t mean one’s mind can’t go elsewhere, either temporarily or permanently, but that does not create or constitute an alternate reality. It’s merely dissociation. Considering the rather extreme limitations of human perceptual apparatuses, it’s frankly inevitable that each of us occupies a unique position, an individual perspective, within a much, much (much, much …) larger reality. Add just a couple more axes to the graph below for time (from nanoseconds to eons) and physical scale (from subatomic to cosmic), and the available portion of reality anyone can grasp is clearly infinitesimally small, yet that tiny, tiny portion is utterly everything for each individual. It’s a weird kind of solipsism.

I get that Harper’s is a literary magazine and that writers/contributors take advantage of the opportunity to flex for whatever diminishing readership has the patience to actually finish their articles. Indeed, in the course of the special supplement, more than a few felicitous concepts and turns of phase appeared. However, despite commonplace protestations, the new chief executive at the helm of the ship of state has not in fact returned the American scene to normal reality after an awful but limited interregnum.

Aside: Citizens are asked to swallow the whopper that the current president, an elder statesman, the so-called leader of the free world, is in full control of this faculties. Funny how his handlers repeatedly erupt like a murder of crows at the first suggestion that a difficult, unvetted question might be posed, inviting the poor fellow to veer even slightly off the teleprompter script. Nope. Lest yet another foot-in-mouth PR disaster occur (too many already to count), he’s whisked away, out of range of cameras and mics before any lasting damage can be done. Everyone is supposed to pretend this charade is somehow normal. On the other hand, considering how many past presidents were plainly puppets, spokespersons, or charlatans (or at least denied the opportunity to enact an agenda), one could argue that the façade is normal. “Pay no attention to the man [or men] behind the curtain. I am the great and powerful Wizard of Oz!”

With some dismay, I admit that the tiny sliver of reality to which many attend incessantly is an even smaller subset of reality, served up via small, handheld devices that fit neatly in one’s pocket. One could say theirs is a pocket reality, mostly mass media controlled by Silicon Valley platforms and their censorious algorithms. Constrained by all things digital, and despite voluminous ephemera, that reality bears little resemblance to what digital refuseniks experience without the blue glare of screens washing all the color from their faces and their own authentic thoughts out of their heads. Instead, I recommend getting outside, into the open air and under the warm glow of the yellow sun, to experience life as an embodied being, not as a mere processor of yet someone else’s pocket reality. That’s how we all start out as children before getting sucked into the machine.

Weirdly, only when the screen size ramps up to 30 feet tall do consumers grow skeptical and critical of storytelling. At just the moment cinema audiences are invited to suspend disbelief, the Reality Principle and logic are applied to character, dialogue, plotting, and make-believe gadgetry, which often fail to ring true. Why does fiction come under such careful scrutiny while reality skates right on by, allowing the credulous to believe whatever they’re fed?

/rant off

For more than a decade, I’ve had in the back of my mind a blog post called “The Power of Naming” to remark that bestowing a name gives something power, substance, and in a sense, reality. That post never really came together, but its inverse did. Anyway, here’s a renewed attempt.

The period of language acquisition in early childhood is suffused with learning the names of things, most of which is passive. Names of animals (associated closely with sounds they make) are often a special focus using picture books. The kitty, doggie, and horsie eventually become the cat, dog, and horse. Similarly, the moo-cow and the tweety-bird shorten to cow and bird (though songbird may be an acceptable holdover). Words in the abstract are signifiers of the actual things, aided by the text symbols learned in literate cultures to reinforce mere categories instead of examples grounded in reality. Multiply the names of things several hundred thousand times into adulthood and indeed throughout life and one can develop a formidable vocabulary supporting expressive and nuanced thought and speech. Do you know the differences between acute, right, obtuse, straight, and reflex angles? Does it matter? Does your knowledge of barware inform when to use a flute, coupe, snifter, shot (or shooter or caballito), nosing glass (or Glencairn), tumbler, tankard, goblet, sling, and Stein? I’d say you’ve missed something by never having drunk dark beer (Ger.: Schwarzbier) from a frosted schooner. All these varieties developed for reasons that remain invisible to someone content to drink everything from the venerable red Solo cup. Funnily enough, the red Solo cup now comes in different versions, fooling precisely no one.

Returning to book blogging, Walter Ong (in Orality and Literacy) has curious comparisons between primarily oral cultures and literate cultures. For example:

Oral people commonly think of names (one kind of words) as conveying power over things. Explanations of Adam’s naming of the animals in Genesis 2:20 usually call condescending attention to this presumably quaint archaic belief. Such a belief is in fact far less quaint than it seems to unreflective chirographic and typographic folk. First of all, names do give humans beings power over what they name: without learning a vast store of names, one is simply powerless to understand, for example, chemistry and to practice chemical engineering. And so with all other intellectual knowledge. Secondly, chirographic and typographic folk tend to think of names as labels, written or printed tags imaginatively affixed to an object named. Oral folk have no sense of a name as a tag, for they have no idea of a name as something that can be seen. Written or printed representations of words can be labels; real, spoken words cannot be. [p. 33]

This gets at something that has been developing over the past few decades, namely, that as otherwise literate (or functionally literate) people gather more and more information through electronic media (screens that serve broadcast and cable TV, YouTube videos, prerecorded news for streaming, and podcasts, and most importantly, audiobooks — all of which speak content to listeners), the spoken word (re)gains primacy and the printed word fades into disuse. Electronic media may produce a hybrid of orality/literacy, but words are no longer silent, internal, and abstract. Indeed, words — all by themselves — are understood as being capable of violence. Gone are the days when “stick and stones ….” Now, fighting words incite and insults sting again.

Not so long ago, it was possible to provoke a duel with an insult or gesture, such as a glove across the face. Among some people, defense of honor never really disappeared (though dueling did). History has taken a strange turn, however. Proposed legislation to criminalize deadnaming (presumably to protect a small but growing number of transgender and nonbinary people who have redefined their gender identity and accordingly adopted different names) recognizes the violence of words but then tries to transmute the offense into an abstract criminal law. It’s deeply mixed up, and I don’t have the patience to sort it out.

More to say in later blog posts, but I’ll raise the Counter-Enlightenment once more to say that the nature of modern consciousness if shifting somewhat radically in response to stimuli and pressures that grew out of an information environment, roughly 70 years old now but transformed even more fundamentally in the last 25 years, that is substantially discontinuous from centuries-old traditions. Those traditions displaced even older traditions inherited from antiquity. Such is the way of the world, I suppose, and with the benefit of Walter Ong’s insights, my appreciation of the outlines is taking better shape.

Happy to report that humans have finally outgrown their adolescent fixation, obsession, and infatuation surrounding technology and gadgetry, especially those that blow up things (and people), part of a maladaptive desire to watch the world burn (like a disturbed 14-year-old playing with fire to test the boundaries of control while hoping for the boundary to be breached). We are now in the process of correcting priorities and fixing the damage done. We’re also free from the psychological prison in which we trapped ourselves through status seeking and insistence on rigid ego consciousness by recognizing instead that, as artifacts of a hypersocial species, human cognition is fundamentally distributed among us as each of us is for all intents and purposes a storyteller retelling, reinforcing, and embellishing stories told elsewhere — even though it’s not quite accurate to call it mass mind or collective consciousness — and that indeed all men are brothers (an admitted anachronism, since that phrase encompasses women/sisters, too). More broadly, humans also now understand that we are only one species among many (a relative late-comer in evolutionary time, as it happens) that coexist in a dynamic balance with each other and with the larger entity some call Mother Earth or Gaia. Accordingly, we have determined that our relationship can no longer be that of abuser (us) and abused (everything not us) if the dynamism built into that system is not to take us out (read: trigger human extinction, like most species suffered throughout evolutionary time). If these pronouncements sound too rosy, well, get a clue, fool!

Let me draw your attention to the long YouTube video embedded below. These folks have gotten the clues, though my commentary follows anyway, because SWOTI.

After processing all the hand-waving and calls to immediate action (with inevitable nods to fundraising), I was struck by two things in particular. First, XR’s co-founder Roger Hallan gets pretty much everything right despite an off-putting combination of alarm, desperation, exasperation, and blame. He argues that to achieve the global awakening needed to alter humanity’s course toward (self-)extinction, we actually need charismatic speakers and heightened emotionalism. Scientific dispassion and neutered measured political discourse (such as the Intergovernmental Panel on Climate Change (IPCC) or as Al Gore attempted for decades before going Hollywood already fifteen years ago now) have simply failed to accomplish anything. (On inspection, what history has actually delivered is not characterized by the lofty rhetoric of statesmen and boosters of Enlightenment philosophy but rather resembles a sociologist’s nightmare of dysfunctional social organization, where anything that could possible go wrong pretty much has.) That abysmal failure is dawning on people under the age of 30 or so quite strongly, whose futures have been not so much imperiled as actively robbed. (HOW DARE YOU!? You slimy, venal, incompetent cretins above the age of 30 or so!) So it’s not for nothing that Roger Hallan insists that the XR movement ought to be powered and led by young people, with old people stepping aside, relinquishing positions of power and influence they’ve already squandered.


Second, Chris Hedges, easily the most erudite and prepared speaker/contributor, describes his first-hand experience reporting on rebellion in Europe leading to (1) the collapse of governments and (2) disintegration of societies. He seems to believe that the first is worthwhile, necessary, and/or inevitable even though the immediate result is the second. Civil wars, purges, and genocides are not uncommon throughout history in the often extended periods preceding and following social collapse. The rapidity of governmental collapse once the spark of citizen rebellion becomes inflamed is, in his experience, evidence that driving irresponsible leaders from power is still possible. Hedges’ catchphrase is “I fight fascists because they’re fascists,” which as an act of conscience allows him to sleep at night. A corollary is that fighting may not necessarily be effective, at least on the short term, or be undertaken without significant sacrifice but needs to be done anyway to imbue life with purpose and meaning, as opposed to anomie. Although Hedges may entertain the possibility that social disintegration and collapse will be far, far more serious and widespread once the armed-to-the-teeth American empire cracks up fully (already under way to many observers) than with the Balkan countries, conscientious resistance and rebellion is still recommended.

Much as my attitudes are aligned with XR, Hallan, and Hedges, I’m less well convinced that we should all go down swinging. That industrial civilization is going down and all of us with it no matter what we do is to me an inescapable conclusion. I’ve blogged about this quite a bit. Does ethical behavior demand fighting to the bitter end? Or can we fiddle while Rome burns, so to speak? There’s a lot of middle ground between those extremes, including nihilistic mischief (euphemism alert) and a bottomless well of anticipated suffering to alleviate somehow. More than altering the inevitable, I’m more inclined to focus on forestalling eleventh-hour evil and finding some grace in how we ultimately, collectively meet species death.

Wanted to provide an update to the previous post in my book-blogging project on Walter Ong’s Orality and Literacy to correct something that wasn’t clear to me at first. The term chirographic refers to writing, but I conflated writing more generally with literacy. Ong actually distinguishes chirographic (writing) from typographic (type or print) and includes another category: electronic media.

Jack Goody … has convincingly shown how shifts hitherto labeled as shifts from magic to science, or from the so-called ‘prelogical’ to the more and more ‘rational’ state of consciousness, or from Lévi-Strauss’s ‘savage’ mind to domesticated thought, can be more economically and cogently explained as shifts from orality to various stages of literacy … Marshall McLuhan’s … cardinal gnomic saying, ‘The medium is the message’, registered his acute awareness of the importance of the shift from orality through literacy and print to electronic media. [pp. 28–29]

So the book’s primary contrast is between orality and literacy, but literacy has a sequence of historical developments: chirographic, typographic, and electronic media. These stages are not used interchangeably by Ong. Indeed, they exist simultaneously in the modern world and all contribute to overall literacy while each possesses unique characteristics. For instance, reading from handwriting (printing or cursive, the latter far less widely used now except for signatures) is different from reading from print on paper or on the screen. Further, writing by hand, typing on a typewriter, typing into a word-processor, and composing text on a smartphone each has its effects on mental processes and outputs. Ong also mentions remnants of orality that have not yet been fully extinguished. So the exact mindset or style of consciousness derived from orality vs. literacy is neither fixed nor established universally but contains aspects from each category and subcategory.

Ong also takes a swing at Julian Jaynes. Considering that Jaynes’ book The Origin of Consciousness in the Breakdown of the Bicameral Mind (1977) (see this overview) was published only seven years prior to Orality and Literacy (1982), the impact of Jaynes’ thesis must have still been felt quite strongly (as it is now among some thinkers). Yet Ong disposes of Jaynes rather parsimoniously, stating

… if attention to sophisticated orality-literacy contrasts is growing in some circles, it is still relatively rare in many fields where it could be helpful. For example, the early and late stages of consciousness which Julian Jaynes (1977) describes and related to neuro-physiological changes to the bicameral mind would also appear to lend themselves largely to much simpler and more verifiable descriptions in terms of a shift from orality to literacy. [p. 29]

In light of the details above, it’s probably not accurate to say (as I did before) that we are returning to orality from literacy. Rather, the synthesis of characteristics is shifting, as it always has, in relation to new stimuli and media. Since the advent of cinema and TV — the first screens, now supplemented by the computer and smartphone — the way humans consume information is undergoing yet another shift. Or perhaps it’s better to conclude that it’s always been shifting, not unlike how we have always been and are still evolving, though the timescales are usually too slow to observe without specialized training and analysis. Shifts in consciousness arguably occur far more quickly than biological evolution, and the rate at which new superstimuli are introduced into the information environment suggest radical discontinuity with even the recent past — something that used to be call the generation gap.

I’ve always wondered what media theorists such as McLuhan (d. 1980), Neil Postman (d. 2003), and now Ong (d. 2003) would make of the 21st century had they lived long enough to witness what has been happening, with 2014–2015 being the significant inflection point according to Jonathan Haidt. (No doubt there are other media theorists working on this issue who have not risen to my attention.) Numerous other analyses point instead to the early 20th century as the era when industrial civilization harnessed fossil fuels and turned the mechanisms and technologies of innovators decidedly against humanity. Pick your branching point.

The backblog at The Spiral Staircase includes numerous book reviews and three book-blogging projects — one completed and two others either abandoned or on semi-permanent hiatus. I’m launching a new project on Walter Ong’s Orality and Literacy: The Technologizing of the Word (1982), which comes highly recommended and appears quite interesting given my preoccupations with language, literacy, and consciousness. To keep my thinking fresh, I have not consulted any online reviews or synopses.

Early on, Ong provides curious (but unsurprising) definitions I suspect will contribute to the book’s main thesis. Here is one from the intro:

It is useful to approach orality and literacy synchronically, by comparing oral cultures and chirographic (i.e., writing) cultures that coexist at a given period of time. But it is absolutely essential to approach them also diachronically or historically, by comparing successive periods with one another. [p. 2]

I don’t recall reading the word chirographic before, but I blogged about the typographic mind (in which Ong’s analyses are discussed) and lamented that the modern world is moving away from literacy, back toward orality, which feels (to me at least) like retrogression and retreat. (Someone is certain to argue return to orality is actually progress.) As a result, Western institutions such as the independent press are decaying. Moreover, it’s probably fair to say that democracy in the West is by now only a remnant fiction, replaced by oligarchic rule and popular subscription to a variety of fantasy narratives easily dispelled by modest inventory of what exists in actuality.

Here is another passage and definition:

A grapholect is a transdialectal language formed by deep commitment to writing. Writing gives a grapholect a power far exceeding that of any purely oral dialect. The grapholect known as standard English has accessible for use a recorded vocabulary of at least a million and a half words, of which not only the present meanings but also hundreds of thousands of past meanings are known. A simply oral dialect will commonly have resources of only a few thousand words, and its users will have virtually no knowledge of the real semantic history of any of these words. [p. 8]

My finding is that terms such as democracy, liberalism, social justice, etc. fail to mean anything (except perhaps to academics and committed readers) precisely because their consensus usage has shifted so wildly over time that common historical points of reference are impossible to establish in a culture heavily dominated by contemporary memes, slang, talking heads, and talking points — components of orality rather than literacy. And as part of a wider epistemological crisis, one can no longer rely on critical thinking to sort out competing truth claims because the modifier critical now bandied about recklessly in academia, now infecting the workplace and politics, has unironically reversed its meaning and requires uncritical doublethink to swallow what’s taught and argued. Let me stress, too, that playing word games (such as dissembling what is means) is a commonplace tactic to put off criticism by distorting word meanings beyond recognition.

Although it’s unclear just yet (to me, obviously) what Ong argues in his book beyond the preliminary comparison and contrast of oral and chirographic cultures (or in terms of the title of the book, orality and literacy), I rather doubt he argues as I do that the modern world has swung around to rejection of literacy and the style of thought that flows from deep engagement with the written word. Frankly, it would surprise me if his did; the book predates the Internet, social media, and what’s now become omnimedia. The last decade in particular has demonstrated that by placing a cheap, personal, 24/7/365 communications device in the hands of every individual from the age of 12 or so, a radical social experiment was launched that no one in particular designed — except that once the outlines of the experiment began to clarify, those most responsible (i.e., social media platforms in particular but also biased journalists and activist academics) have refused to admit that they are major contributors to the derangement of society. Cynics learned long ago to expect that advertisers, PR hacks, and politicians should be discounted, which requires ongoing skepticism and resistance to omnipresent lures, cons, and propaganda. Call it waking up to reality or simply growing up and behaving responsibly in an information environment designed to be disorienting. Accordingly, the existence of counterweights — information networks derived from truth, authority, and integrity — has always been, um, well, critical. Their extinction presages much graver losses as information structures and even the memory of mental habits that society needs to function are simply swept aside.

I simply can’t keep up with all the reading, viewing, and listening in my queue. Waking hours are too few, and concentration dissipates long before sleep overtakes. Accordingly, it’s much easier to settle into couch-potato mode and watch some mindless drivel, such as the Netflix hit Bridgerton binged in two sittings. (Unlike cinema critics, I’m not bothered especially by continuity errors, plot holes, clunky dialogue, weak character motivation, gaps of logic, or glossy decadence of the fictional worlds. I am bothered by the Kafka trap sprung on anyone who notices casting decisions that defy time and place — an ill-advised but now commonplace historical revisionism like editing Mark Twain.) As a result, blog posts are less frequent than they might perhaps be as I pronounce upon American (or more broadly, Western) culture, trying vainly to absorb it as a continuously moving target. Calls to mind the phrase Après moi, le déluge, except that there is no need to wait. A deluge of entertainment, news, analysis, punditry, and trolling has buried everyone already. So rather than the more careful consideration I prefer to post, here are some hot takes.

The Irregular Aphorist. Caitlin Johnstone offers many trenchant observations in the form of aphorisms (some of which I’ve quoted before), all gathered under the subtitle Notes From The Edge Of The Narrative Matrix. The modifier irregular only means that aphorisms are a regular but not constant feature. Her site doesn’t have a tag to that effect but probably ought to. Here’s one in particular that caught my attention:

Everything our species has tried has led us to a dying world and a society that is stark raving mad, so nobody is in any position to tell you that you are wrong.

Twin truths here are (1) the dying world and (2) societal madness, both of which I’ve been describing for some time. Glad when others recognize them, too.

Piling on. Though few still are willing to admit it, nonpharmaceutical interventions (NPIs, e.g., distancing, masks, and lockdowns) to stall or reduce the spread of the virus failed to achieve their objectives according to this study. Instead, NPIs piled on suffering no one could forestall. I read somewhere (no link) that the world is approaching half of total, cumulative deaths/infections predicted had nothing been done to impede the pandemic running its course. Adding in deaths of despair (numbers not entirely up to date), we’re using the wrong tools to fight the wrong battle. Of course, interventions opened up giant opportunities for power grabs and vulture capitalism, so the cynic in me shrugs and wonders half aloud “what did you expect, really?”

Growth of the Managerial Bureaucracy. A blog called Easily Distracted by Timothy Burke (never on my blogroll) publishes only a few times per year, but his analysis is terrific — at least when it doesn’t wind up being overlong and inconclusive. Since a student debt jubilee is back in the news (plenty of arguments pro and con), unintended consequences are anticipated in this quote:

When you set out to create elaborate tiers that segregate the deserving poor from the comfortable middle-class and the truly wealthy, you create a system that requires a massive bureaucracy to administer and a process that forces people into petitionary humiliation in order to verify their eligibility. You create byzantine cutoff points that become business opportunities for predatory rentiers.

Something similar may well be occurring with stimulus checks being issued pro rata (has anyone actually gotten one?), but at least we’re spared any petitionary humiliations. We get whatever the algorithms (byzantine cutoff points) dictate. How those funds will be gamed and attached is not yet clear. Stay alert.

No Defense of Free Speech. Alan Jacobs often recommends deleting, unsubscribing, and/or ignoring social media accounts (after his own long love-hate relationship with them) considering how they have become wholly toxic to a balanced psyche as well as principal enablers of surveillance capitalism and narrative control. However, in an article about the manorial elite, he’s completely lost the plot that absolutism is required in defense of free speech. It’s not sufficient to be blasé or even relieved when 45 is kicked off Twitter permanently or when multiple parties conspire to kill Parler. Establishing your own turf beyond the reach of Silicon Valley censors is a nice idea but frankly impractical. Isn’t that what whoever ran Parler (or posted there) must have thought? And besides, fencing off the digital commons these very entities created has catapulted them into the unenviable position of undemocratic, unelected wielders of monopolistic power and co-conspirators to boot. That’s what needs to be curtailed, not free speech.

The Taxonomic Apocalypse. Although drawn from fiction and thus largely hypothetical, a new book (coming late 2021) by Adam Roberts called It’s the End of the World: But What Are We Really Afraid Of? surveys doomsday stories and categorizes different versions of how it all ends. Alan Jacobs (yeah, him again — must have an advance copy of the manuscript) recommends it as “a delightful and provocative little book” but fails to grok two things: (1) these stories are rehearsals-cum-preparations for the real thing, and (2) the real thing really is bearing down on us implacably and so is no longer a mere hypothetical to contemplate and categorize for shits and grins. Despite acceptance of the eventualities that await all of us, reading Roberts’ taxonomy is not something I would expect to find delightful. Skip.

Narrative Collapse. Ran Prier (no link) sometimes makes statements revealing an unexpected god’s-eye view:

[45] is a mean rich kid who figured out that if he does a good Archie Bunker impression, every lost soul with an authoritarian father will think he’s the messiah. We’re lucky that he cares only about himself, instead of having some crazy utopian agenda. But the power, and the agency, is with the disaffected citizens of a declining empire, tasting barbarism.

This is all about people wanting to be part of a group that’s part of a story. Lately, some of the big group-stories have been dying: sky father religion, American supremacy, the conquest of nature, the virtue of wealth-seeking. In their place, young and clumsy group-stories struggle and rise.

Collapse of certain fundamental stories that animate our thinking is at the core of The Spiral Staircase (see About Brutus at top), though it’s often couched in terms of consciousness in transition. Getting through the transition (only temporarily, see previous item in list) probably means completion of the Counter-Enlightenment historical arc, which necessarily includes further descent into barbarism.

Hail Mary for Individualism. I always take special notice when someone cites Allan Bloom. Alan Jacobs (um, yeah, he’s prolific and I’m using his ideas again — sue me) cites Bloom to argue that individualism or the sovereign self, a product of the Enlightenment, is already dead. No doubt, the thought-world described so ably by Bloom no longer exists, but individualism has not yet died out by attrition or been fully dissolved in nonduality. Many of us born before the advent of the Internet retain selfhood and authenticity not yet coopted by or incorporated into mass mind. Moreover, ongoing struggles over identity (e.g., gender, sexual orientation, and race that are often used improperly to define the self) result from an inchoate sense that individualism is eroding precipitously, not that it’s already passé. Defiant attempts to (re)establish an authentic self (contravening all logic and becoming critical theory of one sort or another) in the face of this loss may well be a last-ditch effort to save the self, but it’s failing.