Archive for the ‘Cognition’ Category

Continuing my book-blogging project on Orality and Literacy, Ong provides context for the oral tradition that surrounded the two great Homeric classics: The Iliad and The Odyssey. According to Ong, it took decades for literary critics and sociologists to overcome their bias, borne out of literacy, and recognize how formulaic are the two epics. They are essentially pastiches of commonplace plots, phrases, and sayings of the time, which was a notable strength when oral delivery based on memorization was how epic poetry was transmitted. In a literate era, such clichés are to be avoided (like the plague).

Aside: my review of David Serota’s Back to Our Future mentions the dialect he and his brother developed, filled with one-liners and catchphrases from entertainment media, especially TV and movies. The three-word (also three-syllable) form seems to be optimal: “Beam me up” (Star Trek), “Use the Force” (Star Wars), “Make my day” (Dirty Harry), “I’ll be back” (The Terminator), etc. This construction is short, punchy, and memorable. The first holder of high office in the U.S. to attempt to govern by catchphrase was probably Ronald Reagan, followed (of course) by Arnold Schwarzenegger and then Donald Trump. Mustn’t overlook that all three (and others) came to prominence via the entertainment industry rather than through earnest (Kennedyesque) public service. Trump’s numerous three-word phrases (shtick, really) lend themselves especially well to being chanted by adoring crowds at his pep rallies, swept up in groupthink, with a recognizable beat-beat-beat-(silence) structure. The rock band Queen stumbled upon this same elemental rhythm with its famous stomp-stomp-clap-(wait) from the anthem “We Are the Champions,” consciously intended for audience participation (as I understand it).

Further aside: “We Are the Champions” combines its iconic rhythm with a recitation tone sourced in antiquity. Make of that what you will.

Ong goes on to provide a discussion of the psychodynamics of orality, which I list here without substantive discussion (read for yourself):

  • orality is additive rather than subordinative
  • orality is aggregative rather than analytic
  • orality is redundant or copious
  • orality is conservative or traditionalist
  • orality is close to the human lifeworld
  • orality is agonistically toned
  • orality is empathetic and participatory rather than objectively distanced
  • orality is homeostatic
  • orality is situational rather than abstract

Of particular interest is Ong’s description of how language functions within oral cultures distinctly from literate cultures, which is the source of the bias mentioned above. To wit:

Fully literate persons can only with great difficulty imagine what a primary oral culture is like, that is, a culture with no knowledge whatsoever of writing or even the possibility of writing … In a primary oral culture, the expression ‘to look up something’ is an empty phrase … [w]ithout writing, words as such have no visual presence, even when the objects they represent are visual … [for] ‘primitive’ (oral) people … language is a mode of action and not simply a countersign of thought — oral people commonly, and probably universally, consider words to have great power. [pp. 31–32]

If this sounds conspicuously reminiscent this previous post, well, congratulations on connecting the dots. The whole point, according to a certain perspective, is that words are capable of violence, which is (re)gaining adherents as our mental frameworks undergo continuous revision. It’s no small thing that slurs, insults, and fighting words (again) provoke offense and violent response and that mere verbal offense equates to violence. Not long ago, nasty words were reclaimed, nullified, and thus made impotent (with varying levels of irrational rules of usage). Well, now they sting again and are used as ammo to cancel (a form of administrative violence, often undertaken anonymously, bureaucratically, and with the assistance of the digital mob) anyone with improper credentials to deploy them.

Let me draw another connection. Here’s a curious quote by Walter Pater, though not well known:

All art constantly aspires towards the condition of music. For while in all other kinds of art it is possible to distinguish the matter from the form, and the understanding can always make this distinction, yet it is the constant effort of art to obliterate it.

Put another way, the separation of signifier from signified, an abstraction conditioned by literacy and rationalism (among other things) is removed (“obliterated”) by music, which connects to emotion more directly than representational art. Similarly, speech within primary oral cultures exists purely as sound and possesses an ephemeral, even effervescence (Ong’s term) quality only experienced in the flow of time. (Arguably, all of human experience takes place within the flow of time.) Music and “primitive” speech are accordingly dynamic and cannot be reduced to static snapshots, that is, fixed on a page as text or committed to a canvas or photograph as a still image (hence, the strange term still life). That’s why a three-word, three-syllable chant, or better yet, the Queen rhythm or the Wave in sports arenas (a gesture requiring subscription of everyone), can possess inherent power, especially as individuals are entrained in groupthink. Music and words-as-violence get inside us and are nearly wholly subjective, not objective — something we all experience organically in early childhood before being taught to read and write (if in fact those skills are learned beyond functional literacy). Does that mean culture is reverting to an earlier stage of development, more primitive, childlike, and irrational?

For more than a decade, I’ve had in the back of my mind a blog post called “The Power of Naming” to remark that bestowing a name gives something power, substance, and in a sense, reality. That post never really came together, but its inverse did. Anyway, here’s a renewed attempt.

The period of language acquisition in early childhood is suffused with learning the names of things, most of which is passive. Names of animals (associated closely with sounds they make) are often a special focus using picture books. The kitty, doggie, and horsie eventually become the cat, dog, and horse. Similarly, the moo-cow and the tweety-bird shorten to cow and bird (though songbird may be an acceptable holdover). Words in the abstract are signifiers of the actual things, aided by the text symbols learned in literate cultures to reinforce mere categories instead of examples grounded in reality. Multiply the names of things several hundred thousand times into adulthood and indeed throughout life and one can develop a formidable vocabulary supporting expressive and nuanced thought and speech. Do you know the differences between acute, right, obtuse, straight, and reflex angles? Does it matter? Does your knowledge of barware inform when to use a flute, coupe, snifter, shot (or shooter or caballito), nosing glass (or Glencairn), tumbler, tankard, goblet, sling, and Stein? I’d say you’ve missed something by never having drunk dark beer (Ger.: Schwarzbier) from a frosted schooner. All these varieties developed for reasons that remain invisible to someone content to drink everything from the venerable red Solo cup. Funnily enough, the red Solo cup now comes in different versions, fooling precisely no one.

Returning to book blogging, Walter Ong (in Orality and Literacy) has curious comparisons between primarily oral cultures and literate cultures. For example:

Oral people commonly think of names (one kind of words) as conveying power over things. Explanations of Adam’s naming of the animals in Genesis 2:20 usually call condescending attention to this presumably quaint archaic belief. Such a belief is in fact far less quaint than it seems to unreflective chirographic and typographic folk. First of all, names do give humans beings power over what they name: without learning a vast store of names, one is simply powerless to understand, for example, chemistry and to practice chemical engineering. And so with all other intellectual knowledge. Secondly, chirographic and typographic folk tend to think of names as labels, written or printed tags imaginatively affixed to an object named. Oral folk have no sense of a name as a tag, for they have no idea of a name as something that can be seen. Written or printed representations of words can be labels; real, spoken words cannot be. [p. 33]

This gets at something that has been developing over the past few decades, namely, that as otherwise literate (or functionally literate) people gather more and more information through electronic media (screens that serve broadcast and cable TV, YouTube videos, prerecorded news for streaming, and podcasts, and most importantly, audiobooks — all of which speak content to listeners), the spoken word (re)gains primacy and the printed word fades into disuse. Electronic media may produce a hybrid of orality/literacy, but words are no longer silent, internal, and abstract. Indeed, words — all by themselves — are understood as being capable of violence. Gone are the days when “stick and stones ….” Now, fighting words incite and insults sting again.

Not so long ago, it was possible to provoke a duel with an insult or gesture, such as a glove across the face. Among some people, defense of honor never really disappeared (though dueling did). History has taken a strange turn, however. Proposed legislation to criminalize deadnaming (presumably to protect a small but growing number of transgender and nonbinary people who have redefined their gender identity and accordingly adopted different names) recognizes the violence of words but then tries to transmute the offense into an abstract criminal law. It’s deeply mixed up, and I don’t have the patience to sort it out.

More to say in later blog posts, but I’ll raise the Counter-Enlightenment once more to say that the nature of modern consciousness if shifting somewhat radically in response to stimuli and pressures that grew out of an information environment, roughly 70 years old now but transformed even more fundamentally in the last 25 years, that is substantially discontinuous from centuries-old traditions. Those traditions displaced even older traditions inherited from antiquity. Such is the way of the world, I suppose, and with the benefit of Walter Ong’s insights, my appreciation of the outlines is taking better shape.

Happy to report that humans have finally outgrown their adolescent fixation, obsession, and infatuation surrounding technology and gadgetry, especially those that blow up things (and people), part of a maladaptive desire to watch the world burn (like a disturbed 14-year-old playing with fire to test the boundaries of control while hoping for the boundary to be breached). We are now in the process of correcting priorities and fixing the damage done. We’re also free from the psychological prison in which we trapped ourselves through status seeking and insistence on rigid ego consciousness by recognizing instead that, as artifacts of a hypersocial species, human cognition is fundamentally distributed among us as each of us is for all intents and purposes a storyteller retelling, reinforcing, and embellishing stories told elsewhere — even though it’s not quite accurate to call it mass mind or collective consciousness — and that indeed all men are brothers (an admitted anachronism, since that phrase encompasses women/sisters, too). More broadly, humans also now understand that we are only one species among many (a relative late-comer in evolutionary time, as it happens) that coexist in a dynamic balance with each other and with the larger entity some call Mother Earth or Gaia. Accordingly, we have determined that our relationship can no longer be that of abuser (us) and abused (everything not us) if the dynamism built into that system is not to take us out (read: trigger human extinction, like most species suffered throughout evolutionary time). If these pronouncements sound too rosy, well, get a clue, fool!

Let me draw your attention to the long YouTube video embedded below. These folks have gotten the clues, though my commentary follows anyway, because SWOTI.

After processing all the hand-waving and calls to immediate action (with inevitable nods to fundraising), I was struck by two things in particular. First, XR’s co-founder Roger Hallan gets pretty much everything right despite an off-putting combination of alarm, desperation, exasperation, and blame. He argues that to achieve the global awakening needed to alter humanity’s course toward (self-)extinction, we actually need charismatic speakers and heightened emotionalism. Scientific dispassion and neutered measured political discourse (such as the Intergovernmental Panel on Climate Change (IPCC) or as Al Gore attempted for decades before going Hollywood already fifteen years ago now) have simply failed to accomplish anything. (On inspection, what history has actually delivered is not characterized by the lofty rhetoric of statesmen and boosters of Enlightenment philosophy but rather resembles a sociologist’s nightmare of dysfunctional social organization, where anything that could possible go wrong pretty much has.) That abysmal failure is dawning on people under the age of 30 or so quite strongly, whose futures have been not so much imperiled as actively robbed. (HOW DARE YOU!? You slimy, venal, incompetent cretins above the age of 30 or so!) So it’s not for nothing that Roger Hallan insists that the XR movement ought to be powered and led by young people, with old people stepping aside, relinquishing positions of power and influence they’ve already squandered.


Second, Chris Hedges, easily the most erudite and prepared speaker/contributor, describes his first-hand experience reporting on rebellion in Europe leading to (1) the collapse of governments and (2) disintegration of societies. He seems to believe that the first is worthwhile, necessary, and/or inevitable even though the immediate result is the second. Civil wars, purges, and genocides are not uncommon throughout history in the often extended periods preceding and following social collapse. The rapidity of governmental collapse once the spark of citizen rebellion becomes inflamed is, in his experience, evidence that driving irresponsible leaders from power is still possible. Hedges’ catchphrase is “I fight fascists because they’re fascists,” which as an act of conscience allows him to sleep at night. A corollary is that fighting may not necessarily be effective, at least on the short term, or be undertaken without significant sacrifice but needs to be done anyway to imbue life with purpose and meaning, as opposed to anomie. Although Hedges may entertain the possibility that social disintegration and collapse will be far, far more serious and widespread once the armed-to-the-teeth American empire cracks up fully (already under way to many observers) than with the Balkan countries, conscientious resistance and rebellion is still recommended.

Much as my attitudes are aligned with XR, Hallan, and Hedges, I’m less well convinced that we should all go down swinging. That industrial civilization is going down and all of us with it no matter what we do is to me an inescapable conclusion. I’ve blogged about this quite a bit. Does ethical behavior demand fighting to the bitter end? Or can we fiddle while Rome burns, so to speak? There’s a lot of middle ground between those extremes, including nihilistic mischief (euphemism alert) and a bottomless well of anticipated suffering to alleviate somehow. More than altering the inevitable, I’m more inclined to focus on forestalling eleventh-hour evil and finding some grace in how we ultimately, collectively meet species death.

Wanted to provide an update to the previous post in my book-blogging project on Walter Ong’s Orality and Literacy to correct something that wasn’t clear to me at first. The term chirographic refers to writing, but I conflated writing more generally with literacy. Ong actually distinguishes chirographic (writing) from typographic (type or print) and includes another category: electronic media.

Jack Goody … has convincingly shown how shifts hitherto labeled as shifts from magic to science, or from the so-called ‘prelogical’ to the more and more ‘rational’ state of consciousness, or from Lévi-Strauss’s ‘savage’ mind to domesticated thought, can be more economically and cogently explained as shifts from orality to various stages of literacy … Marshall McLuhan’s … cardinal gnomic saying, ‘The medium is the message’, registered his acute awareness of the importance of the shift from orality through literacy and print to electronic media. [pp. 28–29]

So the book’s primary contrast is between orality and literacy, but literacy has a sequence of historical developments: chirographic, typographic, and electronic media. These stages are not used interchangeably by Ong. Indeed, they exist simultaneously in the modern world and all contribute to overall literacy while each possesses unique characteristics. For instance, reading from handwriting (printing or cursive, the latter far less widely used now except for signatures) is different from reading from print on paper or on the screen. Further, writing by hand, typing on a typewriter, typing into a word-processor, and composing text on a smartphone each has its effects on mental processes and outputs. Ong also mentions remnants of orality that have not yet been fully extinguished. So the exact mindset or style of consciousness derived from orality vs. literacy is neither fixed nor established universally but contains aspects from each category and subcategory.

Ong also takes a swing at Julian Jaynes. Considering that Jaynes’ book The Origin of Consciousness in the Breakdown of the Bicameral Mind (1977) (see this overview) was published only seven years prior to Orality and Literacy (1982), the impact of Jaynes’ thesis must have still been felt quite strongly (as it is now among some thinkers). Yet Ong disposes of Jaynes rather parsimoniously, stating

… if attention to sophisticated orality-literacy contrasts is growing in some circles, it is still relatively rare in many fields where it could be helpful. For example, the early and late stages of consciousness which Julian Jaynes (1977) describes and related to neuro-physiological changes to the bicameral mind would also appear to lend themselves largely to much simpler and more verifiable descriptions in terms of a shift from orality to literacy. [p. 29]

In light of the details above, it’s probably not accurate to say (as I did before) that we are returning to orality from literacy. Rather, the synthesis of characteristics is shifting, as it always has, in relation to new stimuli and media. Since the advent of cinema and TV — the first screens, now supplemented by the computer and smartphone — the way humans consume information is undergoing yet another shift. Or perhaps it’s better to conclude that it’s always been shifting, not unlike how we have always been and are still evolving, though the timescales are usually too slow to observe without specialized training and analysis. Shifts in consciousness arguably occur far more quickly than biological evolution, and the rate at which new superstimuli are introduced into the information environment suggest radical discontinuity with even the recent past — something that used to be call the generation gap.

I’ve always wondered what media theorists such as McLuhan (d. 1980), Neil Postman (d. 2003), and now Ong (d. 2003) would make of the 21st century had they lived long enough to witness what has been happening, with 2014–2015 being the significant inflection point according to Jonathan Haidt. (No doubt there are other media theorists working on this issue who have not risen to my attention.) Numerous other analyses point instead to the early 20th century as the era when industrial civilization harnessed fossil fuels and turned the mechanisms and technologies of innovators decidedly against humanity. Pick your branching point.

/rant on

The self-appointed Thought Police continue their rampage through the public sphere, campaigning to disallow certain thoughts and fence off unacceptable, unsanitary, unhygienic, unhealthy utterances lest they spread, infect, and distort their host thinkers. Entire histories are being purged from, well, history, to pretend they either never happened or will never happen again, because (doncha know?) attempting to squeeze disreputable thought out of existence can’t possibly result in those forbidden fruits blossoming elsewhere, in the shadows, in all their overripe color and sweetness. The restrictive impulse — policing free speech and free thought — is as old as it is stupid. For example, it’s found in the use of euphemisms that pretend to mask the true nature of all manner of unpleasantness, such as death, racial and national epithets, unsavory ideologies, etc. However, farting is still farting, and calling it “passing wind” does nothing to reduce its stink. Plus, we all fart, just like we all inevitably traffic in ideas from time to time that are unwholesome. Manners demand some discretion when farting broaching some topics, but the point is that one must learn how to handle such difficulty responsibly rather than attempting to hold it in drive it out of thought entirely, which simply doesn’t work. No one knows automatically how to navigate through these minefields.

Considering that the body and mind possess myriad inhibitory-excitatory mechanisms that push and/or pull (i.e., good/bad, on/off, native/alien), a wizened person might recognize that both directions are needed to achieve balance. For instance, exposure to at least some hardship has the salutary effect of building character, whereas constant indulgence results in spoiled children (later, adults). Similarly, the biceps/triceps operate in tandem and opposition and need each other to function properly. However, most inhibitory-excitatory mechanisms aren’t so nearly binary as our language tends to imply but rather rely on an array of inputs. Sorting them all out is like trying to answer the nature/nurture question. Good luck with that.

Here’s a case in point: student and professional athletes in the U.S. are often prohibited from kneeling in dissent during the playing of the national anthem. The prohibition does nothing to ameliorate the roots of dissent but only suppresses its expression under narrow, temporary circumstances. Muzzling speech (ironically in the form of silent behavior) prior to sports contests may actually boomerang to inflame it. Some athletes knuckle under and accept the deal they’re offered (STFU! or lose your position — note the initialism used to hide the curse word) while others take principled stands (while kneeling, ha!) against others attempting to police thought. Some might argue that the setting demands good manners and restraint, while others argue that, by not stomping around the playing field carrying placards, gesticulating threateningly, or chanting slogans, restraint is being used. Opinions differ, obviously, and so the debate goes on. In a free society, that’s how it works. Societies with too many severe restrictions, often bordering on or going fully into fascism and totalitarianism, are intolerable to many of us fed current-day jingoism regarding democracy, freedom, and self-determination.

Many members of the U.S. Congress, sworn protectors of the U.S. Constitution, fundamentally misunderstand the First Amendment, or at least they conveniently pretend to. (I suspect it’s the former). Here is it for reference:

Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the Government for a redress of grievances.

Defending the First Amendment against infringement requires character and principles. What we have instead, both in Congress and in American society, are ideologues and authorities who want to make some categories flatly unthinkable and subject others to prosecution. Whistleblowing falls into the latter category. They are aided by the gradual erosion of educational achievement and shift away from literacy to orality, which robs language of its richness and polysemy. If words are placed out of bounds, made unutterable (but not unthinkable), the very tools of thought and expression are removed. The thoughts themselves may be driven underground or reduced to incoherence, but that’s not a respectable goal. Only under the harshest conditions (Orwell depicted them) can specific thoughts be made truly unthinkable, which typically impoverishes and/or breaks the mind of the thinker or at least results in pro forma public assent while private dissent gets stuffed down. To balance and combat truly virulent notions, exposure and discussion is needed, not suppression. But because many public figures have swallowed a bizarre combination of incoherent notions and are advocating for them, the mood is shifting away from First Amendment protection. Even absolutists like me are forced to reconsider, as for example with this article. The very openness to consideration of contrary thinking may well be the vulnerability being exploited by crypto-fascists.

Calls to establish a Ministry of Truth have progressed beyond the Silicon Valley tech platforms’ arbitrary and (one presumes) algorithm-driven removal of huge swaths of user-created content to a new bill introduced in the Colorado State Senate to establish a Digital Communications Regulation commission (summary here). Maybe this first step toward hammering out a legislative response to surveillance capitalism will rein in the predatory behaviors of big tech. The cynic in me harbors doubts. Instead, resulting legislation is far more likely to be aimed at users of those platforms.

/rant off

The backblog at The Spiral Staircase includes numerous book reviews and three book-blogging projects — one completed and two others either abandoned or on semi-permanent hiatus. I’m launching a new project on Walter Ong’s Orality and Literacy: The Technologizing of the Word (1982), which comes highly recommended and appears quite interesting given my preoccupations with language, literacy, and consciousness. To keep my thinking fresh, I have not consulted any online reviews or synopses.

Early on, Ong provides curious (but unsurprising) definitions I suspect will contribute to the book’s main thesis. Here is one from the intro:

It is useful to approach orality and literacy synchronically, by comparing oral cultures and chirographic (i.e., writing) cultures that coexist at a given period of time. But it is absolutely essential to approach them also diachronically or historically, by comparing successive periods with one another. [p. 2]

I don’t recall reading the word chirographic before, but I blogged about the typographic mind (in which Ong’s analyses are discussed) and lamented that the modern world is moving away from literacy, back toward orality, which feels (to me at least) like retrogression and retreat. (Someone is certain to argue return to orality is actually progress.) As a result, Western institutions such as the independent press are decaying. Moreover, it’s probably fair to say that democracy in the West is by now only a remnant fiction, replaced by oligarchic rule and popular subscription to a variety of fantasy narratives easily dispelled by modest inventory of what exists in actuality.

Here is another passage and definition:

A grapholect is a transdialectal language formed by deep commitment to writing. Writing gives a grapholect a power far exceeding that of any purely oral dialect. The grapholect known as standard English has accessible for use a recorded vocabulary of at least a million and a half words, of which not only the present meanings but also hundreds of thousands of past meanings are known. A simply oral dialect will commonly have resources of only a few thousand words, and its users will have virtually no knowledge of the real semantic history of any of these words. [p. 8]

My finding is that terms such as democracy, liberalism, social justice, etc. fail to mean anything (except perhaps to academics and committed readers) precisely because their consensus usage has shifted so wildly over time that common historical points of reference are impossible to establish in a culture heavily dominated by contemporary memes, slang, talking heads, and talking points — components of orality rather than literacy. And as part of a wider epistemological crisis, one can no longer rely on critical thinking to sort out competing truth claims because the modifier critical now bandied about recklessly in academia, now infecting the workplace and politics, has unironically reversed its meaning and requires uncritical doublethink to swallow what’s taught and argued. Let me stress, too, that playing word games (such as dissembling what is means) is a commonplace tactic to put off criticism by distorting word meanings beyond recognition.

Although it’s unclear just yet (to me, obviously) what Ong argues in his book beyond the preliminary comparison and contrast of oral and chirographic cultures (or in terms of the title of the book, orality and literacy), I rather doubt he argues as I do that the modern world has swung around to rejection of literacy and the style of thought that flows from deep engagement with the written word. Frankly, it would surprise me if his did; the book predates the Internet, social media, and what’s now become omnimedia. The last decade in particular has demonstrated that by placing a cheap, personal, 24/7/365 communications device in the hands of every individual from the age of 12 or so, a radical social experiment was launched that no one in particular designed — except that once the outlines of the experiment began to clarify, those most responsible (i.e., social media platforms in particular but also biased journalists and activist academics) have refused to admit that they are major contributors to the derangement of society. Cynics learned long ago to expect that advertisers, PR hacks, and politicians should be discounted, which requires ongoing skepticism and resistance to omnipresent lures, cons, and propaganda. Call it waking up to reality or simply growing up and behaving responsibly in an information environment designed to be disorienting. Accordingly, the existence of counterweights — information networks derived from truth, authority, and integrity — has always been, um, well, critical. Their extinction presages much graver losses as information structures and even the memory of mental habits that society needs to function are simply swept aside.

So far, this multipart blog post has trafficked in principles and generalities. Let me try now to be more specific, starting with an excerpt from Barry Lynn’s article in Harper’s Magazine titled “The Big Tech Extortion Racket” (Sept. 2020):

… around the middle of the nineteenth century, Americans began to develop technologies that could not be broken into component pieces. This was especially true of the railroad and the telegraph … Such corporations [railroad and telegraph companies] posed one overarching challenge: they charged some people more than others to get to market. They exploited their control over an essential service in order to extort money, and sometimes political favors … Americans found the answer to this problem in common law. For centuries, the owners of ferries, stagecoaches, and inns had been required to serve all customers for the same price and in the order in which they arrived. In the late nineteenth century, versions of such “common carrier” rules were applied to the new middleman corporations.

Today we rightly celebrate the Sherman Antitrust Act of 1890, which gave Americans the power to break apart private corporations. But in many respects, the Interstate Commerce Act of 1887 was the more important document. This act was based on the understanding that monopoly networks like the railroad and the telegraph could be used to influence the actions of people who depend on them, and hence their power must be carefully restricted …

For a century and a half, Americans used common carrier policies to ensure the rule of law in activities that depended on privately held monopolies … regulations freed Americans to take full advantage of every important network technology introduced during these years, including telephones, water and electrical services, energy pipelines, and even large, logistics-powered retailers. Citizens did not have to worry that the men who controlled the technologies involved would exploit their middleman position to steal other people’s business or disrupt balances of power.

I appreciate that Barry Lynn brings up the Interstate Commerce Act. If this legal doctrine appeared in the net neutrality debate a few years ago, it must have escaped my notice. While Internet Service Providers (ISPs) enable network access and connectivity, those utilities have not yet exhibited let’s-be-evil characteristics. Similarly, phone companies (including cell phones) and public libraries may well be eavesdropping and/or monitoring activities of the citizenry, but the real action lies elsewhere, namely, on social media networks and with online retailers. Evil is arguably concentrated in the FANG (or FAANG) corporations but has now grown to be ubiquitous in all social networks (e.g., Twitter) operating as common carriers (Zoom? Slack?) and across academe, nearly all of which have succumbed to moral panic. They are interpreting correctly, sad to observe, demands to censor and sanitize others’ no-longer-free speech appearing on their networks or within their realms. How much deeper it goes toward shaping politics and social engineering is quasi-conspiratorial and impossible for me to assess.

Much as I would prefer to believe that individuals possess the good sense to shift their activities away from social networks or turn their attention from discomfiting information sources, that does not appear to be the case. Demands for trigger warnings and safe spaces commonplace a few years ago on college campuses have instead morphed into censorious removal, deplatforming, and cancellation from the entire public sphere. Those are wrong responses in free societies, but modern institutions and technologies have gotten out of hand and outstripped the limits of normal human cognition. In short, we’re a society gone mad. So rather than accept responsibility to sort out information overflow oneself, many are demanding that others do it for them, and evil private corporations are complying (after a fashion). Moreover, calls for creation of an Orwellian Ministry of Truth, rebranded as a Truth Commission and Reality Czar, could hardly be any more chillingly and fascistically bizarre. People really need someone to brainwash decide for them what is real? Has anyone at the New York Times actually read Orwell’s dystopian novel 1984 and taken to heart its lessons?

I have observed various instances of magical thinking in mainstream culture, especially here, which I find problematical. Although it’s not my ambition to disabuse anyone of magical thinking, which extends far beyond, say, religious thought, I was somewhat taken aback at the suggestion found in the comic at this link (not embedded). For those not familiar with Questionable Content (one of two online comics I read regularly), the comic presents an extended cast of characters, mostly in their early 20s, living in a contemporary New England college town. Those characters are supplemented by a few older parents and lots of AIs (in robot bodies). The AIs are not particularly futuristic but are simply accepted as a normal (if curious) part of the world of the comic. Major story arcs involve characters and AIs (the AIs are characters, I suppose) in the process of discovering and establishing themselves as they (the humans, anyway) transition into early adulthood. There are no great political themes or intrusions into life in a college town. Rather, the comic is largely about acceptance of difference. Often, that means washing away meaningful difference in the name of banal tolerance. Real existential struggle is almost entirely absent.

In the linked comic, a new character comes along and offers advice to an established character struggling with sexual attractions and orientation. The dialogue includes this exchange:

Character A: If tarot or astrology or religion halps you make sense of the world and your place in it, then why not use them?
Character B: But they’re not real. [emphasis in original]
Character A: It doesn’t matter, if you use them constructively!

There it is in a nutshell: believe whatever you want if it, um, halps. I’ve always felt that being wrong (i.e., using unreal or make-believe things) was a sufficient injunction against anchoring oneself to notions widely known to be false. Besides, isn’t it often remarked that the biggest fool is one who fools himself? (Fiction as a combination of entertainment and building a worldview is quite normative, but it’s understood as fiction, or to a lesser degree, as life imitating art and its inverse. Exceptions abound, which are regarded as psychopathy.) The instruction in that dialogue (part object lesson, part lesson in cognition) is not that it’s OK to make mistakes but that knowingly believing something false has worthwhile advantages.

Surveying examples where promulgating false beliefs have constructive and destructive effects is too large a project. Well short of that, nasty categories include fraud, gaslighting, and propaganda, which are criminal in many cases and ought to be in most others (looking at you, MSM! — or not, since I neither trust nor watch). One familiar benevolent category is expressed in the phrase fake it til you make it, often recommended to overcome a lack of confidence. Of course, a swindle is also known as a confidence game (or by its diminutive, a con), so beware overconfidence when asked by another to pay for something (e.g., tarot or astrology readings), take risks, or accept an ideology without question.

As philosophy, willful adoption of falsity for its supposed benefits is half-baked. Though impossible to quantify, my suspicion is that instances of positive outcomes are overbalanced by negative ones. Maybe living in a constructed reality or self-reinforcing fantasy is what people want. The comic discussed is certainly in line with that approach. However, while we dither and delude ourselves with happy, aspirational stories based on silliness, the actual world around us, including all the human institutions that used to serve us but no longer do, falls to tatters. Is it better going through life and eventually to one’s grave refusing to see that reality? Should childlike wonder and innocence be retained in spite of what is easily observable just by poking one’s head up and dismissing comforting lies? Decide for yourself.

I simply can’t keep up with all the reading, viewing, and listening in my queue. Waking hours are too few, and concentration dissipates long before sleep overtakes. Accordingly, it’s much easier to settle into couch-potato mode and watch some mindless drivel, such as the Netflix hit Bridgerton binged in two sittings. (Unlike cinema critics, I’m not bothered especially by continuity errors, plot holes, clunky dialogue, weak character motivation, gaps of logic, or glossy decadence of the fictional worlds. I am bothered by the Kafka trap sprung on anyone who notices casting decisions that defy time and place — an ill-advised but now commonplace historical revisionism like editing Mark Twain.) As a result, blog posts are less frequent than they might perhaps be as I pronounce upon American (or more broadly, Western) culture, trying vainly to absorb it as a continuously moving target. Calls to mind the phrase Après moi, le déluge, except that there is no need to wait. A deluge of entertainment, news, analysis, punditry, and trolling has buried everyone already. So rather than the more careful consideration I prefer to post, here are some hot takes.

The Irregular Aphorist. Caitlin Johnstone offers many trenchant observations in the form of aphorisms (some of which I’ve quoted before), all gathered under the subtitle Notes From The Edge Of The Narrative Matrix. The modifier irregular only means that aphorisms are a regular but not constant feature. Her site doesn’t have a tag to that effect but probably ought to. Here’s one in particular that caught my attention:

Everything our species has tried has led us to a dying world and a society that is stark raving mad, so nobody is in any position to tell you that you are wrong.

Twin truths here are (1) the dying world and (2) societal madness, both of which I’ve been describing for some time. Glad when others recognize them, too.

Piling on. Though few still are willing to admit it, nonpharmaceutical interventions (NPIs, e.g., distancing, masks, and lockdowns) to stall or reduce the spread of the virus failed to achieve their objectives according to this study. Instead, NPIs piled on suffering no one could forestall. I read somewhere (no link) that the world is approaching half of total, cumulative deaths/infections predicted had nothing been done to impede the pandemic running its course. Adding in deaths of despair (numbers not entirely up to date), we’re using the wrong tools to fight the wrong battle. Of course, interventions opened up giant opportunities for power grabs and vulture capitalism, so the cynic in me shrugs and wonders half aloud “what did you expect, really?”

Growth of the Managerial Bureaucracy. A blog called Easily Distracted by Timothy Burke (never on my blogroll) publishes only a few times per year, but his analysis is terrific — at least when it doesn’t wind up being overlong and inconclusive. Since a student debt jubilee is back in the news (plenty of arguments pro and con), unintended consequences are anticipated in this quote:

When you set out to create elaborate tiers that segregate the deserving poor from the comfortable middle-class and the truly wealthy, you create a system that requires a massive bureaucracy to administer and a process that forces people into petitionary humiliation in order to verify their eligibility. You create byzantine cutoff points that become business opportunities for predatory rentiers.

Something similar may well be occurring with stimulus checks being issued pro rata (has anyone actually gotten one?), but at least we’re spared any petitionary humiliations. We get whatever the algorithms (byzantine cutoff points) dictate. How those funds will be gamed and attached is not yet clear. Stay alert.

No Defense of Free Speech. Alan Jacobs often recommends deleting, unsubscribing, and/or ignoring social media accounts (after his own long love-hate relationship with them) considering how they have become wholly toxic to a balanced psyche as well as principal enablers of surveillance capitalism and narrative control. However, in an article about the manorial elite, he’s completely lost the plot that absolutism is required in defense of free speech. It’s not sufficient to be blasé or even relieved when 45 is kicked off Twitter permanently or when multiple parties conspire to kill Parler. Establishing your own turf beyond the reach of Silicon Valley censors is a nice idea but frankly impractical. Isn’t that what whoever ran Parler (or posted there) must have thought? And besides, fencing off the digital commons these very entities created has catapulted them into the unenviable position of undemocratic, unelected wielders of monopolistic power and co-conspirators to boot. That’s what needs to be curtailed, not free speech.

The Taxonomic Apocalypse. Although drawn from fiction and thus largely hypothetical, a new book (coming late 2021) by Adam Roberts called It’s the End of the World: But What Are We Really Afraid Of? surveys doomsday stories and categorizes different versions of how it all ends. Alan Jacobs (yeah, him again — must have an advance copy of the manuscript) recommends it as “a delightful and provocative little book” but fails to grok two things: (1) these stories are rehearsals-cum-preparations for the real thing, and (2) the real thing really is bearing down on us implacably and so is no longer a mere hypothetical to contemplate and categorize for shits and grins. Despite acceptance of the eventualities that await all of us, reading Roberts’ taxonomy is not something I would expect to find delightful. Skip.

Narrative Collapse. Ran Prier (no link) sometimes makes statements revealing an unexpected god’s-eye view:

[45] is a mean rich kid who figured out that if he does a good Archie Bunker impression, every lost soul with an authoritarian father will think he’s the messiah. We’re lucky that he cares only about himself, instead of having some crazy utopian agenda. But the power, and the agency, is with the disaffected citizens of a declining empire, tasting barbarism.

This is all about people wanting to be part of a group that’s part of a story. Lately, some of the big group-stories have been dying: sky father religion, American supremacy, the conquest of nature, the virtue of wealth-seeking. In their place, young and clumsy group-stories struggle and rise.

Collapse of certain fundamental stories that animate our thinking is at the core of The Spiral Staircase (see About Brutus at top), though it’s often couched in terms of consciousness in transition. Getting through the transition (only temporarily, see previous item in list) probably means completion of the Counter-Enlightenment historical arc, which necessarily includes further descent into barbarism.

Hail Mary for Individualism. I always take special notice when someone cites Allan Bloom. Alan Jacobs (um, yeah, he’s prolific and I’m using his ideas again — sue me) cites Bloom to argue that individualism or the sovereign self, a product of the Enlightenment, is already dead. No doubt, the thought-world described so ably by Bloom no longer exists, but individualism has not yet died out by attrition or been fully dissolved in nonduality. Many of us born before the advent of the Internet retain selfhood and authenticity not yet coopted by or incorporated into mass mind. Moreover, ongoing struggles over identity (e.g., gender, sexual orientation, and race that are often used improperly to define the self) result from an inchoate sense that individualism is eroding precipitously, not that it’s already passé. Defiant attempts to (re)establish an authentic self (contravening all logic and becoming critical theory of one sort or another) in the face of this loss may well be a last-ditch effort to save the self, but it’s failing.

I’ve been holding in mind for five months now the article at this link (an informal interview with neuroscientist and psychologist Oliver J. Robinson), waiting for conditions when I could return to forms of media consumption I prefer, namely, reading books, magazines, and long-form journalism. When I try to read something substantive these days, I find myself going over the same paragraph repeatedly, waiting in vain for it to register. Regrettably, the calm, composure, and concentration needed for deep reading has been effectively blocked since March 2020 as we wait (also in vain) for the pandemic to burn itself out. (I could argue that the soul-destroying prospect of industrial collapse and near-term human extinction is having the same effect for much longer.) So my attention and media habits have been resignedly diverted to crap news gathering, mostly via video, and cheap entertainments, mostly streaming TV (like everyone else, though others may complain less). The lack of nourishment is noticeable. Considering we’re only weeks away from the U.S. presidential election, stress levels are ratcheting up further, and civil authorities prepare for “election riots” (is that new term?), which I can only assume means piling violence upon violence under the pretense of keeping-the-peace or law-and-order or some other word string rendered meaningless now that the police are widely acknowledged to be a significant contributors to the very problems they are meant to address. These unresolved issues (pandemic, police violence, civil unrest) give rise to pathological anxiety, which explains (according to Robinson, disclaimers notwithstanding) why it’s so hard to read.

To say we live in unprecedented times is both obvious and banal. Unique stresses of modernity have led multiple times to widespread madness and conflict, as well as attempts to recapture things lost in previous shifts from other styles of social organization. Let me not mince words regarding what’s now happening: we’re in an era of repudiation of the Enlightenment, or a renewed Counter-Enlightenment. I’ve stated this before, and I’m not the only one making this diagnosis (just learned it’s a rather old idea — I’m always late to the party). For instance, Martin Jay’s essay “Dialectic of Counter-Enlightenment” appears to have been floating around in various forms since 2011. Correlation of this renewal of Counter-Enlightenment fervor with literacy seems clear. Despite basic literacy as a skill being widely improved worldwide over the past two centuries, especially in the developing world, deep literacy is eroding:

Beyond self-inflicted attention deficits, people who cannot deep read – or who do not use and hence lose the deep-reading skills they learned – typically suffer from an attenuated capability to comprehend and use abstract reasoning. In other words, if you can’t, or don’t, slow down sufficiently to focus quality attention – what Wolf calls “cognitive patience” – on a complex problem, you cannot effectively think about it.

Considering deep literacy is absolutely critical to clear thinking (or critical thought, if you prefer, not to be confused with the The Frankfurt School’s critical theory discussed in Jay’s essay), its erosion threatens fundamental institutions (e.g., liberal democracy and the scientific method) that constitute the West’s primary cultural inheritance from the Enlightenment. The reach of destruction wrought by reversing course via the Counter-Enlightenment cannot be overstated. Yet many among us, completely unable to construct coherent ideas, are rallying behind abandonment of Enlightenment traditions. They’re ideologues who actively want to return to the Dark Ages (while keeping modern tech, natch). As with many aspects of unavoidable cultural, social, environmental, and civilizational collapse, I have difficulty knowing quite what to hope for. So I won’t condemn retrograde thinking wholly. In fact, I feel empathy toward calls to return to simpler times, such as with German Romanticism or American Transcendentalism, both examples of cultural and aesthetic movements leading away from the Enlightenment.

Long before these ideas coalesced for me, I had noted (see here, here, and here) how literacy is under siege and a transition back toward a predominantly oral culture is underway. The Counter-Enlightenment is either a cause or an effect, I can’t assess which. At the risk of being a Cassandra, let me suggest that, if these times aren’t completely different from dark episodes of the past, we are now crossing the threshold of a new period of immense difficulty that makes pathological anxiety blocking the ability to read and think a minor concern. Indeed, that has been my basic assessment since crafting the About Brutus blurb way back in 2006. Indicators keep piling up. So far, I have a half dozen points of entry to process and digest by other cultural commentators exploring this theme, though they typically don’t adopt wide enough historical or cultural perspectives. Like the last time I failed to synthesize my ideas into a multipart blog series, I don’t have a snazzy title, and this time, I don’t even have planned installment titles. But I will do my best to roll out in greater detail over several blog posts some of the ways the Counter-Enlightenment is manifesting anew.