Having grown up in an ostensibly free, open society animated by liberal Western ideology, it’s fair to say in hindsight that I internalized a variety of assumptions (and illusions) regarding the role of the individual vis-à-vis society. The operative word here is ostensibly owing to the fact that society has always restricted pure expressions of individuality to some degree through socialization and pressure to conform, so freedom has always been constrained. That was one of the takeaways from my reading (long ago in high school) of Albert Camus’ novel The Stranger (1942) (British: The Outsider; French: L’Étranger), namely, that no matter how free one might believe oneself to be, if one refuses (radically, absurdly) to play by society’s rules and expectations, one will be destroyed. The basic, irresolvable conflict is also present in the concerto principle in classical music, which presents the soloist in dialogue with or in antithesis to the ensemble. Perhaps no work exemplifies this better than the 2nd movement of Ludwig van Beethoven’s Concerto No. 4 for piano and orchestra. A similar dialogue if found in the third movement of Gustav Mahler’s Symphony No. 3, though dialogue there might be better understood as man vs. nature. The significant point of similarity is not the musical style or themes but how the individual/man is ultimately subdued or absorbed by society/nature.

Aside: A broader examination of narrative conflict would include four traditional categories: (1) man vs. man, (2) man vs. nature, (3) man vs. self, and (4) man vs. society. Updated versions, often offered as tips for aspiring writers, sometimes include breakout conflicts (i.e., subcategories): (1) person vs. fate/god, (2) person vs. self, (3) person vs. person, (4) person vs. society, (5) person vs. nature, (6) person vs. supernatural, and (7) person vs. technology. Note that modern sensibilities demand use of person instead of man.

My reason for bringing up such disparate cultural artifacts is to provide context. Relying on my appreciation of the Zeitgeist, liberal Western ideology is undergoing a radical rethinking, with Woke activists in particular pretending to emancipate oppressed people when flattening of society is probably the hidden objective. Thus, Wokesters are not really freeing anyone, and flattening mechanisms are pulling people down, not building people up. On top of that, they are targeting the wrong oppressors. If leveling is meant to occur along various markers of identity (race, sexual and gender orientation, religion, political affiliation, nationality, etc.), the true conflict in the modern era has always been socioeconomic, i.e., the ownership class against all others. Sure, differences along identitarian lines have been used to oppress, but oppressors are merely using a surface characteristic to distract from their excessive power. The dispossessed oddly fail to recognize their true enemies, projecting animus instead on those with whom grievances are shared. Similarly, Wokesters are busy exploiting their newfound (albeit momentary) power to question the accepted paradigm and force RightThink on others. Yet traditional power holders are not especially threatened by squabbles among the oppressed masses. Moreover, it’s not quite accurate to say that the identitarian left is rethinking the established order. Whatever is happening is arguably occurring at a deeper, irrational level than any thoughtful, principled, political action meant to benefit a confluence of interest groups (not unlike the impossible-to-sort confluence of identities everyone has).

Although I haven’t read Howard Zinn’s A People’s History of the United States (1980), I gather that Zinn believed history should not be told from the winners’ perspective (i.e., that of the ownership and ruling classes, significant overlap acknowledged), or from top down, but instead through the lens of the masses (i.e., the people, a large segment of whom are oppressed and/or dispossessed), or from the bottom up. This reorientation applies not only within a given society or political entity but among nations. (Any guess which countries are the worst oppressors at the moment? Would be a long list.) Moreover, counter to the standard or accepted histories most of us learn, preparation of the U.S. Constitution and indeed quite a lot of U.S. history are deeply corrupt and oppressive by design. It should be obvious that the state (or nation, if one prefers), with its insistence on personal property and personal freedom (though only for a narrow class of landed gentry back in the day, plutocrats and corporatists today), systematically rolled over everyone else — none so egregiously as Native Americans, African slaves, and immigrants. Many early institutions in U.S. political history were in fact created as bulwarks against various forms of popular resistance, notably slave revolts. Thus, tensions and conflicts that might be mistakenly chalked up as man vs. society can be better characterized as man vs. the state, with the state having been erected specifically to preserve prerogatives of the ownership class.

More to come in part 2 and beyond.

From Alan Jacob’s Breaking Bread with the Dead (2020):

The German sociologist Gerd-Günter Voss outlined the development, over many centuries, of three forms of the “conduct of life.” The first is the traditional: in this model your life takes the forms that the lives of people in your culture and class have always taken, at least for as long as anyone remembers. The key values in the traditional conduct of life are “security and regularity.” The second model is the strategic: people who follow this model have clear goals in mind (first, to get into an elite university; later, to become a radiologist or own their own company or retire at fifty) and form a detailed strategic plan to achieve those goals. But, Voss suggests, those two models, while still present in various parts of the world, are increasingly being displaced by a third model for the conduct of life: the situational.

The situational model has arisen in recent social orders that are unprecedentedly dynamic and fluid. People are less likely to plan to be radiologists when they hear that radiologists may be replaced by computers. They are less likely to plan to own a company when whatever business they’re inclined toward may not exist in a decade … they are less likely to plan to have children … They might not even want to plan to have dinner with a friend a week from Friday …

… the situational conduct of life is … a way of coping with social acceleration. But it’s also, or threatens to be, an abandonment of serious reflection on what makes life good. You end up just managing the moment … The feeling of being at a “frenetic standstill” is highly characteristic of the depressed person.

Happy to report that humans have finally outgrown their adolescent fixation, obsession, and infatuation surrounding technology and gadgetry, especially those that blow up things (and people), part of a maladaptive desire to watch the world burn (like a disturbed 14-year-old playing with fire to test the boundaries of control while hoping for the boundary to be breached). We are now in the process of correcting priorities and fixing the damage done. We’re also free from the psychological prison in which we trapped ourselves through status seeking and insistence on rigid ego consciousness by recognizing instead that, as artifacts of a hypersocial species, human cognition is fundamentally distributed among us as each of us is for all intents and purposes a storyteller retelling, reinforcing, and embellishing stories told elsewhere — even though it’s not quite accurate to call it mass mind or collective consciousness — and that indeed all men are brothers (an admitted anachronism, since that phrase encompasses women/sisters, too). More broadly, humans also now understand that we are only one species among many (a relative late-comer in evolutionary time, as it happens) that coexist in a dynamic balance with each other and with the larger entity some call Mother Earth or Gaia. Accordingly, we have determined that our relationship can no longer be that of abuser (us) and abused (everything not us) if the dynamism built into that system is not to take us out (read: trigger human extinction, like most species suffered throughout evolutionary time). If these pronouncements sound too rosy, well, get a clue, fool!

Let me draw your attention to the long YouTube video embedded below. These folks have gotten the clues, though my commentary follows anyway, because SWOTI.

After processing all the hand-waving and calls to immediate action (with inevitable nods to fundraising), I was struck by two things in particular. First, XR’s co-founder Roger Hallan gets pretty much everything right despite an off-putting combination of alarm, desperation, exasperation, and blame. He argues that to achieve the global awakening needed to alter humanity’s course toward (self-)extinction, we actually need charismatic speakers and heightened emotionalism. Scientific dispassion and neutered measured political discourse (such as the Intergovernmental Panel on Climate Change (IPCC) or as Al Gore attempted for decades before going Hollywood already fifteen years ago now) have simply failed to accomplish anything. (On inspection, what history has actually delivered is not characterized by the lofty rhetoric of statesmen and boosters of Enlightenment philosophy but rather resembles a sociologist’s nightmare of dysfunctional social organization, where anything that could possible go wrong pretty much has.) That abysmal failure is dawning on people under the age of 30 or so quite strongly, whose futures have been not so much imperiled as actively robbed. (HOW DARE YOU!? You slimy, venal, incompetent cretins above the age of 30 or so!) So it’s not for nothing that Roger Hallan insists that the XR movement ought to be powered and led by young people, with old people stepping aside, relinquishing positions of power and influence they’ve already squandered.


Second, Chris Hedges, easily the most erudite and prepared speaker/contributor, describes his first-hand experience reporting on rebellion in Europe leading to (1) the collapse of governments and (2) disintegration of societies. He seems to believe that the first is worthwhile, necessary, and/or inevitable even though the immediate result is the second. Civil wars, purges, and genocides are not uncommon throughout history in the often extended periods preceding and following social collapse. The rapidity of governmental collapse once the spark of citizen rebellion becomes inflamed is, in his experience, evidence that driving irresponsible leaders from power is still possible. Hedges’ catchphrase is “I fight fascists because they’re fascists,” which as an act of conscience allows him to sleep at night. A corollary is that fighting may not necessarily be effective, at least on the short term, or be undertaken without significant sacrifice but needs to be done anyway to imbue life with purpose and meaning, as opposed to anomie. Although Hedges may entertain the possibility that social disintegration and collapse will be far, far more serious and widespread once the armed-to-the-teeth American empire cracks up fully (already under way to many observers) than with the Balkan countries, conscientious resistance and rebellion is still recommended.

Much as my attitudes are aligned with XR, Hallan, and Hedges, I’m less well convinced that we should all go down swinging. That industrial civilization is going down and all of us with it no matter what we do is to me an inescapable conclusion. I’ve blogged about this quite a bit. Does ethical behavior demand fighting to the bitter end? Or can we fiddle while Rome burns, so to speak? There’s a lot of middle ground between those extremes, including nihilistic mischief (euphemism alert) and a bottomless well of anticipated suffering to alleviate somehow. More than altering the inevitable, I’m more inclined to focus on forestalling eleventh-hour evil and finding some grace in how we ultimately, collectively meet species death.

Wanted to provide an update to the previous post in my book-blogging project on Walter Ong’s Orality and Literacy to correct something that wasn’t clear to me at first. The term chirographic refers to writing, but I conflated writing more generally with literacy. Ong actually distinguishes chirographic (writing) from typographic (type or print) and includes another category: electronic media.

Jack Goody … has convincingly shown how shifts hitherto labeled as shifts from magic to science, or from the so-called ‘prelogical’ to the more and more ‘rational’ state of consciousness, or from Lévi-Strauss’s ‘savage’ mind to domesticated thought, can be more economically and cogently explained as shifts from orality to various stages of literacy … Marshall McLuhan’s … cardinal gnomic saying, ‘The medium is the message’, registered his acute awareness of the importance of the shift from orality through literacy and print to electronic media. [pp. 28–29]

So the book’s primary contrast is between orality and literacy, but literacy has a sequence of historical developments: chirographic, typographic, and electronic media. These stages are not used interchangeably by Ong. Indeed, they exist simultaneously in the modern world and all contribute to overall literacy while each possesses unique characteristics. For instance, reading from handwriting (printing or cursive, the latter far less widely used now except for signatures) is different from reading from print on paper or on the screen. Further, writing by hand, typing on a typewriter, typing into a word-processor, and composing text on a smartphone each has its effects on mental processes and outputs. Ong also mentions remnants of orality that have not yet been fully extinguished. So the exact mindset or style of consciousness derived from orality vs. literacy is neither fixed nor established universally but contains aspects from each category and subcategory.

Ong also takes a swing at Julian Jaynes. Considering that Jaynes’ book The Origin of Consciousness in the Breakdown of the Bicameral Mind (1977) (see this overview) was published only seven years prior to Orality and Literacy (1982), the impact of Jaynes’ thesis must have still been felt quite strongly (as it is now among some thinkers). Yet Ong disposes of Jaynes rather parsimoniously, stating

… if attention to sophisticated orality-literacy contrasts is growing in some circles, it is still relatively rare in many fields where it could be helpful. For example, the early and late stages of consciousness which Julian Jaynes (1977) describes and related to neuro-physiological changes to the bicameral mind would also appear to lend themselves largely to much simpler and more verifiable descriptions in terms of a shift from orality to literacy. [p. 29]

In light of the details above, it’s probably not accurate to say (as I did before) that we are returning to orality from literacy. Rather, the synthesis of characteristics is shifting, as it always has, in relation to new stimuli and media. Since the advent of cinema and TV — the first screens, now supplemented by the computer and smartphone — the way humans consume information is undergoing yet another shift. Or perhaps it’s better to conclude that it’s always been shifting, not unlike how we have always been and are still evolving, though the timescales are usually too slow to observe without specialized training and analysis. Shifts in consciousness arguably occur far more quickly than biological evolution, and the rate at which new superstimuli are introduced into the information environment suggest radical discontinuity with even the recent past — something that used to be call the generation gap.

I’ve always wondered what media theorists such as McLuhan (d. 1980), Neil Postman (d. 2003), and now Ong (d. 2003) would make of the 21st century had they lived long enough to witness what has been happening, with 2014–2015 being the significant inflection point according to Jonathan Haidt. (No doubt there are other media theorists working on this issue who have not risen to my attention.) Numerous other analyses point instead to the early 20th century as the era when industrial civilization harnessed fossil fuels and turned the mechanisms and technologies of innovators decidedly against humanity. Pick your branching point.

/rant on

The self-appointed Thought Police continue their rampage through the public sphere, campaigning to disallow certain thoughts and fence off unacceptable, unsanitary, unhygienic, unhealthy utterances lest they spread, infect, and distort their host thinkers. Entire histories are being purged from, well, history, to pretend they either never happened or will never happen again, because (doncha know?) attempting to squeeze disreputable thought out of existence can’t possibly result in those forbidden fruits blossoming elsewhere, in the shadows, in all their overripe color and sweetness. The restrictive impulse — policing free speech and free thought — is as old as it is stupid. For example, it’s found in the use of euphemisms that pretend to mask the true nature of all manner of unpleasantness, such as death, racial and national epithets, unsavory ideologies, etc. However, farting is still farting, and calling it “passing wind” does nothing to reduce its stink. Plus, we all fart, just like we all inevitably traffic in ideas from time to time that are unwholesome. Manners demand some discretion when farting broaching some topics, but the point is that one must learn how to handle such difficulty responsibly rather than attempting to hold it in drive it out of thought entirely, which simply doesn’t work. No one knows automatically how to navigate through these minefields.

Considering that the body and mind possess myriad inhibitory-excitatory mechanisms that push and/or pull (i.e., good/bad, on/off, native/alien), a wizened person might recognize that both directions are needed to achieve balance. For instance, exposure to at least some hardship has the salutary effect of building character, whereas constant indulgence results in spoiled children (later, adults). Similarly, the biceps/triceps operate in tandem and opposition and need each other to function properly. However, most inhibitory-excitatory mechanisms aren’t so nearly binary as our language tends to imply but rather rely on an array of inputs. Sorting them all out is like trying to answer the nature/nurture question. Good luck with that.

Here’s a case in point: student and professional athletes in the U.S. are often prohibited from kneeling in dissent during the playing of the national anthem. The prohibition does nothing to ameliorate the roots of dissent but only suppresses its expression under narrow, temporary circumstances. Muzzling speech (ironically in the form of silent behavior) prior to sports contests may actually boomerang to inflame it. Some athletes knuckle under and accept the deal they’re offered (STFU! or lose your position — note the initialism used to hide the curse word) while others take principled stands (while kneeling, ha!) against others attempting to police thought. Some might argue that the setting demands good manners and restraint, while others argue that, by not stomping around the playing field carrying placards, gesticulating threateningly, or chanting slogans, restraint is being used. Opinions differ, obviously, and so the debate goes on. In a free society, that’s how it works. Societies with too many severe restrictions, often bordering on for going fully into fascism and totalitarianism, are intolerable to many of us fed current-day jingoism regarding democracy, freedom, and self-determination.

Many members of the U.S. Congress, sworn protectors of the U.S. Constitution, fundamentally misunderstand the First Amendment, or at least they conveniently pretend to. (I suspect it’s the former). Here is it for reference:

Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the Government for a redress of grievances.

Defending the First Amendment against infringement requires character and principles. What we have instead, both in Congress and in American society, are ideologues and authorities who want to make some categories flatly unthinkable and subject others to prosecution. Whistleblowing falls into the latter category. They are aided by the gradual erosion of educational achievement and shift away from literacy to orality, which robs language of its richness and polysemy. If words are placed out of bounds, made unutterable (but not unthinkable), the very tools of thought and expression are removed. The thoughts themselves may be driven underground or reduced to incoherence, but that’s not a respectable goal. Only under the harshest conditions (Orwell depicted them) can specific thoughts be made truly unthinkable, which typically impoverishes and/or breaks the mind of the thinker or at least results in pro forma public assent while private dissent gets stuffed down. To balance and combat truly virulent notions, exposure and discussion is needed, not suppression. But because many public figures have swallowed a bizarre combination of incoherent notions and are advocating for them, the mood is shifting away from First Amendment protection. Even absolutists like me are forced to reconsider, as for example with this article. The very openness to consideration of contrary thinking may well be the vulnerability being exploited by crypto-fascists.

Calls to establish a Ministry of Truth have progressed beyond the Silicon Valley tech platforms’ arbitrary and (one presumes) algorithm-driven removal of huge swaths of user-created content to a new bill introduced in the Colorado State Senate to establish a Digital Communications Regulation commission (summary here). Maybe this first step toward hammering out a legislative response to surveillance capitalism will rein in the predatory behaviors of big tech. The cynic in me harbors doubts. Instead, resulting legislation is far more likely to be aimed at users of those platforms.

/rant off

The backblog at The Spiral Staircase includes numerous book reviews and three book-blogging projects — one completed and two others either abandoned or on semi-permanent hiatus. I’m launching a new project on Walter Ong’s Orality and Literacy: The Technologizing of the Word (1982), which comes highly recommended and appears quite interesting given my preoccupations with language, literacy, and consciousness. To keep my thinking fresh, I have not consulted any online reviews or synopses.

Early on, Ong provides curious (but unsurprising) definitions I suspect will contribute to the book’s main thesis. Here is one from the intro:

It is useful to approach orality and literacy synchronically, by comparing oral cultures and chirographic (i.e., writing) cultures that coexist at a given period of time. But it is absolutely essential to approach them also diachronically or historically, by comparing successive periods with one another. [p. 2]

I don’t recall reading the word chirographic before, but I blogged about the typographic mind (in which Ong’s analyses are discussed) and lamented that the modern world is moving away from literacy, back toward orality, which feels (to me at least) like retrogression and retreat. (Someone is certain to argue return to orality is actually progress.) As a result, Western institutions such as the independent press are decaying. Moreover, it’s probably fair to say that democracy in the West is by now only a remnant fiction, replaced by oligarchic rule and popular subscription to a variety of fantasy narratives easily dispelled by modest inventory of what exists in actuality.

Here is another passage and definition:

A grapholect is a transdialectal language formed by deep commitment to writing. Writing gives a grapholect a power far exceeding that of any purely oral dialect. The grapholect known as standard English has accessible for use a recorded vocabulary of at least a million and a half words, of which not only the present meanings but also hundreds of thousands of past meanings are known. A simply oral dialect will commonly have resources of only a few thousand words, and its users will have virtually no knowledge of the real semantic history of any of these words. [p. 8]

My finding is that terms such as democracy, liberalism, social justice, etc. fail to mean anything (except perhaps to academics and committed readers) precisely because their consensus usage has shifted so wildly over time that common historical points of reference are impossible to establish in a culture heavily dominated by contemporary memes, slang, talking heads, and talking points — components of orality rather than literacy. And as part of a wider epistemological crisis, one can no longer rely on critical thinking to sort out competing truth claims because the modifier critical now bandied about recklessly in academia, now infecting the workplace and politics, has unironically reversed its meaning and requires uncritical doublethink to swallow what’s taught and argued. Let me stress, too, that playing word games (such as dissembling what is means) is a commonplace tactic to put off criticism by distorting word meanings beyond recognition.

Although it’s unclear just yet (to me, obviously) what Ong argues in his book beyond the preliminary comparison and contrast of oral and chirographic cultures (or in terms of the title of the book, orality and literacy), I rather doubt he argues as I do that the modern world has swung around to rejection of literacy and the style of thought that flows from deep engagement with the written word. Frankly, it would surprise me if his did; the book predates the Internet, social media, and what’s now become omnimedia. The last decade in particular has demonstrated that by placing a cheap, personal, 24/7/365 communications device in the hands of every individual from the age of 12 or so, a radical social experiment was launched that no one in particular designed — except that once the outlines of the experiment began to clarify, those most responsible (i.e., social media platforms in particular but also biased journalists and activist academics) have refused to admit that they are major contributors to the derangement of society. Cynics learned long ago to expect that advertisers, PR hacks, and politicians should be discounted, which requires ongoing skepticism and resistance to omnipresent lures, cons, and propaganda. Call it waking up to reality or simply growing up and behaving responsibly in an information environment designed to be disorienting. Accordingly, the existence of counterweights — information networks derived from truth, authority, and integrity — has always been, um, well, critical. Their extinction presages much graver losses as information structures and even the memory of mental habits that society needs to function are simply swept aside.

On the heels of a series of snowstorms, ice storms, and deep freezes (mid-Feb. 2021) that have inundated North America and knocked out power to millions of households and businesses, I couldn’t help but to notice inane remarks and single-pane comics to the effect “wish we had some global warming now!” Definitely, things are looking distinctly apocalyptic as folks struggle with deprivation, hardship, and existential threats. However, the common mistake here is to substitute one thing for another, failing to distinguish weather from climate.

National attention is focused on Texas, expected to be declared a disaster zone by Pres. Biden once he visits (a flyover, one suspects) to survey and assess the damage. It’s impossible to say that current events are without precedent. Texas has been in the cross-hairs for decades, suffering repeated droughts, floods, fires, and hurricanes that used to be prefixed by 50-year or 100-year. One or another is now occurring practically every year, which is exactly what climate chaos delivers. And in case the deep freeze and busted water pipes all over Texas appear to have been unpredictable, this very thing happened in Arizona in 2011. Might have been a shot across the bow for Texas to learn from and prepare, but its self-reliant, gun-totin’, freedom-lovin’ (fuck, yeah!), secessionist character is instead demonstrated by having its own electrical grid covering most of the state, separated from other North American power grids, ostensibly to skirt federal regulations. Whether that makes Texas’ grid more or less vulnerable to catastrophic failure is an open question, but events of the past week tested it sorely. It failed badly. People literally froze to death as a result. Some reports indicate Texas was mere moments away from an even greater failure that would have meant months to rebuild and reestablish electrical service. A substantial diaspora would have ensued, essentially meaning more climate refugees.

So where’s the evil in this? Well, let me tell you. Knowledge that we humans are on track to extirpate ourselves via ongoing industrial activity has been reported and ignored for generations. Guy McPherson’s essay “Extinction Foretold, Extinction Ignored” has this to say at the outset:

The warnings I will mention in this short essay were hardly the first ones about climate catastrophe likely to result from burning fossil fuels. A little time with your favorite online search engine will take you to George Perkins Marsh sounding the alarm in 1847, Svente Arrhenius’s relevant journal article in 1896, Richard Nixon’s knowledge in 1969, and young versions of Al Gore, Carl Sagan, and James Hansen testifying before the United States Congress in the 1980s. There is more, of course, all ignored for a few dollars in a few pockets. [links in original]

My personal acquaintance with this large body of knowledge began accumulating in 2007 or so. Others with decision-making capacity have known for much, much longer. Yet short-term motivations shoved aside responsible planning and preparation that is precisely the warrant of governments at all levels, especially, say, the U.S. Department of Energy. Sure, climate change is reported as controversy, or worse, as conspiracy, but in my experience, only a few individuals are willing to speak the obvious truth. They are often branded kooks. Institutions dither, distract, and even issue gag orders to, oh, I dunno, prop up real estate values in south Florida soon to be underwater. I’ve suggested repeatedly that U.S. leaders and institutions should be acting to manage contraction and alleviate suffering best as possible, knowing that civilization will fail anyway. To pretend otherwise and guarantee — no — drive us toward worst-case scenarios is just plain evil. Of course, the megalomania of a few tech billionaires who mistakenly believe they can engineer around society’s biggest problems is just as bad.

Writ small (there’s a phrase no one uses denoting narrowing scope), meaning at a scale less than anthropogenic climate change (a/k/a unwitting geoengineering), American society has struggled to prioritize guns vs. butter for over a century. The profiteering military-industrial complex has clearly won that debate, leaving infrastructure projects, such as bridge and road systems and public utilities, woefully underfunded and extremely vulnerable to market forces. Refusal to recognize public health as a right or public good demanding a national health system (like other developed countries have) qualifies as well. As inflated Pentagon budgets reveal, the U.S. never lacks money to oppress, fight, and kill those outside the U.S. Inside the U.S., however, cities and states fall into ruin, and American society is allowed to slowly unwind for lack of support. Should we withdraw militarily from the world stage and focus on domestic needs, such as homelessness and joblessness? Undoubtedly. Would that leave us open to attack or invasion (other than the demographic invasion of immigrants seeking refuge in the U.S.)? Highly doubtful. Other countries have their own domestic issues to manage and would probably appreciate a cessation of interference and intervention from the U.S. One might accuse me of substituting one thing for another, as I accused others at top, but the guns-vs.-butter debate is well established. Should be obvious that it’s preferable to prioritize caring for our own society rather than devoting so much of our limited time and resources to destroying others.

So far, this multipart blog post has trafficked in principles and generalities. Let me try now to be more specific, starting with an excerpt from Barry Lynn’s article in Harper’s Magazine titled “The Big Tech Extortion Racket” (Sept. 2020):

… around the middle of the nineteenth century, Americans began to develop technologies that could not be broken into component pieces. This was especially true of the railroad and the telegraph … Such corporations [railroad and telegraph companies] posed one overarching challenge: they charged some people more than others to get to market. They exploited their control over an essential service in order to extort money, and sometimes political favors … Americans found the answer to this problem in common law. For centuries, the owners of ferries, stagecoaches, and inns had been required to serve all customers for the same price and in the order in which they arrived. In the late nineteenth century, versions of such “common carrier” rules were applied to the new middleman corporations.

Today we rightly celebrate the Sherman Antitrust Act of 1890, which gave Americans the power to break apart private corporations. But in many respects, the Interstate Commerce Act of 1887 was the more important document. This act was based on the understanding that monopoly networks like the railroad and the telegraph could be used to influence the actions of people who depend on them, and hence their power must be carefully restricted …

For a century and a half, Americans used common carrier policies to ensure the rule of law in activities that depended on privately held monopolies … regulations freed Americans to take full advantage of every important network technology introduced during these years, including telephones, water and electrical services, energy pipelines, and even large, logistics-powered retailers. Citizens did not have to worry that the men who controlled the technologies involved would exploit their middleman position to steal other people’s business or disrupt balances of power.

I appreciate that Barry Lynn brings up the Interstate Commerce Act. If this legal doctrine appeared in the net neutrality debate a few years ago, it must have escaped my notice. While Internet Service Providers (ISPs) enable network access and connectivity, those utilities have not yet exhibited let’s-be-evil characteristics. Similarly, phone companies (including cell phones) and public libraries may well be eavesdropping and/or monitoring activities of the citizenry, but the real action lies elsewhere, namely, on social media networks and with online retailers. Evil is arguably concentrated in the FANG (or FAANG) corporations but has now grown to be ubiquitous in all social networks (e.g., Twitter) operating as common carriers (Zoom? Slack?) and across academe, nearly all of which have succumbed to moral panic. They are interpreting correctly, sad to observe, demands to censor and sanitize others’ no-longer-free speech appearing on their networks or within their realms. How much deeper it goes toward shaping politics and social engineering is quasi-conspiratorial and impossible for me to assess.

Much as I would prefer to believe that individuals possess the good sense to shift their activities away from social networks or turn their attention from discomfiting information sources, that does not appear to be the case. Demands for trigger warnings and safe spaces commonplace a few years ago on college campuses have instead morphed into censorious removal, deplatforming, and cancellation from the entire public sphere. Those are wrong responses in free societies, but modern institutions and technologies have gotten out of hand and outstripped the limits of normal human cognition. In short, we’re a society gone mad. So rather than accept responsibility to sort out information overflow oneself, many are demanding that others do it for them, and evil private corporations are complying (after a fashion). Moreover, calls for creation of an Orwellian Ministry of Truth, rebranded as a Truth Commission and Reality Czar, could hardly be any more chillingly and fascistically bizarre. People really need someone to brainwash decide for them what is real? Has anyone at the New York Times actually read Orwell’s dystopian novel 1984 and taken to heart its lessons?

From an article in the Sept. 2020 issue (I’m lagging in my reading) of Harper’s Magazine by Laurent Dubreuil titled “Nonconforming“:

American academia is a hotbed of proliferating identities supported and largely shaped by the higher ranks of administrators, faculty, student groups, alumni, and trustees. Not all identities are equal in dignity, history, or weight. Race, gender, and sexual orientation were the three main dimensions of what in the 1970s began to be called identity politics. These traits continue to be key today. But affirmed identities are mushrooming.

… identity politics as now practiced does not put an end to racism, sexism, or other sorts of exclusion or exploitation. Ready-made identities imprison us in stereotyped narratives of trauma. In short, identity determinism has become an additional layer of oppression, one that fails to address the problems it clumsily articulates.

Considering the acceleration of practically everything in the late-modern world (postmodern refers to something quite different), which makes planning one’s higher education somewhat fraught if the subject matter studied is rendered flatly out-of-date or moribund by the time of either graduation or entry into the workforce, I’ve heard it recommended that expertise in any particular subject area may be less important than developing expertise in at least one subject that takes a systems approach. That system might be language and communications, mathematics (or any other hard science), history, economics and finance, business administration, computer coding, law and governance, etc. So long as a rigorous understanding of procedures and rules is developed, a structuralist mindset can be repeated and transferred into other subject areas. Be careful, however, not to conflate this approach with a liberal arts education, which is sometimes described as learning how to learn and is widely applicable across disciplines. The liberal arts have fallen distinctly out of favor in the highly technological and technocratic world, which cares little for human values resistant to quantification. Problem is, Western societies in particular are based on liberal democratic institutions now straining due to their sclerotic old age. And because a liberal arts education is scarcely undertaken anymore, civics and citizenship are no longer taught. Even the study of English has now been corrupted (postmodern does apply here) to the point that the basic liberal arts skill of critical thinking is being lost through attrition. Nowhere is that more abundantly clear than in bristling debate over free speech and censorship.

Aside. Although society tinkers and refines itself (sometimes declines) over time, a great body of cultural inheritance informs how things are done properly within an ideology or system. When tinkering and refinement become outright intransigence and defiance of an established order, it’s commonplace to hear the objection “but that’s not how _______ works.” For instance, debate over climate science or the utility of vaccines often has one party proclaiming “trust [or believe] the science.” However, that’s not how science works (i.e., through unquestioning trust or belief). The scientific method properly understood includes verification, falsification, and revision when results and assertions fail to establish reasonable certainty (not the same as consensus). Similarly, critical thinking includes a robust falsification check before “facts” can be accepted at face value. So-called “critical studies” (a/k/a grievance studies), like religious faith, typically positions bald assertions beyond the reach of falsification. Well, sorry, that’s not how critical thinking works.

Being older and educated before critical studies were fully legitimized (or gave rise to things as risible as feminist glaciology), my understand has always been that free speech and other rights are absolutes that cannot be sliced and diced into bits. That way lies casuistry, where law founders frequently. Thus, if one wishes, say, to trample or burn the U.S. flag in protest, no law can be passed or constitutional amendment enacted to carve out an exception disallowed that instance of dissenting free speech. A lesser example is kneeling silently rather than participating in singing the national anthem before a sporting event. Though offensive to certain individual’s sensibilities, silencing speech is far worse according to liberal democratic values. Whatever our ideological or political difference are, we cannot work them out when one party has the power to place topics out or bounds or remove others from discussion entirely. The point at which spirited debate crosses over into inciting violence or fomenting insurrection is a large gray area, which is the subject of the second impeachment of 45. Civil law covers such contingencies, so abridging free speech, deplatforming, and adopting the formulation “language is violence” are highly improper responses under the liberal form of government codified in the U.S. Constitution, which includes the Bill of Rights originally omitted from the U.S. Constitution but quickly added to articulate the rights fully.

Liberal democratic ideology arose in mercantile, substantially agrarian Western societies before scientific, industrial, and capitalist revolutions built a full head of steam, so to speak. Considering just how much America has developed since the Colonial Period, it’s no surprise society has outgrown its own founding documents. More pointedly, the intellectual commons was a much smaller environment, often restricted to a soapbox in the town square and the availability of book, periodicals,and broadsides. Today, the public square has moved online to a bewildering array of social media platforms that enables publication of one’s ideas well beyond the sound of one’s voice over a crowd or the bottleneck of a publisher’s printing press. It’s an entirely new development, and civil law has not kept pace. Whether Internet communications are regulated like the airwaves or nationalized like the U.S. military, it’s clear that the Wild West uber-democratic approach (where anyone can basically say anything) has failed. Demands for regulation (restrictions on free speech) are being taken seriously and acted upon by the private corporations that run social media platforms. During this interim phase, it’s easy for me, as a subscriber to liberal democratic values, to insist reflexively on free speech absolutism. The apparent mood of the public lies elsewhere.