Posts Tagged ‘Counter-Enlightenment’

Back to the initial question: is (any of) the West’s cultural inheritance worth preserving? Part of the question goes to deep culture, inherited from Enlightenment philosophy, which is foundational to liberal democracy. Examples include natural rights of speech, religion, conscience, self-determination, and equality before the law, all of which position the individual or citizen at the center of political focus while limiting the purview of government. Alternative political orientations found outside the West are typically variations of collectivism (multiple –isms from which to choose). Unexpectedly, according to a wide-ranging Cato Institute survey about a CBDC (central bank digital currency), 14% of Americans would support in-home government surveillance (some version of Orwell’s Big Brother, already present with the Google Nest, Apple Echo, and other Internet-of-things devices), “to reduce domestic violence, abuse, and other illegal activity.” The number jumps to 53% among those who support a CBDC as well. Guess those folks never learned the cautionary lessons of dystopian fiction and actual history, are complacent (or worse) about the inverted relationship of the U.S. citizen to government, and are willing to jettison what vestiges of privacy remain (which in truth was probably gone more than a decade ago).

The ongoing assault on free speech and sovereign thought is particularly concerning to me. The principle used to be readily understood simply by citing the example of the ACLU rising to the defense of the right of neo-Nazis to march in Skokie, IL, way back in 1978. More fully, the worry was that if heinous and deeply offensive speech, protest, and dissent were infringed, the state would inevitably expand categories of suppression to include relatively innocuous opinion that ran counter to official policy. Most Americans no longer grok the concept, one result of decades of eroding educational standards that leaves large swaths of the public unable to form ideas clearly or develop a vision for life independently from whatever they are told. An educated public, able to understand their shifting positions in the world and in history, would have been worth preserving. Indeed, wizened perspectives based on the humanities are already disappearing from living memory as the digital life makes entire generations into insufferable, mouth-breathing twits face-planted in their phones. Citizens have become workers and consumers, and cultural heroes and thought leaders are now arch-capitalists and tech bros who may stumble accidentally into defense of rights but whose primary focus is directed to a different target.

Category creep is precisely what happened in the Covid era, where open public debate needed to support scientific inquiry was stifled. Make no mistake, the scale of social disruption demanded widespread scrutiny, discussion, and review by qualified experts; what the world got instead was bogus political consensus, orthodoxy, and mandates that only a minority of people, organizations, and countries were able to resist effectively. Now that the so-called Twitter Files revealed that the U.S. government has been working hand-in-glove with Silicon Valley and legacy journalism for some years to restrict speech, all presumed speech rights are lost. Independent journalists examining the Twitter Files discovered that there are hundreds of offices and agencies within the government surveilling public utterance and determining who should be thwarted. That entails thousands of bureaucrats and careerists carrying water for others. Tight control of narratives extends as well to, say, the Russo-Ukrainian proxy war and the past week’s French street rebellion. Who can guess how many other topics have been subjected to propaganda or completely excised from the news because legacy journalists decided (or were told) not to report? For instance, in late August 2023, a controversial law in the European Union will go into effect that commands Silicon Valley social media operators to regulate speech to combat “disinformation,” however that may be defined by some Orwellian Ministry of Truth. Who knows about this? Until yesterday, I didn’t. From this link:

“I am the enforcer,” European Internal Market Commissioner Thierry Breton told Politico ahead of his planned journey to visit American tech companies to “stress test” them for compliance with the Digital Services Act (DSA), which goes into effect this summer. “I represent the law, which is the will of the state and the people.” [links removed]

I’d like to say that Breton’s claim to represent the will of the people is an outright fabrication, but as the Cato Institute survey above suggests, plenty of folks are just fine giving away their autonomy and critical thinking in favor of efficiency and empty promises of safety. (I grant that he represents the law and state in all their iniquity.) Any way one might try to (mis-)shape recent history, coordinated assaults on free speech and the availability of truthful information are marked betrayals of the cultural inheritance of the West and support my contention (from earlier blog posts) that a Counter-Enlightenment movement is underway from both above and below. I haven’t yet answered the question “what’s worth preserving?” directly, having instead pointed to what’s been lost. Stay tuned.

Search the tag Counter-Enlightenment at the footer of this blog to find roughly ten disparate blog posts, all circling around the idea that intellectual history, despite all the obvious goodies trucked in with science and technology, is turning decidedly away from long-established Enlightenment values. A fair number of resources are available online and in book form exploring various movements against the Enlightenment over the past few centuries, none of which I have consulted. Instead, I picked up Daniel Schwindt’s The Case Against the Modern World: A Crash Course in Traditionalist Thought (2016), which was gifted to me. The book was otherwise unlikely to attract my attention considering that Schwindt takes Catholicism as a starting point whereas I’m an avowed atheist, though with no particular desire to proselytize or attempt to convince others of anything. However, The Case Against is suffused with curious ideas, so it is a good subject for a new book blogging project, which in characteristic fashion (for me) will likely proceed in fits and starts.

Two interrelated ideas Schwindt puts forward early in the book fit with multiple themes of this blog, namely, (1) the discovery and/or development of the self (I refer more regularly to consciousness) and (2) the reductive compartmentalization of thought and behavior. Let’s take them in order. Here’s a capsule of the first issue:

(more…)

Ask parents what ambitions they harbor for their child or children and among the most patterned responses is “I just want them to be happy.” I find such an answer thoughtless and disingenuous, and the insertion of the hedge just to make happiness sound like a small ask is a red herring. To begin with, for most kids still in their first decade, happiness and playfulness are relatively effortless and natural so long as a secure, loving environment is provided. Certainly not a default setting, but it’s still quite commonplace. As the dreamy style of childhood cognition is gradually supplanted by supposedly more logical, rational, adult thinking, and as children become acquainted with iniquities of both history and contemporary life, innocence and optimism become impossible to retain. Cue the sullen teenager confronting the yawning chasm between desire and reality. Indeed, few people seem to make the transition into adulthood knowing with much clarity how to be happy in the midst of widespread travail and suffering. Instead, young adults frequently substitute self-destructive, nihilistic hedonism, something learned primarily (says me) from the posturing of movie characters and the celebrities who portray them. (Never understood the trope of criminals hanging at nightclubs, surrounded by drug addicts, nymphos, other unsavory types, and truly awful music, where they can indulge their assholery before everything inevitably goes sideways.)

Many philosophies recommend simplicity, naturalness, and independence as paths to happiness and moral rectitude. Transcendentalism was one such response to social and political complexities that spoil and/or corrupt. Yet two centuries on, the world has only gotten more and more complex, pressing on everyone especially for information processing in volume and sophistication that does not at all come naturally to most and is arguably not part of our evolutionary toolkit. Multiple social issues, if one is to engage them fairly, hinge on legalistic arguments and bewildering wordplay that render them fundamentally intractable. Accordingly, many waive away all nuance and adopt pro forma attitudes. Yet the airwaves, social media, the Internet, and even dinner conversations are suffused by the worst sorts of hypercomplexity and casuistry that confound even those who traffic regularly in such rhetoric. It’s a very long way from “I just want to be happy.”

(more…)

A quick search revealed that over 15 years of blog posts, the word macrohistory has been used only once. On reflection, macrohistory is something in which I’ve been involved for some time — mostly as a dilettante. Several book reviews and three book-blogging series (one complete, two either on hiatus or fully abandoned) concern macrohistory, and my own several multi-part blogs connect disparate dots over broader topics (if not quite history in the narrow sense). My ambition, as with macrohistory, is to tease out better (if only slightly) understandings of ourselves (since humans and human culture are obviously the most captivating thing evar). Scientists direct similar fascination to the inner workings of nonhuman systems — or at least larger systems in which humans are embedded. Thus, macrohistory can be distinguished from natural history by their objects of study. Relatedly, World-Systems Theory associated with Immanuel Wallerstein and The Fourth Turning (1997 book by William Strauss and Neil Howe) take similarly broad perspectives and attempt to identify historical dynamics and patterns not readily apparent. Other examples undoubtedly exist.

This is all preliminary to discussing a rather depressing article from the December 2020 issue of Harper’s Magazine: Rana Dasgupta’s disquieting (ahem) essay “The Silenced Majority” (probably behind a paywall). The subtitle poses the question, “Can America still afford democracy?” This innocuous line begs the question whether the U.S. (America and the United States of America [and its initialisms U.S. and U.S.A.] being sloppily equivalent almost everywhere, whereas useful distinctions describe the United Kingdom, Great Britain, and England) actually has or practices democracy anymore, to which many would answer flatly “nope.” The essay is an impressive exercise, short of book length, in macrohistory, though it’s limited to Western cultures, which is often the case with history told from inside the bubble. Indeed, if (as the aphorism goes) history is written/told primarily by the victors, one might expect to hear only of an ongoing series of victories and triumphs with all the setbacks, losses, and discontinuities excised like some censored curated Twitter or Facebook Meta discussion. One might also wonder how that same history reads when told from the perspective of non-Western countries, especially those in transitional regions such as Poland, Ukraine, Turkey, and Iran or those with histories long predating the rise of the West roughly 500 years ago, i.e., China, Japan, Egypt, and the lost cultures of Central America. Resentments of the Islamic world, having been eclipsed by the West, are a case in point. My grasp of world history is insufficient to entertain those perspectives. I note, however, that with globalism, the histories of all regions of the world are now intimately interconnected even while perspectives differ.

Dasgupta describes foundational Enlightenment innovations that animate Western thinking, even though the ideas are often poorly contextualized or understood. To wit:

In the seventeenth century, England was an emerging superpower. Supremacy would come from its invention of a world principle of property. This principle was developed following contact with the Americas, where it became possible to conjure vast new English properties “out of nothing”—in a way that was impracticable, for instance, in the militarized, mercantile societies of India. Such properties were created by a legal definition of ownership designed so that it could be applied only to the invaders. “As much land as a man tills, plants, improves, cultivates, and can use the product of,” John Locke wrote in 1689, “so much is his property.” When combined with other new legal categories such as “the savage” and “the state of nature,” this principle of property engendered societies such as Carolina, where Locke’s patron, the first earl of Shaftesbury, was a lord proprietor.

Obvious, isn’t it, that by imposing the notion of private property on indigenous inhabitants of North America, colonialists established ownership rights over territories where none had previously existed? Many consider that straightforward theft (again, begging the question) or at least fencing the commons. (Attempts to do the same in the open oceans and in space [orbit] will pick up as technology allows, I surmise.) In addition, extension of property ownership to human trafficking, i.e., slavery and its analogues still practiced today, has an exceptionally long history and was imported to the Americas, though the indigenous population proved to be poor candidates for subjugation. Accordingly, others were brought to North America in slave trade that extended across four centuries.

Dasgupta goes on:

From their pitiless opposition to the will of the people, we might imagine that British elites were dogmatic and reactionary. (Period dramas depicting stuck-up aristocrats scandalized by eccentricity and innovation flatter this version of history.) The truth is that they were open-minded radicals. They had no sentimentality about the political order, cutting the head off one king and sending another into exile. They could invent financial and legal structures (such as the Bank of England, founded in 1694) capable of releasing unprecedented market energies. Even their decision to exploit American land with African labor demonstrated their world-bending pursuit of wealth. Their mines and plantations would eventually supply the capital for the first industrial revolution. They loved fashion and technology, they believed in rationality, progress, and transparency. They were the “founding fathers” of our modern world.

And yet they presided over a political system as brutal as it was exclusive. Why? The answer is simple. They could not afford democracy, but also, crucially, they did not need it. [emphasis in original]

So much for the awe and sacred respect in which Enlightenment philosophers and the Founders are held — or used to be. Statues of these dudes (always dudes, natch) are being pulled down all the time. Moreover, association of liberal democracy with the 17th century is a fundamental mistake, though neoliberalism (another poorly defined and understood term) aims to shift backwards to a former or hybrid state of human affairs some are beginning to call digital feudalism.

The article goes on to discuss the balancing act and deals struck over the course of centuries to maintain economic and political control by the ownership class. It wasn’t until the 1930s and the postwar economic boom in the U.S. that democracy as commonly understood took root significantly. The labor movement in particular was instrumental in forcing FDR’s New Deal social programs, even though populism and socialism as political movements had been successfully beaten back. Interestingly, the hallowed American nuclear family (limited in its scope racially), an ahistorical formation that enjoyed a roughly 30-year heyday from 1945 to 1975, coincides with the rise of the American middle class and now-aged democratic institutions. They’re all connected with widely distributed wealth and prosperity. But after the oil crisis and stagflation of the middle 1970s, gains enjoyed by the middle class have steadily eroded and/or been actively beaten back (again!) so that dominant themes today are austerity imposed on the masses and inequality coughing up hundy-billionaires with increasing frequency. Estimates are that 30-40% of the American citizenry lives in poverty, bumping up against failed state territory. Inequality has returned to Gilded Age levels if not exceeded them. Dasgupta fails to cite perhaps the major underlying cause of this shift away from affordable democracy, back toward the brutal, world principal of property: falling EROI. Cheap foreign labor, productivity gains, and creation of a giant debtor society have simply not offset the disappearance of cheap energy.

Dasgupta’s further discussion of an emerging two-tier economy along with the Silicon Valley technocracy follows, but I’ll stop short here and encourage readers instead to investigate and think for themselves. Lots of guides and analyses help to illuminate the macrohistory, though I find the conclusions awful in their import. Dasgupta drives home the prognosis:

The neoliberal revolution aimed to restore the supremacy of capital after its twentieth-century subjugation by nation-states, and it has succeeded to an astonishing degree. As states compete and collude with gargantuan new private powers, a new political world arises. The principle of labor, which dominated the twentieth century—producing the industrious, democratic society we have come to regard, erroneously, as the norm—is once again being supplanted by a principle of property, the implications and consequences of which we know only too well from our history books.

Continuing from part 2. I’m so slow ….

If cognitive inertia (i.e., fear of change) used to manifest as technophobia, myriad examples demonstrate how technology has fundamentally penetrated the social fabric and shared mental space, essentially flipping the script to fear of missing out (FOMO) of whatever latest, greatest innovation comes down the pike (laden with fraud and deception — caveat emptor). With FOMO, a new phobia has emerged: fear of technological loss, or more specifically, inability to connect to the Internet. This is true especially among the young, born and bred after the onset of the computing and digital communications era. Who knows when, why, or how loss of connectivity might occur? Maybe a Carrrington Event, maybe rolling blackouts due to wildfires (such as those in California and Oregon), maybe a ransomware attack on ISPs, or maybe a totalitarian clampdown by an overweening government after martial law is declared (coming soon to a neighborhood near you!). Or maybe something simpler: infrastructure failure. For some, inability to connect digitally, electronically, is tantamount to total isolation. Being cut off from the thoughts of others and abandoned left to one’s own thoughts, even on the short term, is thus roughly equivalent to the torture of solitary confinement. Forget the notion of digital detox.

/rant on

Cheerleaders for technocracy are legion, of course, while the mind boggles at how society might or necessarily will be organized differently when it all fails (as it must, if for no other reason than energy depletion). Among the bounties of the communications era is a surfeit of entertainments, movies and TV shows especially, that are essentially new stories to replace or supplant old stories. It’s no accident, however, that the new ones come wrapped up in the themes, iconography, and human psychology (is there any other kind, really?) of old ones. Basically, everything old is new again. And because new stories are delivered through hyperpalatable media — relatively cheap, on demand, and removed from face-to-face social contexts — they arguably cause as much disorientation as reorientation. See, for instance, the embedded video, which is rather long and rambling but nevertheless gets at how religious instincts manifest differently throughout the ages and are now embedded in comic book stories and superheros that have overtaken the entertainment landscape.

Mention is made that the secular age coincides roughly with the rise of video stores, a form of on-demand selection of content more recently made even simpler with ubiquitous streaming services. Did people really start hunkering down in their living rooms, eschewing group entertainments and civic involvements only in the 1980s? The extreme lateness of that development in Western history is highly suspect, considering the death of god had been declared back in the middle of the 19th century. Moreover, the argument swings around to the religious instinct, a cry or meaning if you will, being blocked by organized churches and their endemic corruption and instead finding expression in so-called secular religions (oxymoron alert). Gawd, how I tire of everything that functions as psychological grounding being called a religion. Listen, pseudo-religious elements can be found in Cheerios if one twists up one’s mind sufficiently. That doesn’t make General Mills or Kellogg’s new secular-religious prophets.

Back to the main point. Like money grubbing, technophilia might quiet the desperate search for meaning temporarily, since there’s always more of both to acquire. Can’t get enough, ever. But after even partial acquisition, the soul feels strangely dissatisfied and disquieted. Empty, one might even say. So out roving into the public sphere one goes, seeking and pursuing something to fill one’s time and appetites. Curiously, many traditional solutions to this age-old problem taught the seeker to probe within as an alternative. Well screw that! In the hyper-connected 20th-century world, who has time for that measly self-isolation? More reified Cheerios!

/rant off

Guy McPherson used to say in his presentations that we’re all born into bondage, meaning that there is no escape from Western civilization and its imperatives, including especially participation in the money economy. The oblique reference to chattel slavery is clumsy, perhaps, but the point is nonetheless clear. For all but a very few, civilization functions like Tolkien’s One Ring, bringing everyone ineluctably under its dominion. Enlightenment cheerleaders celebrate that circumstance and the undisputed material and technological (same thing, really) bounties of the industrial age, but Counter-Enlightenment thinkers recognize reasons for profound discontent. Having blogged at intervals about the emerging Counter-Enlightenment and what’s missing from modern technocratic society, my gnawing guilt by virtue of forced participation in the planet-killing enterprise of industrial civilization is growing intolerable. Skipping past the conclusion drawn by many doomers that collapse and ecocide due to unrestrained human consumption of resources (and the waste stream that follows) have already launched a mass extinction that will extirpate most species (including large mammals such as humans), let me focus instead on gross dysfunction occurring at levels falling more readily within human control.

An Empire of War

Long overdue U.S. troop withdrawal from Afghanistan has already yielded Taliban resurgence, which was a foregone conclusion at whatever point U.S. troops left (and before them, Soviets). After all, the Taliban lives there and had only to wait. Distasteful and inhumane as it may be to Westerners, a powerful faction (religious fanatics) truly wants to live under a 7th-century style of patriarchy. Considering how long the U.S. occupied the country, a new generation of wannabe patriarchs came to adulthood — an unbroken intergenerational descent. Of course, the U.S. (and others) keeps arming them. Indeed, I heard that the U.S. military is considering bombing raids to destroy the war machines left behind as positions were so swiftly abandoned. Oops, too late! This is the handiest example how failed U.S. military escapades extending over decades net nothing of value to anyone besides weapons and ordnance manufacturers and miserable careerists within various government branches and agencies. The costs (e.g., money, lives, honor, sanity) are incalculable and spread with each country where the American Empire engages. Indeed, the military-industrial complex chooses intervention and war over peace at nearly every opportunity (though careful not to poke them bears too hard). And although the American public’s inability to affect policy (unlike the Vietnam War era) doesn’t equate with participation, the notion that it’s a government of the people deposits some of the blame on our heads anyway. My frustration is that nothing is learned and the same war crimes mistakes keep being committed by maniacs who ought to know better.

Crony and Vulture Capitalism

Critics of capitalism are being proven correct far more often than are apologists and earnest capitalists. The two subcategories I most deplore are crony capitalism and vulture capitalism, both of which typically accrue to the benefit of those in no real need of financial assistance. Crony capitalism is deeply embedded within our political system and tilts the economic playing field heavily in favor of those willing to both pay for and grant favors rather than let markets sort themselves out. Vulture capitalism extracts value out of dead hosts vulnerable resource pools by attacking and often killing them off (e.g., Microsoft, Walmart, Amazon), or more charitably, absorbing them to create monopolies, often by hostile takeover at steep discounts. Distressed mortgage holders forced into short sales, default, and eviction is the contemporary example. Rationalizing predatory behavior as competition is deployed regularly.

Other historical economic systems had similarly skewed hierarchies, but none have reached quite the same heartless, absurd levels of inequality as late-stage capitalism. Pointing to competing systems and the rising tide that lifts all boats misdirects people to make ahistorical comparisons. Human psychology normally restricts one’s points of comparison to contemporaries in the same country/region. Under such narrow comparison, the rank injustice of hundred-billionaires (or even simply billionaires) existing at the same time as giant populations of political/economic/climate refugees and the unhoused (the new, glossy euphemism for homelessness) demonstrates the soul-forfeiting callousness of the top quintile and/or 1% — an ancient lesson never learned. Indeed, aspirational nonsense repackages suffering and sells it back to the underclass, which as a matter of definition will always exist but need not have to live as though on an entirely different planet from Richistan.

Human Development

Though I’ve never been a big fan of behaviorism, the idea that a hypercomplex stew of influences, inputs, and stimuli leads to better or worse individual human development, especially in critical childhood years but also throughout life, is pretty undeniable. As individuals aggregate into societies, the health and wellbeing of a given society is linked to the health and wellbeing of those very individuals who are understood metaphorically as the masses. Behaviorism would aim to optimize conditions (as if such a thing were possible), but because American institutions and social systems have been so completely subordinated to capitalism and its distortions, society has stumbled and fumbled from one brand of dysfunction to another, barely staying ahead of revolution or civil war (except that one time …). Indeed, as the decades have worn on from, say, the 1950s (a nearly idyllic postwar reset that looms large in the memories of today’s patrician octogenarians), it’s difficult to imaging how conditions could have deteriorated any worse other than a third world war.

Look no further than the U.S. educational system, both K–12 and higher ed. As with other institutions, education has had its peaks and valleys. However, the crazy, snowballing race to the bottom witnessed in the last few decades is utterly astounding. Stick a pin in it: it’s done. Obviously, some individuals manage to get educated (some doing quite well, even) despite the minefield that must be navigated, but the exception does not prove the rule. Countries that value quality education (e.g., Finland, China, Singapore, Japan, South Korea) in deed, not just in empty words trotted out predictably by every presidential campaign, routinely trounce decidedly middling results in the U.S. and reveal that dysfunctional U.S. political systems and agencies (Federal, state, municipal) just can’t get the job done properly anymore. (Exceptions are always tony suburbs populated by high-earning and -achieving parents who create opportunities and unimpeded pathways for their kids.) Indeed, the giant babysitting project that morphs into underclass school-to-prison and school-to-military service (cannon fodder) pipelines are what education has actually become for many. The opportunity cost of failing to invest in education (or by proxy, American youth) is already having follow-on effects. The low-information voter is not a fiction, and it extends to every American institution that requires clarity to see through the fog machine operated by the mainstream media.

As an armchair social critic, I often struggle to reconcile how history unfolds without a plan, and similarly, how society self-organizes without a plan. Social engineering gets a bad rap for reasons: it doesn’t work (small exceptions exist) and subverts the rights and freedoms of individuals. However, the rank failure to achieve progress (in human terms, not technological terms) does not suggest stasis. By many measures, the conditions in which we live are cratering. For instance, Dr. Gabor Maté discusses the relationship of stress to addiction in a startling interview at Democracy Now! Just how bad is it for most people?

… it never used to be that children grew up in a stressed nuclear family. That wasn’t the normal basis for child development. The normal basis for child development has always been the clan, the tribe, the community, the neighborhood, the extended family. Essentially, post-industrial capitalism has completely destroyed those conditions. People no longer live in communities which are still connected to one another. People don’t work where they live. They don’t shop where they live. The kids don’t go to school, necessarily, where they live. The parents are away most of the day. For the first time in history, children are not spending most of their time around the nurturing adults in their lives. And they’re spending their lives away from the nurturing adults, which is what they need for healthy brain development.

Does that not sound like self-hobbling? A similar argument can be made about human estrangement from the natural world, considering how rural-to-urban migration (largely completed in the U.S. but accelerating in the developing world) has rendered many Americans flatly unable to cope with, say, bugs and dirt and labor (or indeed most any discomfort). Instead, we’ve trapped ourselves within a society that is, as a result of its organizing principles, slowly grinding down everyone and everything. How can any of us (at least those of us without independent wealth) choose not to participate in this wretched concatenation? Nope, we’re all guilty.

“Language is dynamic” is a phrase invoked in praise or derision of shifts in usage. Corollaries include “the only constant is change” and “time’s arrow points in only one direction” — both signalling that stasis is an invalid and ultimately futile conservative value. The flip side might well be the myth of progress, understood in reference not to technological advancement but human nature’s failure to rise above its base (animal) origins. This failure is especially grotesque considering that humans currently albeit temporarily live in an age of material abundance that would provide amply for everyone if that largesse were justly and equitably produced and distributed. However, resources (including labor) are being systematically exploited, diverted, and hoarded by a small, unethical elite (what some call “alpha chimps”) who often use state power to subjugate vulnerable populations to funnel further tribute to the already obscenely wealthy top of the socioeconomic hierarchy. But that’s a different diatribe.

Although I’m sensitive the dynamism of language — especially terms for broad ideas in need of short, snappy neologisms — I’m resistant to adopting most new coin. For instance, multiple colors of pill (red, blue, white, and black to my knowledge) refer to certain narrative complexes that people, in effect, swallow. Similarly, the “blue church” is used to refer to legacy media struggling desperately (and failing) to retain its last shreds of legitimacy and authority. (Dignity is long gone.) Does language really need these terms or are hipsters just being clever? That question probably lacks a definitive answer.

My real interest with this blog post, however, is how the modern digital mediascape has given rise to a curious phenomenon associated with cancel culture: deletion of tweets and social media posts to scrub one’s past of impropriety as though the tweet or post never happened. (I’ve never deleted a post nor have any plans to.) Silicon Valley hegemons can’t resist getting their piece of the action, too, by applying deeply flawed algorithms to everyone’s content to demonetize, restrict, and/or remove (i.e., censor) offensive opinion that runs counter to (shifting) consensus narratives decided upon in their sole discretion as water carriers for officialdom. Algorithmic dragnets are not effective kludges precisely because thoughts are not synonymous with their online expression; one merely points to the other. Used to be said that the Internet is forever, so wait before posting or tweeting a reasonable duration so that irresponsible behavior (opinion and trolling, mostly) can be tempered. Who knows who possesses technical expertise and access to tweet and video archives other than, say, the Wayback Machine? When a public figure says or does something dumb, a search-and-destroy mission is often launched to resurrect offending and damning past utterance. Of course, scrub-a-dub erasure or deletion is merely another attempt to manage narrative and isn’t a plea for forgiveness, which doesn’t exist in the public sphere anyway except for rehabilitated monsters such as past U.S. presidents a/k/a war criminals. And the Internet isn’t in fact forever; ask an archivist.

Shifting language, shifting records, shifting sentiment, shifting intellectual history are all aspects of culture that develop naturally and inevitably over time. We no longer believe, for instance, in the four elements or geocentrism (a/k/a the Aristotelian-Ptolemaic system; never mind the intransigent Flat Earthers who need not be silenced). Darker aspects of these shifts, however, include the remarkable Orwellian insight that “Who controls the past controls the future: who controls the present controls the past” from the 1949 novel Nineteen Eighty-Four. Here’s the passage for context:

Who controls the past, controls the future: who controls the present, controls the past … The mutability of the past is the central tenet of Ingsoc. Past events, it is argued, have no objective existence, but survive only in written records and in human memories. The past is whatever the records and the memories agree upon. And since the Party is in full control of all records, and in equally full control of the minds of its members, it follows that the past is whatever the Party chooses to make it.

In 2021, the awful lesson is taken to heart by multiple parties (not the Party in the novel but wannabes) who have latched maniacally onto Orwellian mechanisms of thought control specifically through the manipulation of records, history, and language. But as mentioned above, policing mere expression is not the same as policing thought itself, at least among those who retain critical thinking skills and independence of mind. I abstain judgment how effective attempted brainwashing is with the masses but will at least mention that Yeonmi Park, who escaped from North Korea in 2007 before settling in the U.S. in 2014, describes the chilling totalitarian thought control exercised by the North Korean government — the stuff of nightmare dystopianism. The template is by now well established and despots everywhere are only too happy to implement it repeatedly, following an evil trajectory that should be resisted at every turn while still possible.

For more than a decade, I’ve had in the back of my mind a blog post called “The Power of Naming” to remark that bestowing a name gives something power, substance, and in a sense, reality. That post never really came together, but its inverse did. Anyway, here’s a renewed attempt.

The period of language acquisition in early childhood is suffused with learning the names of things, most of which is passive. Names of animals (associated closely with sounds they make) are often a special focus using picture books. The kitty, doggie, and horsie eventually become the cat, dog, and horse. Similarly, the moo-cow and the tweety-bird shorten to cow and bird (though songbird may be an acceptable holdover). Words in the abstract are signifiers of the actual things, aided by the text symbols learned in literate cultures to reinforce mere categories instead of examples grounded in reality. Multiply the names of things several hundred thousand times into adulthood and indeed throughout life and one can develop a formidable vocabulary supporting expressive and nuanced thought and speech. Do you know the differences between acute, right, obtuse, straight, and reflex angles? Does it matter? Does your knowledge of barware inform when to use a flute, coupe, snifter, shot (or shooter or caballito), nosing glass (or Glencairn), tumbler, tankard, goblet, sling, and Stein? I’d say you’ve missed something by never having drunk dark beer (Ger.: Schwarzbier) from a frosted schooner. All these varieties developed for reasons that remain invisible to someone content to drink everything from the venerable red Solo cup. Funnily enough, the red Solo cup now comes in different versions, fooling precisely no one.

Returning to book blogging, Walter Ong (in Orality and Literacy) has curious comparisons between primarily oral cultures and literate cultures. For example:

Oral people commonly think of names (one kind of words) as conveying power over things. Explanations of Adam’s naming of the animals in Genesis 2:20 usually call condescending attention to this presumably quaint archaic belief. Such a belief is in fact far less quaint than it seems to unreflective chirographic and typographic folk. First of all, names do give humans beings power over what they name: without learning a vast store of names, one is simply powerless to understand, for example, chemistry and to practice chemical engineering. And so with all other intellectual knowledge. Secondly, chirographic and typographic folk tend to think of names as labels, written or printed tags imaginatively affixed to an object named. Oral folk have no sense of a name as a tag, for they have no idea of a name as something that can be seen. Written or printed representations of words can be labels; real, spoken words cannot be. [p. 33]

This gets at something that has been developing over the past few decades, namely, that as otherwise literate (or functionally literate) people gather more and more information through electronic media (screens that serve broadcast and cable TV, YouTube videos, prerecorded news for streaming, and podcasts, and most importantly, audiobooks — all of which speak content to listeners), the spoken word (re)gains primacy and the printed word fades into disuse. Electronic media may produce a hybrid of orality/literacy, but words are no longer silent, internal, and abstract. Indeed, words — all by themselves — are understood as being capable of violence. Gone are the days when “stick and stones ….” Now, fighting words incite and insults sting again.

Not so long ago, it was possible to provoke a duel with an insult or gesture, such as a glove across the face. Among some people, defense of honor never really disappeared (though dueling did). History has taken a strange turn, however. Proposed legislation to criminalize deadnaming (presumably to protect a small but growing number of transgender and nonbinary people who have redefined their gender identity and accordingly adopted different names) recognizes the violence of words but then tries to transmute the offense into an abstract criminal law. It’s deeply mixed up, and I don’t have the patience to sort it out.

More to say in later blog posts, but I’ll raise the Counter-Enlightenment once more to say that the nature of modern consciousness if shifting somewhat radically in response to stimuli and pressures that grew out of an information environment, roughly 70 years old now but transformed even more fundamentally in the last 25 years, that is substantially discontinuous from centuries-old traditions. Those traditions displaced even older traditions inherited from antiquity. Such is the way of the world, I suppose, and with the benefit of Walter Ong’s insights, my appreciation of the outlines is taking better shape.

Having grown up in an ostensibly free, open society animated by liberal Western ideology, it’s fair to say in hindsight that I internalized a variety of assumptions (and illusions) regarding the role of the individual vis-à-vis society. The operative word here is ostensibly owing to the fact that society has always restricted pure expressions of individuality to some degree through socialization and pressure to conform, so freedom has always been constrained. That was one of the takeaways from my reading (long ago in high school) of Albert Camus’ novel The Stranger (1942) (British: The Outsider; French: L’Étranger), namely, that no matter how free one might believe oneself to be, if one refuses (radically, absurdly) to play by society’s rules and expectations, one will be destroyed. The basic, irresolvable conflict is also present in the concerto principle in classical music, which presents the soloist in dialogue with or in antithesis to the ensemble. Perhaps no work exemplifies this better than the 2nd movement of Ludwig van Beethoven’s Concerto No. 4 for piano and orchestra. A similar dialogue if found in the third movement of Gustav Mahler’s Symphony No. 3, though dialogue there might be better understood as man vs. nature. The significant point of similarity is not the musical style or themes but how the individual/man is ultimately subdued or absorbed by society/nature.

Aside: A broader examination of narrative conflict would include four traditional categories: (1) man vs. man, (2) man vs. nature, (3) man vs. self, and (4) man vs. society. Updated versions, often offered as tips for aspiring writers, sometimes include breakout conflicts (i.e., subcategories): (1) person vs. fate/god, (2) person vs. self, (3) person vs. person, (4) person vs. society, (5) person vs. nature, (6) person vs. supernatural, and (7) person vs. technology. Note that modern sensibilities demand use of person instead of man.

My reason for bringing up such disparate cultural artifacts is to provide context. Relying on my appreciation of the Zeitgeist, liberal Western ideology is undergoing a radical rethinking, with Woke activists in particular pretending to emancipate oppressed people when flattening of society is probably the hidden objective. Thus, Wokesters are not really freeing anyone, and flattening mechanisms are pulling people down, not building people up. On top of that, they are targeting the wrong oppressors. If leveling is meant to occur along various markers of identity (race, sexual and gender orientation, religion, political affiliation, nationality, etc.), the true conflict in the modern era has always been socioeconomic, i.e., the ownership class against all others. Sure, differences along identitarian lines have been used to oppress, but oppressors are merely using a surface characteristic to distract from their excessive power. The dispossessed oddly fail to recognize their true enemies, projecting animus instead on those with whom grievances are shared. Similarly, Wokesters are busy exploiting their newfound (albeit momentary) power to question the accepted paradigm and force RightThink on others. Yet traditional power holders are not especially threatened by squabbles among the oppressed masses. Moreover, it’s not quite accurate to say that the identitarian left is rethinking the established order. Whatever is happening is arguably occurring at a deeper, irrational level than any thoughtful, principled, political action meant to benefit a confluence of interest groups (not unlike the impossible-to-sort confluence of identities everyone has).

Although I haven’t read Howard Zinn’s A People’s History of the United States (1980), I gather that Zinn believed history should not be told from the winners’ perspective (i.e., that of the ownership and ruling classes, significant overlap acknowledged), or from top down, but instead through the lens of the masses (i.e., the people, a large segment of whom are oppressed and/or dispossessed), or from the bottom up. This reorientation applies not only within a given society or political entity but among nations. (Any guess which countries are the worst oppressors at the moment? Would be a long list.) Moreover, counter to the standard or accepted histories most of us learn, preparation of the U.S. Constitution and indeed quite a lot of U.S. history are deeply corrupt and oppressive by design. It should be obvious that the state (or nation, if one prefers), with its insistence on personal property and personal freedom (though only for a narrow class of landed gentry back in the day, plutocrats and corporatists today), systematically rolled over everyone else — none so egregiously as Native Americans, African slaves, and immigrants. Many early institutions in U.S. political history were in fact created as bulwarks against various forms of popular resistance, notably slave revolts. Thus, tensions and conflicts that might be mistakenly chalked up as man vs. society can be better characterized as man vs. the state, with the state having been erected specifically to preserve prerogatives of the ownership class.

More to come in part 2 and beyond.

The backblog at The Spiral Staircase includes numerous book reviews and three book-blogging projects — one completed and two others either abandoned or on semi-permanent hiatus. I’m launching a new project on Walter Ong’s Orality and Literacy: The Technologizing of the Word (1982), which comes highly recommended and appears quite interesting given my preoccupations with language, literacy, and consciousness. To keep my thinking fresh, I have not consulted any online reviews or synopses.

Early on, Ong provides curious (but unsurprising) definitions I suspect will contribute to the book’s main thesis. Here is one from the intro:

It is useful to approach orality and literacy synchronically, by comparing oral cultures and chirographic (i.e., writing) cultures that coexist at a given period of time. But it is absolutely essential to approach them also diachronically or historically, by comparing successive periods with one another. [p. 2]

I don’t recall reading the word chirographic before, but I blogged about the typographic mind (in which Ong’s analyses are discussed) and lamented that the modern world is moving away from literacy, back toward orality, which feels (to me at least) like retrogression and retreat. (Someone is certain to argue return to orality is actually progress.) As a result, Western institutions such as the independent press are decaying. Moreover, it’s probably fair to say that democracy in the West is by now only a remnant fiction, replaced by oligarchic rule and popular subscription to a variety of fantasy narratives easily dispelled by modest inventory of what exists in actuality.

Here is another passage and definition:

A grapholect is a transdialectal language formed by deep commitment to writing. Writing gives a grapholect a power far exceeding that of any purely oral dialect. The grapholect known as standard English has accessible for use a recorded vocabulary of at least a million and a half words, of which not only the present meanings but also hundreds of thousands of past meanings are known. A simply oral dialect will commonly have resources of only a few thousand words, and its users will have virtually no knowledge of the real semantic history of any of these words. [p. 8]

My finding is that terms such as democracy, liberalism, social justice, etc. fail to mean anything (except perhaps to academics and committed readers) precisely because their consensus usage has shifted so wildly over time that common historical points of reference are impossible to establish in a culture heavily dominated by contemporary memes, slang, talking heads, and talking points — components of orality rather than literacy. And as part of a wider epistemological crisis, one can no longer rely on critical thinking to sort out competing truth claims because the modifier critical now bandied about recklessly in academia, now infecting the workplace and politics, has unironically reversed its meaning and requires uncritical doublethink to swallow what’s taught and argued. Let me stress, too, that playing word games (such as dissembling what is means) is a commonplace tactic to put off criticism by distorting word meanings beyond recognition.

Although it’s unclear just yet (to me, obviously) what Ong argues in his book beyond the preliminary comparison and contrast of oral and chirographic cultures (or in terms of the title of the book, orality and literacy), I rather doubt he argues as I do that the modern world has swung around to rejection of literacy and the style of thought that flows from deep engagement with the written word. Frankly, it would surprise me if his did; the book predates the Internet, social media, and what’s now become omnimedia. The last decade in particular has demonstrated that by placing a cheap, personal, 24/7/365 communications device in the hands of every individual from the age of 12 or so, a radical social experiment was launched that no one in particular designed — except that once the outlines of the experiment began to clarify, those most responsible (i.e., social media platforms in particular but also biased journalists and activist academics) have refused to admit that they are major contributors to the derangement of society. Cynics learned long ago to expect that advertisers, PR hacks, and politicians should be discounted, which requires ongoing skepticism and resistance to omnipresent lures, cons, and propaganda. Call it waking up to reality or simply growing up and behaving responsibly in an information environment designed to be disorienting. Accordingly, the existence of counterweights — information networks derived from truth, authority, and integrity — has always been, um, well, critical. Their extinction presages much graver losses as information structures and even the memory of mental habits that society needs to function are simply swept aside.