Archive for the ‘Intellectual History’ Category

Search the tag Counter-Enlightenment at the footer of this blog to find roughly ten disparate blog posts, all circling around the idea that intellectual history, despite all the obvious goodies trucked in with science and technology, is turning decidedly away from long-established Enlightenment values. A fair number of resources are available online and in book form exploring various movements against the Enlightenment over the past few centuries, none of which I have consulted. Instead, I picked up Daniel Schwindt’s The Case Against the Modern World: A Crash Course in Traditionalist Thought (2016), which was gifted to me. The book was otherwise unlikely to attract my attention considering that Schwindt takes Catholicism as a starting point whereas I’m an avowed atheist, though with no particular desire to proselytize or attempt to convince others of anything. However, The Case Against is suffused with curious ideas, so it is a good subject for a new book blogging project, which in characteristic fashion (for me) will likely proceed in fits and starts.

Two interrelated ideas Schwindt puts forward early in the book fit with multiple themes of this blog, namely, (1) the discovery and/or development of the self (I refer more regularly to consciousness) and (2) the reductive compartmentalization of thought and behavior. Let’s take them in order. Here’s a capsule of the first issue:

(more…)

After a hiatus due to health issues, Jordan Peterson has reappeared in the public sphere. Good for him. I find him one of the most stimulating public intellectuals to appear thus far into the 21st century, though several others (unnamed) spring to mind who have a stronger claims on my attention. Yet I’m wary of Peterson as an effective evaluator of every development coughed up for public consideration. It’s simply not necessary or warranted for him to opine recklessly about every last damn thing. (Other podcasters are doing the same, and although I don’t want to instruct anyone to stay in their lane, I also recognize that Joe “Talkity-Talk” Blow’s hot take or rehash on this, that, and every other thing really isn’t worth my time.) With the inordinate volume of text in his books, video on his YouTube channel (classroom lectures, podcasts, interviews) and as a guest on others’ podcasts, and third-party writing about him (like mine), it’s inevitable that Peterson will run afoul of far better analysis than he himself can bring to bear. However, he declares his opinions forcefully and with overbearing confidence then decamps to obfuscation and reframing whenever someone pushes back effectively (which isn’t often, at least when in direct communication). With exasperation, I observe that he’s basically up to his old rhetorical tricks.

In a wide-ranging discussion on The Joe Rogan Experience from January 2022 (found exclusively on Spotify for anyone somehow unaware of Rogan’s influence in the public sphere), the thing that most irked me was Peterson’s take on the climate emergency. He described climate as too complex, with too many variable and unknowns, to embody in scientific models over extended periods of time. Seems to me Peterson has that entirely backwards. Weather (and extreme weather events) on the short term can’t be predicted too accurately, so daily/weekly/monthly forecasts give wide ranges of, say, cloud cover, temperature, and precipitation. But over substantial time (let’s start with a few decades, which is still a blink in geological time), trends and boundaries reveal themselves pretty reliably, which is why disturbances — such as burning enough fossil fuels to alter the chemical composition of the planet’s atmosphere — that upset the climate steady-state known as Garden Earth are not merely cause for serious concern but harbingers of doom. And then, as others often do, Peterson reframed the climate emergency largely in terms of economics (same thing happened with the pandemic, though not by Peterson so far as I know), suggesting that the problem is characterized by inefficiencies and grass-roots policy that would be magically different if more people were raised out of poverty and could advocate for solutions rather than simply struggle to survive. Dude apparently hasn’t grasped that wealth in the modern world is an outgrowth of the very thing — fossil fuels — that is the root of the problem. Further, industrial civilization is a heat engine that binds us to a warming trend. That’s a thermodynamic principle flatly immune to half-baked economic theories and ecological advocacy. Peterson also gives no indication of ever having acknowledged Jevons Paradox.

So let me state somewhat emphatically: the climate emergency is in fact an existential crisis on several fronts (e.g., resource depletion and scarcity, ecological despoliation, extreme weather events, and loss of habitat, all resulting in civilizational collapse). The rate of species extinction — before human population has begun to collapse in earnest, 8 Billion Day looms near — is several orders of magnitude greater than historical examples. Humans are unlikely to survive to the end of the century even if we refrain from blowing ourselves up over pointless geopolitical squabbles. I’ll defer to Peterson in his area of expertise: personality inventories. I’ll also grant him space to explore myth and symbolism in Western culture. But speaking on climate, he sounds like an ignoramus — the dangerous sort who leads others astray. And when challenged by someone armed with knowledge of governing principles, grasp of detail, and thus analysis superior to what he can muster (such as when debating Richard Wolff about Marxism), Peterson frequently resorts to a series of motte-and-bailey assertions that confound inexpert interlocutors. “Well, that depends on what you mean by ….” His retreat to faux safety is sometimes so astonishingly complete that he resorts to questioning the foundation of reality: “Why the sun? Why this sky? Why these stars? Why not something else completely?” Also, Peterson’s penchant for pointing out that the future is contingent and unknown despite, for instance, all indicators positively screaming to stop destroying our own habitat, as though no predictions or models can be made that have more than a whisper of accuracy in future outcomes, is mere rhetoric to forestall losing an argument.

As I’ve asserted repeatedly, sufficiency is the crucible on which all decisions are formed because earnest information gathering cannot persist interminably. Tipping points (ecological ones, sure, but more importantly, psychological ones) actually exist, where one must act despite incomplete knowledge and unclear prognosis. Accordingly, every decision is on some level a leap into the unknown and/or an act of faith. That doesn’t mean every decision is a wild, reckless foray based on nothing. Rather, when the propitious moment arrives (if one has the wherewithal to recognize it), one has to go with what one’s got, knowing that mistakes will be made and corrections will be needed.

Peterson’s improvisational speaking style is both impressive and inscrutable. I’m sometimes reminded of Marshall McLuhan, whose purported Asperger’s Syndrome (low-grade autism, perhaps, I’m unsure) awarded him unique insights into the emerging field of media theory that were not easily distilled in speech. Greta Thunberg is another more recent public figure whose cognitive character allows her to recognize rather acutely how human institutions have completely botched the job of keeping industrial civilization from consuming itself. Indeed, people from many diverse backgrounds, not hemmed in by the rigid dictates of politics, economics, and science, intuit through diverse ways of knowing (e.g., kinesthetic, aesthetic, artistic, religious, psychedelic) what I’ve written about repeatedly under the title “Life Out of Balance.” I’ve begun to understand Peterson as a mystic overwhelmed by the sheer beauty of existence but simultaneously horrified by unspeakably awful evils humans perpetrate on each other. Glimpses of both (and humor, a bit unexpectedly) often provoke cracks in his voice, sniffles, and tears as he speaks, clearly choking back emotions to keep his composure. Peterson’s essential message (if I can be so bold), like other mystics, is aspirational, transcendental, and charismatic. Such messages are impossible to express fully and are frankly ill-suited to 21st-century Western culture. That we’re severely out of balance, unable to regain an upright and righteous orientation, is plain to nearly everyone not already lost in the thrall of mass media and social media, but so long as the dominant culture remains preoccupied with wealth, consumption, celebrity, geopolitical violence, spectacle, and trash entertainment, I can’t envision any sort of return to piety and self-restraint. Plus, we can’t outrun the climate emergency bearing down on us.

There’s a Joseph Conrad title with which I’ve always struggled, not having read the short story: The Secret Sharer (1910). The problem for me is the modifier secret. Is a secret being shared or is someone sharing in secret? Another ambivalent term came up recently at Macro-Futilism (on my blogroll) regarding the term animal farm (not the novel by George Orwell). Is the animal farming or is the animal being farmed? Mention was made that ant and termites share with humans the characteristic that we farm. Apparently, several others do as well. Omission of humans in the linked list is a frustratingly commonplace failure to observe, whether out of ignorance or stupid convention, that humans are animals, too. I also recalled ant farms from boyhood, and although I never had one (maybe because I never had one), I misunderstood that the ants themselves were doing the farming, as opposed to the keeper of the kit farming the ants.

The additional detail at Macro-Futilism that piqued my curiosity, citing John Gowdy’s book Ultrasocial: The Evolution of Human Nature and the Quest for a Sustainable Future (2021), is the contention that animals that farm organize themselves into labor hierarchies (e.g., worker/drone, soldier, and gyne/queen). Whether those hierarchies are a knowing choice (at least on the part of humans) or merely blind adaptation to the needs of agriculturalism is not clearly stated in the blog post or quotations, nor is the possibility of exceptions to formation of hierarchies in the list of other farming species. (Is there a jellyfish hierarchy?) However, lumping together humans, ants, and termites as ultrasocial agricultural species rather suggests that social and/or cultural evolution is driving their inner stratification, not foresight or planning. Put more plainly, humans are little or no different from insects after discovery and adoption of agriculture except for the obviously much higher complexity of human society over other animal farms.

I’ve suggested many times on this blog that humans are not really choosing the course of history (human or otherwise) as it unfolds around us, and further, that trying to drive or channel history in a chosen direction is futile. Rather, history is like a headless (thus, mindless) beast, and humans are mostly along for the ride. Gowdy’s contention regarding agricultural species supports the idea that no one is or can be in charge and that we’re all responding to survival pressure and adapting at unconscious levels. We’re not mindless, like insects, but neither are we able to choose our path in the macro-historical sense. The humanist in me — an artifact of Enlightenment liberalism, perhaps (more to say about that in forthcoming posts) — clings still to the assertion that we have agency, meaning choices to make. But those choices typically operate at a far more mundane level than human history. Perhaps political leaders and industrial tycoons have greater influence over human affairs by virtue of armies, weapons, and machinery, but my fear is that those decision-makers can really only dominate and destroy, not preserve or create in ways that allow for human flourishing.

Does this explain away scourges like inequality, exploitation, institutional failure, rank incompetence, and corruption, given that each of us responds to a radically different set of influences and available options? Impossible question to answer.

Ask parents what ambitions they harbor for their child or children and among the most patterned responses is “I just want them to be happy.” I find such an answer thoughtless and disingenuous, and the insertion of the hedge just to make happiness sound like a small ask is a red herring. To begin with, for most kids still in their first decade, happiness and playfulness are relatively effortless and natural so long as a secure, loving environment is provided. Certainly not a default setting, but it’s still quite commonplace. As the dreamy style of childhood cognition is gradually supplanted by supposedly more logical, rational, adult thinking, and as children become acquainted with iniquities of both history and contemporary life, innocence and optimism become impossible to retain. Cue the sullen teenager confronting the yawning chasm between desire and reality. Indeed, few people seem to make the transition into adulthood knowing with much clarity how to be happy in the midst of widespread travail and suffering. Instead, young adults frequently substitute self-destructive, nihilistic hedonism, something learned primarily (says me) from the posturing of movie characters and the celebrities who portray them. (Never understood the trope of criminals hanging at nightclubs, surrounded by drug addicts, nymphos, other unsavory types, and truly awful music, where they can indulge their assholery before everything inevitably goes sideways.)

Many philosophies recommend simplicity, naturalness, and independence as paths to happiness and moral rectitude. Transcendentalism was one such response to social and political complexities that spoil and/or corrupt. Yet two centuries on, the world has only gotten more and more complex, pressing on everyone especially for information processing in volume and sophistication that does not at all come naturally to most and is arguably not part of our evolutionary toolkit. Multiple social issues, if one is to engage them fairly, hinge on legalistic arguments and bewildering wordplay that render them fundamentally intractable. Accordingly, many waive away all nuance and adopt pro forma attitudes. Yet the airwaves, social media, the Internet, and even dinner conversations are suffused by the worst sorts of hypercomplexity and casuistry that confound even those who traffic regularly in such rhetoric. It’s a very long way from “I just want to be happy.”

(more…)

A quick search revealed that over 15 years of blog posts, the word macrohistory has been used only once. On reflection, macrohistory is something in which I’ve been involved for some time — mostly as a dilettante. Several book reviews and three book-blogging series (one complete, two either on hiatus or fully abandoned) concern macrohistory, and my own several multi-part blogs connect disparate dots over broader topics (if not quite history in the narrow sense). My ambition, as with macrohistory, is to tease out better (if only slightly) understandings of ourselves (since humans and human culture are obviously the most captivating thing evar). Scientists direct similar fascination to the inner workings of nonhuman systems — or at least larger systems in which humans are embedded. Thus, macrohistory can be distinguished from natural history by their objects of study. Relatedly, World-Systems Theory associated with Immanuel Wallerstein and The Fourth Turning (1997 book by William Strauss and Neil Howe) take similarly broad perspectives and attempt to identify historical dynamics and patterns not readily apparent. Other examples undoubtedly exist.

This is all preliminary to discussing a rather depressing article from the December 2020 issue of Harper’s Magazine: Rana Dasgupta’s disquieting (ahem) essay “The Silenced Majority” (probably behind a paywall). The subtitle poses the question, “Can America still afford democracy?” This innocuous line begs the question whether the U.S. (America and the United States of America [and its initialisms U.S. and U.S.A.] being sloppily equivalent almost everywhere, whereas useful distinctions describe the United Kingdom, Great Britain, and England) actually has or practices democracy anymore, to which many would answer flatly “nope.” The essay is an impressive exercise, short of book length, in macrohistory, though it’s limited to Western cultures, which is often the case with history told from inside the bubble. Indeed, if (as the aphorism goes) history is written/told primarily by the victors, one might expect to hear only of an ongoing series of victories and triumphs with all the setbacks, losses, and discontinuities excised like some censored curated Twitter or Facebook Meta discussion. One might also wonder how that same history reads when told from the perspective of non-Western countries, especially those in transitional regions such as Poland, Ukraine, Turkey, and Iran or those with histories long predating the rise of the West roughly 500 years ago, i.e., China, Japan, Egypt, and the lost cultures of Central America. Resentments of the Islamic world, having been eclipsed by the West, are a case in point. My grasp of world history is insufficient to entertain those perspectives. I note, however, that with globalism, the histories of all regions of the world are now intimately interconnected even while perspectives differ.

Dasgupta describes foundational Enlightenment innovations that animate Western thinking, even though the ideas are often poorly contextualized or understood. To wit:

In the seventeenth century, England was an emerging superpower. Supremacy would come from its invention of a world principle of property. This principle was developed following contact with the Americas, where it became possible to conjure vast new English properties “out of nothing”—in a way that was impracticable, for instance, in the militarized, mercantile societies of India. Such properties were created by a legal definition of ownership designed so that it could be applied only to the invaders. “As much land as a man tills, plants, improves, cultivates, and can use the product of,” John Locke wrote in 1689, “so much is his property.” When combined with other new legal categories such as “the savage” and “the state of nature,” this principle of property engendered societies such as Carolina, where Locke’s patron, the first earl of Shaftesbury, was a lord proprietor.

Obvious, isn’t it, that by imposing the notion of private property on indigenous inhabitants of North America, colonialists established ownership rights over territories where none had previously existed? Many consider that straightforward theft (again, begging the question) or at least fencing the commons. (Attempts to do the same in the open oceans and in space [orbit] will pick up as technology allows, I surmise.) In addition, extension of property ownership to human trafficking, i.e., slavery and its analogues still practiced today, has an exceptionally long history and was imported to the Americas, though the indigenous population proved to be poor candidates for subjugation. Accordingly, others were brought to North America in slave trade that extended across four centuries.

Dasgupta goes on:

From their pitiless opposition to the will of the people, we might imagine that British elites were dogmatic and reactionary. (Period dramas depicting stuck-up aristocrats scandalized by eccentricity and innovation flatter this version of history.) The truth is that they were open-minded radicals. They had no sentimentality about the political order, cutting the head off one king and sending another into exile. They could invent financial and legal structures (such as the Bank of England, founded in 1694) capable of releasing unprecedented market energies. Even their decision to exploit American land with African labor demonstrated their world-bending pursuit of wealth. Their mines and plantations would eventually supply the capital for the first industrial revolution. They loved fashion and technology, they believed in rationality, progress, and transparency. They were the “founding fathers” of our modern world.

And yet they presided over a political system as brutal as it was exclusive. Why? The answer is simple. They could not afford democracy, but also, crucially, they did not need it. [emphasis in original]

So much for the awe and sacred respect in which Enlightenment philosophers and the Founders are held — or used to be. Statues of these dudes (always dudes, natch) are being pulled down all the time. Moreover, association of liberal democracy with the 17th century is a fundamental mistake, though neoliberalism (another poorly defined and understood term) aims to shift backwards to a former or hybrid state of human affairs some are beginning to call digital feudalism.

The article goes on to discuss the balancing act and deals struck over the course of centuries to maintain economic and political control by the ownership class. It wasn’t until the 1930s and the postwar economic boom in the U.S. that democracy as commonly understood took root significantly. The labor movement in particular was instrumental in forcing FDR’s New Deal social programs, even though populism and socialism as political movements had been successfully beaten back. Interestingly, the hallowed American nuclear family (limited in its scope racially), an ahistorical formation that enjoyed a roughly 30-year heyday from 1945 to 1975, coincides with the rise of the American middle class and now-aged democratic institutions. They’re all connected with widely distributed wealth and prosperity. But after the oil crisis and stagflation of the middle 1970s, gains enjoyed by the middle class have steadily eroded and/or been actively beaten back (again!) so that dominant themes today are austerity imposed on the masses and inequality coughing up hundy-billionaires with increasing frequency. Estimates are that 30-40% of the American citizenry lives in poverty, bumping up against failed state territory. Inequality has returned to Gilded Age levels if not exceeded them. Dasgupta fails to cite perhaps the major underlying cause of this shift away from affordable democracy, back toward the brutal, world principal of property: falling EROI. Cheap foreign labor, productivity gains, and creation of a giant debtor society have simply not offset the disappearance of cheap energy.

Dasgupta’s further discussion of an emerging two-tier economy along with the Silicon Valley technocracy follows, but I’ll stop short here and encourage readers instead to investigate and think for themselves. Lots of guides and analyses help to illuminate the macrohistory, though I find the conclusions awful in their import. Dasgupta drives home the prognosis:

The neoliberal revolution aimed to restore the supremacy of capital after its twentieth-century subjugation by nation-states, and it has succeeded to an astonishing degree. As states compete and collude with gargantuan new private powers, a new political world arises. The principle of labor, which dominated the twentieth century—producing the industrious, democratic society we have come to regard, erroneously, as the norm—is once again being supplanted by a principle of property, the implications and consequences of which we know only too well from our history books.

From Ran Prieur (no link, note nested reply):


I was heavily into conspiracy theory in the 90’s. There was a great paper magazine, Kenn Thomas’s Steamshovel Press, that always had thoughtful and well-researched articles exploring anomalies in the dominant narrative.

Another magazine, Jim Martin’s Flatland, was more dark and paranoid but still really smart. A more popular magazine, Paranoia, was stupid but fun.

At some point, conspiracy culture shifted to grand narratives about absolute evil. This happened at the same time that superhero movies (along with Harry Potter and Lord of the Rings) took over Hollywood. The more epic and the more black-and-white the story, the more humans are drawn to it.

This is my half-baked theory: It used to be that ordinary people would accept whatever the TV said — or before that, the church. Only a few weirdos developed the skill of looking at a broad swath of potential facts, and drawing their own pictures.

It’s like seeing shapes in the clouds. It’s not just something you do or don’t do — it’s a skill you can develop, to see more shapes more easily. And now everyone is learning it.

Through the magic of the internet, everyone is discovering that they can make reality look like whatever they want. They feel like they’re finding truth, when really they’re veering off into madness.

SamuraiBeanDog replies: Except that the real issue with the current conspiracy crisis is that people are just replacing the old TV and church sources with social media and YouTube. The masses of conspiracy culture aren’t coming up with their own realities, they’re just believing whatever shit they’re told by conspiracy influencers.

Something that’s rarely said about influencers, and propaganda in general, is that they can’t change anyone’s mind — they have to work with what people already feel good about believing.

“Language is dynamic” is a phrase invoked in praise or derision of shifts in usage. Corollaries include “the only constant is change” and “time’s arrow points in only one direction” — both signalling that stasis is an invalid and ultimately futile conservative value. The flip side might well be the myth of progress, understood in reference not to technological advancement but human nature’s failure to rise above its base (animal) origins. This failure is especially grotesque considering that humans currently albeit temporarily live in an age of material abundance that would provide amply for everyone if that largesse were justly and equitably produced and distributed. However, resources (including labor) are being systematically exploited, diverted, and hoarded by a small, unethical elite (what some call “alpha chimps”) who often use state power to subjugate vulnerable populations to funnel further tribute to the already obscenely wealthy top of the socioeconomic hierarchy. But that’s a different diatribe.

Although I’m sensitive the dynamism of language — especially terms for broad ideas in need of short, snappy neologisms — I’m resistant to adopting most new coin. For instance, multiple colors of pill (red, blue, white, and black to my knowledge) refer to certain narrative complexes that people, in effect, swallow. Similarly, the “blue church” is used to refer to legacy media struggling desperately (and failing) to retain its last shreds of legitimacy and authority. (Dignity is long gone.) Does language really need these terms or are hipsters just being clever? That question probably lacks a definitive answer.

My real interest with this blog post, however, is how the modern digital mediascape has given rise to a curious phenomenon associated with cancel culture: deletion of tweets and social media posts to scrub one’s past of impropriety as though the tweet or post never happened. (I’ve never deleted a post nor have any plans to.) Silicon Valley hegemons can’t resist getting their piece of the action, too, by applying deeply flawed algorithms to everyone’s content to demonetize, restrict, and/or remove (i.e., censor) offensive opinion that runs counter to (shifting) consensus narratives decided upon in their sole discretion as water carriers for officialdom. Algorithmic dragnets are not effective kludges precisely because thoughts are not synonymous with their online expression; one merely points to the other. Used to be said that the Internet is forever, so wait before posting or tweeting a reasonable duration so that irresponsible behavior (opinion and trolling, mostly) can be tempered. Who knows who possesses technical expertise and access to tweet and video archives other than, say, the Wayback Machine? When a public figure says or does something dumb, a search-and-destroy mission is often launched to resurrect offending and damning past utterance. Of course, scrub-a-dub erasure or deletion is merely another attempt to manage narrative and isn’t a plea for forgiveness, which doesn’t exist in the public sphere anyway except for rehabilitated monsters such as past U.S. presidents a/k/a war criminals. And the Internet isn’t in fact forever; ask an archivist.

Shifting language, shifting records, shifting sentiment, shifting intellectual history are all aspects of culture that develop naturally and inevitably over time. We no longer believe, for instance, in the four elements or geocentrism (a/k/a the Aristotelian-Ptolemaic system; never mind the intransigent Flat Earthers who need not be silenced). Darker aspects of these shifts, however, include the remarkable Orwellian insight that “Who controls the past controls the future: who controls the present controls the past” from the 1949 novel Nineteen Eighty-Four. Here’s the passage for context:

Who controls the past, controls the future: who controls the present, controls the past … The mutability of the past is the central tenet of Ingsoc. Past events, it is argued, have no objective existence, but survive only in written records and in human memories. The past is whatever the records and the memories agree upon. And since the Party is in full control of all records, and in equally full control of the minds of its members, it follows that the past is whatever the Party chooses to make it.

In 2021, the awful lesson is taken to heart by multiple parties (not the Party in the novel but wannabes) who have latched maniacally onto Orwellian mechanisms of thought control specifically through the manipulation of records, history, and language. But as mentioned above, policing mere expression is not the same as policing thought itself, at least among those who retain critical thinking skills and independence of mind. I abstain judgment how effective attempted brainwashing is with the masses but will at least mention that Yeonmi Park, who escaped from North Korea in 2007 before settling in the U.S. in 2014, describes the chilling totalitarian thought control exercised by the North Korean government — the stuff of nightmare dystopianism. The template is by now well established and despots everywhere are only too happy to implement it repeatedly, following an evil trajectory that should be resisted at every turn while still possible.

Having grown up in an ostensibly free, open society animated by liberal Western ideology, it’s fair to say in hindsight that I internalized a variety of assumptions (and illusions) regarding the role of the individual vis-à-vis society. The operative word here is ostensibly owing to the fact that society has always restricted pure expressions of individuality to some degree through socialization and pressure to conform, so freedom has always been constrained. That was one of the takeaways from my reading (long ago in high school) of Albert Camus’ novel The Stranger (1942) (British: The Outsider; French: L’Étranger), namely, that no matter how free one might believe oneself to be, if one refuses (radically, absurdly) to play by society’s rules and expectations, one will be destroyed. The basic, irresolvable conflict is also present in the concerto principle in classical music, which presents the soloist in dialogue with or in antithesis to the ensemble. Perhaps no work exemplifies this better than the 2nd movement of Ludwig van Beethoven’s Concerto No. 4 for piano and orchestra. A similar dialogue if found in the third movement of Gustav Mahler’s Symphony No. 3, though dialogue there might be better understood as man vs. nature. The significant point of similarity is not the musical style or themes but how the individual/man is ultimately subdued or absorbed by society/nature.

Aside: A broader examination of narrative conflict would include four traditional categories: (1) man vs. man, (2) man vs. nature, (3) man vs. self, and (4) man vs. society. Updated versions, often offered as tips for aspiring writers, sometimes include breakout conflicts (i.e., subcategories): (1) person vs. fate/god, (2) person vs. self, (3) person vs. person, (4) person vs. society, (5) person vs. nature, (6) person vs. supernatural, and (7) person vs. technology. Note that modern sensibilities demand use of person instead of man.

My reason for bringing up such disparate cultural artifacts is to provide context. Relying on my appreciation of the Zeitgeist, liberal Western ideology is undergoing a radical rethinking, with Woke activists in particular pretending to emancipate oppressed people when flattening of society is probably the hidden objective. Thus, Wokesters are not really freeing anyone, and flattening mechanisms are pulling people down, not building people up. On top of that, they are targeting the wrong oppressors. If leveling is meant to occur along various markers of identity (race, sexual and gender orientation, religion, political affiliation, nationality, etc.), the true conflict in the modern era has always been socioeconomic, i.e., the ownership class against all others. Sure, differences along identitarian lines have been used to oppress, but oppressors are merely using a surface characteristic to distract from their excessive power. The dispossessed oddly fail to recognize their true enemies, projecting animus instead on those with whom grievances are shared. Similarly, Wokesters are busy exploiting their newfound (albeit momentary) power to question the accepted paradigm and force RightThink on others. Yet traditional power holders are not especially threatened by squabbles among the oppressed masses. Moreover, it’s not quite accurate to say that the identitarian left is rethinking the established order. Whatever is happening is arguably occurring at a deeper, irrational level than any thoughtful, principled, political action meant to benefit a confluence of interest groups (not unlike the impossible-to-sort confluence of identities everyone has).

Although I haven’t read Howard Zinn’s A People’s History of the United States (1980), I gather that Zinn believed history should not be told from the winners’ perspective (i.e., that of the ownership and ruling classes, significant overlap acknowledged), or from top down, but instead through the lens of the masses (i.e., the people, a large segment of whom are oppressed and/or dispossessed), or from the bottom up. This reorientation applies not only within a given society or political entity but among nations. (Any guess which countries are the worst oppressors at the moment? Would be a long list.) Moreover, counter to the standard or accepted histories most of us learn, preparation of the U.S. Constitution and indeed quite a lot of U.S. history are deeply corrupt and oppressive by design. It should be obvious that the state (or nation, if one prefers), with its insistence on personal property and personal freedom (though only for a narrow class of landed gentry back in the day, plutocrats and corporatists today), systematically rolled over everyone else — none so egregiously as Native Americans, African slaves, and immigrants. Many early institutions in U.S. political history were in fact created as bulwarks against various forms of popular resistance, notably slave revolts. Thus, tensions and conflicts that might be mistakenly chalked up as man vs. society can be better characterized as man vs. the state, with the state having been erected specifically to preserve prerogatives of the ownership class.

More to come in part 2 and beyond.

The backblog at The Spiral Staircase includes numerous book reviews and three book-blogging projects — one completed and two others either abandoned or on semi-permanent hiatus. I’m launching a new project on Walter Ong’s Orality and Literacy: The Technologizing of the Word (1982), which comes highly recommended and appears quite interesting given my preoccupations with language, literacy, and consciousness. To keep my thinking fresh, I have not consulted any online reviews or synopses.

Early on, Ong provides curious (but unsurprising) definitions I suspect will contribute to the book’s main thesis. Here is one from the intro:

It is useful to approach orality and literacy synchronically, by comparing oral cultures and chirographic (i.e., writing) cultures that coexist at a given period of time. But it is absolutely essential to approach them also diachronically or historically, by comparing successive periods with one another. [p. 2]

I don’t recall reading the word chirographic before, but I blogged about the typographic mind (in which Ong’s analyses are discussed) and lamented that the modern world is moving away from literacy, back toward orality, which feels (to me at least) like retrogression and retreat. (Someone is certain to argue return to orality is actually progress.) As a result, Western institutions such as the independent press are decaying. Moreover, it’s probably fair to say that democracy in the West is by now only a remnant fiction, replaced by oligarchic rule and popular subscription to a variety of fantasy narratives easily dispelled by modest inventory of what exists in actuality.

Here is another passage and definition:

A grapholect is a transdialectal language formed by deep commitment to writing. Writing gives a grapholect a power far exceeding that of any purely oral dialect. The grapholect known as standard English has accessible for use a recorded vocabulary of at least a million and a half words, of which not only the present meanings but also hundreds of thousands of past meanings are known. A simply oral dialect will commonly have resources of only a few thousand words, and its users will have virtually no knowledge of the real semantic history of any of these words. [p. 8]

My finding is that terms such as democracy, liberalism, social justice, etc. fail to mean anything (except perhaps to academics and committed readers) precisely because their consensus usage has shifted so wildly over time that common historical points of reference are impossible to establish in a culture heavily dominated by contemporary memes, slang, talking heads, and talking points — components of orality rather than literacy. And as part of a wider epistemological crisis, one can no longer rely on critical thinking to sort out competing truth claims because the modifier critical now bandied about recklessly in academia, now infecting the workplace and politics, has unironically reversed its meaning and requires uncritical doublethink to swallow what’s taught and argued. Let me stress, too, that playing word games (such as dissembling what is means) is a commonplace tactic to put off criticism by distorting word meanings beyond recognition.

Although it’s unclear just yet (to me, obviously) what Ong argues in his book beyond the preliminary comparison and contrast of oral and chirographic cultures (or in terms of the title of the book, orality and literacy), I rather doubt he argues as I do that the modern world has swung around to rejection of literacy and the style of thought that flows from deep engagement with the written word. Frankly, it would surprise me if his did; the book predates the Internet, social media, and what’s now become omnimedia. The last decade in particular has demonstrated that by placing a cheap, personal, 24/7/365 communications device in the hands of every individual from the age of 12 or so, a radical social experiment was launched that no one in particular designed — except that once the outlines of the experiment began to clarify, those most responsible (i.e., social media platforms in particular but also biased journalists and activist academics) have refused to admit that they are major contributors to the derangement of society. Cynics learned long ago to expect that advertisers, PR hacks, and politicians should be discounted, which requires ongoing skepticism and resistance to omnipresent lures, cons, and propaganda. Call it waking up to reality or simply growing up and behaving responsibly in an information environment designed to be disorienting. Accordingly, the existence of counterweights — information networks derived from truth, authority, and integrity — has always been, um, well, critical. Their extinction presages much graver losses as information structures and even the memory of mental habits that society needs to function are simply swept aside.

Evil exists in the world. History and current events both bear this out amply. Pseudo-philosophers might argue that, like emotions and other immaterial sensations, good and evil are merely reified concepts, meaning they are human constructs with no palpable external reality. Go tell that to victims of evildoers. Human suffering can’t be anonymized, rationalized, or philosophized away quite so handily.

It was sort of refreshing, back in the day, when Google’s motto and/or corporate code of conduct was simple: “Don’t Be Evil.” It acknowledged the potential for being or becoming evil (like any of the Bigs: Big Tobacco, Big Soda, Big Oil, Big Pharma, Big Media, Big Agriculture, etc.) and presumably aspired to resist obvious temptations. That was then (from 2000 to 2018), this is now (2021 until death take us — soon enough, I fear). But like all entities possessed of absurd levels of wealth and power, Google (now reorganized as a subsidiary of Alphabet, but who actually refers to it that way?) and its Silicon Valley brethren have succumbed to temptation and become straight-up evil.

One might charitably assess this development as something unbidden, unanticipated, and unexpected, but that’s no excuse, really. I certainly don’t envy celebrity executives experiencing difficulty resulting from having created unmanageable behemoths loosed on both public and polity unable to recognize beastly fangs until already clamped on their necks. As often occurs, dystopian extrapolations are explored in fiction, sometimes satirically. The dénouement of the HBO show Silicon Valley depicts tech mogul wannabes succeeding in creating an AI (or merely a sophisticated algorithm? doesn’t matter …) that would in time become far too powerful in blind execution of its inner imperative. In the show, characters recognize what they had done and kill their own project rather than allow it to destroy the world. In reality, multiple developers of computer tech platforms (and their embedded dynamic, including the wildly unhelpful albeit accurate term algorithm) lacked the foresight to anticipate awful downstream effects of their brainchildren. Yet now that those effects are manifesting recognizably, these corporations continue to operate and wreak havoc.

Silicon Valley shows a extended software development period of bungling ineptitude punctuated by brilliant though momentary breakthroughs. Characters are smart, flawed people laughably unable to get out of the way of their own success. The pièce de résistance was yoking one so-called “learning machine” to another and initiating what would become a runaway doomsday process (either like ecological collapse, building slowly then making the biosphere uninhabitable all at once, or like the gray goo problem, progressively “processing” biomass at the molecular level until all that remains is lifeless goo). It was a final act of bumbling that demanded the characters’ principled, ethical response before the window of opportunity closed. Real Silicon Valley tech platforms are in the (ongoing) process of rending the social fabric, which is no laughing matter. The issue du jour surrounds free speech and its inverse censorship. More broadly, real Silicon Valley succeeded in gaming human psychology for profit in at least two aspects (could be more as yet unrecognized): (1) mining behavioral data as an exploitable resource, and (2) delivering inexhaustible streams of extremely divisive content (not its own) to drive persistent engagement with its platforms. Yoked together, they operate to drive society mad, and yet, mounting evidence of this development has not produced even an inkling that maybe the damned doomsday devices ought to be shut off. As with the environment, we operate with freedom enough to destroy ourselves. Instead, politicians issue stunningly ineffectual calls for regulation or break-up of monopolies. In the meantime, ever more absurd wealth and power are concentrated in the hands of a few executives who have clearly punted and decided “let’s be evil.” No restraints on their behavioral experimentation across whole societies exist.

Much more to say on this topic in additional parts to come.