Archive for the ‘Mental Health’ Category

So far, this multipart blog post has trafficked in principles and generalities. Let me try now to be more specific, starting with an excerpt from Barry Lynn’s article in Harper’s Magazine titled “The Big Tech Extortion Racket” (Sept. 2020):

… around the middle of the nineteenth century, Americans began to develop technologies that could not be broken into component pieces. This was especially true of the railroad and the telegraph … Such corporations [railroad and telegraph companies] posed one overarching challenge: they charged some people more than others to get to market. They exploited their control over an essential service in order to extort money, and sometimes political favors … Americans found the answer to this problem in common law. For centuries, the owners of ferries, stagecoaches, and inns had been required to serve all customers for the same price and in the order in which they arrived. In the late nineteenth century, versions of such “common carrier” rules were applied to the new middleman corporations.

Today we rightly celebrate the Sherman Antitrust Act of 1890, which gave Americans the power to break apart private corporations. But in many respects, the Interstate Commerce Act of 1887 was the more important document. This act was based on the understanding that monopoly networks like the railroad and the telegraph could be used to influence the actions of people who depend on them, and hence their power must be carefully restricted …

For a century and a half, Americans used common carrier policies to ensure the rule of law in activities that depended on privately held monopolies … regulations freed Americans to take full advantage of every important network technology introduced during these years, including telephones, water and electrical services, energy pipelines, and even large, logistics-powered retailers. Citizens did not have to worry that the men who controlled the technologies involved would exploit their middleman position to steal other people’s business or disrupt balances of power.

I appreciate that Barry Lynn brings up the Interstate Commerce Act. If this legal doctrine appeared in the net neutrality debate a few years ago, it must have escaped my notice. While Internet Service Providers (ISPs) enable network access and connectivity, those utilities have not yet exhibited let’s-be-evil characteristics. Similarly, phone companies (including cell phones) and public libraries may well be eavesdropping and/or monitoring activities of the citizenry, but the real action lies elsewhere, namely, on social media networks and with online retailers. Evil is arguably concentrated in the FANG (or FAANG) corporations but has now grown to be ubiquitous in all social networks (e.g., Twitter) operating as common carriers (Zoom? Slack?) and across academe, nearly all of which have succumbed to moral panic. They are interpreting correctly, sad to observe, demands to censor and sanitize others’ no-longer-free speech appearing on their networks or within their realms. How much deeper it goes toward shaping politics and social engineering is quasi-conspiratorial and impossible for me to assess.

Much as I would prefer to believe that individuals possess the good sense to shift their activities away from social networks or turn their attention from discomfiting information sources, that does not appear to be the case. Demands for trigger warnings and safe spaces commonplace a few years ago on college campuses have instead morphed into censorious removal, deplatforming, and cancellation from the entire public sphere. Those are wrong responses in free societies, but modern institutions and technologies have gotten out of hand and outstripped the limits of normal human cognition. In short, we’re a society gone mad. So rather than accept responsibility to sort out information overflow oneself, many are demanding that others do it for them, and evil private corporations are complying (after a fashion). Moreover, calls for creation of an Orwellian Ministry of Truth, rebranded as a Truth Commission and Reality Czar, could hardly be any more chillingly and fascistically bizarre. People really need someone to brainwash decide for them what is real? Has anyone at the New York Times actually read Orwell’s dystopian novel 1984 and taken to heart its lessons?

I have observed various instances of magical thinking in mainstream culture, especially here, which I find problematical. Although it’s not my ambition to disabuse anyone of magical thinking, which extends far beyond, say, religious thought, I was somewhat taken aback at the suggestion found in the comic at this link (not embedded). For those not familiar with Questionable Content (one of two online comics I read regularly), the comic presents an extended cast of characters, mostly in their early 20s, living in a contemporary New England college town. Those characters are supplemented by a few older parents and lots of AIs (in robot bodies). The AIs are not particularly futuristic but are simply accepted as a normal (if curious) part of the world of the comic. Major story arcs involve characters and AIs (the AIs are characters, I suppose) in the process of discovering and establishing themselves as they (the humans, anyway) transition into early adulthood. There are no great political themes or intrusions into life in a college town. Rather, the comic is largely about acceptance of difference. Often, that means washing away meaningful difference in the name of banal tolerance. Real existential struggle is almost entirely absent.

In the linked comic, a new character comes along and offers advice to an established character struggling with sexual attractions and orientation. The dialogue includes this exchange:

Character A: If tarot or astrology or religion halps you make sense of the world and your place in it, then why not use them?
Character B: But they’re not real. [emphasis in original]
Character A: It doesn’t matter, if you use them constructively!

There it is in a nutshell: believe whatever you want if it, um, halps. I’ve always felt that being wrong (i.e., using unreal or make-believe things) was a sufficient injunction against anchoring oneself to notions widely known to be false. Besides, isn’t it often remarked that the biggest fool is one who fools himself? (Fiction as a combination of entertainment and building a worldview is quite normative, but it’s understood as fiction, or to a lesser degree, as life imitating art and its inverse. Exceptions abound, which are regarded as psychopathy.) The instruction in that dialogue (part object lesson, part lesson in cognition) is not that it’s OK to make mistakes but that knowingly believing something false has worthwhile advantages.

Surveying examples where promulgating false beliefs have constructive and destructive effects is too large a project. Well short of that, nasty categories include fraud, gaslighting, and propaganda, which are criminal in many cases and ought to be in most others (looking at you, MSM! — or not, since I neither trust nor watch). One familiar benevolent category is expressed in the phrase fake it til you make it, often recommended to overcome a lack of confidence. Of course, a swindle is also known as a confidence game (or by its diminutive, a con), so beware overconfidence when asked by another to pay for something (e.g., tarot or astrology readings), take risks, or accept an ideology without question.

As philosophy, willful adoption of falsity for its supposed benefits is half-baked. Though impossible to quantify, my suspicion is that instances of positive outcomes are overbalanced by negative ones. Maybe living in a constructed reality or self-reinforcing fantasy is what people want. The comic discussed is certainly in line with that approach. However, while we dither and delude ourselves with happy, aspirational stories based on silliness, the actual world around us, including all the human institutions that used to serve us but no longer do, falls to tatters. Is it better going through life and eventually to one’s grave refusing to see that reality? Should childlike wonder and innocence be retained in spite of what is easily observable just by poking one’s head up and dismissing comforting lies? Decide for yourself.

Evil exists in the world. History and current events both bear this out amply. Pseudo-philosophers might argue that, like emotions and other immaterial sensations, good and evil are merely reified concepts, meaning they are human constructs with no palpable external reality. Go tell that to victims of evildoers. Human suffering can’t be anonymized, rationalized, or philosophized away quite so handily.

It was sort of refreshing, back in the day, when Google’s motto and/or corporate code of conduct was simple: “Don’t Be Evil.” It acknowledged the potential for being or becoming evil (like any of the Bigs: Big Tobacco, Big Soda, Big Oil, Big Pharma, Big Media, Big Agriculture, etc.) and presumably aspired to resist obvious temptations. That was then (from 2000 to 2018), this is now (2021 until death take us — soon enough, I fear). But like all entities possessed of absurd levels of wealth and power, Google (now reorganized as a subsidiary of Alphabet, but who actually refers to it that way?) and its Silicon Valley brethren have succumbed to temptation and become straight-up evil.

One might charitably assess this development as something unbidden, unanticipated, and unexpected, but that’s no excuse, really. I certainly don’t envy celebrity executives experiencing difficulty resulting from having created unmanageable behemoths loosed on both public and polity unable to recognize beastly fangs until already clamped on their necks. As often occurs, dystopian extrapolations are explored in fiction, sometimes satirically. The dénouement of the HBO show Silicon Valley depicts tech mogul wannabes succeeding in creating an AI (or merely a sophisticated algorithm? doesn’t matter …) that would in time become far too powerful in blind execution of its inner imperative. In the show, characters recognize what they had done and kill their own project rather than allow it to destroy the world. In reality, multiple developers of computer tech platforms (and their embedded dynamic, including the wildly unhelpful albeit accurate term algorithm) lacked the foresight to anticipate awful downstream effects of their brainchildren. Yet now that those effects are manifesting recognizably, these corporations continue to operate and wreak havoc.

Silicon Valley shows a extended software development period of bungling ineptitude punctuated by brilliant though momentary breakthroughs. Characters are smart, flawed people laughably unable to get out of the way of their own success. The pièce de résistance was yoking one so-called “learning machine” to another and initiating what would become a runaway doomsday process (either like ecological collapse, building slowly the making the biosphere uninhabitable all at once, or like the gray goo problem, progressively “processing” biomass at the molecular level until all that remains is lifeless goo). It was a final act of bumbling that demanded the characters’ principled, ethical response before the window of opportunity closed. Real Silicon Valley tech platforms are in the (ongoing) process of rending the social fabric, which is no laughing matter. The issue du jour surrounds free speech and its inverse censorship. More broadly, real Silicon Valley succeeded in gaming human psychology for profit in at least two aspects (could be more as yet unrecognized): (1) mining behavioral data as an exploitable resource, and (2) delivering inexhaustible streams of extremely divisive content (not its own) to drive persistent engagement with its platforms. Yoked together, they operate to drive society mad, and yet, mounting evidence of this development has not produced even an inkling that maybe the damned doomsday devices ought to be shut off. As with the environment, we operate with freedom enough to destroy ourselves. Instead, politicians issue stunningly ineffectual calls for regulation or break-up of monopolies. In the meantime, ever more absurd wealth and power are concentrated in the hands of a few executives who have clearly punted and decided “let’s be evil.” No restraints on their behavioral experimentation across whole societies exist.

Much more to say on this topic in additional parts to come.

I simply can’t keep up with all the reading, viewing, and listening in my queue. Waking hours are too few, and concentration dissipates long before sleep overtakes. Accordingly, it’s much easier to settle into couch-potato mode and watch some mindless drivel, such as the Netflix hit Bridgerton binged in two sittings. (Unlike cinema critics, I’m not bothered especially by continuity errors, plot holes, clunky dialogue, weak character motivation, gaps of logic, or glossy decadence of the fictional worlds. I am bothered by the Kafka trap sprung on anyone who notices casting decisions that defy time and place — an ill-advised but now commonplace historical revisionism like editing Mark Twain.) As a result, blog posts are less frequent than they might perhaps be as I pronounce upon American (or more broadly, Western) culture, trying vainly to absorb it as a continuously moving target. Calls to mind the phrase Après moi, le déluge, except that there is no need to wait. A deluge of entertainment, news, analysis, punditry, and trolling has buried everyone already. So rather than the more careful consideration I prefer to post, here are some hot takes.

The Irregular Aphorist. Caitlin Johnstone offers many trenchant observations in the form of aphorisms (some of which I’ve quoted before), all gathered under the subtitle Notes From The Edge Of The Narrative Matrix. The modifier irregular only means that aphorisms are a regular but not constant feature. Her site doesn’t have a tag to that effect but probably ought to. Here’s one in particular that caught my attention:

Everything our species has tried has led us to a dying world and a society that is stark raving mad, so nobody is in any position to tell you that you are wrong.

Twin truths here are (1) the dying world and (2) societal madness, both of which I’ve been describing for some time. Glad when others recognize them, too.

Piling on. Though few still are willing to admit it, nonpharmaceutical interventions (NPIs, e.g., distancing, masks, and lockdowns) to stall or reduce the spread of the virus failed to achieve their objectives according to this study. Instead, NPIs piled on suffering no one could forestall. I read somewhere (no link) that the world is approaching half of total, cumulative deaths/infections predicted had nothing been done to impede the pandemic running its course. Adding in deaths of despair (numbers not entirely up to date), we’re using the wrong tools to fight the wrong battle. Of course, interventions opened up giant opportunities for power grabs and vulture capitalism, so the cynic in me shrugs and wonders half aloud “what did you expect, really?”

Growth of the Managerial Bureaucracy. A blog called Easily Distracted by Timothy Burke (never on my blogroll) publishes only a few times per year, but his analysis is terrific — at least when it doesn’t wind up being overlong and inconclusive. Since a student debt jubilee is back in the news (plenty of arguments pro and con), unintended consequences are anticipated in this quote:

When you set out to create elaborate tiers that segregate the deserving poor from the comfortable middle-class and the truly wealthy, you create a system that requires a massive bureaucracy to administer and a process that forces people into petitionary humiliation in order to verify their eligibility. You create byzantine cutoff points that become business opportunities for predatory rentiers.

Something similar may well be occurring with stimulus checks being issued pro rata (has anyone actually gotten one?), but at least we’re spared any petitionary humiliations. We get whatever the algorithms (byzantine cutoff points) dictate. How those funds will be gamed and attached is not yet clear. Stay alert.

No Defense of Free Speech. Alan Jacobs often recommends deleting, unsubscribing, and/or ignoring social media accounts (after his own long love-hate relationship with them) considering how they have become wholly toxic to a balanced psyche as well as principal enablers of surveillance capitalism and narrative control. However, in an article about the manorial elite, he’s completely lost the plot that absolutism is required in defense of free speech. It’s not sufficient to be blasé or even relieved when 45 is kicked off Twitter permanently or when multiple parties conspire to kill Parler. Establishing your own turf beyond the reach of Silicon Valley censors is a nice idea but frankly impractical. Isn’t that what whoever ran Parler (or posted there) must have thought? And besides, fencing off the digital commons these very entities created has catapulted them into the unenviable position of undemocratic, unelected wielders of monopolistic power and co-conspirators to boot. That’s what needs to be curtailed, not free speech.

The Taxonomic Apocalypse. Although drawn from fiction and thus largely hypothetical, a new book (coming late 2021) by Adam Roberts called It’s the End of the World: But What Are We Really Afraid Of? surveys doomsday stories and categorizes different versions of how it all ends. Alan Jacobs (yeah, him again — must have an advance copy of the manuscript) recommends it as “a delightful and provocative little book” but fails to grok two things: (1) these stories are rehearsals-cum-preparations for the real thing, and (2) the real thing really is bearing down on us implacably and so is no longer a mere hypothetical to contemplate and categorize for shits and grins. Despite acceptance of the eventualities that await all of us, reading Roberts’ taxonomy is not something I would expect to find delightful. Skip.

Narrative Collapse. Ran Prier (no link) sometimes makes statements revealing an unexpected god’s-eye view:

[45] is a mean rich kid who figured out that if he does a good Archie Bunker impression, every lost soul with an authoritarian father will think he’s the messiah. We’re lucky that he cares only about himself, instead of having some crazy utopian agenda. But the power, and the agency, is with the disaffected citizens of a declining empire, tasting barbarism.

This is all about people wanting to be part of a group that’s part of a story. Lately, some of the big group-stories have been dying: sky father religion, American supremacy, the conquest of nature, the virtue of wealth-seeking. In their place, young and clumsy group-stories struggle and rise.

Collapse of certain fundamental stories that animate our thinking is at the core of The Spiral Staircase (see About Brutus at top), though it’s often couched in terms of consciousness in transition. Getting through the transition (only temporarily, see previous item in list) probably means completion of the Counter-Enlightenment historical arc, which necessarily includes further descent into barbarism.

Hail Mary for Individualism. I always take special notice when someone cites Allan Bloom. Alan Jacobs (um, yeah, he’s prolific and I’m using his ideas again — sue me) cites Bloom to argue that individualism or the sovereign self, a product of the Enlightenment, is already dead. No doubt, the thought-world described so ably by Bloom no longer exists, but individualism has not yet died out by attrition or been fully dissolved in nonduality. Many of us born before the advent of the Internet retain selfhood and authenticity not yet coopted by or incorporated into mass mind. Moreover, ongoing struggles over identity (e.g., gender, sexual orientation, and race that are often used improperly to define the self) result from an inchoate sense that individualism is eroding precipitously, not that it’s already passé. Defiant attempts to (re)establish an authentic self (contravening all logic and becoming critical theory of one sort or another) in the face of this loss may well be a last-ditch effort to save the self, but it’s failing.

I’ve mentioned the precautionary principle several times, most notably here. Little of our approach to precautions has changed in the two years since that blog post. At the same time, climate change and Mother Nature batter us aggressively. Eventualities remain predictable. Different precautions are being undertaken with respect to the pandemic currently gripping the planet. Arguably, the pandemic is either a subset of Mother Nature’s fury or, if the virus was created in a lab, a self-inflicted wound. Proper pandemic precautions have been confounded by undermining of authority, misinformation, lack of coordination, and politically biased narratives. I’m as confused as the next poor sap. However, low-cost precautions such as wearing masks are entirely acceptable, notwithstanding refusals of many Americans to cooperate after authorities muddied the question of their effectiveness so completely. More significant precautions such as lockdowns and business shutdowns have morphed into received wisdom among government bodies yet are questioned widely as being a cure worse than the disease, not to mention administrative overreach (conspiratorial conjecture withheld).

Now comes evidence published in the New England Journal of Medicine on November 11, 2020, that costly isolation is flatly ineffective at stemming infection rates. Here are the results and conclusions from the abstract of the published study:

Results
A total of 1848 recruits volunteered to participate in the study; within 2 days after arrival on campus, 16 (0.9%) tested positive for SARS-CoV-2, 15 of whom were asymptomatic. An additional 35 participants (1.9%) tested positive on day 7 or on day 14. Five of the 51 participants (9.8%) who tested positive at any time had symptoms in the week before a positive qPCR test. Of the recruits who declined to participate in the study, 26 (1.7%) of the 1554 recruits with available qPCR results tested positive on day 14. No SARS-CoV-2 infections were identified through clinical qPCR testing performed as a result of daily symptom monitoring. Analysis of 36 SARS-CoV-2 genomes obtained from 32 participants revealed six transmission clusters among 18 participants. Epidemiologic analysis supported multiple local transmission events, including transmission between roommates and among recruits within the same platoon.
Conclusions
Among Marine Corps recruits, approximately 2% who had previously had negative results for SARS-CoV-2 at the beginning of supervised quarantine, and less than 2% of recruits with unknown previous status, tested positive by day 14. Most recruits who tested positive were asymptomatic, and no infections were detected through daily symptom monitoring. Transmission clusters occurred within platoons.

So an initial 0.9% tested positive, then an additional 1.9%. This total 2.8% compares to 1.7% in the control group (tested but not isolated as part of the study). Perhaps the experimental and control groups are a bit small (1848 and 1554, respectively), and it’s not clear why the experimental group infection rate is higher than that of the control group, but the evidence points to the uselessness of trying to limit the spread of the virus by quarantining and/or isolation. Once the virus is present in a population, it spreads despite precautions.

A mantra is circulating that we should “trust the science.” Are these results to be trusted? Can we call off all the lockdowns and closures? It’s been at least eight months that the virus has been raging throughout the U.S. Although there might be some instances of isolated populations with no infection, the wider population has by now been exposed. Moreover, some individuals who self-isolated effectively may not have been exposed, but in all likelihood, most of us have been. Accordingly, renewed lockdowns, school and business closures, and destruction of entire industries are a pretense of control we never really had. Their costs are enormous and ongoing. A stay-at-home order (advisory, if you prefer) just went into effect for the City of Chicago on November 16, 2020. My anecdotal observation is that most Chicagoans are ignoring it and going about their business similar to summer and fall months. It’s nothing like the ghost town effect of March and April 2020. I daresay they may well be correct to reject the received wisdom of our civic leaders.

/rant on

Remember all those folks in the weeks and days preceding election day on November 4, 2020, who were buying guns, ammo, and other provisions in preparation for civil breakdown? (No one known personally, of course, and gawd no not actually any of us, either; just them other others who don’t read blogs or anything else.) Well, maybe they were correct adopting the precautionary principal (notably absent from a host of other perils besetting us). But as of this writing, nothing remotely resembling widespread disruption — feared by some, hotly anticipated by others — has developed. But wait! There’s still time. Considering Americans were set up by both political parties to distrust the outcome of the presidential race no matter which candidate claimed to have prevailed, we now face weeks or months of legal challenges and impatient formation of agitators (again, both sides) demanding their candidate be declared the winner (now, dammit!) by the courts instead of either official ballot-counters or the liberal-biased MSM. To say our institutions have failed us, and further, that political operatives all the way up to the sitting president have been openly fomenting violence in the streets, is a statement of the obvious.

Among my concerns more pressing than who gets to sit in the big chair, however, is the whipsawing stock market. Although no longer an accurate proxy of overall economic health or asset valuation, the stock market’s thoroughly irrational daily reaction to every rumor of, say, a vaccine for the raging coronavirus, or resumption of full economic activity and profitability despite widespread joblessness, renewed lockdowns, and a massive wave of homelessness in the offing due to bankruptcies, evictions, and foreclosures, none of this bodes well for the short-term future and maintenance of, oh, I dunno, supply lines to grocery stores. Indeed, I suspect we are rapidly approaching our very own Minsky Moment, which Wikipedia describes as “a sudden, major collapse of asset values which marks the end of the growth phase of a cycle in credit markets or business activity” [underlying links omitted]. This is another prospective event (overdue, actually) for which the set-up has been long prepared. Conspiratorial types call it “the great reset” — something quite different from a debt jubilee.

For lazy thinkers, rhyming comparisons with the past frequently resort to calling someone a Nazi (or the new Hitler) or reminding everyone of U.S. chattel slavery. At the risk of being accused of similar stupidity, I suggest that we’re not on the eve of a 1929-style market crash and ensuing second great depression (though those could well happen, too, bread lines having already formed in 2020) but are instead poised at the precipice of hyperinflation and intense humiliation akin to the Weimar Republic in 1933 or so. American humiliation will result from recognition that the U.S. is now a failed state and doesn’t even pretend anymore to look after its citizens or the commonweal. Look no further than the two preposterous presidential candidates, neither of whom made any campaign promises to improve the lives of average Americans. Rather, the state has been captured by kleptocrats. Accordingly, no more American exceptionalism and no more lying to ourselves how we’re the model for the rest of the world to admire and emulate.

Like Germany in the 1930s, the U.S. has also suffered military defeats and stagnation (perhaps by design) and currently demonstrates a marked inability to manage itself economically, politically, or culturally. Indeed, the American people may well be ungovernable at this point, nourished on a thin gruel of rugged individualism that forestalls our coming together to address adversity effectively. The possibility of another faux-populist savior arising out of necessity only to lead us over the edge (see the Great Man Theory of history) seems eerily likely, though the specific form that descent into madness would take is unclear. Recent history already indicates a deeply divided American citizenry having lost its collective mind but not yet having gone fully apeshit, flinging feces and destroying what remains of economically ravaged communities for the sheer sport of it. (I’ve never understood vandalism.) That’s what everyone was preparing for with emergency guns, ammo, and provisions. How narrowly we escaped catastrophe (or merely delayed it) should be clear in the fullness of time.

/rant off

I’ve been holding in mind for five months now the article at this link (an informal interview with neuroscientist and psychologist Oliver J. Robinson), waiting for conditions when I could return to forms of media consumption I prefer, namely, reading books, magazines, and long-form journalism. When I try to read something substantive these days, I find myself going over the same paragraph repeatedly, waiting in vain for it to register. Regrettably, the calm, composure, and concentration needed for deep reading has been effectively blocked since March 2020 as we wait (also in vain) for the pandemic to burn itself out. (I could argue that the soul-destroying prospect of industrial collapse and near-term human extinction is having the same effect for much longer.) So my attention and media habits have been resignedly diverted to crap news gathering, mostly via video, and cheap entertainments, mostly streaming TV (like everyone else, though others may complain less). The lack of nourishment is noticeable. Considering we’re only weeks away from the U.S. presidential election, stress levels are ratcheting up further, and civil authorities prepare for “election riots” (is that new term?), which I can only assume means piling violence upon violence under the pretense of keeping-the-peace or law-and-order or some other word string rendered meaningless now that the police are widely acknowledged to be a significant contributors to the very problems they are meant to address. These unresolved issues (pandemic, police violence, civil unrest) give rise to pathological anxiety, which explains (according to Robinson, disclaimers notwithstanding) why it’s so hard to read.

To say we live in unprecedented times is both obvious and banal. Unique stresses of modernity have led multiple times to widespread madness and conflict, as well as attempts to recapture things lost in previous shifts from other styles of social organization. Let me not mince words regarding what’s now happening: we’re in an era of repudiation of the Enlightenment, or a renewed Counter-Enlightenment. I’ve stated this before, and I’m not the only one making this diagnosis (just learned it’s a rather old idea — I’m always late to the party). For instance, Martin Jay’s essay “Dialectic of Counter-Enlightenment” appears to have been floating around in various forms since 2011. Correlation of this renewal of Counter-Enlightenment fervor with literacy seems clear. Despite basic literacy as a skill being widely improved worldwide over the past two centuries, especially in the developing world, deep literacy is eroding:

Beyond self-inflicted attention deficits, people who cannot deep read – or who do not use and hence lose the deep-reading skills they learned – typically suffer from an attenuated capability to comprehend and use abstract reasoning. In other words, if you can’t, or don’t, slow down sufficiently to focus quality attention – what Wolf calls “cognitive patience” – on a complex problem, you cannot effectively think about it.

Considering deep literacy is absolutely critical to clear thinking (or critical thought, if you prefer, not to be confused with the The Frankfurt School’s critical theory discussed in Jay’s essay), its erosion threatens fundamental institutions (e.g., liberal democracy and the scientific method) that constitute the West’s primary cultural inheritance from the Enlightenment. The reach of destruction wrought by reversing course via the Counter-Enlightenment cannot be overstated. Yet many among us, completely unable to construct coherent ideas, are rallying behind abandonment of Enlightenment traditions. They’re ideologues who actively want to return to the Dark Ages (while keeping modern tech, natch). As with many aspects of unavoidable cultural, social, environmental, and civilizational collapse, I have difficulty knowing quite what to hope for. So I won’t condemn retrograde thinking wholly. In fact, I feel empathy toward calls to return to simpler times, such as with German Romanticism or American Transcendentalism, both examples of cultural and aesthetic movements leading away from the Enlightenment.

Long before these ideas coalesced for me, I had noted (see here, here, and here) how literacy is under siege and a transition back toward a predominantly oral culture is underway. The Counter-Enlightenment is either a cause or an effect, I can’t assess which. At the risk of being a Cassandra, let me suggest that, if these times aren’t completely different from dark episodes of the past, we are now crossing the threshold of a new period of immense difficulty that makes pathological anxiety blocking the ability to read and think a minor concern. Indeed, that has been my basic assessment since crafting the About Brutus blurb way back in 2006. Indicators keep piling up. So far, I have a half dozen points of entry to process and digest by other cultural commentators exploring this theme, though they typically don’t adopt wide enough historical or cultural perspectives. Like the last time I failed to synthesize my ideas into a multipart blog series, I don’t have a snazzy title, and this time, I don’t even have planned installment titles. But I will do my best to roll out in greater detail over several blog posts some of the ways the Counter-Enlightenment is manifesting anew.

Can’t let these two brief paragraphs pass by without reinforcing them, from Tom Lewis’ latest blog post:

… we have generations of privileged white people who cannot imagine being non white or unrich. We have whole populations who cannot imagine that an infectious disease is a real threat because they don’t have it, and neither does anyone they know. Who are unable to believe that global warming is real because where they are, it’s cold. Who cannot conceive of conditions that would cause a family to walk a thousand miles and seek a new life in another country. Who cannot think of any reason other than moral depravity that would lead a person into addiction.

People who are devoid of imagination find it easy to accept simplistic explanations of how things work: forests burn fiercely in California because they haven’t been raked; bleach could be good for COVID-19; all we have to do to have a better life is get rid of the (fill in the blank, one word only). 

Once in a while, when discussing current events and their interpretations and implications, a regular interlocutor of mine will impeach me, saying “What do you know, really?” I’m always forced to reply that I know only what I’ve learned through various media sources, faulty though they may be, not through first-hand observation. (Reports of anything I have observed personally tend to differ considerably from my own experience once the news media completes its work.) How, then, can I know, to take a very contemporary instance this final week of July 2020, what’s going on in Portland from my home in Chicago other than what’s reported? Makes no sense to travel there (or much of anywhere) in the middle of a public health crisis just to see a different slice of protesting, lawbreaking, and peacekeeping [sic] activities with my own eyes. Extending the challenge to its logical extremity, everything I think I know collapses into solipsism. The endpoint of that trajectory is rather, well, pointless.

If you read my previous post, there is an argument that can’t be falsified any too handily that what we understand about ourselves and the world we inhabit is actually a constructed reality. To which I reply: is there any other kind? That construction achieves a fair lot of consensus about basics, more than one might even guess, but that still leaves quite a lot of space for idiosyncratic and/or personal interpretations that conflict wildly. In the absence of stabilizing authority and expertise, it has become impossible to tease a coherent story out of the many voices pressing on us with their interpretations of how we ought to think and feel. Twin conspiracies foisted on us by the Deep State and MSM known and RussiaGate and BountyGate attest to this. I’ll have more to say about inability to figure things out when I complete my post called Making Sense and Sensemaking.

In the meantime, the modern world has in effect constructed its own metaphorical Tower of Babel (borrowing from Jonathan Haidt — see below). It’s not different languages we speak so much (though it’s that, too) as the conflicting stories we tell. Democratization of media has given each us of — authorities, cranks, and everyone between — new platforms and vehicles for promulgating pet stories, interpretations, and conspiracies. Most of it is noise, and divining the worthwhile signal portion is a daunting task even for disciplined, earnest folks trying their best to penetrate the cacophony. No wonder so many simply turn away in disgust.

Ours is an era when individuals are encouraged to explore, amplify, and parade various attributes of their identities out in public, typically via social media. For those just coming of age and/or recently having entered adulthood, because identity is not yet fully formed, defining oneself is more nearly a demand. When identity is further complicated by unusual levels of celebrity, wealth, beauty, and athleticism (lots of overlap there), defining oneself is often an act of rebellion against the perceived demands of an insatiable public. Accordingly, it was unsurprising to me at least to learn of several well-known people unhappy with their lives and the burdens upon them.

Regular folks can’t truly relate the glitterati, who are often held up aspirational models. For example, many of us look upon the discomforts of Prince Harry and Meghan Markle with a combination of perverse fascination and crocodile tears. They were undoubtedly trapped in a strange, gilded prison before repudiating the duties expected of them as “senior royals,” attempting an impossible retreat to normalcy outside of England. Should be obvious that they will continue to be hounded while public interest in them persists. Similarly, Presley Gerber made news, fell out of the news, and then got back into the news as a result of his growing collection of tattoos. Were he simply some anonymous fellow, few would care. However, he has famous parents and already launched a modeling career before his face tattoo announced his sense of being “misunderstood.” Pretty bold move. With all the presumed resources and opportunities at his disposal, many have wondered in comments and elsewhere whether another, better declaration of self might have been preferred.

Let me give these three the benefit of doubt. Although they all have numerous enviable attributes, the accident of birth (or in Markle’s case, decision to marry) landed them in exceptional circumstances. The percentage of celebrities who crack under the pressure of unrelenting attention and proceed to run off the rails is significant. Remaining grounded is no doubt easier if one attains celebrity (or absurd immense wealth) after, say, the age of 25 or even later. (On some level, we’ve all lost essential groundedness with reality, but that’s another blog post.) Those who are children of celebrities or who become child prodigies may not all be consigned to character distortion or a life irrevocably out of balance, but it’s at least so commonplace that the dangerous potential should be recognized and embraced only with wariness. I’ve heard of programs designed to help professional athletes who become sudden multimillionaires (and thus targets of golddiggers and scammers) make the transition. Good for them that structured support is available. Yet another way average folks can’t relate: we have to work things out for ourselves.

Here’s the example I don’t get: Taylor Swift. She was the subject of a Netflix biography called Miss Americana (2020) that paints her as, well, misunderstood. Thing is, Swift is a runaway success story, raking in money, fans, awards, attention, and on balance, detractors. That success is something she earnestly desired and struggled to achieve only to learn that the glossy, popstar image sold especially but nonexclusively to 14-year-old girls comes with a lot of heavy baggage. How can the tragic lives of so many musicians launched into superstardom from the late 1950s onward have escaped Swift’s awareness in our media-saturated world? Naming names is sorta tacky, so I demur, but there are lots of them. Swift obtained her heart’s desire, found her songwriting and political voice, maintains a high public profile, and shows no lack of productivity. Sure, it’s a life out of balance, not remotely normal the way most noncelebrities would understand. However, she signed up for it willingly (if naïvely) and by all accounts perpetuates it. She created her own distinctive gilded prison. I don’t envy her, nor do I particularly feel sorry for her, as the Netflix show appears to instruct.