Posts Tagged ‘Free Speech’

Continuing from the previous blog post, lengthy credit scrolls at the ends of movies have become a favorite hiding place for bloopers and teasers. The purpose of this practice is unclear, since I can’t pretend (unlike many reckless opinonators) to inhabit the minds of filmmakers, but it has become a fairly reliable afterthought for film-goers willing to wait out the credits. Those who depart the theater, change the channel, or click away to other content may know they are relinquishing some last tidbit to be discovered, but there’s no way to know in advance if one is being punked or pleased, or indeed if there is anything at all there. Clickbait news often employs this same technique, teasing some newsbit in the headline to entice readers to wade (or skim) through a series of (ugh!) one-sentence paragraphs to find the desired content, which sometimes is not even provided. At least one film (Monty Python’s The Secret Policeman’s Other Ball (1982) as memory serves) pranked those in a rush to beat foot traffic out of the theater (back when film-going meant visiting the cinema) by having an additional thirty minutes of material after the (first) credit sequence.

This also put me in mind of Paul Harvey radio broadcasts ending with the sign-off tag line, “… the rest of the story.” Harvey supplemented the news with obscure yet interesting facts and analysis that tended to reshape one’s understanding of consensus narrative. Such reshaping is especially important as an ongoing process of clarification and revision. When served up in delectable chunks by winning personalities like Paul Harvey, supplemental material is easily absorbed. When material requires effort to obtain and/or challenges one’s beliefs, something strongly, well, the default response is probably not to bother. However, those possessing intellectual integrity welcome challenging material and indeed seek it out. Indeed, invalidation of a thesis or hypothesis is fundamental to the scientific method, and no body of work can be sequestered from scrutiny and then be held as legitimately authoritative.

Yet that’s what happens routinely in the contemporary infosphere. A government press office or corporate public relations officer issues guidance or policy in direct conflict with earlier guidance or policy and in doing so seeks to place any resulting cognitive dissonance beyond examination and out of scope. Simple matters of adjustment are not what concern me. Rather, it’s wholesale brainwashing that is of concern, when something is clear within one’s memory or plainly documented in print/video yet brazenly denied, circumvented, and deflected in favor of a new directive. The American public has contended with this repeatedly as each new presidential administration demonizes the policies of its predecessors but typically without demonstrating the self-reflection and -examination to admit, wrongdoing, responsibility, or error on anyone’s part. It’s a distinctly American phenomenon, though others have cottoned onto it and adopted the practice for themselves.

Exhaustion from separating the spin-doctored utterances of one malefactor or another from one’s own direct experience and sense-making drives many to simply give up. “Whatever you say, sir. Lemme go back to my entertainments.” The prospect of a never-ending slog through evidence and analysis only to arrive on unsteady ground, due to shift underfoot again and again with each new revelation, is particularly unsatisfactory. And as discussed before, those who nonetheless strain to achieve knowledge and understanding that reach temporary sufficiency yet remain permanently, intransigently provisional find themselves thwarted by those in the employ of organizations willing and eager to game information systems in the service of their not-even-hidden agendas. Alternative dangers for the muddled thinker include retreating into fixed ideology or collapsing into solipsism. Maybe none of it matters in the end. We can choose our beliefs from the buffet of available options without adherence to reality. We can create our own reality. Of course, that’s a description of madness, to which many have already succumbed. Why aren’t they wearing straitjackets?

Here’s a term I daresay most won’t recognize: the purse seine. My introduction was as the title of a poem by Robinson Jeffers. More generally, the term refers to a net drawn between two fishing boats to encircle a school of fish. The poem captures something both beautiful and terrifying, drawing an analogy between a fishing net and government power over human populations gathered into cities (confined by economic necessity?) rather than subsisting more simply on the bounty of nature. Whether Jeffers intends a traditional agrarian setting or a deeper, ancestral, hunter-gatherer orientation is unclear and probably doesn’t matter. The obvious counterpoint he names plainly: Progress (capital P).

My own analogy to the purse seine is more pedestrian: cloth masks strung between two ears and drawn over the face to encircle the breath in futile hope of impeding the respiratory virus that has impacted everyone worldwide for the last two years (needs no name — are you living under a rock?). Like a seine allows water to flow through, cloth masks allow airflow so that one can breathe. Otherwise, we’d all be wearing gas masks and/or hazmat suits 24/7. And therein lies the problem: given the tiny particle size of the pathogen, cloth and paper masks are akin (yes, another analogy) to using a chain-link fence to hold back the wind. That’s not what fences (or face masks) are designed to do. More robust N95 masks do little better for the very same reason. Gotta be able to breathe. Other pandemic mitigation efforts such as social distancing, lock downs, and vaccines suffer from similar lack of efficacy no matter how official pronouncements insist otherwise. The pandemic has come in similar, unstoppable, year-over-year waves in locations/states/nations that took few or no precautions and those that imposed the most egregious authoritarian measures. The comparative numbers (those not purposely distorted beyond recognition, anyway) tell the story clearly, as anyone with a principled understanding of infectious disease could well have anticipated considering humans are a hypersocial species packed into dense population centers (compared to our agrarian past).

Although these are statements of the obvious, at least to me, I’ve broken my previous silence on the pandemic and surmise I’m probably tempting the censors and trolls. I’m not giving advice, and others can of course disagree; I’ve no particular issue with principled disagreement. Decide for yourself what to do. I do have a problem, however, with self-censorship (read: cowardice). So although this blog post is a rather oblique way of saying that the putative consensus narrative is a giant, shifting pile of horse pucky (disintegrating further into nothingness with each passing day), please exercise your synapses and evaluate the evidence best you can despite official channels (and plenty of water carriers) herding and bullying everyone toward conclusions that make utterly no sense in terms of public health.

Although disinclined to take the optimistic perspective inhabited by bright-siders, I’m nonetheless unable to live in a state of perpetual fear that would to façile thinkers be more fitting for a pessimist. Yet unrelenting fear is the dominant approach, with every major media outlet constantly stoking a toxic combination of fear and hatred, as though activation and ongoing conditioning of the lizard brain (i.e., the amygdala — or maybe not) in everyone were worthy of the endeavor rather than it being a limited instinctual response, leaping to the fore only when immediate threat presents. I can’t guess the motivations of purveyors of constant fear to discern an endgame, but a few of the dynamics are clear enough to observe.

First thing that comes to mind is that the U.S. in the 1930s and 40s was pacifist and isolationist. Recent memory of the Great War was still keenly felt, and with the difficulties of the 1929 Crash and ensuing Great Depression still very must present, the prospect of engaging in a new, unlimited war (even over there) was not at all attractive to the citizenry. Of course, political leaders always regard (not) entering into war somewhat differently, maybe in terms of opportunity cost. Hard to say. Whether by hook or by crook (I don’t actually know whether advance knowledge of the Japanese attack on Pearl Harbor was suppressed), the U.S. was handily drawn into the war, and a variety of world-historical developments followed that promoted the U.S. (and its sprawling, unacknowledged empire) into the global hegemon, at least after the Soviet Union collapsed and before China rose from a predominantly peasant culture into world economic power. A not-so-subtle hindsight lesson was learned, namely, that against widespread public sentiment and at great cost, the war effort could (not would) provide substantial benefits (if ill-gotten and of questionable desirability).

None of the intervening wars (never declared) or Wars for Dummies (e.g., the war on poverty, the war on crime, the war on drugs) provided similar benefits except to government agencies and careerist administrators. Nor did the war on terror following the 9/11 attacks or subsequent undeclared wars and bombings in Afghanistan, Iraq, Syria, Libya, Yemen, and elsewhere provide benefits. All were massive boondoggles with substantial destruction and loss of life. Yet after 9/11, a body of sweeping legislation was enacted without much public debate or scrutiny — “smuggled in under cover of fear” one might say. The Patriot Act and The National Defense Authorization Act are among the most notable. The conditioned response by the citizenry to perceived but not actual existential fear was consistent: desperate pleading to keep everyone safe from threat (even if it originates in the U.S. government) and tacit approval to roll back civil liberties (even though the citizenry is not itself the threat). The wisdom of the old Benjamin Franklin quote, borne out of a very different era and now rendered more nearly as a bromide, has long been lost on many Americans.

The newest omnipresent threat, literally made-to-order (at least according to some — who can really know when it comes to conspiracy), is the Covid pandemic. Nearly every talking, squawking head in government and the mainstream media (the latter now practically useless except for obvious propaganda functions) is telling everyone who still watches (video and broadcast being the dominant modes) to cower in fear of each other, reduce or refuse human contact and social function, and most of all, take the vaccine-not-really-a-vaccine followed by what is developing into a ongoing series of boosters to maintain fear and anxiety if not indeed provide medical efficacy (no good way to measure and substantiate that, anyway). The drumbeat is loud and unabated, and a large, unthinking (or spineless) portion of the citizenry, cowed and cowering, has basically joined the drum circle, spreading a social consensus that is very, well, un-American. Opinion as to other nations on similar tracks are not ventured here. Running slightly ahead of the pandemic is the mind virus of wokery and its sufferers who demand, among other things, control over others’ thoughts and speech through threats and intimidation, censorship, and social cancellation — usually in the name of safety but without any evidence how driving independent thought underground or into hiding accomplishes anything worthwhile.

Again, motivations and endgame in all this are unclear, though concentration of power to compel seems to be exhilarating. In effect, regular folks are being told, “stand on one leg; good boy; now bark like a dog; very good boy; now get used to it because this shit is never going to end but will surely escalate to intolerability.” It truly surprises me to see police forces around the world harassing, beating, and terrorizing citizens for failing to do as told, however arbitrary or questionable the order or the underlying justification. Waiting for the moment to dawn on rank-and-file officers that their monopoly on use of force is serving and protecting the wrong constituency. (Not holding my breath.) This is the stuff of dystopic novels, except that it’s not limited to fiction and frankly never was. The hotspot(s) shift in terms of time and place, but totalitarian mind and behavioral control never seems to fade or invalidate itself as one might expect. Covid passports to grant full participation in society (signalling compliance, not health) is the early step already adopted by some countries. My repeated warnings over the years of creeping fascism (more coercive style than form of government) appears to be materializing before our very eyes. I’m afraid of what portends, but with what remains of my intact mind, I can’t live in perpetual fear, come what may.

For more than a decade, I’ve had in the back of my mind a blog post called “The Power of Naming” to remark that bestowing a name gives something power, substance, and in a sense, reality. That post never really came together, but its inverse did. Anyway, here’s a renewed attempt.

The period of language acquisition in early childhood is suffused with learning the names of things, most of which is passive. Names of animals (associated closely with sounds they make) are often a special focus using picture books. The kitty, doggie, and horsie eventually become the cat, dog, and horse. Similarly, the moo-cow and the tweety-bird shorten to cow and bird (though songbird may be an acceptable holdover). Words in the abstract are signifiers of the actual things, aided by the text symbols learned in literate cultures to reinforce mere categories instead of examples grounded in reality. Multiply the names of things several hundred thousand times into adulthood and indeed throughout life and one can develop a formidable vocabulary supporting expressive and nuanced thought and speech. Do you know the differences between acute, right, obtuse, straight, and reflex angles? Does it matter? Does your knowledge of barware inform when to use a flute, coupe, snifter, shot (or shooter or caballito), nosing glass (or Glencairn), tumbler, tankard, goblet, sling, and Stein? I’d say you’ve missed something by never having drunk dark beer (Ger.: Schwarzbier) from a frosted schooner. All these varieties developed for reasons that remain invisible to someone content to drink everything from the venerable red Solo cup. Funnily enough, the red Solo cup now comes in different versions, fooling precisely no one.

Returning to book blogging, Walter Ong (in Orality and Literacy) has curious comparisons between primarily oral cultures and literate cultures. For example:

Oral people commonly think of names (one kind of words) as conveying power over things. Explanations of Adam’s naming of the animals in Genesis 2:20 usually call condescending attention to this presumably quaint archaic belief. Such a belief is in fact far less quaint than it seems to unreflective chirographic and typographic folk. First of all, names do give humans beings power over what they name: without learning a vast store of names, one is simply powerless to understand, for example, chemistry and to practice chemical engineering. And so with all other intellectual knowledge. Secondly, chirographic and typographic folk tend to think of names as labels, written or printed tags imaginatively affixed to an object named. Oral folk have no sense of a name as a tag, for they have no idea of a name as something that can be seen. Written or printed representations of words can be labels; real, spoken words cannot be. [p. 33]

This gets at something that has been developing over the past few decades, namely, that as otherwise literate (or functionally literate) people gather more and more information through electronic media (screens that serve broadcast and cable TV, YouTube videos, prerecorded news for streaming, and podcasts, and most importantly, audiobooks — all of which speak content to listeners), the spoken word (re)gains primacy and the printed word fades into disuse. Electronic media may produce a hybrid of orality/literacy, but words are no longer silent, internal, and abstract. Indeed, words — all by themselves — are understood as being capable of violence. Gone are the days when “stick and stones ….” Now, fighting words incite and insults sting again.

Not so long ago, it was possible to provoke a duel with an insult or gesture, such as a glove across the face. Among some people, defense of honor never really disappeared (though dueling did). History has taken a strange turn, however. Proposed legislation to criminalize deadnaming (presumably to protect a small but growing number of transgender and nonbinary people who have redefined their gender identity and accordingly adopted different names) recognizes the violence of words but then tries to transmute the offense into an abstract criminal law. It’s deeply mixed up, and I don’t have the patience to sort it out.

More to say in later blog posts, but I’ll raise the Counter-Enlightenment once more to say that the nature of modern consciousness if shifting somewhat radically in response to stimuli and pressures that grew out of an information environment, roughly 70 years old now but transformed even more fundamentally in the last 25 years, that is substantially discontinuous from centuries-old traditions. Those traditions displaced even older traditions inherited from antiquity. Such is the way of the world, I suppose, and with the benefit of Walter Ong’s insights, my appreciation of the outlines is taking better shape.

/rant on

The self-appointed Thought Police continue their rampage through the public sphere, campaigning to disallow certain thoughts and fence off unacceptable, unsanitary, unhygienic, unhealthy utterances lest they spread, infect, and distort their host thinkers. Entire histories are being purged from, well, history, to pretend they either never happened or will never happen again, because (doncha know?) attempting to squeeze disreputable thought out of existence can’t possibly result in those forbidden fruits blossoming elsewhere, in the shadows, in all their overripe color and sweetness. The restrictive impulse — policing free speech and free thought — is as old as it is stupid. For example, it’s found in the use of euphemisms that pretend to mask the true nature of all manner of unpleasantness, such as death, racial and national epithets, unsavory ideologies, etc. However, farting is still farting, and calling it “passing wind” does nothing to reduce its stink. Plus, we all fart, just like we all inevitably traffic in ideas from time to time that are unwholesome. Manners demand some discretion when farting broaching some topics, but the point is that one must learn how to handle such difficulty responsibly rather than attempting to hold it in drive it out of thought entirely, which simply doesn’t work. No one knows automatically how to navigate through these minefields.

Considering that the body and mind possess myriad inhibitory-excitatory mechanisms that push and/or pull (i.e., good/bad, on/off, native/alien), a wizened person might recognize that both directions are needed to achieve balance. For instance, exposure to at least some hardship has the salutary effect of building character, whereas constant indulgence results in spoiled children (later, adults). Similarly, the biceps/triceps operate in tandem and opposition and need each other to function properly. However, most inhibitory-excitatory mechanisms aren’t so nearly binary as our language tends to imply but rather rely on an array of inputs. Sorting them all out is like trying to answer the nature/nurture question. Good luck with that.

Here’s a case in point: student and professional athletes in the U.S. are often prohibited from kneeling in dissent during the playing of the national anthem. The prohibition does nothing to ameliorate the roots of dissent but only suppresses its expression under narrow, temporary circumstances. Muzzling speech (ironically in the form of silent behavior) prior to sports contests may actually boomerang to inflame it. Some athletes knuckle under and accept the deal they’re offered (STFU! or lose your position — note the initialism used to hide the curse word) while others take principled stands (while kneeling, ha!) against others attempting to police thought. Some might argue that the setting demands good manners and restraint, while others argue that, by not stomping around the playing field carrying placards, gesticulating threateningly, or chanting slogans, restraint is being used. Opinions differ, obviously, and so the debate goes on. In a free society, that’s how it works. Societies with too many severe restrictions, often bordering on or going fully into fascism and totalitarianism, are intolerable to many of us fed current-day jingoism regarding democracy, freedom, and self-determination.

Many members of the U.S. Congress, sworn protectors of the U.S. Constitution, fundamentally misunderstand the First Amendment, or at least they conveniently pretend to. (I suspect it’s the former). Here is it for reference:

Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the Government for a redress of grievances.

Defending the First Amendment against infringement requires character and principles. What we have instead, both in Congress and in American society, are ideologues and authorities who want to make some categories flatly unthinkable and subject others to prosecution. Whistleblowing falls into the latter category. They are aided by the gradual erosion of educational achievement and shift away from literacy to orality, which robs language of its richness and polysemy. If words are placed out of bounds, made unutterable (but not unthinkable), the very tools of thought and expression are removed. The thoughts themselves may be driven underground or reduced to incoherence, but that’s not a respectable goal. Only under the harshest conditions (Orwell depicted them) can specific thoughts be made truly unthinkable, which typically impoverishes and/or breaks the mind of the thinker or at least results in pro forma public assent while private dissent gets stuffed down. To balance and combat truly virulent notions, exposure and discussion is needed, not suppression. But because many public figures have swallowed a bizarre combination of incoherent notions and are advocating for them, the mood is shifting away from First Amendment protection. Even absolutists like me are forced to reconsider, as for example with this article. The very openness to consideration of contrary thinking may well be the vulnerability being exploited by crypto-fascists.

Calls to establish a Ministry of Truth have progressed beyond the Silicon Valley tech platforms’ arbitrary and (one presumes) algorithm-driven removal of huge swaths of user-created content to a new bill introduced in the Colorado State Senate to establish a Digital Communications Regulation commission (summary here). Maybe this first step toward hammering out a legislative response to surveillance capitalism will rein in the predatory behaviors of big tech. The cynic in me harbors doubts. Instead, resulting legislation is far more likely to be aimed at users of those platforms.

/rant off

So far, this multipart blog post has trafficked in principles and generalities. Let me try now to be more specific, starting with an excerpt from Barry Lynn’s article in Harper’s Magazine titled “The Big Tech Extortion Racket” (Sept. 2020):

… around the middle of the nineteenth century, Americans began to develop technologies that could not be broken into component pieces. This was especially true of the railroad and the telegraph … Such corporations [railroad and telegraph companies] posed one overarching challenge: they charged some people more than others to get to market. They exploited their control over an essential service in order to extort money, and sometimes political favors … Americans found the answer to this problem in common law. For centuries, the owners of ferries, stagecoaches, and inns had been required to serve all customers for the same price and in the order in which they arrived. In the late nineteenth century, versions of such “common carrier” rules were applied to the new middleman corporations.

Today we rightly celebrate the Sherman Antitrust Act of 1890, which gave Americans the power to break apart private corporations. But in many respects, the Interstate Commerce Act of 1887 was the more important document. This act was based on the understanding that monopoly networks like the railroad and the telegraph could be used to influence the actions of people who depend on them, and hence their power must be carefully restricted …

For a century and a half, Americans used common carrier policies to ensure the rule of law in activities that depended on privately held monopolies … regulations freed Americans to take full advantage of every important network technology introduced during these years, including telephones, water and electrical services, energy pipelines, and even large, logistics-powered retailers. Citizens did not have to worry that the men who controlled the technologies involved would exploit their middleman position to steal other people’s business or disrupt balances of power.

I appreciate that Barry Lynn brings up the Interstate Commerce Act. If this legal doctrine appeared in the net neutrality debate a few years ago, it must have escaped my notice. While Internet Service Providers (ISPs) enable network access and connectivity, those utilities have not yet exhibited let’s-be-evil characteristics. Similarly, phone companies (including cell phones) and public libraries may well be eavesdropping and/or monitoring activities of the citizenry, but the real action lies elsewhere, namely, on social media networks and with online retailers. Evil is arguably concentrated in the FANG (or FAANG) corporations but has now grown to be ubiquitous in all social networks (e.g., Twitter) operating as common carriers (Zoom? Slack?) and across academe, nearly all of which have succumbed to moral panic. They are interpreting correctly, sad to observe, demands to censor and sanitize others’ no-longer-free speech appearing on their networks or within their realms. How much deeper it goes toward shaping politics and social engineering is quasi-conspiratorial and impossible for me to assess.

Much as I would prefer to believe that individuals possess the good sense to shift their activities away from social networks or turn their attention from discomfiting information sources, that does not appear to be the case. Demands for trigger warnings and safe spaces commonplace a few years ago on college campuses have instead morphed into censorious removal, deplatforming, and cancellation from the entire public sphere. Those are wrong responses in free societies, but modern institutions and technologies have gotten out of hand and outstripped the limits of normal human cognition. In short, we’re a society gone mad. So rather than accept responsibility to sort out information overflow oneself, many are demanding that others do it for them, and evil private corporations are complying (after a fashion). Moreover, calls for creation of an Orwellian Ministry of Truth, rebranded as a Truth Commission and Reality Czar, could hardly be any more chillingly and fascistically bizarre. People really need someone to brainwash decide for them what is real? Has anyone at the New York Times actually read Orwell’s dystopian novel 1984 and taken to heart its lessons?

Considering the acceleration of practically everything in the late-modern world (postmodern refers to something quite different), which makes planning one’s higher education somewhat fraught if the subject matter studied is rendered flatly out-of-date or moribund by the time of either graduation or entry into the workforce, I’ve heard it recommended that expertise in any particular subject area may be less important than developing expertise in at least one subject that takes a systems approach. That system might be language and communications, mathematics (or any other hard science), history, economics and finance, business administration, computer coding, law and governance, etc. So long as a rigorous understanding of procedures and rules is developed, a structuralist mindset can be repeated and transferred into other subject areas. Be careful, however, not to conflate this approach with a liberal arts education, which is sometimes described as learning how to learn and is widely applicable across disciplines. The liberal arts have fallen distinctly out of favor in the highly technological and technocratic world, which cares little for human values resistant to quantification. Problem is, Western societies in particular are based on liberal democratic institutions now straining due to their sclerotic old age. And because a liberal arts education is scarcely undertaken anymore, civics and citizenship are no longer taught. Even the study of English has now been corrupted (postmodern does apply here) to the point that the basic liberal arts skill of critical thinking is being lost through attrition. Nowhere is that more abundantly clear than in bristling debate over free speech and censorship.

Aside. Although society tinkers and refines itself (sometimes declines) over time, a great body of cultural inheritance informs how things are done properly within an ideology or system. When tinkering and refinement become outright intransigence and defiance of an established order, it’s commonplace to hear the objection “but that’s not how _______ works.” For instance, debate over climate science or the utility of vaccines often has one party proclaiming “trust [or believe] the science.” However, that’s not how science works (i.e., through unquestioning trust or belief). The scientific method properly understood includes verification, falsification, and revision when results and assertions fail to establish reasonable certainty (not the same as consensus). Similarly, critical thinking includes a robust falsification check before “facts” can be accepted at face value. So-called “critical studies” (a/k/a grievance studies), like religious faith, typically positions bald assertions beyond the reach of falsification. Well, sorry, that’s not how critical thinking works.

Being older and educated before critical studies were fully legitimized (or gave rise to things as risible as feminist glaciology), my understand has always been that free speech and other rights are absolutes that cannot be sliced and diced into bits. That way lies casuistry, where law founders frequently. Thus, if one wishes, say, to trample or burn the U.S. flag in protest, no law can be passed or constitutional amendment enacted to carve out an exception disallowed that instance of dissenting free speech. A lesser example is kneeling silently rather than participating in singing the national anthem before a sporting event. Though offensive to certain individual’s sensibilities, silencing speech is far worse according to liberal democratic values. Whatever our ideological or political difference are, we cannot work them out when one party has the power to place topics out or bounds or remove others from discussion entirely. The point at which spirited debate crosses over into inciting violence or fomenting insurrection is a large gray area, which is the subject of the second impeachment of 45. Civil law covers such contingencies, so abridging free speech, deplatforming, and adopting the formulation “language is violence” are highly improper responses under the liberal form of government codified in the U.S. Constitution, which includes the Bill of Rights originally omitted from the U.S. Constitution but quickly added to articulate the rights fully.

Liberal democratic ideology arose in mercantile, substantially agrarian Western societies before scientific, industrial, and capitalist revolutions built a full head of steam, so to speak. Considering just how much America has developed since the Colonial Period, it’s no surprise society has outgrown its own founding documents. More pointedly, the intellectual commons was a much smaller environment, often restricted to a soapbox in the town square and the availability of book, periodicals, and broadsides. Today, the public square has moved online to a bewildering array of social media platforms that enables publication of one’s ideas well beyond the sound of one’s voice over a crowd or the bottleneck of a publisher’s printing press. It’s an entirely new development, and civil law has not kept pace. Whether Internet communications are regulated like the airwaves or nationalized like the U.S. military, it’s clear that the Wild West uber-democratic approach (where anyone can basically say anything) has failed. Demands for regulation (restrictions on free speech) are being taken seriously and acted upon by the private corporations that run social media platforms. During this interim phase, it’s easy for me, as a subscriber to liberal democratic values, to insist reflexively on free speech absolutism. The apparent mood of the public lies elsewhere.

Evil exists in the world. History and current events both bear this out amply. Pseudo-philosophers might argue that, like emotions and other immaterial sensations, good and evil are merely reified concepts, meaning they are human constructs with no palpable external reality. Go tell that to victims of evildoers. Human suffering can’t be anonymized, rationalized, or philosophized away quite so handily.

It was sort of refreshing, back in the day, when Google’s motto and/or corporate code of conduct was simple: “Don’t Be Evil.” It acknowledged the potential for being or becoming evil (like any of the Bigs: Big Tobacco, Big Soda, Big Oil, Big Pharma, Big Media, Big Agriculture, etc.) and presumably aspired to resist obvious temptations. That was then (from 2000 to 2018), this is now (2021 until death take us — soon enough, I fear). But like all entities possessed of absurd levels of wealth and power, Google (now reorganized as a subsidiary of Alphabet, but who actually refers to it that way?) and its Silicon Valley brethren have succumbed to temptation and become straight-up evil.

One might charitably assess this development as something unbidden, unanticipated, and unexpected, but that’s no excuse, really. I certainly don’t envy celebrity executives experiencing difficulty resulting from having created unmanageable behemoths loosed on both public and polity unable to recognize beastly fangs until already clamped on their necks. As often occurs, dystopian extrapolations are explored in fiction, sometimes satirically. The dénouement of the HBO show Silicon Valley depicts tech mogul wannabes succeeding in creating an AI (or merely a sophisticated algorithm? doesn’t matter …) that would in time become far too powerful in blind execution of its inner imperative. In the show, characters recognize what they had done and kill their own project rather than allow it to destroy the world. In reality, multiple developers of computer tech platforms (and their embedded dynamic, including the wildly unhelpful albeit accurate term algorithm) lacked the foresight to anticipate awful downstream effects of their brainchildren. Yet now that those effects are manifesting recognizably, these corporations continue to operate and wreak havoc.

Silicon Valley shows a extended software development period of bungling ineptitude punctuated by brilliant though momentary breakthroughs. Characters are smart, flawed people laughably unable to get out of the way of their own success. The pièce de résistance was yoking one so-called “learning machine” to another and initiating what would become a runaway doomsday process (either like ecological collapse, building slowly then making the biosphere uninhabitable all at once, or like the gray goo problem, progressively “processing” biomass at the molecular level until all that remains is lifeless goo). It was a final act of bumbling that demanded the characters’ principled, ethical response before the window of opportunity closed. Real Silicon Valley tech platforms are in the (ongoing) process of rending the social fabric, which is no laughing matter. The issue du jour surrounds free speech and its inverse censorship. More broadly, real Silicon Valley succeeded in gaming human psychology for profit in at least two aspects (could be more as yet unrecognized): (1) mining behavioral data as an exploitable resource, and (2) delivering inexhaustible streams of extremely divisive content (not its own) to drive persistent engagement with its platforms. Yoked together, they operate to drive society mad, and yet, mounting evidence of this development has not produced even an inkling that maybe the damned doomsday devices ought to be shut off. As with the environment, we operate with freedom enough to destroy ourselves. Instead, politicians issue stunningly ineffectual calls for regulation or break-up of monopolies. In the meantime, ever more absurd wealth and power are concentrated in the hands of a few executives who have clearly punted and decided “let’s be evil.” No restraints on their behavioral experimentation across whole societies exist.

Much more to say on this topic in additional parts to come.

I simply can’t keep up with all the reading, viewing, and listening in my queue. Waking hours are too few, and concentration dissipates long before sleep overtakes. Accordingly, it’s much easier to settle into couch-potato mode and watch some mindless drivel, such as the Netflix hit Bridgerton binged in two sittings. (Unlike cinema critics, I’m not bothered especially by continuity errors, plot holes, clunky dialogue, weak character motivation, gaps of logic, or glossy decadence of the fictional worlds. I am bothered by the Kafka trap sprung on anyone who notices casting decisions that defy time and place — an ill-advised but now commonplace historical revisionism like editing Mark Twain.) As a result, blog posts are less frequent than they might perhaps be as I pronounce upon American (or more broadly, Western) culture, trying vainly to absorb it as a continuously moving target. Calls to mind the phrase Après moi, le déluge, except that there is no need to wait. A deluge of entertainment, news, analysis, punditry, and trolling has buried everyone already. So rather than the more careful consideration I prefer to post, here are some hot takes.

The Irregular Aphorist. Caitlin Johnstone offers many trenchant observations in the form of aphorisms (some of which I’ve quoted before), all gathered under the subtitle Notes From The Edge Of The Narrative Matrix. The modifier irregular only means that aphorisms are a regular but not constant feature. Her site doesn’t have a tag to that effect but probably ought to. Here’s one in particular that caught my attention:

Everything our species has tried has led us to a dying world and a society that is stark raving mad, so nobody is in any position to tell you that you are wrong.

Twin truths here are (1) the dying world and (2) societal madness, both of which I’ve been describing for some time. Glad when others recognize them, too.

Piling on. Though few still are willing to admit it, nonpharmaceutical interventions (NPIs, e.g., distancing, masks, and lockdowns) to stall or reduce the spread of the virus failed to achieve their objectives according to this study. Instead, NPIs piled on suffering no one could forestall. I read somewhere (no link) that the world is approaching half of total, cumulative deaths/infections predicted had nothing been done to impede the pandemic running its course. Adding in deaths of despair (numbers not entirely up to date), we’re using the wrong tools to fight the wrong battle. Of course, interventions opened up giant opportunities for power grabs and vulture capitalism, so the cynic in me shrugs and wonders half aloud “what did you expect, really?”

Growth of the Managerial Bureaucracy. A blog called Easily Distracted by Timothy Burke (never on my blogroll) publishes only a few times per year, but his analysis is terrific — at least when it doesn’t wind up being overlong and inconclusive. Since a student debt jubilee is back in the news (plenty of arguments pro and con), unintended consequences are anticipated in this quote:

When you set out to create elaborate tiers that segregate the deserving poor from the comfortable middle-class and the truly wealthy, you create a system that requires a massive bureaucracy to administer and a process that forces people into petitionary humiliation in order to verify their eligibility. You create byzantine cutoff points that become business opportunities for predatory rentiers.

Something similar may well be occurring with stimulus checks being issued pro rata (has anyone actually gotten one?), but at least we’re spared any petitionary humiliations. We get whatever the algorithms (byzantine cutoff points) dictate. How those funds will be gamed and attached is not yet clear. Stay alert.

No Defense of Free Speech. Alan Jacobs often recommends deleting, unsubscribing, and/or ignoring social media accounts (after his own long love-hate relationship with them) considering how they have become wholly toxic to a balanced psyche as well as principal enablers of surveillance capitalism and narrative control. However, in an article about the manorial elite, he’s completely lost the plot that absolutism is required in defense of free speech. It’s not sufficient to be blasé or even relieved when 45 is kicked off Twitter permanently or when multiple parties conspire to kill Parler. Establishing your own turf beyond the reach of Silicon Valley censors is a nice idea but frankly impractical. Isn’t that what whoever ran Parler (or posted there) must have thought? And besides, fencing off the digital commons these very entities created has catapulted them into the unenviable position of undemocratic, unelected wielders of monopolistic power and co-conspirators to boot. That’s what needs to be curtailed, not free speech.

The Taxonomic Apocalypse. Although drawn from fiction and thus largely hypothetical, a new book (coming late 2021) by Adam Roberts called It’s the End of the World: But What Are We Really Afraid Of? surveys doomsday stories and categorizes different versions of how it all ends. Alan Jacobs (yeah, him again — must have an advance copy of the manuscript) recommends it as “a delightful and provocative little book” but fails to grok two things: (1) these stories are rehearsals-cum-preparations for the real thing, and (2) the real thing really is bearing down on us implacably and so is no longer a mere hypothetical to contemplate and categorize for shits and grins. Despite acceptance of the eventualities that await all of us, reading Roberts’ taxonomy is not something I would expect to find delightful. Skip.

Narrative Collapse. Ran Prier (no link) sometimes makes statements revealing an unexpected god’s-eye view:

[45] is a mean rich kid who figured out that if he does a good Archie Bunker impression, every lost soul with an authoritarian father will think he’s the messiah. We’re lucky that he cares only about himself, instead of having some crazy utopian agenda. But the power, and the agency, is with the disaffected citizens of a declining empire, tasting barbarism.

This is all about people wanting to be part of a group that’s part of a story. Lately, some of the big group-stories have been dying: sky father religion, American supremacy, the conquest of nature, the virtue of wealth-seeking. In their place, young and clumsy group-stories struggle and rise.

Collapse of certain fundamental stories that animate our thinking is at the core of The Spiral Staircase (see About Brutus at top), though it’s often couched in terms of consciousness in transition. Getting through the transition (only temporarily, see previous item in list) probably means completion of the Counter-Enlightenment historical arc, which necessarily includes further descent into barbarism.

Hail Mary for Individualism. I always take special notice when someone cites Allan Bloom. Alan Jacobs (um, yeah, he’s prolific and I’m using his ideas again — sue me) cites Bloom to argue that individualism or the sovereign self, a product of the Enlightenment, is already dead. No doubt, the thought-world described so ably by Bloom no longer exists, but individualism has not yet died out by attrition or been fully dissolved in nonduality. Many of us born before the advent of the Internet retain selfhood and authenticity not yet coopted by or incorporated into mass mind. Moreover, ongoing struggles over identity (e.g., gender, sexual orientation, and race that are often used improperly to define the self) result from an inchoate sense that individualism is eroding precipitously, not that it’s already passé. Defiant attempts to (re)establish an authentic self (contravening all logic and becoming critical theory of one sort or another) in the face of this loss may well be a last-ditch effort to save the self, but it’s failing.

Caveat: rather overlong for me, but I got rolling …

One of the better articles I’ve read about the pandemic is this one by Robert Skidelsky at Project Syndicate (a publication I’ve never heard of before). It reads as only slightly conspiratorial, purporting to reveal the true motivation for lockdowns and social distancing, namely, so-called herd immunity. If that’s the case, it’s basically a silent admission that no cure, vaccine, or inoculation is forthcoming and the spread of the virus can only be managed modestly until it has essentially raced through the population. Of course, the virus cannot be allowed to simply run its course unimpeded, but available impediments are limited. “Flattening the curve,” or distributing the infection and death rates over time, is the only attainable strategy and objective.

Wedding mathematical and biological insights, as well as the law of mass action in chemistry, into an epidemic model may seem obvious now, but it was novel roughly a century ago. We’re also now inclined, if scientifically oriented and informed, to understand the problem and its potential solutions management in terms of engineering rather than medicine (or maybe in terms of triage and palliation). Global response has also made the pandemic into a political issue as governments obfuscate and conceal true motivations behind their handling (bumbling in the U.S.) of the pandemic. Curiously, the article also mentions financial contagion, which is shaping up to be worse in both severity and duration than the viral pandemic itself.

(more…)