Archive for the ‘Politics’ Category

Much ado over nothing was made this past week regarding a technical glitch (or control room error) during the first of two televised Democratic presidential debates where one pair of moderators’ mics was accidentally left on and extraneous, unintended speech leaked into the broadcast. It distracted the other pair of moderators enough to cause a modest procedural disruption. Big deal. This was not the modal case of a hot mic where someone, e.g., a politician, swears (a big no-no despite the shock value being almost completely erased in today’s media landscape) or accidentally reveals callous attitudes (or worse) thinking that no one important was listening or recording. Hot mics in the past have led to public outrage and criminal investigations. One recent example that still sticks in everyone’s craw was a novice political candidate who revealed he could use his fame and impudent nerve to “grab ’em by the pussy.” Turned out not to be the career killer everyone thought it would be.

The latest minor furor over a hot mic got me thinking, however, about inadvertent revelation of matters of genuine public interest. Three genres spring to mind: documentary films, whistle-blowing, and investigative journalism, that last including category outliers such as Wikileaks. Whereas a gaffe on a hot mic usually means the leaker/speaker exposes him- or herself and thus has no one else to blame, disclosures occurring in the other three categories are often against the will of those exposed. It’s obviously in the public interest to know about corruption, misbehavior, and malfeasance in corporate and political life, but the manner in which such information is made public is controversial. Those who expose others suffer harassment and persecution. Documentarians probably fare the best with respect to being left alone following release of information. Michael Moore, for all his absurd though entertaining theatrics, is free (so far as I know) to go about his business and do as he pleases. However, gestures to protect whistle-blowers are just that: gestures. Those who have leaked classified government information in particular, because they gained access to such information through security clearances and signed nondisclosure agreements (before knowing what secrets they were obliged to keep, which is frankly the way such obligations work), are especially prone to reprisal and prosecution. Such information is literally not theirs to disclose, but when keeping others’ secrets is heinous enough, some people feel their conscience and more duty is superior to job security and other risks involved. Opinions vary, sometimes passionately. And now even journalists who uncover or merely come into possession of evidence of wrongdoing and later publish it — again, decidedly in the public interest — are subject to (malicious?) prosecution. Julian Assange is the current test case.

The free speech aspect of revealing someone else’s amoral and criminal acts is a fraught argument. However, it’s clear that as soon as damaging information comes to light, focus shifts away from the acts and their perpetrators to those who publish the information. Shifting the focus is a miserable yet well-established precedent by now, the result being that most folks who might consider coming forward to speak up now keep things to themselves rather than suffer entirely foreseeable consequences. In that light, when someone comes forward anyway, knowing that they will be hounded, vilified, arrested, and worse, he or she deserved more respect for courage and self-sacrifice than generally occurs in the aftermath of disclosure. The flip side — condemnation, prosecution, and death threats — are already abundant in the public sphere.

Some time after reports of torture at Guantánamo, Abu Ghraib, and Bagram went public, a handful of low-level servicemen (“bad apples” used to deflect attention down the command hierarchy) were prosecuted, but high-level officials (e.g., former U.S. presidents Bush and Obama, anyone in their respective administrations, and commanding officers on site) were essentially immunized from prosecution. That example is not quite the same as going after truth-tellers, but it’s a rather egregious instance of bad actors going unprosecuted. I’m still incensed by it. And that’s why I’m blogging about the hot mic. Lots of awful things go on behind the scenes without public knowledge or sanction. Those who commit high crimes (including war crimes) clearly know what they’re doing is wrong. Claims of national security are often invoked and gag laws are legislated into existence on behalf of private industry. When leaks do inevitably occur, those accused immediately attack the accuser, often with the aid of others in the media. Denials may also be issued (sometimes not — why bother?), but most bad actors hide successfully behind the deflecting shift of focus. When will those acting in the shadows against the public interest and in defiance of domestic and international law ever be brought to justice? I daresay the soul of the nation is at stake, and as long as officialdom escapes all but temporary public relations problems to be spun, the pride everyone wants to take as Americans eludes us. In the meantime, there’s a lot to answer for, and it keeps piling up.

Advertisements

I put aside Harari’s book from the previous blog post in favor of Pankaj Mishra’s Age of Anger: A History of the Present (2017). Mishra’s sharp cultural criticism is far more convincing than Harari’s Panglossian perspective. Perhaps some of that is due to an inescapable pessimism in my own character. Either way, I’ve found the first 35 pages dense with observations of interest to me as a blogger and armchair cultural critic. Some while back, I published a post attempting to delineate (not very well, probably) what’s missing in the modern world despite its obvious material abundance. Reinforcing my own contentions, Mishra’s thesis (as I understand it so far) is this: we today share with others post-Enlightenment an array of resentments and hatreds (Fr.: ressentiment) aimed incorrectly at scapegoats for political and social failure to deliver the promises of progressive modernity equitably. For instance, Mishra describes

… flamboyant secular radicals in the nineteenth and early twentieth centuries: the aesthetes who glorified war, misogyny and pyromania; the nationalists who accused Jews and liberals of rootless cosmopolitanism and celebrated irrational violence; and the nihilists, anarchists and terrorists who flourished in almost every continent against a background of cosy political-financial alliances, devastating economic crises and obscene inequalities. [pp. 10–11]

Contrast and/or compare his assessment of the recent past:

Beginning in the 1990s, a democratic revolution of aspiration … swept across the world, sparking longings for wealth, status and power, in addition to ordinary desires for stability and contentment, in the most unpromising circumstances. Egalitarian ambition broke free of old social hierarchies … The culture of [frantic] individualism went universal … The crises of recent years have uncovered an extensive failure to realize the ideals of endless economic expansion and private wealth creation. Most newly created ‘individuals’ toil within poorly imagined social and political communities and/or states with weakening sovereignty … individuals with very different pasts find themselves herded by capitalism and technology into a common present, where grossly unequal distributions of wealth and power have created humiliating new hierarchies. This proximity … is rendered more claustrophobic by digital communications … [S]hocks of modernity were once absorbed by inherited social structures of family and community, and the state’s welfare cushions [something mentioned here, too]. Today’s individuals are directly exposed to them in an age of accelerating competition on uneven playing fields, where it is easy to feel that there is no such thing as either society or state, and that there is only a war of all against all. [pp. 12–14]

These long quotes (the second one cut together from longer paragraphs) are here because Mishra is remarkably eloquent in his diagnosis of globalized culture. Although I’ve only read the prologue, I expect to find support for my long-held contention that disorienting disruptions of modernity (using Anthony Giddens’ sociological definition rather than the modish use of the term Postmodern to describe only the last few decades) create unique and formidable challenges to the formation of healthy self-image and personhood. Foremost among these challenges is an unexpectedly oppressive information environment: the world forced into full view and inciting comparison, jealousy, envy, and hatred stemming from routine and ubiquitous frustrations and humiliations as we each struggle in life getting our personal share of attention, renown, and reward.

Another reason Mishra provides for our collective anger is a deep human yearning not for anarchism or radical freedom but rather for belonging and absorption within a meaningful social context. This reminds me of Erich Fromm’s book Escape from Freedom (1941), which I read long ago but can’t remember so well anymore. I do remember quite vividly how counter-intuitive was the suggestion that absolute freedom is actually burdensome as distinguished from the usual programming we get about breaking free of all restraints. (Freedom! Liberty!) Indeed, Mishra provides a snapshot of multiple cultural and intellectual movements from the past two centuries where abandoning oneself to a cause, any cause, was preferable to the boredom and nothingness of everyday life absent purpose other than mere existence. The modern substitute for larger purpose — commodity culture — is a mere shadow of better ways of spending one’s life. Maybe commodity culture is better than sacrificing one’s life fighting wars (a common fate) or destroying others, but that’s a much longer, more difficult argument.

More to follow as my reading progresses.

“Come with me if you want to live.” That’s among the quotable lines from the latest movie in the Terminator franchise, though it’s not nearly so succinct or iconic as “I’ll be back” from the first Terminator. Whereas the latter has the quality (in hindsight) of slow, implacable inevitability (considering the Terminator is literally a death-bringer), the former occurs within the context of a character having only just traveled back in time, not yet adequately reoriented, and forced to make a snap decision under duress. “I’ll be back” might be easy to brush off as harmless (temporary denial) since the threat recedes — except that it doesn’t, it’s merely delayed. “Come with me …” demands a leap of faith (or trust) because the danger is very real at that instant.

Which quote, I must ask, better characterizes the threat of climate change? My answer: both, but at different times. Three to four decades ago, it was the “I’ll be back” type: building slowly but inevitable given the underlying structure of industrial civilization. That structure was known even then by a narrow circle of experts (e.g., engineers for Big Oil and at the Dept. of Energy) to be a heat engine, meaning that we would ultimately cook our own goose by warming the planet, altering the climatic steady state under which our most recent civilization has flourished and producing a steady loss of biodiversity and biomass until our own human habitat (the entirety of the planet by now) becomes a hostile environment unable (unwilling if one anthropomorphizes Mother Nature) to support our swollen population. All that was if we stayed on course and took no corrective action. Despite foreknowledge and ample warning, that’s precisely what occurred (and continues today).

With the Intergovernmental Panel on Climate Change (IPCC) in particular, the threat has for roughly a decade shifted over to “Come with me ….” It’s no longer possible to put things off, yet we continue to dither well beyond the tipping point where/when we can still save ourselves from self-annihilation. Although scientists have been gathering data and evidence, forming an overwhelming consensus, and sounding the alarm, scientific illiteracy, realpolitik, journalistic malpractice, and corporate greed have all conspired to grant the illusion of time to react we simply don’t have anymore (and truth be told, probably didn’t as of the early 1980s).

I’m aware of at least three journalists (relying on the far more authoritative work of scientific consensus) who have embraced the message: Dahr Jamail, Thom Hartmann, and David Wallace-Wells. None to my knowledge has been able to bring himself to admit that humanity is now a collection of dead men walking. They can’t muster the courage to give up hope (or to report truthfully), clinging to the possibility we may still have a fleeting chance to avert disaster. I heard Ralph Nader on his webcast say something to the same effect, namely, what good is it to rob others of hope? My personal values adhere to unstinting truth rather than illusion or self-deception, so I subscribe to Guy McPherson‘s assessment that we face near-term human extinction (precise date unknown but soon if, for example, this the year we get a blue ocean event). Simply put, McPherson is professor emeritus of natural resources and ecology and evolutionary biology at the University of Arizona [note my emphasis]. I trust his scholarship (summarizing the work of other scientists and drawing necessary though unpalatable conclusions) more than I trust journalistic shaping of the story for public consumption.

The obvious metaphor for what we face is a terminal medical diagnosis, or if one has hope, perhaps a death sentence about to be carried out but with the possibility of a last-minute stay of execution via phone call from the governor. Opinions vary whether one should hope/resist up to the final moment or make peace with one’s fate. By not telling the truth, I daresay the MSM has not given the public the second option by using the “I’ll be back” characterization when it’s really “Come with me ….” Various authors on the Web offer a better approximation of the truth (such as it can be known) and form a loose doomer network (a/k/a collapsniks). This blog is (an admittedly tiny) part of that doomersphere, which gives me no pleasure.

Throughout human history, the question “who should rule?” has been answered myriad ways. The most enduring answer is simple: he who can muster and deploy the most force of arms and then maintain control over those forces. Genghis Khan is probably the most outrageously successful example and is regarded by the West as a barbarian. Only slightly removed from barbarians is the so-called Big Man, who perhaps adds a layer of diplomacy by running a protection racket while selectively providing and distributing spoils. As societies move further away from subsistence and immediacy, various absolute rulers are established, often through hereditary title. Call them Caesar, chief, dear leader, emir, emperor (or empress), kaiser, king (or queen), pharaoh, premier, el presidente, sultan, suzerain, or tsar, they typically acquire power through the accident of birth and are dynastic. Some are female but most are male, and they typically extract tribute and sometimes demand loyalty oaths.

Post-Enlightenment, rulers are frequently democratically elected administrators (e.g., legislators, technocrats, autocrats, plutocrats, kleptocrats, and former military) ideally meant to be representative of common folks. In the U.S., members of Congress (and of course the President) are almost wholly drawn from the ranks of the wealthy (insufficient wealth being a de facto bar to office) and are accordingly estranged from American life the many different ways most of us experience it. Below the top level of visible, elected leaders is a large, hidden apparatus of high-level bureaucratic functionaries (often appointees), the so-called Deep State, that is relatively stable and made up primarily of well-educated, white-collar careerists whose ambitions for themselves and the country are often at odds with the citizenry.

I began to think about this in response to a rather irrational reply to an observation I made here. Actually, it wasn’t even originally my observation but that of Thomas Frank, namely, that the Deep State is largely made up of the liberal professional class. The reply reinforced the notion who better to rule than the “pros”? History makes the alternatives unthinkable. Thus, the Deep State’s response to the veritable one-man barbarian invasion of the Oval Office has been to seek removal of the interloper by hook or by crook. (High office in this case was won unexpectedly and with unnamed precedent by rhetorical force — base populism — rather than by military coup, making the current occupant a quasi-cult leader; similarly, extracted tribute is merely gawking attention rather than riches.)

History also reveals that all forms of political organization suffer endemic incompetence and corruption, lending truth to Winston Churchill’s witticism “Democracy is the worst form of government, except for all the others.” Indeed, recent rule by technocrats has been supremely awful, leading to periodic market crashes, extreme wealth inequality, social stigmatization, and forever wars. Life under such rule is arguably better than under various other political styles; after all, we gots our vaunted freedoms and enviable material comforts. But the exercise of those freedoms does not reliably deliver either ontological security or psychological certainty we humans crave. In truth, our current form of self-governance has let nothing get in the way of plundering the planet for short-term profit. That ongoing priority is making Earth uninhabitable not just for other species but for humans, too. In light of this fact, liberal technocratic democracy could be a far worse failure than most: it will have killed billions (an inevitability now under delayed effect).

Two new grassroots movements (to my knowledge) have appeared that openly question who should rule: the Sunrise Movement (SM) and the Extinction Rebellion (ER). SM is a youth political movement in the U.S. that acknowledges climate change and supports the Green New Deal as a way of prioritizing the desperate existential threat modern politics and society have become. For now at least, SM appears to be content with working within the system, replacing incumbents with candidates it supports. More intensely, ER is a global movement centered in the U.K. that also acknowledges that familiar modern forms of social and political organization (there are several) no longer function but in fact threaten all of us with, well, extinction. One of its unique demands is that legislatures be drawn via sortition from the general population to be more representative of the people. Further, sortition avoids the established pattern of those elected to lead representational governments from being corrupted by the very process of seeking and attaining office.

I surmise attrition and/or replacement (the SM path) are too slow and leave candidates vulnerable to corruption. In addition, since no one relinquishes power willingly, current leaders will have to be forced out via open rebellion (the ER path). I’m willing to entertain either path but must sadly conclude that both are too little, too late to address climate change and near-term extinction effectively. Though difficult to establish convincingly, I suspect the time to act was in the 1970s (or even before) when the Ecology Movement arose in recognition that we cannot continue to despoil our own habitat without consequence. That message (social, political, economic, and scientific all at once) was as inert then as it is now. However, fatalism acknowledged, some other path forward is better than our current systems of rule.

I observed way back here that it was no longer a thing to have a black man portray the U.S. president in film. Such casting might draw a modest bit of attention, but it no longer raises a particularly arched eyebrow. These depictions in cinema were only slightly ahead of the actuality of the first black president. Moreover, we’ve gotten used to female heads of state elsewhere, and we now have in the U.S. a burgeoning field of presidential wannabes from all sorts of diverse backgrounds. Near as I can tell, no one really cares anymore that a candidate is a women, or black, or a black women. (Could be that I’m isolated and/or misreading the issue.)

In Chicago where I live, the recent mayoral election offered a choice among some ten candidates to succeed the current mayor who is not seeking reelection. None of them got the required majority of votes, so a runoff between the top two candidates is about to occur. Both also happen to be black women. Although my exposure to the mainstream media and all the talking heads offering analysis is limited, I’ve yet to hear anyone remark disparagingly that Chicago will soon have its first black female mayor. This is as it should be: the field is open to all comers and no one can (or should) claim advantage or disadvantage based on identitarian politics.

Admittedly, extremists on both ends of the bogus left/right political spectrum still pay quite a lot of attention to identifiers. Academia in particular is currently destroying itself with bizarre claims and demands for equity — a nebulous doctrine that divides rather than unites people. Further, some conservatives can’t yet countenance a black, female, gay, atheist, or <insert other> politician, especially in the big chair. I’m nonetheless pleased to see that irrelevant markers matter less and less to many voters. Perhaps it’s a transition by sheer attrition and will take more time, but the current Zeitgeist outside of academia bodes well.

Everyone is familiar with the convention in entertainment media where characters speak without the use of recognizable language. (Not related really to the convention of talking animals.) The first instance I can recall (someone correct me if earlier examples are to be found) is the happy-go-lucky bird Woodstock from the old Peanuts cartoons (do kids still recognize that cast of characters?), whose dialog was shown graphically as a series of vertical lines:

When the cartoon made its way onto TV for holiday specials, its creator Charles Schultz used the same convention to depict adults, never shown onscreen but with dialogue voiced by a Harmon-muted trombone. Roughly a decade later, two characters from the Star Wars franchise “spoke” in languages only other Star Wars characters could understand, namely, Chebacca (Chewie) and R2D2. More recently, the character Groot from Guardians of the Galaxy (known to me only through the Marvel movie franchise, not through comic books) speaks only one line of dialogue, “I am Groot,” which is understood as full speech by others Guardians characters. When behemoths larger than a school bus (King Kong, Godzilla, Jurassic dinosaurs, Cloverfield, Kaiju, etc.) appear, the characters are typically denied the power of speech beyond the equivalent of a lion’s roar. (True villains talk little or not at all as they go about their machinations — no monologuing! unless it’s a James Bond film. An exception notable for its failure to charm audiences is Ultron, who wouldn’t STFU. You can decide for yourself which is the worse kind of villainy.)

This convention works well enough for storytelling and has the advantage of allowing the reader/viewer to project onto otherwise blank speech. However, when imported into the real world, especially in politics, the convention founders. There is no Babelfish universal translator inserted in the ear to transform nonsense into coherence. The obvious example of babblespeech is 45, whose speech when off the teleprompter is a series of rambling non sequiturs, free associations, slogans, and sales pitches. Transcripts of anyone’s extemporaneous speech reveal lots of restarts and blind alleys; we all interrupt ourselves to redirect. However, word salad that substitutes for meaningful content in 45’s case is tragicomic: alternately entirely frustrating or comically entertaining depending on one’s objective. Satirical news shows fall into the second category.

45 is certainly not the first. Sarah Palin in her time as a media darling (driver of ratings and butt of jokes — sound familiar?) had a knack for crazy speech combinations that were utter horseshit yet oddly effective for some credulous voters. She was even a hero to some (nearly a heartbeat away from being the very first PILF). We’ve also now been treated to a series of public interrogations where a candidate for a cabinet post or an accused criminal offers testimony before a congressional panel. Secretary of Education Betsy DeVos famously evaded simple yes/no questions during her confirmation hearing, and Supreme Court Justice Brett Kavanaugh similarly refused to provide direct answers to direct questions. Unexpectedly, sacrificial lamb Michael Cohen does give direct answers to many questions, but his interlocutors then don’t quite know how to respond considering their experience and expectation that no one answers appropriately.

What all this demonstrates is that there is often a wide gulf between what is said and what is heard. In the absence of what might be understood as effective communication (honest, truthful, and forthright), audiences and voters fill in the blanks. Ironically, we also can’t handle hear too much truth when confronted by its awfulness. None of this is a problem in storytelling, but when found in politic narratives, it’s emblematic of how dysfunctional our communications have become, and with them, the clear thought and principled activity of governance.

First, a bit of history. The U.S. Constitution was ratified in 1788 and superseded the Articles of Confederation. The first ten Amendments, ratified in 1791 (rather quickly after the initial drafting and adoption of the main document — oops, forgot these obvious assumptions), are known as the Bill of Rights. The final amendment to date, the 27th Amendment, though proposed in 1789 along with others, was not ratified until 1992. A half dozen additional amendments approved by Congress have not yet been ratified, and a large number of other unapproved amendments have been proposed.

The received wisdom is that, by virtue of its lengthy service as the supreme law of the land, the U.S. Constitution has become sacrosanct and invulnerable to significant criticism and further amendment. That wisdom has begun to be questioned actively as a result of (at least) two factors: (1) recognition that the Federal government serves the common good and citizenry rather poorly, having become corrupt and dysfunctional, and (2) the Electoral College, an anachronism from the Revolutionary Era that skews voting power away from cities, handed two recent presidential elections to candidates who failed to win the popular vote yet won in the Electoral College. For a numerical analysis of how electoral politics is gamed to subvert public opinion, resulting in more government seats held by Republicans than voting (expressing the will of the people) would indicate, see this article by the Brookings Institute.

These are issues of political philosophy and ongoing public debate, spurred by dissatisfaction over periodic Federal shutdowns, power struggles between the executive and legislative branches that are tantamount to holding each other hostage, and income inequality that pools wealth and power in the hands of ever fewer people. The judicial branch (especially the U.S. Supreme Court) is also a significant point of contention; its newly appointed members are increasingly right wing but have not (yet) taken openly activist roles (e.g., reversing Roe v. Wade). As philosophy, questioning the wisdom of the U.S. Constitution requires considerable knowledge of history and comparative government to undertake with equanimity (as opposed to emotionalism). I don’t possess such expert knowledge but will observe that the U.S. is an outlier among nations in relying on a centuries-old constitution, which may not have been the expectation or intent of the drafters.

It might be too strong to suggest just yet that the public feels betrayed by its institutions. Better to say that, for instance, the U.S. Constitution is now regarded as a flawed document — not for its day (with limited Federal powers) but for the needs of today (where the Federal apparatus, including the giant military, has grown into a leviathan). This would explain renewed interest in direct democracy (as opposed to representative government), flirtations with socialism (expanded over the blended system we already have), and open calls for revolution to remove a de facto corporatocracy. Whether the U.S. Constitution can or should survive these challenges is the question.

Some while back, Scott Adams (my general disdain for him noted but unexpanded, since I’m not in the habit of shitting on people), using his knowledge of hypnosis, began pushing the string selling the narrative that our Commander-in-Chief is cannily adept at the art of persuasion. I, for one, am persuaded by neither Adams nor 45 but must admit that many others are. Constant shilling for control of narratives by agents of all sorts could not be more transparent (for me at least), rendering the whole enterprise null. Similarly, when I see an advertisement (infrequently, I might add, since I use ad blockers and don’t watch broadcast TV or news programs), I’m rarely inclined to seek more information or make a purchase. Once in a long while, an ad creeps through my defenses and hits one of my interests, and even then, I rarely respond because, duh, it’s an ad.

In the embedded video below, Stuart Ewen describes how some learned to exploit a feature (not a bug) in human cognition, namely, appeals to emotion that overwhelm rational response. The most obvious, well-worn example is striking fear into people’s hearts and minds to convince them of an illusion of safety necessitating relinquishing civil liberties and/or fighting foreign wars.

The way Ewen uses the term consciousness differs from the way I use it. He refers specifically to opinion- and decision-making (the very things vulnerable to manipulation) rather than the more generalized and puzzling property of having an individual identity or mind and with it self-awareness. In fact, Ewen uses the terms consciousness industry and persuasion industry instead of public relations and marketing to name those who spin information and thus public discourse. At some level, absolutely everyone is guilty of seeking to persuade others, which again is a basic feature of communication. (Anyone negotiating the purchase of, say, a new or used car faces the persuasion of the sales agent with some skepticism.) What turns it into something maniacal is using lies and fabrication to advance agendas against the public interest, especially where public opinion is already clear.

Ewen also points to early 20th-century American history, where political leaders and marketers were successful in manipulating mass psychology in at least three ways: 1. drawing the pacifist U.S. public into two world wars of European origin, 2. transforming citizens into consumers, thereby saving capitalism from its inherently self-destructive endgame (creeping up on us yet again), and 3. suppressing emergent collectivism, namely, socialism. Of course, unionism as a collectivist institution still gained considerable strength but only within the larger context of capitalism, e.g., achieving the American Dream in purely financial terms.

So getting back to Scott Adams’ argument, the notion that the American public is under some form of mass hypnosis (persuasion) and that 45 is the master puppeteer is perhaps half true. Societies do sometimes go mad and fall under the spell of a mania or cult leader. But 45 is not the driver of the current episode, merely the embodiment. I wouldn’t say that 45 figured out anything because that awards too much credit to presumed understanding and planning. Rather, he worked out (accidentally and intuitively — really by default considering his job in 2016) that his peculiar self-as-brand could be applied to politics by treating it all as reality TV, which by now everyone knows is its own weird unreality the same way professional wrestling is fundamentally unreal. (The term political theater applies here.) He demonstrated a knack (at best) for keeping the focus firmly on himself and driving ratings (abetted by the mainstream media that had long regarded him as a clown or joke), but those objectives were never really in service of a larger political vision. In effect, the circus brought to town offers its own bizarre constructed narrative, but its principle characteristic is gawking, slack-jawed, made-you-look narcissism, not any sort of proper guidance or governance.

As I reread what I wrote 2.5 years ago in my first blog on this topic, I surmise that the only update needed to my initial assessment is a growing pile of events that demonstrate my thesis: our corrupted information environment is too taxing on human cognition, with the result that a small but growing segment of society gets radicalized (wound up like a spring) and relatively random individuals inevitably pop, typically in a self-annihilating gush of violence. News reports bear this out periodically, as one lone-wolf kook after another takes it upon himself (are there any examples of females doing this?) to shoot or blow up some target, typically chosen irrationally or randomly though for symbolic effect. More journalists and bloggers are taking note of this activity and evolving or resurrecting nomenclature to describe it.

The earliest example I’ve found offering nomenclature for this phenomenon is a blog with a single post from 2011 (oddly, no follow-up) describing so-called stochastic terrorism. Other terms include syntactic violence, semantic violence, and epistemic violence, but they all revolve around the same point. Whether on the sending or receiving end of communications, some individuals are particularly adept at or sensitive to dog whistles that over time activate and exacerbate tendencies toward radical ideology and violence. Wired has a brief article from a few days ago discussing stochastic terrorism as jargon, which is basically what I’m doing here. Admittedly, the last of these terms, epistemic violence (alternative: epistemological violence), ranges farther afield from the end effect I’m calling wind-up toys. For instance, this article discussing structural violence is much more academic in character than when I blogged on the same term (one of a handful of “greatest hits” for this blog that return search-engine hits with some regularity). Indeed, just about any of my themes and topics can be given a dry, academic treatment. That’s not my approach (I gather opinions differ on this account, but I insist that real academic work is fundamentally different from my armchair cultural criticism), but it’s entirely valid despite being a bit remote for most readers. One can easily get lost down the rabbit hole of analysis.

If indeed it’s mere words and rhetoric that transform otherwise normal people into criminals and mass murderers, then I suppose I can understand the distorted logic of the far Left that equates words and rhetoric themselves with violence, followed by the demand that they be provided with warnings and safe spaces lest they be triggered by what they hear, read, or learn. As I understand it, the fear is not so much that vulnerable, credulous folks will be magically turned into automatons wound up and set loose in public to enact violent agendas but instead that virulent ideas and knowledge (including many awful truths of history) might cause discomfort and psychological collapse akin to what happens to when targets of hate speech and death threats are reduced, say, to quivering agoraphobia. Desire for protection from harm is thus understandable. The problem with such logic, though, is that protections immediately run afoul of free speech, a hallowed but misunderstood American institution that preempts quite a few restrictions many would have placed on the public sphere. Protections also stall learning and truth-seeking straight out of the gate. And besides, preemption of preemption doesn’t work.

In information theory, the notion of a caustic idea taking hold of an unwilling person and having its wicked way with him or her is what’s called a mind virus or meme. The viral metaphor accounts for the infectious nature of ideas as they propagate through the culture. For instance, every once in a while, a charismatic cult emerges and inducts new members, a suicide cluster appears, or suburban housewives develop wildly disproportionate phobias about Muslims or immigrants (or worse, Muslim immigrants!) poised at their doorsteps with intentions of rape and murder. Inflaming these phobias, often done by pundits and politicians, is precisely the point of semantic violence. Everyone is targeted but only a few are affected to the extreme of acting out violently. Milder but still invalid responses include the usual bigotries: nationalism, racism, sexism, and all forms of tribalism, “othering,” or xenophobia that seek to insulate oneself safely among like folks.

Extending the viral metaphor, to protect oneself from infectious ideas requires exposure, not insulation. Think of it as a healthy immune system built up gradually, typically early in life, through slow, steady exposure to harm. The alternative is hiding oneself away from germs and disease, which has the ironic result of weakening the immune system. For instance, I learned recently that peanut allergies can be overcome by gradual exposure — a desensitization process — but are exacerbated by removal of peanuts from one’s environment and/or diet. This is what folks mean when they say the answer to hate speech is yet more (free) speech. The nasty stuff can’t be dealt with properly when it’s quarantined, hidden away, suppressed, or criminalized. Maybe there are exceptions. Science fiction entertains those dangers with some regularity, where minds are shunted aside to become hosts for invaders of some sort. That might be overstating the danger somewhat, but violent eruptions may provide some credence.

Several politicians on the U.S. national stage have emerged in the past few years as firebrands of new politics and ideas about leadership — some salutary, others less so. Perhaps the quintessential example is Bernie Sanders, who identified himself as Socialist within the Democratic Party, a tacit acknowledgement that there are no electable third-party candidates for high office thus far. Even 45’s emergence as a de facto independent candidate within the Republican Party points to the same effect (and at roughly the same time). Ross Perot and Ralph Nader came closest in recent U.S. politics to establishing viable candidacies outside the two-party system, but their ultimate failures only reinforce the rigidity of modern party politics; it’s a closed system.

Those infusing energy and new (OK, in truth, they’re old) ideas into this closed system are intriguing. By virtue of his immediate name/brand recognition, Bernie Sanders can now go by his single given name (same is true of Hillary, Donald, and others). Supporters of Bernie’s version of Democratic Socialism are thus known as Bernie Bros, though the term is meant pejoratively. Considering his age, however, Bernie is not widely considered a viable presidential candidate in the next election cycle. Among other firebrands, I was surprised to find Alexandria Ocasio-Cortez (often referred to simply as AOC) described in the video embedded below as a Democratic Socialist but without any reference to Bernie (“single-handedly galvanized the American people”):

Despite the generation(s) gap, young adults had no trouble supporting Bernie three years ago but appear to have shifted their ardent support to AOC. Yet Bernie is still relevant and makes frequent statements demonstrating how well he understands the failings of the modern state, its support of the status quo, and the cult of personality behind certain high-profile politicians.

As I reflect on history, it occurs to me that many of the major advances in society (e.g., abolition, suffrage, the labor movement, civil rights, equal rights and abortion, and the end of U.S. involvement in the Vietnam War) occurred not because our government led us to them but because the American people forced the issues. The most recent examples of the government yielding to the will of the people are gay marriage and cannabis/hemp legalization (still underway). I would venture that Johnson and Nixon were the last U.S. presidents who experienced palpable fear of the public. (Claims that Democrats are afraid of AOC ring hollow — so far.) As time has worn on, later presidents have been confident in their ability to buffalo the public or at least to use the power of the state to quell unrest (e.g., the Occupy movement). (Modern means of crowd control raise serious questions about the legitimacy of any government that would use them against its own citizens. I would include enemy combatants, but that is a separate issue.) In contrast with salutary examples of the public using its disruptive power over a recalcitrant government are arguably more examples where things went haywire rather badly. Looking beyond the U.S., the French Reign of Terror and the Bolsheviks are the two examples that leap immediately to mind, but there are plenty of others. The pattern appears to be a populist ideology that takes root and turns virulent and violent followed by consolidation of power by those who mange to survive the inevitable purge of dissidents.

I bring this up because we’re in a period of U.S. history characterized by populist ideological possession on both sides (left/right) of the political continuum, though politics ought to be better understood as a spectrum. Extremism has again found a home (or several), and although the early stages appear to be mild or harmless, I fear that a charismatic leader might unwittingly succeed in raising a mob. As the saying goes (from the Indiana Jones movie franchise), “You are meddling with forces you cannot possibly comprehend,” to which I would add cannot control. Positioning oneself at the head of a movement or rallying behind such an opportunist may feel like the right thing to do but could easily and quickly veer into wildly unintended consequences. How many times in history has that already occurred?