I put aside Harari’s book from the previous blog post in favor of Pankaj Mishra’s Age of Anger: A History of the Present (2017). Mishra’s sharp cultural criticism is far more convincing than Harari’s Panglossian perspective. Perhaps some of that is due to an inescapable pessimism in my own character. Either way, I’ve found the first 35 pages dense with observations of interest to me as a blogger and armchair cultural critic. Some while back, I published a post attempting to delineate (not very well, probably) what’s missing in the modern world despite its obvious material abundance. Reinforcing my own contentions, Mishra’s thesis (as I understand it so far) is this: we today share with others post-Enlightenment an array of resentments and hatreds (Fr.: ressentiment) aimed incorrectly at scapegoats for political and social failure to deliver the promises of progressive modernity equitably. For instance, Mishra describes

… flamboyant secular radicals in the nineteenth and early twentieth centuries: the aesthetes who glorified war, misogyny and pyromania; the nationalists who accused Jews and liberals of rootless cosmopolitanism and celebrated irrational violence; and the nihilists, anarchists and terrorists who flourished in almost every continent against a background of cosy political-financial alliances, devastating economic crises and obscene inequalities. [pp. 10–11]

Contrast and/or compare his assessment of the recent past:

Beginning in the 1990s, a democratic revolution of aspiration … swept across the world, sparking longings for wealth, status and power, in addition to ordinary desires for stability and contentment, in the most unpromising circumstances. Egalitarian ambition broke free of old social hierarchies … The culture of [frantic] individualism went universal … The crises of recent years have uncovered an extensive failure to realize the ideals of endless economic expansion and private wealth creation. Most newly created ‘individuals’ toil within poorly imagined social and political communities and/or states with weakening sovereignty … individuals with very different pasts find themselves herded by capitalism and technology into a common present, where grossly unequal distributions of wealth and power have created humiliating new hierarchies. This proximity … is rendered more claustrophobic by digital communications … [S]hocks of modernity were once absorbed by inherited social structures of family and community, and the state’s welfare cushions [something mentioned here, too]. Today’s individuals are directly exposed to them in an age of accelerating competition on uneven playing fields, where it is easy to feel that there is no such thing as either society or state, and that there is only a war of all against all. [pp. 12–14]

These long quotes (the second one cut together from longer paragraphs) are here because Mishra is remarkably eloquent in his diagnosis of globalized culture. Although I’ve only read the prologue, I expect to find support for my long-held contention that disorienting disruptions of modernity (using Anthony Giddens’ sociological definition rather than the modish use of the term Postmodern to describe only the last few decades) create unique and formidable challenges to the formation of healthy self-image and personhood. Foremost among these challenges is an unexpectedly oppressive information environment: the world forced into full view and inciting comparison, jealousy, envy, and hatred stemming from routine and ubiquitous frustrations and humiliations as we each struggle in life getting our personal share of attention, renown, and reward.

Another reason Mishra provides for our collective anger is a deep human yearning not for anarchism or radical freedom but rather for belonging and absorption within a meaningful social context. This reminds me of Erich Fromm’s book Escape from Freedom (1941), which I read long ago but can’t remember so well anymore. I do remember quite vividly how counter-intuitive was the suggestion that absolute freedom is actually burdensome as distinguished from the usual programming we get about breaking free of all restraints. (Freedom! Liberty!) Indeed, Mishra provides a snapshot of multiple cultural and intellectual movements from the past two centuries where abandoning oneself to a cause, any cause, was preferable to the boredom and nothingness of everyday life absent purpose other than mere existence. The modern substitute for larger purpose — commodity culture — is a mere shadow of better ways of spending one’s life. Maybe commodity culture is better than sacrificing one’s life fighting wars (a common fate) or destroying others, but that’s a much longer, more difficult argument.

More to follow as my reading progresses.

Advertisements

I started reading Yuval Harari’s book Homo Deus: A Brief History of Tomorrow (2017). Had expected to read Sapiens (2014) first but its follow-up came into my possession instead. My familiarity with Harari’s theses and arguments stem from his gadfly presence on YouTube being interviewed or giving speeches promoting his books. He’s a compelling yet confounding thinker, and his distinctive voice in my mind’s ear lent to my reading the quality of an audiobook. I’ve only read the introductory chapter (“A New Human Agenda”) so far, the main argument being this:

We have managed to bring famine, plague and war under control thanks largely to our phenomenal economic growth, which provides us with abundant food, medicine, energy and raw materials. Yet this same growth destabilises the ecological equilibrium of the planet in myriad ways, which we have only begun to explore … Despite all the talk of pollution, global warming and climate change, most countries have yet to make any serious economic or political sacrifices to improve the situation … In the twenty-first century, we shall have to do better if we are to avoid catastrophe. [p. 20]

“Do better”? Harari’s bland understatement of the catastrophic implications of our historical moment is risible. Yet as a consequence of having (at least temporarily) brought three major historical pestilences (no direct mention of the fabled Four Horsemen of the Apocalypse) under administrative, managerial, and technical control (I leave that contention unchallenged), Harari states rather over-confidently — forcefully even — that humankind is now turning its attention and ambitions toward different problems, namely, mortality (the fourth of the Four Horsemen and one of the defining features of the human condition), misery, and divinity.

Read the rest of this entry »

“Come with me if you want to live.” That’s among the quotable lines from the latest movie in the Terminator franchise, though it’s not nearly so succinct or iconic as “I’ll be back” from the first Terminator. Whereas the latter has the quality (in hindsight) of slow, implacable inevitability (considering the Terminator is literally a death-bringer), the former occurs within the context of a character having only just traveled back in time, not yet adequately reoriented, and forced to make a snap decision under duress. “I’ll be back” might be easy to brush off as harmless (temporary denial) since the threat recedes — except that it doesn’t, it’s merely delayed. “Come with me …” demands a leap of faith (or trust) because the danger is very real at that instant.

Which quote, I must ask, better characterizes the threat of climate change? My answer: both, but at different times. Three to four decades ago, it was the “I’ll be back” type: building slowly but inevitable given the underlying structure of industrial civilization. That structure was known even then by a narrow circle of experts (e.g., engineers for Big Oil and at the Dept. of Energy) to be a heat engine, meaning that we would ultimately cook our own goose by warming the planet, altering the climatic steady state under which our most recent civilization has flourished and producing a steady loss of biodiversity and biomass until our own human habitat (the entirety of the planet by now) becomes a hostile environment unable (unwilling if one anthropomorphizes Mother Nature) to support our swollen population. All that was if we stayed on course and took no corrective action. Despite foreknowledge and ample warning, that’s precisely what occurred (and continues today).

With the Intergovernmental Panel on Climate Change (IPCC) in particular, the threat has for roughly a decade shifted over to “Come with me ….” It’s no longer possible to put things off, yet we continue to dither well beyond the tipping point where/when we can still save ourselves from self-annihilation. Although scientists have been gathering data and evidence, forming an overwhelming consensus, and sounding the alarm, scientific illiteracy, realpolitik, journalistic malpractice, and corporate greed have all conspired to grant the illusion of time to react we simply don’t have anymore (and truth be told, probably didn’t as of the early 1980s).

I’m aware of at least three journalists (relying on the far more authoritative work of scientific consensus) who have embraced the message: Dahr Jamail, Thom Hartmann, and David Wallace-Wells. None to my knowledge has been able to bring himself to admit that humanity is now a collection of dead men walking. They can’t muster the courage to give up hope (or to report truthfully), clinging to the possibility we may still have a fleeting chance to avert disaster. I heard Ralph Nader on his webcast say something to the same effect, namely, what good is it to rob others of hope? My personal values adhere to unstinting truth rather than illusion or self-deception, so I subscribe to Guy McPherson‘s assessment that we face near-term human extinction (precise date unknown but soon if, for example, this the year we get a blue ocean event). Simply put, McPherson is professor emeritus of natural resources and ecology and evolutionary biology at the University of Arizona [note my emphasis]. I trust his scholarship (summarizing the work of other scientists and drawing necessary though unpalatable conclusions) more than I trust journalistic shaping of the story for public consumption.

The obvious metaphor for what we face is a terminal medical diagnosis, or if one has hope, perhaps a death sentence about to be carried out but with the possibility of a last-minute stay of execution via phone call from the governor. Opinions vary whether one should hope/resist up to the final moment or make peace with one’s fate. By not telling the truth, I daresay the MSM has not given the public the second option by using the “I’ll be back” characterization when it’s really “Come with me ….” Various authors on the Web offer a better approximation of the truth (such as it can be known) and form a loose doomer network (a/k/a collapsniks). This blog is (an admittedly tiny) part of that doomersphere, which gives me no pleasure.

Renewed twin memes Universal Basic Income (UBI) and Debt Jubilees (DJ) have been in the news recently. I write renewed because the two ideas are quite literally ancient, unlearnt lessons that are enjoying revitalized interest in the 21st century. Both are capable of sophisticated support from historical and contemporary study, which I admit I haven’t undertaken. However, others have done the work and make their recommendations with considerable authority. For instance, Andrew Yang, interviewed repeatedly as a 2020 U.S. presidential candidate, has made UBI the centerpiece of his policy proposals, whereas Michael Hudson has a new book out called … and forgive them their debts: Lending, Foreclosure and Redemption — From Bronze Age Finance to the Jubilee Year that offers a forgotten history of DJ.

Whenever UBI or DJ comes up in conversation, the most obvious, predicable response I hear (containing a kernel of truth) is that either proposal would reward the losers in today’s capitalist regime: those who earn too little or those who carry too much debt (often a combination of both). Never mind that quality education and economic opportunities have been steadily withdrawn over the past half century. UBI and DJ would thus be giveaways, and I daresay nothing offends a sense of fairness more than others getting something for nothing. Typical resentment goes, “I worked hard, played by the rules, and met my responsibilities; why should others who slacked, failed, or cheated get the benefit of my hard work?” It’s a commonplace “othering” response, failing to recognize that as societies we are completely interconnected and interdependent. Granting the winners in the capitalist contest a pass on fair play is also a major assumption. The most iconic supreme winners are all characterized by shark-like business practices: taking advantage of tax loopholes, devouring everything, and shrewdly understanding their predatory behavior not in terms of producing value but rather as gobbling or destroying competition to gain market share. More than a few companies these days are content to operate for years on venture capital, reporting one quarterly loss after another until rivals are vanquished. Amazon.com is the test case, though how many times its success can be repeated is unknown.

With my relative lack of economic study and sophistication, I take my lessons instead from the children’s game Monopoly. As an oversimplification of the dynamics of capital formation and ownership, Monopoly even for children reaches its logical conclusion well before its actual end, where one person “wins” everything. The balancing point when the game is no longer worth playing is debatable, but some have found through experience the answer is “before it starts.” It’s just no fun destroying bankrupting other players utterly through rent seeking. The no-longer-fun point is analogous to late-stage capitalism, where the conclusion has not yet been fully reached but is nonetheless clear. The endgame is, in a word, monopoly — the significant element being “mono,” as in there can be only one winner. (Be careful what you wish for: it’s lonely and resentful at the top.) Others take a different, aspirational lesson from Monopoly, which is to figure out game dynamics, or game the game, so that the world can be taken by force. One’s growing stranglehold on others disallows fair negotiation and cooperation (social rather than capitalist values) precisely because one party holds all the advantages, leading to exploitation of the many for the benefit of a few (or one).

Another unlearnt ancient lesson is that nothing corrupts so easily or so much as success, power, fame, wealth. Many accept that corruption willingly; few take the lesson to heart. (Disclosure: I’ve sometimes embarked on the easy path to wealth by buying lottery tickets. Haven’t won, so I’m not corruptible yet corrupted. Another case of something for nearly nothing, or for those gambling away their rent and grocery money, nothing for something.) Considering that money makes the world go around, especially in the modern age, the dynamics of capitalism are inescapable and the internal contradictions of capitalism are well acknowledged. The ancient idea of DJ is essentially a reset button depressed before the endgame leads to rebellion and destruction of the ownership class. Franklin D. Roosevelt is credited in some accounts of history as having saved capitalism from that near endgame by transferring wealth back to the people through the New Deal and the war economy. Thus, progressives are calling for a Green New Deal, through it’s not clear they are aware that propping up capitalism only delays its eventual collapse through another couple cycles (reversals) of capital flow. Availability of cheap, plentiful energy that allowed economies (and populations) to balloon over the past two and a half centuries cannot continue for much longer, so even if we get UBI or DJ, the endgame remains unchanged.

There is something ironic and vaguely tragic about how various Internet platforms — mostly search engines and social media networks — have unwittingly been thrust into roles their creators never envisioned for themselves. Unless I’m mistaken, they launched under the same business model as broadcast media: create content, or better yet, crowd-source content, to draw in viewers and subscribers whose attention is then delivered to advertisers. Revenue is derived from advertisers while the basic services — i.e., search, job networking, encyclopedias and dictionaries, or social connection — are given away gratis. The modest inconveniences and irritations of having the screen littered and interrupted with ads is a trade-off most end users are happy to accept for free content.

Along the way, some platform operators discovered that user data itself could be both aggregated and individualized and subsequently monetized. This second step unwittingly created so-called surveillance capitalism that Shoshana Zuboff writes about in her recently published book (previously blogged about it here). Essentially, an Orwellian Big Brother (several of them, in fact) tracks one’s activity through smart phone apps and Web browsers, including GPS data revealing movement through real space, not just virtual spaces. This is also the domain of the national security state from local law enforcement to the various security branches of the Federal government: dragnet surveillance where everyone is watched continuously. Again, end users shrug off surveillance as either no big deal or too late to resist.

The most recent step is that, like the Internet itself, various platforms have been functioning for some time already as public utilities and accordingly fallen under demand for regulation with regard to authenticity, truth, and community standards of allowable speech. Thus, private corporations have been thrust unexpectedly into the role of regulating content. Problem is, unlike broadcast networks that create their own content and can easily enforce restrictive standards, crowd-sourced platforms enable the general population to upload its own content, often mere commentary in text form but increasingly as video content, without any editorial review. These platforms have parried by deploying and/or modifying their preexisting surveillance algorithms in search of objectionable content normally protected as free speech and taken steps to remove content, demonetize channels, and ban offending users indefinitely, typically without warning and without appeal.

If Internet entrepreneurs initially got into the biz to make a few (or a lot of) quick billions, which some few of them have, they have by virtue of the global reach of their platforms been transformed into censors. It’s also curious that by enabling end uses to publish to their platforms, they’ve given voice to the masses in all their unwashed glory. Now, everyone’s crazy, radicalized uncle (or sibling or parent or BFF) formerly banished to obscurity railing against one thing or another at the local tavern, where he was tolerated as harmless so long as he kept his bar tab current, is proud to fly his freak flag anywhere and everywhere. Further, the anonymous coward who might issue death or bomb threats to denounce others has been given means to distribute hate across platforms and into the public sphere, where it gets picked up and maybe censored. Worst of all, the folks who monitor and decide what is allowed, functioning as modern-day thought police, are private citizens and corporations with no oversight or legal basis to act except for the fact that everything occurs on their respective platforms. This is a new aspect to the corporatocracy but not one anyone planned.

I’ve been on the sidelines of the Chicago Symphony Orchestra (CSO) musicians’ union labor action — a strike now extending into its second month with no apparent resolution in sight — and reticent to take a strong position. This might be surprising considering that I’m a natural ally of the musicians in at least two respects: (1) my support for the labor movement in general, and (2) my sustained interest in classical music as both a listener and practitioner. On balance, I have two objections that hold me back: (1) difficulty empathizing with anyone already well compensated for his or her work (CSO base salary is more than $160K per year; many make considerably more), and (2) the argument that as a premier arts institution, the organization should take no heed of economic effects being felt universally and visited on many who actually suffer deprivations beyond lost prestige.

To buttress their position, the Musicians of the CSO (why do the musicians operate a website distinct from the organization as a whole?) issued a press release in late March 2019 (PDF link). I’ve no desire to analyze it paragraph-by-paragraph, but I want to bring a few bits forward:

For more than 50 years, the Chicago Symphony Orchestra has been touted as the nation’s finest – able to draw talent from across the globe. [emphasis added]

Music is not a championship endeavor despite the plethora of televised lip-syncing singing contests. No one orchestra can lay reasonable claim to being the best. Smacks of hubris. Simply change that to “as among the nation’s finest” and I’m OK with it.

In the last seven years the Orchestra’s salary has not kept up with inflation. Further, the Orchestra’s benefit package has fallen behind that of Los Angeles and San Francisco. Now, the Association is attempting to change a fundamental tenet of the security of the Orchestra – and American life – our pension plan.

Well boo hoo for you. Many of the fundamental tenets of American life have been steadily stripped away from the population over the past 40 years or so. The very existence of a pension plan is exceptional for many in the labor force, not to mention the handsome salary and other benefits, including a 20-hour workweek, that CSO musicians enjoy. (Admittedly, a lot of outside preparation is necessary to participate effectively.) I understand that comparison with sister institutions in LA, SF, and NYC provide context, but cost of living differences at the coasts ought to be part of that context, too. Keeping up with the Joneses in this instance is a fool’s errand. And besides, those three cities suffer considerably with homeless and destitute populations that line the sidewalks and alleys. Chicago has somehow managed to displace most of its homeless population (mostly through harassment, not humanitarian aid), though one cannot avoid a phalanx of panhandlers outside Chicago Symphony Center on concert nights. Still, it’s nothing compared to conditions in downtown SF, which have gotten so bad with people living, peeing, and shitting in the street that an infamous poop map is available to help pedestrians avoid the worst of it. (I’ve no idea what the sidewalk outside Davies Symphony Hall in SF is like, but the location appears to be in the area of greatest poop concentration.) LA’s skid row is another district straight out of hell.

With many of the musicians already vested, our concern is truly about the future of the Orchestra – its ability to retain and attract great talent – a concern shared by Maestro Muti, Daniel Barenboim, and many of the world’s other finest orchestras and leaders.

This is not a concern of mine in the slightest. Sure, musicians play musical chairs, swapping around from orchestra to orchestra as opportunities arise, just like other workers traipse from job to job throughout their working lives. So what? A performing position with the CSO has long been a terminal position from which many players retire after more than 50 years of service (if they’re so fortunate to be hired by the orchestra in their 20s). I cannot estimate how many top-tier musicians forego auditions for the CSO due to perceived inadequacies with compensation or working conditions. Maybe that explains the years-long inability to hire and/or retain personnel for certain principal chairs. Still, I’m not convinced at all by “we’re the best yet we can’t compete without excessive compensation” (or shouldn’t have to). Similar arguments for ridiculously inflated CEO pay to attract qualified individuals fall on deaf ears.

An overview of the musicians’ strike was published by Lawrence A. Johnson at Chicago Classical Review, which provides details regarding the musicians’ demands. According to Johnson, the public’s initial support of the strike has turned sour. Comments I’ve been reading and my own reaction have followed exactly this trajectory. Lawrence also uses the term tone deaf to describe the musicians, though he’s diplomatic enough to avoid saying it himself, noting that the charge comes from commentators. I won’t be nearly so diplomatic. Musicians, stop this nonsense now! Demands far in excess of need, far in excess of typical workers’ compensation, and far in excess of your bargaining position do you no credit. In addition, although season ticket holders may express dismay at lost opportunities to hear certain concerts, soloists, and repertoire due to the work stoppage, the CSO is not a public utility that must keep working to maintain public wellbeing. Alternatives in greater Chicagoland can easily take up your slack for those in need of a classical music fix. Indeed, I haven’t been to a CSO concert in years because they’ve become anodyne. My CSO love affair is with the recorded legacy of the 1970s and 80s.

By striking, you’re creating a public relations nightmare that will drive people away, just as the baseball strike and take-a-knee controversy in football (and elsewhere) sent sports fans scrambling for the exits. You’re tone deaf regarding actual workplace and contract insufficiency many others confront regularly, as well as the economic realities of Chicago, Illinois, the U.S. and indeed the globe. Get over yourselves.

Throughout human history, the question “who should rule?” has been answered myriad ways. The most enduring answer is simple: he who can muster and deploy the most force of arms and then maintain control over those forces. Genghis Khan is probably the most outrageously successful example and is regarded by the West as a barbarian. Only slightly removed from barbarians is the so-called Big Man, who perhaps adds a layer of diplomacy by running a protection racket while selectively providing and distributing spoils. As societies move further away from subsistence and immediacy, various absolute rulers are established, often through hereditary title. Call them Caesar, chief, dear leader, emir, emperor (or empress), kaiser, king (or queen), pharaoh, premier, el presidente, sultan, suzerain, or tsar, they typically acquire power through the accident of birth and are dynastic. Some are female but most are male, and they typically extract tribute and sometimes demand loyalty oaths.

Post-Enlightenment, rulers are frequently democratically elected administrators (e.g., legislators, technocrats, autocrats, plutocrats, kleptocrats, and former military) ideally meant to be representative of common folks. In the U.S., members of Congress (and of course the President) are almost wholly drawn from the ranks of the wealthy (insufficient wealth being a de facto bar to office) and are accordingly estranged from American life the many different ways most of us experience it. Below the top level of visible, elected leaders is a large, hidden apparatus of high-level bureaucratic functionaries (often appointees), the so-called Deep State, that is relatively stable and made up primarily of well-educated, white-collar careerists whose ambitions for themselves and the country are often at odds with the citizenry.

I began to think about this in response to a rather irrational reply to an observation I made here. Actually, it wasn’t even originally my observation but that of Thomas Frank, namely, that the Deep State is largely made up of the liberal professional class. The reply reinforced the notion who better to rule than the “pros”? History makes the alternatives unthinkable. Thus, the Deep State’s response to the veritable one-man barbarian invasion of the Oval Office has been to seek removal of the interloper by hook or by crook. (High office in this case was won unexpectedly and with unnamed precedent by rhetorical force — base populism — rather than by military coup, making the current occupant a quasi-cult leader; similarly, extracted tribute is merely gawking attention rather than riches.)

History also reveals that all forms of political organization suffer endemic incompetence and corruption, lending truth to Winston Churchill’s witticism “Democracy is the worst form of government, except for all the others.” Indeed, recent rule by technocrats has been supremely awful, leading to periodic market crashes, extreme wealth inequality, social stigmatization, and forever wars. Life under such rule is arguably better than under various other political styles; after all, we gots our vaunted freedoms and enviable material comforts. But the exercise of those freedoms does not reliably deliver either ontological security or psychological certainty we humans crave. In truth, our current form of self-governance has let nothing get in the way of plundering the planet for short-term profit. That ongoing priority is making Earth uninhabitable not just for other species but for humans, too. In light of this fact, liberal technocratic democracy could be a far worse failure than most: it will have killed billions (an inevitability now under delayed effect).

Two new grassroots movements (to my knowledge) have appeared that openly question who should rule: the Sunrise Movement (SM) and the Extinction Rebellion (ER). SM is a youth political movement in the U.S. that acknowledges climate change and supports the Green New Deal as a way of prioritizing the desperate existential threat modern politics and society have become. For now at least, SM appears to be content with working within the system, replacing incumbents with candidates it supports. More intensely, ER is a global movement centered in the U.K. that also acknowledges that familiar modern forms of social and political organization (there are several) no longer function but in fact threaten all of us with, well, extinction. One of its unique demands is that legislatures be drawn via sortition from the general population to be more representative of the people. Further, sortition avoids the established pattern of those elected to lead representational governments from being corrupted by the very process of seeking and attaining office.

I surmise attrition and/or replacement (the SM path) are too slow and leave candidates vulnerable to corruption. In addition, since no one relinquishes power willingly, current leaders will have to be forced out via open rebellion (the ER path). I’m willing to entertain either path but must sadly conclude that both are too little, too late to address climate change and near-term extinction effectively. Though difficult to establish convincingly, I suspect the time to act was in the 1970s (or even before) when the Ecology Movement arose in recognition that we cannot continue to despoil our own habitat without consequence. That message (social, political, economic, and scientific all at once) was as inert then as it is now. However, fatalism acknowledged, some other path forward is better than our current systems of rule.

I observed way back here that it was no longer a thing to have a black man portray the U.S. president in film. Such casting might draw a modest bit of attention, but it no longer raises a particularly arched eyebrow. These depictions in cinema were only slightly ahead of the actuality of the first black president. Moreover, we’ve gotten used to female heads of state elsewhere, and we now have in the U.S. a burgeoning field of presidential wannabes from all sorts of diverse backgrounds. Near as I can tell, no one really cares anymore that a candidate is a women, or black, or a black women. (Could be that I’m isolated and/or misreading the issue.)

In Chicago where I live, the recent mayoral election offered a choice among some ten candidates to succeed the current mayor who is not seeking reelection. None of them got the required majority of votes, so a runoff between the top two candidates is about to occur. Both also happen to be black women. Although my exposure to the mainstream media and all the talking heads offering analysis is limited, I’ve yet to hear anyone remark disparagingly that Chicago will soon have its first black female mayor. This is as it should be: the field is open to all comers and no one can (or should) claim advantage or disadvantage based on identitarian politics.

Admittedly, extremists on both ends of the bogus left/right political spectrum still pay quite a lot of attention to identifiers. Academia in particular is currently destroying itself with bizarre claims and demands for equity — a nebulous doctrine that divides rather than unites people. Further, some conservatives can’t yet countenance a black, female, gay, atheist, or <insert other> politician, especially in the big chair. I’m nonetheless pleased to see that irrelevant markers matter less and less to many voters. Perhaps it’s a transition by sheer attrition and will take more time, but the current Zeitgeist outside of academia bodes well.

This is about to get weird.

I caught a good portion of a recent Joe Rogan podcast (sorry, no link or embedded video) with Alex Jones and Eddie Bravo (nearly 5 hours long instead of the usual 2 to 3) where the trio indulged themselves in a purported grand conspiracy to destroy civilization and establish a new post-human one. The more Jones rants speaks (which is quite a lot), the more he sounds like a madman. But he insists he does so to serve the public. He sincerely wants people to know things he’s figured out about an evil cabal of New World Order types. So let me say at least this: “Alex Jones, I hear you.” But I’m unconvinced. Apologies to Alex Jones et al. if I got any details wrong. For instance, it’s not clear to me whether Jones believes this stuff himself or he’s merely reporting what others may believe.

The grand conspiracy is supposedly interdimensional beings operating at a subliminal range below or beyond normal human perception. Perhaps they revealed themselves to a few individuals (to the cognoscenti, ya know, or is that shared revelation how one is inducted into the cognoscenti?). Rogan believes that ecstatic states induced by drugs provide access to revelation, like tuning a radio to the correct (but secret) frequency. Whatever exists in that altered cognitive state appears like a dream and is difficult to understand or remember. The overwhelming impression Rogan reports as lasting is of a distinct nonhuman presence.

Maybe I’m not quite as barking mad as Jones or as credulous as Rogan and Bravo, but I have to point out that humans are interdimensional beings. We move through three dimensions of space and one unidirectional dimension of time. If that doesn’t quite make sense, then I refer readers to Edwin Abbott’s well-known book Flatland. Abbott describes what it might be like for conscious beings in only two dimensions of space (or one). Similarly, for most of nature outside of vertebrates, it’s understood that consciousness, if it exists at all (e.g., not in plants), is so rudimentary that there is no durable sense of time. Beings exist in an eternal now (could be several seconds long/wide/tall — enough to function) without memory or anticipation. With that in mind, the possibility of multidimensional beings in 5+ dimensions completely imperceptible to us doesn’t bother me in the least. The same is true of the multiverse or many-worlds interpretation. What bothers me is that such beings would bother with us, especially with a conspiracy to crash civilization.

The other possibility at which I roll my eyes is a post-human future: specifically, a future when one’s consciousness escapes its biological boundaries. The common trope is that one’s mind is uploaded to a computer to exist in the ether. Another is that one transcends death somehow with intention and purpose instead of simply ceasing to be (as atheists believe) or some variation of the far more common religious heaven/hell/purgatory myth. This relates as well to the supposition of strong AI about to spark (the Singularity): self-awareness and intelligent thought that can exist on some substrate other than human biology (the nervous system, really, including the brain). Sure, cognition can be simulated for some specific tasks like playing chess or go, and we humans can be fooled easily into believing we are communicating with a thought machine à la the Turing Test. But the rather shocking sophistication, range, utility, and adaptability of even routine human consciousness is so far beyond any current simulation that the usual solution to get engineers from where they are now to real, true, strong AI is always “and then a miracle happened.” The easy, obvious route/accident is typically a power surge (e.g., a lightning strike).

Why bother with mere humans is a good question if one is post-human or an interdimensional being. It could well be that existence in such a realm would make watching human interactions either impenetrable (news flash, they are already) or akin to watching through a dim screen. That familiar trope is the lost soul imprisoned in the spirit world, a parallel dimension that permits viewing from one side only but prohibits contact except perhaps through psychic mediums (if you believe in such folks — Rogan for one doesn’t).

The one idea worth repeating from the podcast is the warning not to discount all conspiracy theories out of hand as bunk. At least a few have been demonstrated to be true. Whether any of the sites behind that link are to be believed I leave you readers to judge.

Addendum: Although a couple comments came in, no one puzzled over the primary piece I had to add, namely, that we humans are interdimentional beings. The YouTube video below depicts a portion of the math/science behind my statement, showing how at least two topographical surfaces behave paradoxically when limited to 2 or 3 dimensions but theoretically cohere in 4+ dimensions imperceptible to us.

Everyone is familiar with the convention in entertainment media where characters speak without the use of recognizable language. (Not related really to the convention of talking animals.) The first instance I can recall (someone correct me if earlier examples are to be found) is the happy-go-lucky bird Woodstock from the old Peanuts cartoons (do kids still recognize that cast of characters?), whose dialog was shown graphically as a series of vertical lines:

When the cartoon made its way onto TV for holiday specials, its creator Charles Schultz used the same convention to depict adults, never shown onscreen but with dialogue voiced by a Harmon-muted trombone. Roughly a decade later, two characters from the Star Wars franchise “spoke” in languages only other Star Wars characters could understand, namely, Chebacca (Chewie) and R2D2. More recently, the character Groot from Guardians of the Galaxy (known to me only through the Marvel movie franchise, not through comic books) speaks only one line of dialogue, “I am Groot,” which is understood as full speech by others Guardians characters. When behemoths larger than a school bus (King Kong, Godzilla, Jurassic dinosaurs, Cloverfield, Kaiju, etc.) appear, the characters are typically denied the power of speech beyond the equivalent of a lion’s roar. (True villains talk little or not at all as they go about their machinations — no monologuing! unless it’s a James Bond film. An exception notable for its failure to charm audiences is Ultron, who wouldn’t STFU. You can decide for yourself which is the worse kind of villainy.)

This convention works well enough for storytelling and has the advantage of allowing the reader/viewer to project onto otherwise blank speech. However, when imported into the real world, especially in politics, the convention founders. There is no Babelfish universal translator inserted in the ear to transform nonsense into coherence. The obvious example of babblespeech is 45, whose speech when off the teleprompter is a series of rambling non sequiturs, free associations, slogans, and sales pitches. Transcripts of anyone’s extemporaneous speech reveal lots of restarts and blind alleys; we all interrupt ourselves to redirect. However, word salad that substitutes for meaningful content in 45’s case is tragicomic: alternately entirely frustrating or comically entertaining depending on one’s objective. Satirical news shows fall into the second category.

45 is certainly not the first. Sarah Palin in her time as a media darling (driver of ratings and butt of jokes — sound familiar?) had a knack for crazy speech combinations that were utter horseshit yet oddly effective for some credulous voters. She was even a hero to some (nearly a heartbeat away from being the very first PILF). We’ve also now been treated to a series of public interrogations where a candidate for a cabinet post or an accused criminal offers testimony before a congressional panel. Secretary of Education Betsy DeVos famously evaded simple yes/no questions during her confirmation hearing, and Supreme Court Justice Brett Kavanaugh similarly refused to provide direct answers to direct questions. Unexpectedly, sacrificial lamb Michael Cohen does give direct answers to many questions, but his interlocutors then don’t quite know how to respond considering their experience and expectation that no one answers appropriately.

What all this demonstrates is that there is often a wide gulf between what is said and what is heard. In the absence of what might be understood as effective communication (honest, truthful, and forthright), audiences and voters fill in the blanks. Ironically, we also can’t handle hear too much truth when confronted by its awfulness. None of this is a problem in storytelling, but when found in politic narratives, it’s emblematic of how dysfunctional our communications have become, and with them, the clear thought and principled activity of governance.