Posts Tagged ‘Culture’

Most of us are familiar with a grandpa, uncle, or father who eventually turns into a cranky old man during late middle age or in his dotage. (Why is it a mostly male phenomenon?) In the last three decades, Clint Eastwood typecast himself as a cranky old man, building on lone-wolf characters (mostly cops, criminals, and cowboys) established earlier in his career. In real life, these guys spout talking points absorbed from mainstream media and narrative managers, or if they are truly lazy and/or can’t articulate anything coherently on their own, merely forward agitprop via e-mail like chain mail of yore. They also demonstrate remarkably forgivable racism, sexism, and bigotry, such as Eastwood’s rather enjoyable and ultimately redeemed character in the film Gran Torino. If interaction with such a fellow is limited to Thanksgiving gatherings once per year, crankiness can be tolerated fairly easily. If interactions are ongoing, then a typical reaction is simply to delete e-mail messages unread, or in the case of unavoidable face-to-face interaction, to chalk it up: Well, that’s just Grandpa Joe or Uncle Bill or Dad. Let him rant; he’s basically harmless now that he’s so old he creaks.

Except that not all of them are so harmless. Only a handful of the so-called Greatest Generation (I tire of the term but it’s solidly established) remain in positions of influence. However, lots of Boomers still wield considerable power despite their advancing age, looming retirement (and death), and basic out-of-touchness with a culture that has left them behind. Nor are their rants and bluster necessarily wrong. See, for instance, this rant by Tom Engelhardt, which begins with these two paragraphs:

Let me rant for a moment. I don’t do it often, maybe ever. I’m not Donald Trump. Though I’m only two years older than him, I don’t even know how to tweet and that tells you everything you really need to know about Tom Engelhardt in a world clearly passing me by. Still, after years in which America’s streets were essentially empty, they’ve suddenly filled, day after day, with youthful protesters, bringing back a version of a moment I remember from my youth and that’s a hopeful (if also, given Covid-19, a scary) thing, even if I’m an old man in isolation in this never-ending pandemic moment of ours.

In such isolation, no wonder I have the urge to rant. Our present American world, after all, was both deeply unimaginable — before 2016, no one could have conjured up President Donald Trump as anything but a joke — and yet in some sense, all too imaginable …

If my own father (who doesn’t read this blog) could articulate ideas as well as Engelhardt, maybe I would stop deleting unread the idiocy he forwards via e-mail. Admittedly, I could well be following in my father’s footsteps, as the tag rants on this blog indicates, but at least I write my own screed. I’m far less accomplished at it than, say, Engelhardt, Andy Rooney (in his day), Ralph Nader, or Dave Barry, but then, I’m only a curmudgeon-in-training, not having fully aged (or elevated?) yet to cranky old manhood.

As the fall presidential election draws near (assuming that it goes forward), the choice in the limited U.S. two-party system is between one of two cranky old men, neither of which is remotely capable of guiding the country through this rough patch at the doomer-anticipated end of human history. Oh, and BTW, echoing Engelhardt’s remark above, 45 has been a joke all of my life — a dark parody of success — and remains so despite occupying the Oval Office. Their primary opponent up to only a couple months ago was Bernie Sanders, himself a cranky old man but far more endearing at it. This is what passes for the best leadership on offer?

Many Americans are ready to move on to someone younger and more vibrant, able to articulate a vision for something, well, different from the past. Let’s skip right on past candidates (names withheld) who parrot the same worn-out ideas as our fathers and grandfathers. Indeed, a meme emerged recently to the effect that the Greatest Generation saved us from various early 20th-century scourges (e.g., Nazis and Reds) only for the Boomers to proceed in their turn to mess up the planet so badly nothing will survive new scourges already appearing. It may not be fair to hang such labels uniformly around the necks of either generation (or subsequent ones); each possesses unique characteristics and opportunities (some achieved, others squandered) borne out of their particular moment in history. But this much is clear: whatever happens with the election and whichever generational cohort assumes power, the future is gonna be remarkably different.

My information diet is, like most others, self-curated and biased. As a result, the news that finally makes its way through my filters (meaning that to which I give any attention) is incomplete. This I admit without reservation. However, it’s not only my filters at work. Nearly everyone with something to say, reveal, or withhold regarding civil unrest sparked in the U.S. and diffusing globally has an agenda. Here are some of the things we’re not hearing about but should expect to:

  • comparison of peaceful protest to violent protest, by percentage, say, at least until the police show up and things go sideways
  • incidence of aldermen, councilmen, mayors, congressmen, and other elected officials who side with protesters
  • incidence of police officers who side with protesters, take a knee, and decline to crack heads
  • examples of police units on the streets who do not look like they’re equipped like soldiers in a war zone — deployed against civilians with bottles and bricks (mostly)
  • incidents where it’s police rioting rather than protesters
  • situations where looters are left alone to loot while nearby protesters are harassed and arrested or worse

If the objective of those trying to control the narrative, meaning the MSM, the corpocracy, and municipal, state, and Federal PR offices, is to strike fear in the hearts of Americans as a means of rationalizing and justifying overweening use of state power (authoritarianism), then it makes sense to omit or de-emphasize evidence that protesters are acting on legitimate grievances. Indeed, if other legitimate avenues of petitioning government — you know, 1st Amendment stuff — have been thwarted, then it should be expected that massed citizen dissent might devolve into violence. Group psychology essentially guarantees it.

Such violence may well be misdirected, but that violence is being reflected back at protesters in what can only be described as further cycles of escalation. Misdirection upon misdirection. That is not at all the proper role of civil authority, yet the police have been cast in that role and have been largely compliant. Dystopian fiction in the middle of the 20th century predicted this state of human affairs pretty comprehensively, yet we find ourselves having avoided none of it.

Ours is an era when individuals are encouraged to explore, amplify, and parade various attributes of their identities out in public, typically via social media. For those just coming of age and/or recently having entered adulthood, because identity is not yet fully formed, defining oneself is more nearly a demand. When identity is further complicated by unusual levels of celebrity, wealth, beauty, and athleticism (lots of overlap there), defining oneself is often an act of rebellion against the perceived demands of an insatiable public. Accordingly, it was unsurprising to me at least to learn of several well-known people unhappy with their lives and the burdens upon them.

Regular folks can’t truly relate the glitterati, who are often held up aspirational models. For example, many of us look upon the discomforts of Prince Harry and Meghan Markle with a combination of perverse fascination and crocodile tears. They were undoubtedly trapped in a strange, gilded prison before repudiating the duties expected of them as “senior royals,” attempting an impossible retreat to normalcy outside of England. Should be obvious that they will continue to be hounded while public interest in them persists. Similarly, Presley Gerber made news, fell out of the news, and then got back into the news as a result of his growing collection of tattoos. Were he simply some anonymous fellow, few would care. However, he has famous parents and already launched a modeling career before his face tattoo announced his sense of being “misunderstood.” Pretty bold move. With all the presumed resources and opportunities at his disposal, many have wondered in comments and elsewhere whether another, better declaration of self might have been preferred.

Let me give these three the benefit of doubt. Although they all have numerous enviable attributes, the accident of birth (or in Markle’s case, decision to marry) landed them in exceptional circumstances. The percentage of celebrities who crack under the pressure of unrelenting attention and proceed to run off the rails is significant. Remaining grounded is no doubt easier if one attains celebrity (or absurd immense wealth) after, say, the age of 25 or even later. (On some level, we’ve all lost essential groundedness with reality, but that’s another blog post.) Those who are children of celebrities or who become child prodigies may not all be consigned to character distortion or a life irrevocably out of balance, but it’s at least so commonplace that the dangerous potential should be recognized and embraced only with wariness. I’ve heard of programs designed to help professional athletes who become sudden multimillionaires (and thus targets of golddiggers and scammers) make the transition. Good for them that structured support is available. Yet another way average folks can’t relate: we have to work things out for ourselves.

Here’s the example I don’t get: Taylor Swift. She was the subject of a Netflix biography called Miss Americana (2020) that paints her as, well, misunderstood. Thing is, Swift is a runaway success story, raking in money, fans, awards, attention, and on balance, detractors. That success is something she earnestly desired and struggled to achieve only to learn that the glossy, popstar image sold especially but nonexclusively to 14-year-old girls comes with a lot of heavy baggage. How can the tragic lives of so many musicians launched into superstardom from the late 1950s onward have escaped Swift’s awareness in our media-saturated world? Naming names is sorta tacky, so I demur, but there are lots of them. Swift obtained her heart’s desire, found her songwriting and political voice, maintains a high public profile, and shows no lack of productivity. Sure, it’s a life out of balance, not remotely normal the way most noncelebrities would understand. However, she signed up for it willingly (if naïvely) and by all accounts perpetuates it. She created her own distinctive gilded prison. I don’t envy her, nor do I particularly feel sorry for her, as the Netflix show appears to instruct.

Caveat: rather overlong for me, but I got rolling …

One of the better articles I’ve read about the pandemic is this one by Robert Skidelsky at Project Syndicate (a publication I’ve never heard of before). It reads as only slightly conspiratorial, purporting to reveal the true motivation for lockdowns and social distancing, namely, so-called herd immunity. If that’s the case, it’s basically a silent admission that no cure, vaccine, or inoculation is forthcoming and the spread of the virus can only be managed modestly until it has essentially raced through the population. Of course, the virus cannot be allowed to simply run its course unimpeded, but available impediments are limited. “Flattening the curve,” or distributing the infection and death rates over time, is the only attainable strategy and objective.

Wedding mathematical and biological insights, as well as the law of mass action in chemistry, into an epidemic model may seem obvious now, but it was novel roughly a century ago. We’re also now inclined, if scientifically oriented and informed, to understand the problem and its potential solutions management in terms of engineering rather than medicine (or maybe in terms of triage and palliation). Global response has also made the pandemic into a political issue as governments obfuscate and conceal true motivations behind their handling (bumbling in the U.S.) of the pandemic. Curiously, the article also mentions financial contagion, which is shaping up to be worse in both severity and duration than the viral pandemic itself.

(more…)

I’m aware of at least two authors who describe American character in less than glowing terms: Alexis de Tocqueville and Morris Berman. Tocqueville’s book Democracy in America (two vols., 1835 and 1840) is among the most cited, least read (by 21st-century Americans, anyway) books about America. (I admit to not having read it.) Berman’s American trilogy (titles unnamed, all of which I’ve read) is better known by contemporary Americans (those who read, anyway) and is unflinching in its denunciation of, among other things, our prideful stupidity. Undoubtedly, others have taken a crack at describing American character.

American identity, OTOH, if such a thing even exists, is somewhat more elusive for a variety of reasons. For instance, Americans lack the formative centuries or millennia of history Europeans and Asians have at their disposal. Moreover, Americans (except for Native Americans — multiple synonyms and euphemisms available) are immigrants (or their descendants) drawn from all around the globe. Accordingly, we lack a coherent unifying narrative about who we are. The American melting pot may come closest but is insufficient in its generality. National identity may well be fraying in other societies as each loses its distinctiveness over time. Nonetheless, two influential factors to formation of a loose American identity are negative identity (defining oneself as against others, e.g., adolescent rebellion rather fitting for a young nation) and borrowed identity (better known as cultural appropriation). The latter has been among the chief complaints of social justice warriors.

(more…)

The old saw goes that acting may be just fine as a creative endeavor, but given the opportunity, most actors really want to direct. A similar remark is often made of orchestral musicians, namely, that most rank-and-file players would really rather conduct. Directing and conducting may not be the central focus of creative work in their respective genres. After all, directors don’t normally appear onscreen and conductors make no sound. Instead, they coordinate the activities of an array of creative folks, putting directors in a unique position to bring about a singular vision in otherwise collaborative work. A further example is the Will to Power (associated with Friedrich Nietzsche and Arthur Schopenhauer) characteristic of those who wish to rule (as distinguished from those who wish to serve) such as regents, dictators, and autocrats. All of this sprang to mind because, despite outward appearance of a free, open society in the U.S., recent history demonstrates that the powers that be have instituted a directed election and directed economy quite at odds with democracy or popular opinion.

The nearest analogy is probably the directed verdict, where a judge removes the verdict from the hands or responsibility of the jury by directing the jury to return a particular verdict. In short, the judge decides the case for the jury, making the jury moot. I have no idea how commonplace directed verdicts are in practice.

Directed Election

Now that progressive candidates have been run out of the Democratic primaries, the U.S. presidential election boils down to which stooge to install (or retain) in November. Even if Biden is eventually swapped out for another Democrat in a brokered nominating convention (highly likely according to many), it’s certain to be someone fully amenable to entrenched corporate/financial interests. Accordingly, the deciders won’t be the folks who dutifully showed up and voted in their state primaries and caucuses but instead party leaders. One could try to argue that as elected representatives of the people, party leaders act on behalf of their constituencies (governing by consent of the people), but some serious straining is needed to arrive at that view. Votes cast in the primaries thus far demonstrate persistent desire for something distinctly other than the status quo, at least in the progressive wing of the Democratic party. Applying the cinematic metaphor of the top paragraph, voters are a cast of thousands millions being directed within a larger political theater toward a predetermined result.

Anyone paying attention knows that voters are rarely given options that aren’t in fact different flavors of the same pro-corporate agenda. Thus, no matter whom we manage to elect in November, the outcome has already been engineered. This is true not only by virtue of the narrow range of candidates able to maneuver successfully through the electoral gauntlet but also because of perennial distortions of the balloting process such as gerrymandering, voter suppression, and election fraud. Claims that both sides (really just one side) indulge in such practices so everything evens out don’t convince me.

Directed Economy

Conservative economists and market fundamentalists never seem to tire of arguments in the abstract that capitalist mechanisms of economics, left alone (unregulated, laissez-faire) to work their magic, deliver optimal outcomes when it comes to social and economic justice. Among the primary mechanisms is price discovery. However, economic practice never even remotely approaches the purity of abstraction because malefactors continuously distort and game economic systems out of self-interest greed. Price discovery is broken and equitable economic activity is made fundamentally fictitious. For example, the market for gemstones is famously inflated by a narrow consortium of sellers having successfully directed consumers to adopt a cultural standard of spending three months’ wages/salary for a wedding band as a demonstration of one’s love and devotion. In the opposite direction, precious metal spot prices are suppressed despite very high demand and nearly nonexistent supply. Current quoted premiums over spot silver price, even though no delivery is contemplated, range from roughly 20% to an absurd 2,000%. Supply and demand curves no longer function to aid in true price discovery (if such a thing ever existed). In a more banal sense, what people are willing to pay for a burger at a fast food joint or a loaf of bread at the grocery may affect the price charged more directly.

Nowhere is it more true that we’ve shifted to a directed economy than with the stock market (i.e., Wall Street vs. Main Street). As with the housing market, a real-world application with which many people have personal experience, if a buyer of a property or asset fails to appear within a certain time frame (longer for housing, shorter for stock, bonds, and other financial instruments), the seller is generally obliged to lower the price until a buyer finally appears. Some housing markets extraordinarily flush with money (e.g., Silicon Valley and Manhattan) trigger wild speculation and inflated prices that drive out all but the wealthiest buyers. Moreover, when the eventual buyer turns out to be a bank, corporation, or government entity willing to overpay for the property or asset using someone else’s money, the market becomes wholly artificial. This has been the case with the stock market for the last twelve years, with cheap money being injected nonstop via bailouts and quantitative easing to keep asset prices inflated. When fundamental instabilities began dragging the stock market down last fall, accelerating precipitous in early spring of this year and resulting in yet another crash (albeit brief), the so-called Plunge Protection Team sprang into action and wished trillions of dollars (taxpayer debt, actually, and over the objections of taxpayers in a classic fool-me-once scenario) into existence to perpetuate the casino economy and keep asset prices inflated for the foreseeable future, which isn’t very long.

The beneficiaries of this largesse are the same as they have always been when tax monies and public debt are concerned: corporations, banks, and the wealthy. Government economic supports are directed to these entities, leaving all others in the lurch. Claims that bailouts to keep large corporate entities and wealthy individuals whole so that the larger economy doesn’t seize up and fail catastrophically are preposterous because the larger economy already has seized up and failed catastrophically while the population is mostly quarantined, throwing many individuals out of work and shuttering many businesses. A reasonable expectation of widespread insolvency and bankruptcy lingers, waiting for the workouts and numbers to mount up.

The power of the purse possessed by the U.S. Congress hasn’t been used to help the citizenry since the New Deal era of FDR. Instead, military budgets and debts expand enormously while entitlements and services to the needy and vulnerable are whittled away. Citizen rebellions are already underway in small measure, mostly aimed at the quarantines. When bankruptcies, evictions, and foreclosures start to swell, watch out. Our leaders’ fundamental mismanagement of human affairs is unlikely to be swallowed quietly.

Purpose behind consumption of different genres of fiction varies. For most of us, it’s about responding to stimuli and experiencing emotions vicariously, which is to say, safely. For instance, tragedy and horror can be enjoyed, if that’s the right word, in a fictional context to tweak one’s sensibilities without significant effect outside the story frame. Similarly, fighting crime, prosecuting war, or repelling an alien invasion in a video game can be fun but is far removed from actually doing those things in real life (not fun). For less explicitly narrative forms, such as music, feelings evoked are aesthetic and artistic in nature, which makes a sad song or tragic symphony enjoyable on its own merits without bleeding far into real sadness or tragedy. Cinema (now blurred with broadcast TV and streaming services) is the preeminent storytelling medium that provoke all manner of emotional response. After reaching a certain age (middle to late teens), emotional detachment from depiction of sexuality and violent mayhem makes possible digestion of such stimulation for the purpose of entertainment — except in cases where prior personal trauma is triggered. Before that age, nightmare-prone children are prohibited.

Dramatic conflict is central to driving plot and story forward, and naturally, folks are drawn to some stories while avoiding others. Although I’m detached enough not to be upset by, say, zombie films where people and zombies alike are dispatched horrifically, I wouldn’t say I enjoy gore or splatter. Similarly, realistic portrayals of war (e.g., Saving Private Ryan) are not especially enjoyable for me despite the larger story, whether based on true events or entirely made up. The primary reason I leave behind a movie or TV show partway through is because I simply don’t enjoy watching suffering.

Another category bugs me even more: when fiction intrudes on reality to remind me too clearly of actual horrors (or is it the reverse: reality intruding on fiction?). It doesn’t happen often. One of the first instances I recall was in Star Trek: The Next Generation when the story observed that (fictional) warp travel produced some sort of residue akin to pollution. The reminder that we humans are destroying the actual environment registered heavily on me and ruined my enjoyment of the fictional story. (I also much prefer the exploration and discovery aspects of Star Trek that hew closer to Gene Roddenberry’s original vision than the militaristic approach now central to Star Trek.) A much more recent intrusion occurs in the rather adolescent TV show The 100, where a global nuclear exchange launched by an artificial intelligence has the follow-on effect a century later of remaining nuclear sites going critical, melting down, and irradiating the Earth, making it uninhabitable. This bothers me because that’s my expectation what happens in reality, probably not too long (decades) after industrial civilization collapses and most or all of us are dead. This prospect served up as fiction is simply too close to reality for me to enjoy vicariously.

Another example of fiction intruding too heavily on my doomer appreciation of reality occurred retroactively. As high-concept science fiction, I especially enjoyed the first Matrix movie. Like Star Trek, the sequels degraded into run-of-the-mill war stories. But what was provocative about the original was the matrix itself: a computer-generated fiction situated within a larger reality. Inside the matrix was pleasant enough (though not without conflict), but reality outside the matrix was truly awful. It was a supremely interesting narrative and thought experiment when it came out in 1999. Now twenty-one years later, it’s increasingly clear that we are living in a matrix-like, narrative-driven hyperreality intent on deluding ourselves of a pleasant equilibrium that simply isn’t in evidence. In fact, as societies and as a civilization, we’re careening out of control, no brakes, no steering. Caitlin Johnstone explores this startling after-the-fact realization in an article at Medium.com, which I found only a couple days ago. Reality is in fact far worse than the constructed hyperreality. No wonder no one wants to look at it.

This unwritten blog post has been sitting in my drafts folder since October 2019. The genesis, the kernel, is that beyond the ongoing collapse of the ecosystem, the natural world that provides all the resources upon which we humans and other organisms rely for life and survival, all other concerns are secondary. Now 5–6 months later, we’re faced with a short- to mid-term crisis that has transfixed and paralyzed us, riveting all attention on immediate pressures, not least of which is ample supplies of paper with which to wipe our asses. Every day brings news of worsening conditions: rising numbers of infection; growing incidence of death; sequestering and quarantining of entire cities, states, and countries; business shutdowns; financial catastrophe; and the awful foreknowledge that we have a long way to go before we emerge (if ever) back into daylight and normalcy. The Age of Abundance (shared unequally) may be gone forever.

Are we mobilizing fully enough to stop or at least ameliorate the pandemic? Are our democratically elected leaders [sic] up to the task of marshaling us through the (arguably) worst global crisis in living memory? Are regular folks rising to the occasion, shouldering loss and being decent toward one another in the face of extraordinary difficulties? So far, my assessment would indicate that the answers are no, no, and somewhat. (OK, some municipal and state leaders have responded late but admirably; I’m really thinking of the early executive response that wasn’t). But let me remind: as serious as the immediate health crisis may be, the even larger civilizational collapse underway (alongside the current extinction process) has not yet been addressed. Sure, lots of ink and pixels have been devoted to studies, reports, books, articles, speeches, and blog posts about collapse, but we have blithely and intransigently continued to inhabit the planet as though strong continuity of our living arrangements will persist until — oh, I dunno — the end of the century or so. Long enough away that very few of us now alive (besides Greta Thunberg) care enough what happens then to forestall much of anything. Certainly not any of the real decision-makers. Collapse remains hypothetical, contingent, theoretical, postulated, and suppositional until … well … it isn’t anymore.

While we occupy ourselves indoors at a social distance for some weeks or months to avoid exposure to the scourge, I’d like to believe that we have the intelligence to recognize that, even in the face of a small (by percentage) reduction of global human population, all other concerns are still secondary to dealing with the prospect (or certainty, depending on one’s perspective) of collapse. However, we’re not disciplined or wizened enough to adopt that view. Moreover, it’s unclear what can or should be done, muddying the issue sufficiently to further delay action being taken. Fellow blogger The Compulsive Explainer summarizes handily:

We have been in an emergency mode for some time, and are now just recognizing it. This time it is a virus that is killing us, but we have been dying for a long time, from many causes. So many causes, they cannot be enumerated individually.

So for the near term, life goes on; for the farther term, maybe not.

As we prepare to hunker down for the Long Emergency (using Kunstler’s apt term), there has been a veritable stampede for the exits, which takes multiple forms as the U.S. anticipates an exponential rise in the viral epidemic, roughly a week behind Italy’s example. It wouldn’t surprise me to see curfews and/or martial law enacted before long. But then, I’m an avowed doomer and have expected something wild and woolly to transpire for some years now. It was always futile to predict either what or when with any specificity. The number of possible scenarios is simply too great. But the inevitability of some major disruption was (to me at least) quite obvious. Whether the COVID-19 pandemic develops into a megadeath pulse remains to be seen. I cannot predict any better than most.

In the meantime, panic buying of toilet paper (an irrational essential I joked about here) and prophylactics such as surgical masks and alcohol swabs; widespread cancellation of concerts, sports events, school sessions, and church services; press releases by every public-facing corporate entity as to their hygienic response to the virus; crazy fluctuations in the U.S. and international stock markets; and exhortations to stay home if at all possible attest to the seriousness of the threat. The velocity of the stock market crash in particular points to a mad stampede to get out before being crushed. Our collective response seems to me exaggerated, but perhaps it’s necessary to forestall the worst-case scenario or letting things run rampant. It’s possible that quarantines and a major economic slowdown will do more damage than the virus, making the cure worse than the disease. That’s a hypothetical to which we will probably never know the answer with certainty, though the United Kingdom may be running that very experiment. Also, Guy McPherson suggests that a 20% reduction in industrial activity will be enough to trigger an abrupt rise in global average temperature further negatively affecting habitat. However, it’s a Catch-22 precisely because sustained industrial activity is already destroying habitat.

In nature, there are several familiar waves far too powerful to stop or control: earthquakes, tsunamis, and hurricanes. I suppose we should now acknowledge another: pandemic diseases. While it’s sensible to seek to understand what’s happening even as it happens, I can’t help but to wonder whether resistance is futile and letting the wave crash over us is roughly equivalent to before-the-fact mobilization. Pop psychology would have us do something, not nothing, as an antidote to despair, and indeed, abandoning people to their fates has a callous feel to it — the sort of instrumental logic characteristic of tyrants. I’m not recommending it. On the upside, after the initial panic at the sight of the approaching wave, and shortly after the wave hits, we humans demonstrate a remarkable capacity to set aside differences and pull together to offer aid and comfort. We rediscover our common humanity. Maybe Mad Max-style dystopias are just fiction.

The crisis consists precisely in the fact that the old is dying and the new
cannot be born; in this interregnum a great variety of morbid symptoms appear.

Antonio Gramsci

 As a kid, I was confused when during some TV drama I heard the phrase “The king is dead; long live the king!” I was interpreting events too literally: the king had just died, so how could his subjects proclaim for him long life? Only when age awarded me greater sophistication (probably not wisdom, though) did I then realize that the phrase connotes the end of one era and the start of another. Old regent dies; new regent assumes power. We’re in the midst of such as transition from one era to the next, though it isn’t marked clearly by the death of a leader. Indeed, when I launched this blog in 2006, that was what I sensed and said so plainly in the About Brutus link at top, which hasn’t changed since then except to correct my embarrassing typos. I initially thought the transition would be about an emerging style of consciousness. Only slightly later, I fell down the rabbit hole regarding climate change (an anthropogenic, nonlinear, extinction-level process). I still believe my intuitions and/or conclusions on both subjects, but I’ve since realized that consciousness was always a moving target and climate change could unfold slowly enough to allow other fundamental shifts to occur alongside. No promises, though. We could also expire rather suddenly if things go awry quickly and unexpectedly. At this point, however, and in a pique of overconfidence, I’m willing to offer that another big transition has finally come into focus despite its being underway as I write. Let me explain. In his book America: The Farewell Tour (2018), Chris Hedges writes this:

Presently, 42 percent of the U.S. public believes in creationism … [and] nearly a third of the population, 94 million people, consider themselves evangelical. Those who remain in a reality-based universe do not take seriously the huge segment of the public, mostly white and working-class, who because of economic distress have primal yearnings for vengeance, new glory, and moral renewal and are easily seduced by magical thinking … The rational, secular forces, those that speak in the language of fact and reason, are hated and feared, for they seek to pull believers back into “the culture of death” that nearly destroyed them. The magical belief system, as it was for impoverished German workers who flocked to the Nazi Party, is an emotional life raft. It is all the supports them. [pp. 50–51]

That’s where we are now, retreating into magical thinking we supposedly left behind in the wake of the Enlightenment. Call it the Counter-Enlightenment (or Un-Enlightenment). We’re on this track for a variety of reasons but primarily because the bounties of the closing Age of Abundance have been gobbled up by a few plutocrats. Most of the rest of population, formerly living frankly precarious lives (thus, the precariat), have now become decidedly unnecessary (thus, the unnecessariat). The masses know that they have been poorly served by their own social, political, and cultural institutions, which have been systematically hijacked and diverted into service of the obscenely, absurdly rich.

Three developments occurring right now, this week, indicate that we’re not just entering an era of magical thinking (and severely diminishing returns) but that we’ve lost our shit, gone off the deep end, and sought escape valves to release intolerable pressures. It’s the same madness of crowds writ large — something that periodically overtakes whole societies, as noted above by Chris Hedges. Those developments are (1) the U.S. stock market (and those worldwide?) seesawing wildly on every piece of news, (2) deranged political narratives and brazenly corrupt machinations that attempt to, among other things, install select the preferred Democratic presidential candidate to defeat 45, and (3) widespread panic over the Covid-19 virus. Disproportionate response to the virus is already shutting down entire cities and regions even though the growing epidemic so far in the U.S. has killed fewer people than, say, traffic accidents. Which will wreak the worst mayhem is a matter of pointless conjecture since the seriousness of the historical discontinuity will require hindsight to access. Meanwhile, the king is dead. Long live the king!