Archive for the ‘Corporatism’ Category

On the heels of a series of snowstorms, ice storms, and deep freezes (mid-Feb. 2021) that have inundated North America and knocked out power to millions of households and businesses, I couldn’t help but to notice inane remarks and single-pane comics to the effect “wish we had some global warming now!” Definitely, things are looking distinctly apocalyptic as folks struggle with deprivation, hardship, and existential threats. However, the common mistake here is to substitute one thing for another, failing to distinguish weather from climate.

National attention is focused on Texas, expected to be declared a disaster zone by Pres. Biden once he visits (a flyover, one suspects) to survey and assess the damage. It’s impossible to say that current events are without precedent. Texas has been in the cross-hairs for decades, suffering repeated droughts, floods, fires, and hurricanes that used to be prefixed by 50-year or 100-year. One or another is now occurring practically every year, which is exactly what climate chaos delivers. And in case the deep freeze and busted water pipes all over Texas appear to have been unpredictable, this very thing happened in Arizona in 2011. Might have been a shot across the bow for Texas to learn from and prepare, but its self-reliant, gun-totin’, freedom-lovin’ (fuck, yeah!), secessionist character is instead demonstrated by having its own electrical grid covering most of the state, separated from other North American power grids, ostensibly to skirt federal regulations. Whether that makes Texas’ grid more or less vulnerable to catastrophic failure is an open question, but events of the past week tested it sorely. It failed badly. People literally froze to death as a result. Some reports indicate Texas was mere moments away from an even greater failure that would have meant months to rebuild and reestablish electrical service. A substantial diaspora would have ensued, essentially meaning more climate refugees.

So where’s the evil in this? Well, let me tell you. Knowledge that we humans are on track to extirpate ourselves via ongoing industrial activity has been reported and ignored for generations. Guy McPherson’s essay “Extinction Foretold, Extinction Ignored” has this to say at the outset:

The warnings I will mention in this short essay were hardly the first ones about climate catastrophe likely to result from burning fossil fuels. A little time with your favorite online search engine will take you to George Perkins Marsh sounding the alarm in 1847, Svente Arrhenius’s relevant journal article in 1896, Richard Nixon’s knowledge in 1969, and young versions of Al Gore, Carl Sagan, and James Hansen testifying before the United States Congress in the 1980s. There is more, of course, all ignored for a few dollars in a few pockets. [links in original]

My personal acquaintance with this large body of knowledge began accumulating in 2007 or so. Others with decision-making capacity have known for much, much longer. Yet short-term motivations shoved aside responsible planning and preparation that is precisely the warrant of governments at all levels, especially, say, the U.S. Department of Energy. Sure, climate change is reported as controversy, or worse, as conspiracy, but in my experience, only a few individuals are willing to speak the obvious truth. They are often branded kooks. Institutions dither, distract, and even issue gag orders to, oh, I dunno, prop up real estate values in south Florida soon to be underwater. I’ve suggested repeatedly that U.S. leaders and institutions should be acting to manage contraction and alleviate suffering best as possible, knowing that civilization will fail anyway. To pretend otherwise and guarantee — no — drive us toward worst-case scenarios is just plain evil. Of course, the megalomania of a few tech billionaires who mistakenly believe they can engineer around society’s biggest problems is just as bad.

Writ small (there’s a phrase no one uses denoting narrowing scope), meaning at a scale less than anthropogenic climate change (a/k/a unwitting geoengineering), American society has struggled to prioritize guns vs. butter for over a century. The profiteering military-industrial complex has clearly won that debate, leaving infrastructure projects, such as bridge and road systems and public utilities, woefully underfunded and extremely vulnerable to market forces. Refusal to recognize public health as a right or public good demanding a national health system (like other developed countries have) qualifies as well. As inflated Pentagon budgets reveal, the U.S. never lacks money to oppress, fight, and kill those outside the U.S. Inside the U.S., however, cities and states fall into ruin, and American society is allowed to slowly unwind for lack of support. Should we withdraw militarily from the world stage and focus on domestic needs, such as homelessness and joblessness? Undoubtedly. Would that leave us open to attack or invasion (other than the demographic invasion of immigrants seeking refuge in the U.S.)? Highly doubtful. Other countries have their own domestic issues to manage and would probably appreciate a cessation of interference and intervention from the U.S. One might accuse me of substituting one thing for another, as I accused others at top, but the guns-vs.-butter debate is well established. Should be obvious that it’s preferable to prioritize caring for our own society rather than devoting so much of our limited time and resources to destroying others.

So far, this multipart blog post has trafficked in principles and generalities. Let me try now to be more specific, starting with an excerpt from Barry Lynn’s article in Harper’s Magazine titled “The Big Tech Extortion Racket” (Sept. 2020):

… around the middle of the nineteenth century, Americans began to develop technologies that could not be broken into component pieces. This was especially true of the railroad and the telegraph … Such corporations [railroad and telegraph companies] posed one overarching challenge: they charged some people more than others to get to market. They exploited their control over an essential service in order to extort money, and sometimes political favors … Americans found the answer to this problem in common law. For centuries, the owners of ferries, stagecoaches, and inns had been required to serve all customers for the same price and in the order in which they arrived. In the late nineteenth century, versions of such “common carrier” rules were applied to the new middleman corporations.

Today we rightly celebrate the Sherman Antitrust Act of 1890, which gave Americans the power to break apart private corporations. But in many respects, the Interstate Commerce Act of 1887 was the more important document. This act was based on the understanding that monopoly networks like the railroad and the telegraph could be used to influence the actions of people who depend on them, and hence their power must be carefully restricted …

For a century and a half, Americans used common carrier policies to ensure the rule of law in activities that depended on privately held monopolies … regulations freed Americans to take full advantage of every important network technology introduced during these years, including telephones, water and electrical services, energy pipelines, and even large, logistics-powered retailers. Citizens did not have to worry that the men who controlled the technologies involved would exploit their middleman position to steal other people’s business or disrupt balances of power.

I appreciate that Barry Lynn brings up the Interstate Commerce Act. If this legal doctrine appeared in the net neutrality debate a few years ago, it must have escaped my notice. While Internet Service Providers (ISPs) enable network access and connectivity, those utilities have not yet exhibited let’s-be-evil characteristics. Similarly, phone companies (including cell phones) and public libraries may well be eavesdropping and/or monitoring activities of the citizenry, but the real action lies elsewhere, namely, on social media networks and with online retailers. Evil is arguably concentrated in the FANG (or FAANG) corporations but has now grown to be ubiquitous in all social networks (e.g., Twitter) operating as common carriers (Zoom? Slack?) and across academe, nearly all of which have succumbed to moral panic. They are interpreting correctly, sad to observe, demands to censor and sanitize others’ no-longer-free speech appearing on their networks or within their realms. How much deeper it goes toward shaping politics and social engineering is quasi-conspiratorial and impossible for me to assess.

Much as I would prefer to believe that individuals possess the good sense to shift their activities away from social networks or turn their attention from discomfiting information sources, that does not appear to be the case. Demands for trigger warnings and safe spaces commonplace a few years ago on college campuses have instead morphed into censorious removal, deplatforming, and cancellation from the entire public sphere. Those are wrong responses in free societies, but modern institutions and technologies have gotten out of hand and outstripped the limits of normal human cognition. In short, we’re a society gone mad. So rather than accept responsibility to sort out information overflow oneself, many are demanding that others do it for them, and evil private corporations are complying (after a fashion). Moreover, calls for creation of an Orwellian Ministry of Truth, rebranded as a Truth Commission and Reality Czar, could hardly be any more chillingly and fascistically bizarre. People really need someone to brainwash decide for them what is real? Has anyone at the New York Times actually read Orwell’s dystopian novel 1984 and taken to heart its lessons?

Considering the acceleration of practically everything in the late-modern world (postmodern refers to something quite different), which makes planning one’s higher education somewhat fraught if the subject matter studied is rendered flatly out-of-date or moribund by the time of either graduation or entry into the workforce, I’ve heard it recommended that expertise in any particular subject area may be less important than developing expertise in at least one subject that takes a systems approach. That system might be language and communications, mathematics (or any other hard science), history, economics and finance, business administration, computer coding, law and governance, etc. So long as a rigorous understanding of procedures and rules is developed, a structuralist mindset can be repeated and transferred into other subject areas. Be careful, however, not to conflate this approach with a liberal arts education, which is sometimes described as learning how to learn and is widely applicable across disciplines. The liberal arts have fallen distinctly out of favor in the highly technological and technocratic world, which cares little for human values resistant to quantification. Problem is, Western societies in particular are based on liberal democratic institutions now straining due to their sclerotic old age. And because a liberal arts education is scarcely undertaken anymore, civics and citizenship are no longer taught. Even the study of English has now been corrupted (postmodern does apply here) to the point that the basic liberal arts skill of critical thinking is being lost through attrition. Nowhere is that more abundantly clear than in bristling debate over free speech and censorship.

Aside. Although society tinkers and refines itself (sometimes declines) over time, a great body of cultural inheritance informs how things are done properly within an ideology or system. When tinkering and refinement become outright intransigence and defiance of an established order, it’s commonplace to hear the objection “but that’s not how _______ works.” For instance, debate over climate science or the utility of vaccines often has one party proclaiming “trust [or believe] the science.” However, that’s not how science works (i.e., through unquestioning trust or belief). The scientific method properly understood includes verification, falsification, and revision when results and assertions fail to establish reasonable certainty (not the same as consensus). Similarly, critical thinking includes a robust falsification check before “facts” can be accepted at face value. So-called “critical studies” (a/k/a grievance studies), like religious faith, typically positions bald assertions beyond the reach of falsification. Well, sorry, that’s not how critical thinking works.

Being older and educated before critical studies were fully legitimized (or gave rise to things as risible as feminist glaciology), my understand has always been that free speech and other rights are absolutes that cannot be sliced and diced into bits. That way lies casuistry, where law founders frequently. Thus, if one wishes, say, to trample or burn the U.S. flag in protest, no law can be passed or constitutional amendment enacted to carve out an exception disallowed that instance of dissenting free speech. A lesser example is kneeling silently rather than participating in singing the national anthem before a sporting event. Though offensive to certain individual’s sensibilities, silencing speech is far worse according to liberal democratic values. Whatever our ideological or political difference are, we cannot work them out when one party has the power to place topics out or bounds or remove others from discussion entirely. The point at which spirited debate crosses over into inciting violence or fomenting insurrection is a large gray area, which is the subject of the second impeachment of 45. Civil law covers such contingencies, so abridging free speech, deplatforming, and adopting the formulation “language is violence” are highly improper responses under the liberal form of government codified in the U.S. Constitution, which includes the Bill of Rights originally omitted from the U.S. Constitution but quickly added to articulate the rights fully.

Liberal democratic ideology arose in mercantile, substantially agrarian Western societies before scientific, industrial, and capitalist revolutions built a full head of steam, so to speak. Considering just how much America has developed since the Colonial Period, it’s no surprise society has outgrown its own founding documents. More pointedly, the intellectual commons was a much smaller environment, often restricted to a soapbox in the town square and the availability of book, periodicals,and broadsides. Today, the public square has moved online to a bewildering array of social media platforms that enables publication of one’s ideas well beyond the sound of one’s voice over a crowd or the bottleneck of a publisher’s printing press. It’s an entirely new development, and civil law has not kept pace. Whether Internet communications are regulated like the airwaves or nationalized like the U.S. military, it’s clear that the Wild West uber-democratic approach (where anyone can basically say anything) has failed. Demands for regulation (restrictions on free speech) are being taken seriously and acted upon by the private corporations that run social media platforms. During this interim phase, it’s easy for me, as a subscriber to liberal democratic values, to insist reflexively on free speech absolutism. The apparent mood of the public lies elsewhere.

Evil exists in the world. History and current events both bear this out amply. Pseudo-philosophers might argue that, like emotions and other immaterial sensations, good and evil are merely reified concepts, meaning they are human constructs with no palpable external reality. Go tell that to victims of evildoers. Human suffering can’t be anonymized, rationalized, or philosophized away quite so handily.

It was sort of refreshing, back in the day, when Google’s motto and/or corporate code of conduct was simple: “Don’t Be Evil.” It acknowledged the potential for being or becoming evil (like any of the Bigs: Big Tobacco, Big Soda, Big Oil, Big Pharma, Big Media, Big Agriculture, etc.) and presumably aspired to resist obvious temptations. That was then (from 2000 to 2018), this is now (2021 until death take us — soon enough, I fear). But like all entities possessed of absurd levels of wealth and power, Google (now reorganized as a subsidiary of Alphabet, but who actually refers to it that way?) and its Silicon Valley brethren have succumbed to temptation and become straight-up evil.

One might charitably assess this development as something unbidden, unanticipated, and unexpected, but that’s no excuse, really. I certainly don’t envy celebrity executives experiencing difficulty resulting from having created unmanageable behemoths loosed on both public and polity unable to recognize beastly fangs until already clamped on their necks. As often occurs, dystopian extrapolations are explored in fiction, sometimes satirically. The dénouement of the HBO show Silicon Valley depicts tech mogul wannabes succeeding in creating an AI (or merely a sophisticated algorithm? doesn’t matter …) that would in time become far too powerful in blind execution of its inner imperative. In the show, characters recognize what they had done and kill their own project rather than allow it to destroy the world. In reality, multiple developers of computer tech platforms (and their embedded dynamic, including the wildly unhelpful albeit accurate term algorithm) lacked the foresight to anticipate awful downstream effects of their brainchildren. Yet now that those effects are manifesting recognizably, these corporations continue to operate and wreak havoc.

Silicon Valley shows a extended software development period of bungling ineptitude punctuated by brilliant though momentary breakthroughs. Characters are smart, flawed people laughably unable to get out of the way of their own success. The pièce de résistance was yoking one so-called “learning machine” to another and initiating what would become a runaway doomsday process (either like ecological collapse, building slowly the making the biosphere uninhabitable all at once, or like the gray goo problem, progressively “processing” biomass at the molecular level until all that remains is lifeless goo). It was a final act of bumbling that demanded the characters’ principled, ethical response before the window of opportunity closed. Real Silicon Valley tech platforms are in the (ongoing) process of rending the social fabric, which is no laughing matter. The issue du jour surrounds free speech and its inverse censorship. More broadly, real Silicon Valley succeeded in gaming human psychology for profit in at least two aspects (could be more as yet unrecognized): (1) mining behavioral data as an exploitable resource, and (2) delivering inexhaustible streams of extremely divisive content (not its own) to drive persistent engagement with its platforms. Yoked together, they operate to drive society mad, and yet, mounting evidence of this development has not produced even an inkling that maybe the damned doomsday devices ought to be shut off. As with the environment, we operate with freedom enough to destroy ourselves. Instead, politicians issue stunningly ineffectual calls for regulation or break-up of monopolies. In the meantime, ever more absurd wealth and power are concentrated in the hands of a few executives who have clearly punted and decided “let’s be evil.” No restraints on their behavioral experimentation across whole societies exist.

Much more to say on this topic in additional parts to come.

Already widely reported but only just having come to my awareness is an initiative by Rolling Stone to establish a Culture Council: “an Invitation-Only Community of Influencers, Innovatives, and Creatives.” The flattering terms tastemakers and thought leaders are also used. One must presume that submissions will be promotional and propaganda pieces masquerading as news articles. Selling advertising disguised as news is an old practice, but the ad usually has the notation “advertisement” somewhere on the page. Who knows whether submissions will be subject to editorial review?

To be considered for membership, candidates must sit in a senior-level position at a company generating at least $500K in annual revenue or have obtained at least $1M in total institutional funding.

Rolling Stone‘s website doesn’t say it anywhere I can locate, but third-party reports indicate that members pay either a $1,500 annual fee and $500 submission fee (one-time? repeat?) or a flat $2,000 submission fee. Not certain which. Just to be abundantly clear, fees would be paid by the submitter to the magazine, reversing how published content is normally acquired (i.e., by paying staff writers and free lancers). I’d say this move by Rolling Stone is unprecedented, but of course, it’s not. However, it is a more brazen pay-to-play scheme than most and may be a harbinger of even worse developments to come.

Without describing fully how creative content (arts and news) was supported in the past, I will at least observe that prior to the rise of full-time creative professions in the 18th and 19th centuries (those able to scratch out earn a living on commissions and royalties), creative work was either a labor of love/dedication, typically remunerated very poorly if at all, or was undertaken through the patronage of wealthy European monarchs, aristocrats, and religious institutions (at least in the developing West). Unless I’m mistaken, self-sustaining news organizations and magazines came later. More recent developments include video news release and crowd sourcing, the latter of which sometimes accomplished under the pretense of running contests. The creative commons is how many now operative (including me — I’ve refused to monetize my blog), which is exploited ruthlessly by HuffPost (a infotainment source I ignore entirely), which (correct me if wrong) doesn’t pay for content but offers exposure as an inducement to journalists trying to develop a byline and/or audience. Podcasts, YouTube channels, and news sites also offer a variety of subscription, membership, and voluntary patronage (tipping) schemes to pay the bills (or hit it big if an outlier). Thus, business models have changed considerably over time and are in the midst of another major transformation, especially for news-gathering organizations and the music recording industry in marked retreat from their former positions.

Rolling Stone had always been a niche publication specializing in content that falls outside my usual scope of interest. I read Matt Taibbi’s reporting that appeared in Rolling Stone, but the magazine’s imprint (read: reputation) was not the draw. Now that the Rolling Stone is openly soliciting content through paid membership in the Culture Council, well, the magazine sinks past irrelevance to active avoidance.

It’s always been difficult to separate advertising and propaganda from reliable news, and some don’t find it important to keep these categories discrete, but this new initiative is begging to be gamed by motivated PR hacks and self-promoters with sufficient cash to burn. It’s essentially Rolling Stone whoring itself out. Perhaps more worrying is that others will inevitably follow Rolling Stone‘s example and sell their journalistic integrity with similar programs, effectively putting the final nails in their own coffins (via brand self-destruction). The models in this respect are cozy, incestuous relationships between PACs, lobbying groups, think tanks, and political campaigns. One might assume that legacy publications such as Rolling Stone would have the good sense to retain as much of their valuable brand identity as possible, but the relentless force of corporate/capitalist dynamics are corrupting even the incorruptible.

Something in an online discussion brought me back to my days as a Boy Scout. (No, not that, with your nasty, nasty assumptions.) It was one of the first merit badges I earned: Citizenship in the Community (link to PDF). I can’t remember any of the content anymore (haven’t yet consulted the PDF), and indeed, looking back with the advantage of several decades of hindsight, I have a hard time imagining any of the (morality? ethics?) lessons learned back then having had much durable impact despite remembering an emerging confidence and awareness (a commonplace delusion of youth) of my position within the community. Still, I appreciate having had many Boy Scout character-building experiences, which led to simple and enduring understandings of ideals such as honor, duty, preparedness, service, forbearance, shouldering hardships, and perhaps most of all, accepting responsibility for others, particularly those younger and weaker. (I’m not claiming to be any sort of paragon of virtue. Cynicism and misanthropy may have wrecked that aspiration.) I never served in the military, but I surmise others learn similar lessons slightly later in life when more readily absorbed and not so easily forgotten. In the past decade plus, some may seek these lessons through participation in endurance sports or martial arts (if not distorted by bad instruction like in Cobra Kai), though the focus outward (i.e., toward community and mutual reliance) may not be as strong.

The subject came up in a discussion of participants in small-scale democracy, something I’ve always known is messy, unrewarding, thankless, and sometimes costly yet still necessary to be a good citizen contributing to one’s community. Many adults get their first taste of local democratic groups (read: self-governing) through parent groups like the Parent-Teacher Association (PTA). Or maybe it’s a performing arts organization, home owner’s association, church council, social work hotline, self-help group, or cooperative. Doesn’t matter which. (Political activism and organizing might be something quite different. Hard to say.) Groups run on the good will and dedication of volunteered time and skills for the benefit of members of the community. As with any population, there are always free riders: those who contribute nothing but enjoy and/or extract benefits. In fact, if everyone were integrally involved, organizational complexity would become unmanageable. If activities of such groups seem like a piece of cake or vaguely utopian, just join one and see how different character types behave. Lotta dead wood in such organization. Moreover, power mongers and self-aggrandizers often take over small-scale democracies and run them like private fiefdoms. Or difficult policy and finance discussions divide otherwise like-minded groups into antagonists. As I said, it’s a decidedly messy undertaking.

Members of the community outside of the executive group (typically a board of directors) also have legitimate interests. Maybe community members attend meetings to keep informed or weigh in online with unconstructive complaints and criticisms (or even mockery and trolling) but then refuse to contribute anything worthwhile. Indeed, boards often have difficulty recruiting new officers or participants because no one wants to take on responsibility and face potential criticism directed at them. I’ve also seen boards settle into the same few folks year after year whose opinions and leadership grow stale and calcifies.

Writ large, leadership skills learned through citizenship in the community rise to the equivalents of Boy Scout merit badges Citizenship in the Nation and Citizenship in the World (no links but searchable). Skills deployed at those strata would arguably require even greater wherewithal and wisdom, with stakes potentially being much higher. Regrettably, having just passed through an election cycle and change of leadership in the U.S., my dour assessment is that leadership has failed miserably at multiple issues. The two most significant involve how we fail to organize society for the benefit of all, namely, economic equality and resource sustainability. Once market forces came to bear on social organization and corporate entities grew too large to be rooted in community service anymore, greed and corruption destroyed high-minded ideals. More self-aggrandizers and careerists than ever (no names, fill in the blanks, they’re all famous — or infamous) rose to the tops of organizations and administrations, especially politics, news media, and the punditry. Their logical antidotes are routinely and ruthlessly disenfranchised and/or ignored. The lasting results are financial inequality run amok and unsustainable resource addictions (energy mostly) that are toxifying the environment and reducing the landscape to ruin and inhabitability. (Perpetual war is a third institutional failure that could be halted almost immediately if moral clarity were somehow to appear.) It’s all out there, plain to see, yet continues to mount because of execrable leadership. Some argue it’s really a problem with human nature, a kind of original stain on our souls that can never be erased and so should be forgiven or at least understood (and rationalized away) within a large context. I’m not yet ready to excuse national and world leaders. Their culpability is criminal.

I simply can’t keep up with all the reading, viewing, and listening in my queue. Waking hours are too few, and concentration dissipates long before sleep overtakes. Accordingly, it’s much easier to settle into couch-potato mode and watch some mindless drivel, such as the Netflix hit Bridgerton binged in two sittings. (Unlike cinema critics, I’m not bothered especially by continuity errors, plot holes, clunky dialogue, weak character motivation, gaps of logic, or glossy decadence of the fictional worlds. I am bothered by the Kafka trap sprung on anyone who notices casting decisions that defy time and place — an ill-advised but now commonplace historical revisionism like editing Mark Twain.) As a result, blog posts are less frequent than they might perhaps be as I pronounce upon American (or more broadly, Western) culture, trying vainly to absorb it as a continuously moving target. Calls to mind the phrase Après moi, le déluge, except that there is no need to wait. A deluge of entertainment, news, analysis, punditry, and trolling has buried everyone already. So rather than the more careful consideration I prefer to post, here are some hot takes.

The Irregular Aphorist. Caitlin Johnstone offers many trenchant observations in the form of aphorisms (some of which I’ve quoted before), all gathered under the subtitle Notes From The Edge Of The Narrative Matrix. The modifier irregular only means that aphorisms are a regular but not constant feature. Her site doesn’t have a tag to that effect but probably ought to. Here’s one in particular that caught my attention:

Everything our species has tried has led us to a dying world and a society that is stark raving mad, so nobody is in any position to tell you that you are wrong.

Twin truths here are (1) the dying world and (2) societal madness, both of which I’ve been describing for some time. Glad when others recognize them, too.

Piling on. Though few still are willing to admit it, nonpharmaceutical interventions (NPIs, e.g., distancing, masks, and lockdowns) to stall or reduce the spread of the virus failed to achieve their objectives according to this study. Instead, NPIs piled on suffering no one could forestall. I read somewhere (no link) that the world is approaching half of total, cumulative deaths/infections predicted had nothing been done to impede the pandemic running its course. Adding in deaths of despair (numbers not entirely up to date), we’re using the wrong tools to fight the wrong battle. Of course, interventions opened up giant opportunities for power grabs and vulture capitalism, so the cynic in me shrugs and wonders half aloud “what did you expect, really?”

Growth of the Managerial Bureaucracy. A blog called Easily Distracted by Timothy Burke (never on my blogroll) publishes only a few times per year, but his analysis is terrific — at least when it doesn’t wind up being overlong and inconclusive. Since a student debt jubilee is back in the news (plenty of arguments pro and con), unintended consequences are anticipated in this quote:

When you set out to create elaborate tiers that segregate the deserving poor from the comfortable middle-class and the truly wealthy, you create a system that requires a massive bureaucracy to administer and a process that forces people into petitionary humiliation in order to verify their eligibility. You create byzantine cutoff points that become business opportunities for predatory rentiers.

Something similar may well be occurring with stimulus checks being issued pro rata (has anyone actually gotten one?), but at least we’re spared any petitionary humiliations. We get whatever the algorithms (byzantine cutoff points) dictate. How those funds will be gamed and attached is not yet clear. Stay alert.

No Defense of Free Speech. Alan Jacobs often recommends deleting, unsubscribing, and/or ignoring social media accounts (after his own long love-hate relationship with them) considering how they have become wholly toxic to a balanced psyche as well as principal enablers of surveillance capitalism and narrative control. However, in an article about the manorial elite, he’s completely lost the plot that absolutism is required in defense of free speech. It’s not sufficient to be blasé or even relieved when 45 is kicked off Twitter permanently or when multiple parties conspire to kill Parler. Establishing your own turf beyond the reach of Silicon Valley censors is a nice idea but frankly impractical. Isn’t that what whoever ran Parler (or posted there) must have thought? And besides, fencing off the digital commons these very entities created has catapulted them into the unenviable position of undemocratic, unelected wielders of monopolistic power and co-conspirators to boot. That’s what needs to be curtailed, not free speech.

The Taxonomic Apocalypse. Although drawn from fiction and thus largely hypothetical, a new book (coming late 2021) by Adam Roberts called It’s the End of the World: But What Are We Really Afraid Of? surveys doomsday stories and categorizes different versions of how it all ends. Alan Jacobs (yeah, him again — must have an advance copy of the manuscript) recommends it as “a delightful and provocative little book” but fails to grok two things: (1) these stories are rehearsals-cum-preparations for the real thing, and (2) the real thing really is bearing down on us implacably and so is no longer a mere hypothetical to contemplate and categorize for shits and grins. Despite acceptance of the eventualities that await all of us, reading Roberts’ taxonomy is not something I would expect to find delightful. Skip.

Narrative Collapse. Ran Prier (no link) sometimes makes statements revealing an unexpected god’s-eye view:

[45] is a mean rich kid who figured out that if he does a good Archie Bunker impression, every lost soul with an authoritarian father will think he’s the messiah. We’re lucky that he cares only about himself, instead of having some crazy utopian agenda. But the power, and the agency, is with the disaffected citizens of a declining empire, tasting barbarism.

This is all about people wanting to be part of a group that’s part of a story. Lately, some of the big group-stories have been dying: sky father religion, American supremacy, the conquest of nature, the virtue of wealth-seeking. In their place, young and clumsy group-stories struggle and rise.

Collapse of certain fundamental stories that animate our thinking is at the core of The Spiral Staircase (see About Brutus at top), though it’s often couched in terms of consciousness in transition. Getting through the transition (only temporarily, see previous item in list) probably means completion of the Counter-Enlightenment historical arc, which necessarily includes further descent into barbarism.

Hail Mary for Individualism. I always take special notice when someone cites Allan Bloom. Alan Jacobs (um, yeah, he’s prolific and I’m using his ideas again — sue me) cites Bloom to argue that individualism or the sovereign self, a product of the Enlightenment, is already dead. No doubt, the thought-world described so ably by Bloom no longer exists, but individualism has not yet died out by attrition or been fully dissolved in nonduality. Many of us born before the advent of the Internet retain selfhood and authenticity not yet coopted by or incorporated into mass mind. Moreover, ongoing struggles over identity (e.g., gender, sexual orientation, and race that are often used improperly to define the self) result from an inchoate sense that individualism is eroding precipitously, not that it’s already passé. Defiant attempts to (re)establish an authentic self (contravening all logic and becoming critical theory of one sort or another) in the face of this loss may well be a last-ditch effort to save the self, but it’s failing.

Unlike turtles, humans do not have protective shells into which we can withdraw when danger presents. Nor can we lift off, fly away, and elude danger the way birds do. These days, we’re sorely beset by an invisible pandemic spread by exposure to carriers (read: other people) and so asked or forced to submit to being locked down and socially distanced. Thus, we are withdrawn into the protective shell of the home in cycles of varying intensity and obeisance to maintain health and safety. Yet life goes on, and with it, numerous physical requirements (ignoring psychological needs) that can’t be met virtually demand we venture out into the public sphere to gather resources, risking exposure to the scourge. Accordingly, the conduct of business has adapted to enable folks to remain in the protective shells of their vehicles, taking delivery through the car window and rarely if ever entering a brick-and-mortar establishment except in defiance or at the option of acceptable risk. In effect, we’re being driven into our cars ever more, and the vehicle is readily understood as a proxy for its inhabitant(s). Take note of pictures of people in bread lines during the Great Depression having been replaced by pictures of cars lined up for miles during the pandemic to get packaged meals from charitable organizations.

Reflecting on this aspect of modern life, I realized that it’s not exactly novel. The widespread adoption of the individual vehicle in the 1940s and 50s, as distinguished from mass transit, and the construction of the interstate highway system promised (and delivered) flexibility and freedom of tremendous appeal. While the shift into cars (along with air travel) doomed now moribund passenger rail (except intracity in the few American cities with effective rail systems), it enabled the buildout of suburbs and exurbs now recognized as urban sprawl. And like all those packages now clogging delivery systems as we shift even more heavily during the holiday season to online shopping, a loss of efficiency was inevitable. All those individual cars and boxes create congestion that cry out for solutions.

Among the solutions (really a nonsolution) were the first drive-through banks of the 1970s. Is doing one’s banking without leaving the vehicle’s protective shell really an efficiency? Or is it merely an early acknowledgement and enabling of antisocial individualism? Pneumatic tubes that permitted drive-through banking did not speed up transactions appreciably, but the novel mechanism undoubtedly reinforced the psychological attachment Americans felt with their cars. That growing attachment was already apparent in the 1950s, with two bits of Americana from that decade still resonating: the drive-in theater and the drive-in restaurant. The drive-in theater was a low-fidelity efficiency and alternative to the grand movie houses built in the 1920s and 30s seating a few thousand people in one cavernous space. (A different sort of efficiency enabling choice later transformed most cinema establishments into multiplexes able to show 8–10 titles instead of one, handily diminishing audiences of thousands to hundreds or even tens and robbing the group experience of much of its inherent power. Now that premium streaming content is delivered to screens at home and we are disallowed assembly into large audiences, we have instead become something far more inert — viewers — with fully anticipatable degradation of the entertainment experience notwithstanding the handsome technologies found within the comforts of the home.) I’ve heard that drive-ins are experiencing a renaissance of sorts in 2020, with Walmart parking lots converted into showplaces, at least temporarily, to resemble (poorly) group experience and social cohesion connection. The drive-in restaurant of the 1950s, with their iconic carhops (sometimes on roller skates), is a further example of enabling car culture to proliferate. Never mind that eating in the car is actually kinda sad and maybe a little disgusting as odors and refuse collect in that confined space. One might suspect that drive-ins were directed toward teenyboppers and cruisers of the 1950s exploring newfound freedom, mobility, and the illusion of privacy in their cars, parked in neat rows at drive-ins (and Lookout Points for smooch sessions) all across the country. However, my childhood memory was that it was also a family affair.

Inevitably, fast food restaurants followed the banks in the 1970s and quickly established drive-through lanes, reinforcing the degradation of the food experience into mere feeding (often on one’s lonesome) rather than dining in community. Curiously, the pandemic has made every restaurant still operating, even the upscale ones, a drive-through and forced those with and without dedicated drive-through lanes to bring back the anachronistic carhop to serve the congestion. A trip to a local burger joint in Chicago last week revealed 40+ cars in queue and a dozen or so carhops on the exterior directing traffic and making deliveries through the car window (briefly penetrating the protective shell) so that no one would have to enter the building and expose oneself to virus carriers. I’ve yet to see a 2020 carhop wearing roller skates (now roller blades) or a poodle skirt.

Such arrangements are probably effective at minimizing pandemic risk and have become one of several new normals (discussion of political dysfunction deferred). Who can say how long they will persist? Still, it’s strange to observe the psychology of our response, even if only superficially and preliminarily. Car culture has been a curious phenomenon since at least the middle of the 20th century. New dynamics reinforcing our commitment to cars are surprising, perhaps, but a little unsurprising, too, considering how we made ourselves so dependent on them as the foundation of personal transportation infrastructure. As a doomer, I had rather expected that Peak Oil occurring around 2006 or so would spell the gradual (or sudden) end of happy motoring as prices at the pump, refusal to elevate standard fuel efficiency above 50 mph, and climbing average cost of new vehicles placed individual options beyond the reach of average folks. However, I’ve been genuinely surprised by fuel costs sinking to new lows (below the cost of production, even bizarrely inverting to the point that producers paid buyers to take inventory) and continued attempts to engineer (only partially) around the limitations of Peak Oil, if not indeed Peak Energy. I continue to believe these are mirages, like the record-setting bull market of 2020 occurring in the midst of simultaneous economic, social, and health crises.

Black Friday has over the past decades become the default kickoff of annual consumer madness associated with the holiday season and its gift-giving tradition. Due to the pandemic, this year has been considerably muted in comparison to other years — at least in terms of crowds. Shopping has apparently moved online fairly aggressively, which is an entirely understandable result of everyone being locked down and socially distanced. (Lack of disposable income ought to be a factor, too, but American consumers have shown remarkable willingness to take on substantial debt when able in support of mere lifestyle.) Nevertheless, my inbox has been deluged over the past week with incessant Black Friday and Cyber Monday advertising. Predictably, retailers continue feeding the frenzy.

Uncharacteristically, perhaps, this state of affairs is not the source of outrage on my part. I recognize that we live in a consumerist, capitalist society that will persist in buying and selling activities even in the face of increasing hardship. I’m also cynical enough to expect retailers (and the manufacturers they support, even if those manufacturers are Chinese) to stoke consumer desire through advertising, promotions, and discount sales. It’s simply what they do. Why stop now? Thus far, I’ve seen no rationalizations or other arguments excusing how it’s a little ghoulish to be profiting while so many are clearly suffering and facing individual and household fiscal cliffs. Instead, we rather blandly accept that the public needs to be served no less by mass market retailers than by, say, grocery and utility services. Failure by the private sector to maintain functioning supply lines (including nonessentials, I suppose) during a crisis would look too much like the appalling mismanagement of the same crisis by local, state, and federal governments. Is it ironic that centralized bureaucracies reveal themselves as incompetent at the very same time they consolidate power? Or more cynically, isn’t it outrageous that they barely even try anymore to address the true needs of the public?

One of the questions I’ve posed unrhetorically is this: when will it finally become undeniably clear that instead of being geared to growth we should instead be managing contraction? I don’t know the precise timing, but the issue will be forced on us sooner or later as a result of radically diminishing return (compared to a century ago, say) on investment (ROI) in the energy sector. In short, we will be pulled back down to earth from the perilous heights we scaled as resources needed to keep industrial civilization creaking along become ever more difficult to obtain. (Maybe we’ll have to start using the term unobtainium from the Avatar movies.) Physical resources are impossible to counterfeit at scale, unlike the bogus enormous increase in the fiat money supply via debt creation. If/when hyperinflation makes us all multimillionaires because everything is grossly overvalued, the absurd paradox of being cash rich yet resource poor ought to wake up some folks.

I might have thought that the phrase divide and conquer originated in the writings of Sun Tzu or perhaps during the Colonial Period when so many Western European powers mobilized to claim their share of the New World. Not so. This link indicates that, beyond its more immediate association with Julius Caesar (Latin: divide et impera), the basic strategy is observed throughout antiquity. The article goes on to discuss Narcissism, Politics, and Psychopathy found in the employ of divide-and-conquer strategies, often in business competition. Knowing that our information environment is polluted with mis- and disinformation, especially online, I struggle awarding too much authority to some dude with a website, but that dude at least provides 24 footnotes (some of which are other Internet resources). This blanket suspicion applies to this dude (me), as well.

I also read (can’t remember where, otherwise I would provide a hyperlink — the online equivalent of a footnote) that Americans’ rather unique, ongoing, dysfunctional relationship with racism is an effective divide-and-conquer strategy deployed to keep the races (a sociological category, not a biological one) constantly preoccupied with each other rather than uniting against the true scourge: the owners and rulers (plus the military, technocrats, and managerial class that enable them). The historical illustration below shows how that hierarchy breaks down:

If the proportions were more statistically accurate, that bottom layer would be much, much broader, more like the 99% vs. the infamous 1% brought to acute awareness by the Occupy Movement. The specific distributions are probably impossible to determine, but it’s fair to say that the downtrodden masses are increasing in number as wealth inequality skews continuously and disproportionately to the benefit of the top quintile and higher. Is it really any question that those occupying the upper layers seek to keep balanced on top of the confection like an ill-fated Jenga wedding cake? Or that the bottom layer is foundational?

If class warfare is the underlying structural conflict truly at work in socioeconomic struggles plaguing the United States, race warfare is the bait to displace attention and blame for whatever befalls the masses. It’s divide and conquer, baby, and we’re falling for it like brawlers in a bar fight who don’t know why they’re fighting. (Meanwhile, someone just emptied the till.) On top, add the pandemic keeping people apart and largely unable to communicate meaningfully (read: face-to-face). As the U.S. election draws to a close, the major division among the American people is misunderstood primarily as red/blue (with associated Democratic and Republican memes, since neither has bothered to present a coherent political platform). Other false dichotomies are at work, no doubt. So when election results are contested next week, expect to see lines draw incorrectly between groups that are suffering equally at the hands of a different, hidden-in-plain-sight group only too happy to set off bar fights while keeping the focus off themselves. It’s a proven strategy.