I simply can’t keep up with all the reading, viewing, and listening in my queue. Waking hours are too few, and concentration dissipates long before sleep overtakes. Accordingly, it’s much easier to settle into couch-potato mode and watch some mindless drivel, such as the Netflix hit Bridgerton binged in two sittings. (Unlike cinema critics, I’m not bothered especially by continuity errors, plot holes, clunky dialogue, weak character motivation, gaps of logic, or glossy decadence of the fictional worlds. I am bothered by the Kafka trap sprung on anyone who notices casting decisions that defy time and place — an ill-advised but now commonplace historical revisionism like editing Mark Twain.) As a result, blog posts are less frequent than they might perhaps be as I pronounce upon American (or more broadly, Western) culture, trying vainly to absorb it as a continuously moving target. Calls to mind the phrase Après moi, le déluge, except that there is no need to wait. A deluge of entertainment, news, analysis, punditry, and trolling has buried everyone already. So rather than the more careful consideration I prefer to post, here are some hot takes.

The Irregular Aphorist. Caitlin Johnstone offers many trenchant observations in the form of aphorisms (some of which I’ve quoted before), all gathered under the subtitle Notes From The Edge Of The Narrative Matrix. The modifier irregular only means that aphorisms are a regular but not constant feature. Her site doesn’t have a tag to that effect but probably ought to. Here’s one in particular that caught my attention:

Everything our species has tried has led us to a dying world and a society that is stark raving mad, so nobody is in any position to tell you that you are wrong.

Twin truths here are (1) the dying world and (2) societal madness, both of which I’ve been describing for some time. Glad when others recognize them, too.

Piling on. Though few still are willing to admit it, nonpharmaceutical interventions (NPIs, e.g., distancing, masks, and lockdowns) to stall or reduce the spread of the virus failed to achieve their objectives according to this study. Instead, NPIs piled on suffering no one could forestall. I read somewhere (no link) that the world is approaching half of total, cumulative deaths/infections predicted had nothing been done to impede the pandemic running its course. Adding in deaths of despair (numbers not entirely up to date), we’re using the wrong tools to fight the wrong battle. Of course, interventions opened up giant opportunities for power grabs and vulture capitalism, so the cynic in me shrugs and wonders half aloud “what did you expect, really?”

Growth of the Managerial Bureaucracy. A blog called Easily Distracted by Timothy Burke (never on my blogroll) publishes only a few times per year, but his analysis is terrific — at least when it doesn’t wind up being overlong and inconclusive. Since a student debt jubilee is back in the news (plenty of arguments pro and con), unintended consequences are anticipated in this quote:

When you set out to create elaborate tiers that segregate the deserving poor from the comfortable middle-class and the truly wealthy, you create a system that requires a massive bureaucracy to administer and a process that forces people into petitionary humiliation in order to verify their eligibility. You create byzantine cutoff points that become business opportunities for predatory rentiers.

Something similar may well be occurring with stimulus checks being issued pro rata (has anyone actually gotten one?), but at least we’re spared any petitionary humiliations. We get whatever the algorithms (byzantine cutoff points) dictate. How those funds will be gamed and attached is not yet clear. Stay alert.

No Defense of Free Speech. Alan Jacobs often recommends deleting, unsubscribing, and/or ignoring social media accounts (after his own long love-hate relationship with them) considering how they have become wholly toxic to a balanced psyche as well as principal enablers of surveillance capitalism and narrative control. However, in an article about the manorial elite, he’s completely lost the plot that absolutism is required in defense of free speech. It’s not sufficient to be blasé or even relieved when 45 is kicked off Twitter permanently or when multiple parties conspire to kill Parler. Establishing your own turf beyond the reach of Silicon Valley censors is a nice idea but frankly impractical. Isn’t that what whoever ran Parler (or posted there) must have thought? And besides, fencing off the digital commons these very entities created has catapulted them into undemocratic, unelected wielders of monopolistic power and co-conspirators to boot. That’s what needs to be curtailed, not free speech.

The Taxonomic Apocalypse. Although drawn from fiction and thus largely hypothetical, a new book (coming late 2021) by Adam Roberts called It’s the End of the World: But What Are We Really Afraid Of? surveys doomsday stories and categorizes different versions of how it all ends. Alan Jacobs (yeah, him again — must have an advance copy of the manuscript) recommends it as “a delightful and provocative little book” but fails to grok two things: (1) these stories are rehearsals-cum-preparations for the real thing, and (2) the real thing really is bearing down on us implacably and so is no longer a mere hypothetical to contemplate and categorize for shits and grins. Despite acceptance of the eventualities that await all of us, reading Roberts’ taxonomy is not something I would expect to find delightful. Skip.

Narrative Collapse. Ran Prier (no link) sometimes makes statements revealing an unexpected god’s-eye view:

[45] is a mean rich kid who figured out that if he does a good Archie Bunker impression, every lost soul with an authoritarian father will think he’s the messiah. We’re lucky that he cares only about himself, instead of having some crazy utopian agenda. But the power, and the agency, is with the disaffected citizens of a declining empire, tasting barbarism.

This is all about people wanting to be part of a group that’s part of a story. Lately, some of the big group-stories have been dying: sky father religion, American supremacy, the conquest of nature, the virtue of wealth-seeking. In their place, young and clumsy group-stories struggle and rise.

Collapse of certain fundamental stories that animate our thinking is at the core of The Spiral Staircase (see About Brutus at top), though it’s often couched in terms of consciousness in transition. Getting through the transition (only temporarily, see previous item in list) probably means completion of the Counter-Enlightenment historical arc, which necessarily includes further descent into barbarism.

Hail Mary for Individualism. I always take special notice when someone cites Allan Bloom. Alan Jacobs (um, yeah, he’s prolific and I’m using his ideas again — sue me) cites Bloom to argue that individualism or the sovereign self, a product of the Enlightenment, is already dead. No doubt, the thought-world described so ably by Bloom no longer exists, but individualism has not yet died out by attrition or been fully dissolved in nonduality. Many of us born before the advent of the Internet retain selfhood and authenticity not yet coopted by or incorporated into mass mind. Moreover, ongoing struggles over identity (e.g., gender, sexual orientation, and race that are often used improperly to define the self) result from an inchoate sense that individualism is eroding precipitously, not that it’s already passé. Defiant attempts to (re)establish an authentic self (contravening all logic and becoming critical theory of one sort or another) in the face of this loss may well be a last-ditch effort to save the self, but it’s failing.

I have a memory of John Oliver dressing down a room full of entertainment journalists (ugh …) asking him questions following his Emmy win a few years ago. The first few had failed to offer even perfunctory congratulations for his award but instead leapt straight into questions. After his demand that everyone observe basic courtesy by at least acknowledging the reason for their attention being focused on him, each dutifully offered their compliments, which Oliver accepted graciously, and a question and answer ensued. It was a worthy reminder (something I mistakenly believed superfluous when I was much younger) that we have a sophisticated set of manners developed over time to which we should all subscribe. Behaving otherwise (i.e., skipping straight to matters at hand) is boorish, clownish, rude, and unsophisticated. Thus, routine exchanges at the beginnings of most interviews intended for broadcast go something to the effect, “Thanks for appearing on the show” or “Nice to meet you” followed by “Pleased to be here” or “My pleasure.” It’s part of a formal frame, the introduction or prologue, bearing no significant content but needful for hosts and guests to acknowledge each other.

In the course viewing many podcasts, often conducted by relative unknowns who nonetheless manage to attract someone of distinction to interview, I notice a tendency to geek out and succumb to effusive fandom. Even a little bit of that has the unfortunate effect of establishing an uneasy tension because the fan often becomes unhinged in the presence of the celebrity. Even when there is no latent threat of something going really wrong, the fanboi sometimes goes to such an extreme heaping praise and adulation on the interview subject that nothing else worthwhile occurs. Instead, one witnesses only the fanboi’s self-debasement. It makes me squirm watching someone figuratively fellating a celebrity (apology for my coarseness, but that’s really what springs to mind), and those on the receiving end often look just as uncomfortable. There’s simply no good response to the gushing, screaming, fainting, delirious equivalent of a 15-year-old Beatles freak (from back in the day) failing to hold it together and being caught embarrassingly in flagrante delicto.

Like others, I admire some people for their extraordinary accomplishments, but I never describe myself as a fan. Rather, objects of my admiration fall uniformly in the category of heroes people one shouldn’t scrutinize too closely lest their flaws be uncovered. Further, those times I’ve been in the presence of celebrities are usually the occasion of some discomfort precisely because celebrities’ fame invokes a false sense of intimacy (one might say oversharing) because details of their lives are in full public view. A balanced interaction is impossible because I know quite a bit about them whereas they know nothing about me, and topics gravitate toward the reasons for their celebrity. Most of us average folks feel compelled to acknowledge the films, trophies, recordings, awards, etc. that form their accomplishments, no matter how out of date. I’ve never been in the circumstance where a famous person, recognizing that I don’t recognize him or her (or don’t kowtow as expected), plays the celebrity card: “Don’t you know who I am?”

An interrelated effect is when someone has way too much money, that fortune clouding all interactions because it transforms the person into a target for those currying favor or otherwise on the make. Scammers, conmen, golddiggers, sycophants, etc. appear to extract wealth, and the dynamic breeds mutual distrust and wariness even in routine transactions. Chalk it up as another corrupting aspect of inequality run amok, this time affecting wannabes as well. In light of this, I suppose it’s understandable that rich, famous people are most comfortable among those similarly rich and famous, thus, immune to envy and fandom (but not always). Everyone else is alienated. Weird sort of gilded case to live in — not one that I admire.

I admit it: I’m a bit triggered. Storming of the U.S. Capitol Building last week, even though it was over in one day, sent a lot of us back to the drawing board, wondering how things could come to that. Not that civil unrest, attempted coups and secession, and even revolution haven’t been predicted for months. Still, the weirdness of this particular manifestation of citizen frustrations is hard to fathom. See, for instance, this blog post, which offers a reckoning not easy to face. Simply put, crowds that form into protests and physical occupations fully recognize their abandonment at the hand of oligarchs and political leaders and as a result act out their desperation and nihilism. Their question becomes “why not take over and occupy a building?” Doesn’t matter, nothing to lose anymore. It’s already all gone. Whether it’s a college administrative building, governor’s mansion, federal or state office building, or the U.S. Capitol Building, the sentiment appears to be the same: why the hell not? Doesn’t matter there was no plan what to do once the building was breached; doesn’t matter that it wasn’t occupied for long; doesn’t matter that property was damaged; doesn’t matter that lives were ruined and lost; doesn’t matter that no replacement government or executive was installed like a real coup or revolution would demand. Still works as an expression of outrage over the dysfunctions of society.

On the bright side, actual death and injury were quite limited compared to what might have obtained. Mayhem was largely limited to property destruction. Plus, it was a potent reminder to legislators (filmed scrambling for safety) that maybe they ought to fear backing the citizenry into corners with nowhere to turn. Conjecture that, had the racial make-up of the protesters been different, a massacre would have ensued remains just that: conjecture.

Read the rest of this entry »

The end of every U.S. presidential administration is preceded by a spate of pardons and commutations — the equivalents of a get-out-of-jail-free card offered routinely to conspirators collaborators with the outgoing executive and general-purpose crony capitalists. This practice, along with diplomatic immunity and supranational elevation of people (and corporations-as-people) beyond the reach of prosecution, is a deplorable workaround obviating the rule of law. Whose brilliant idea it was to offer special indulgence to miscreants is unknown to me, but it’s pretty clear that, with the right connections and/or with enough wealth, you can essentially be as bad as you wanna be with little fear of real consequence (a/k/a too big to fail a/k/a too big to jail). Similarly, politicians, whose very job it is to manage the affairs of society, are free to be incompetent and destructive in their brazen disregard for needs of the citizenry. Only modest effort (typically a lot of jawing directed to the wrong things) is necessary to enjoy the advantages of incumbency.

In this moment of year-end summaries, I could choose from among an array of insane, destructive, counter-productive, and ultimately self-defeating nominees (behaviors exhibited by elite powers that be) as the very worst, the baddest of the bad. For me, in the largest sense, that would be the abject failure of the rule of law (read: restraints), which has (so far) seen only a handful of high-office criminals prosecuted successfully (special investigations leading nowhere and failed impeachments don’t count) for their misdeeds and malfeasance. I prefer to be more specific. Given my indignation over the use of torture, that would seem an obvious choice. However, those news stories have been shoved to the back burner, including the ongoing torture of Julian Assange for essentially revealing truths cynics like me already suspected and now know to be accurate, where they general little heat. Instead, I choose war as the very worst, an example of the U.S. (via its leadership) being as bad as it can possibly be. The recent election cycle offered a few candidates who bucked the consensus that U.S. involvement in every unnecessary, undeclared war since WWII is justified. They were effectively shut out by the military-industrial complex. And as the incoming executive tweeted on November 24, 2020, America’s back, baby! Ready to do our worst again (read: some more, since we [the U.S. military] never stopped [making war]). A sizeable portion of the American public is aligned with this approach, too.

So rule of law has failed and we [Americans] are infested with crime and incompetence at the highest levels. Requirements, rights, and protections found in the U.S. Constitution are handily ignored. That means every administration since Truman has been full of war criminals, because torture and elective war are crimes. The insult to my sensibilities is far worse than the unaffordability of war, the failure to win or end conflicts, or the lack of righteousness in our supposed cause. It’s that we [America, as viewed from outside] are belligerent, bellicose aggressors. We [Americans] are predators. And we [Americans, but really all humans] are stuck in an adolescent concept of conduct in the world shared with animals that must kill just to eat. We [humans] make no humanitarian progress at all. But the increasing scale of our [human] destructiveness is progress if drones, robots, and other DARPA-developed weaponry impress.

Read the rest of this entry »

I’ve reached another crossroads. Chalk it up to pandemic exhaustion at being mostly cooped up for the better part of a year. Of course, this state is on top of other sources of exhaustion (politics, doom, the news grind cycle) that drained my enthusiasm for things I used to do before meaningful (to me) endeavors were all cancelled and everyone was forced to search for meaning staring at surfaces (e.g., the walls, pages, and screens — especially screens for most Americans, I daresay). So as the year and decade draw to a close, I anticipate a spate of lists and summaries as we move into 2021 with the hope it won’t be worse than 2020 — a faint hope, I might add, since nothing has been resolved except perhaps (!) which listless septuagenarian gets to sit in the Oval Office. The jury is still out whether vaccines will have the intended effect.

Aside: The calendar is not a timer or odometer. So although we change the calendar to 2021, the new year is the first year of the new decade (third decade of the 21st century, obviously). We struggled with this issue at the end of the previous century/millennium when 2000 became 2001, not more popularly when 1999 became 2000. This discrepancy is because calendars begin counting each month, year, etc. with 1, not 0. So the first ten counting numbers are 1–10, not 0–9, and all decades run from xx01 to xx10. However, timers and odometers begin counting at 0 and show elapsed intervals, so the first ten minutes or miles run from the start (at 0) to the end of 9, at which point the odometer in particular rolls to 10 and begins a new sequence. I realize I’m being a pointy-headed snoot about this, but it’s a relatively easy concept to understand. Innumeracy evident among the public is a microcosm for all the other easy concepts so badly misunderstood.

I’ve admitted to feelings of exhaustion and defeat numerous times, and indeed, hope eludes me whilst a narrow group of things still produce enjoyment. But my blogroll is no longer one of those things. I recently wrote the following to an acquaintance of mine:

Now that collapse narratives have matured, over two decades old for some (about 14 years for me), I notice that existential threats are still too remote and contingent for most to do more than signal some vague level of awareness and/or concern before returning to normal life. A few who sank into it deeply recognized that nothing positive comes out of it and have retreated from public life, or at least ongoing tracking and reporting. Several of the sites I used to frequent for news and perspective have dried up, and my finding is that adding more awfulness to the pile doesn’t enhance my understandings anymore, so I’ve also largely stopped gathering information. I still cite collapse frequently at my doom blog, but I have other things to write about.

I’m one of those who sank into the collapse narrative rather deeply and blogged about it consistently. By now, the sole available positive outcome has manifested: the recognition (and with it, resignation) that nothing will or can be done to avert disaster. So I’m dumping the doom and inactive blogs from my blogroll. I’ll continue to blog about and bear witness to the gathering storm: the cascade failure of industrial civilization. It’s proven to be a more protracted process than expected (at least by me), but no promises that it will stall until the end of the century at the conclusion of 2100 for sea level to rise and flora and fauna to expire. Human habitat will continue to diminish decade by decade, and at some point, so will human population — already shown to be rather precariously perched on an illusory safety and security we take as business as usual. I’ll keep a couple of the respectable truth-telling blogs just to have something to which to link. I have no links to add at this point.

David Sirota, author of Back to our Future: How the 1980s Explain the World We Live in Now — Our Culture, Our Politics, Our Everything (2011), came to my attention (how else?) through a podcast. He riffed pretty entertainingly on his book, now roughly one decade old, like a rock ‘n’ roller stuck (re)playing his or her greatest hits into dotage. However, his thesis was strong and appealing enough that I picked up a copy (read: borrowed from the library) to investigate despite the datedness of the book (and my tardiness). It promised to be an easy read.

Sirota’s basic thesis is that memes and meme complexes (a/k/a memeplexes, though Sirota never uses the term meme) developed in the 80s and deployed through a combination of information and entertainment media (thus, infotainment) form the narrative background we take for granted in the early part of the 20th century. Children fed a steady diet of clichés, catchphrases, one-liners, archetypes, and story plots have now grown to adulthood and are scarcely able to peer behind the curtain to question the legitimacy or subtext of the narrative shapes and distortions imbibed during childhood like mother’s milk. The table of contents lists four parts (boldface section titles are Sirota’s; descriptive text is mine):

  • Liking Ike, Hating Woodstock. How the 50s and 60s decades were (the first?) assigned reductive demographic signifiers, handily ignoring the true diversity of experience during those decades. More specifically, the boom-boom 50s (economics, births) were recalled nostalgically in 80s TV and films while the 60s were recast as being all about those dirty, hairy hippies and their music, drugs, and sexual licentiousness, all of which had to be invalidated somehow to regain lost wholesomeness. The one-man promotional vehicle for this pleasing self-deception was Michael J. Fox, whose screen personae (TV and film) during the 80s (glorifying the 50s but openly shitting on the 60s) were instrumental in reforming attitudes about our mixed history.
  • The Jump Man Chronicles. How the Great Man Theory of History was developed through glorification of heroes, rogues, mavericks, and iconoclasts who came into their own during the 80s. That one-man vehicle was Michael Jordan, whose talents and personal magnetism were so outsized that everyone aspired to be “like Mike,” which is to say, a superhero elevated beyond mere mortal rules and thus immortalized. The effect was duplicated many times over in popular culture, with various entertainment icons and political operatives subverting thoughtful consideration of real-world problems in favor of jingoistic portrayals.
  • Why We (Continue to) Fight. How the U.S. military was rehabilitated after losing the Vietnam War, gifting us with today’s hypermilitarism and permanent wars. Two principal tropes were deployed to shape public opinion: the Legend of the Spat upon Veteran and the Hands Tied Behind Their Backs Myth. Each was trotted out reliably whenever we needed to misremember our past as fictionalized in the 80s.
  • The Huxtable Effect. How “America’s dad” helped accommodate race relations to white anxiety, primarily to sell a TV show. In contrast with various “ghetto TV” shows of the 70s that depicted urban working poor (various ethnicities), The Cosby Show presented an upscale black family who transcended race by simply ignoring the issue — a privilege of wealth and celebrity. The Obama campaign and subsequent administration copied this approach, pretending American society had become postracial despite his never truly being able to escape the modifier black because the default (no modifier needed) in America is always white. This is the most fraught part of the book, demonstrating that despite whatever instructions we get from entertainment media and pundits, we remain stuck in an unresolved, unhealed, inescapable trap.

Read the rest of this entry »

Returning to the subject of this post, I asserted that the modern era frustrates a deep, human yearning for meaning. As a result, the Medieval Period, and to a lesser degree, life on the highroad, became narrative fixations. Had I time to investigate further, I would read C.S. Lewis’ The Discarded Image (1964), but my reading list is already overfull. Nonetheless, I found an executive summary of how Lewis describes the Medieval approach to history and education:

Medieval historians varied in that some of them were more scientific, but most historians tried to create a “picture of the past.” This “picture” was not necessarily based in fact and was meant more to entertain curiosity than to seriously inform. Educated people in medieval times, however, had a high standard for education composed of The Seven Liberal Arts of grammar, dialectic, rhetoric, arithmetic, music, geometry, and astronomy.

In the last chapter, Lewis summarizes the influence of the Medieval Model. In general, the model was widely accepted, meaning that most people of the time conformed to the same way of thinking. The model, he reiterates, satisfied imagination and curiosity, but was not necessarily accurate or factual, specifically when analyzed by modern thinkers.

Aside. Regular readers of The Spiral Staircase may also recognize how consciousness informs this blog post. Historical psychology offers a glimpse into worldviews of bygone eras, with the Medieval Period perhaps being the easiest to excavate contemplate due to proximity. Few storytellers (cinema or literature) attempt to depict what the world was truly like in the past (best as we can know) but instead resort to an ahistorical modern gloss on how men and women thought and behaved. One notable exception may be the 1986 film The Name of the Rose, which depicts the emerging rational mind in stark conflict with the cloistered Medieval mind. Sword-and-sandal epics set in ancient Rome and Greece get things even worse.

Read the rest of this entry »

Unlike turtles, humans do not have protective shells into which we can withdraw when danger presents. Nor can we lift off, fly away, and elude danger the way birds do. These days, we’re sorely beset by an invisible pandemic spread by exposure to carriers (read: other people) and so asked or forced to submit to being locked down and socially distanced. Thus, we are withdrawn into the protective shell of the home in cycles of varying intensity and obeisance to maintain health and safety. Yet life goes on, and with it, numerous physical requirements (ignoring psychological needs) that can’t be met virtually demand we venture out into the public sphere to gather resources, risking exposure to the scourge. Accordingly, the conduct of business has adapted to enable folks to remain in the protective shells of their vehicles, taking delivery through the car window and rarely if ever entering a brick-and-mortar establishment except in defiance or at the option of acceptable risk. In effect, we’re being driven into our cars ever more, and the vehicle is readily understood as a proxy for its inhabitant(s). Take note of pictures of people in bread lines during the Great Depression having been replaced by pictures of cars lined up for miles during the pandemic to get packaged meals from charitable organizations.

Reflecting on this aspect of modern life, I realized that it’s not exactly novel. The widespread adoption of the individual vehicle in the 1940s and 50s, as distinguished from mass transit, and the construction of the interstate highway system promised (and delivered) flexibility and freedom of tremendous appeal. While the shift into cars (along with air travel) doomed now moribund passenger rail (except intracity in the few American cities with effective rail systems), it enabled the buildout of suburbs and exurbs now recognized as urban sprawl. And like all those packages now clogging delivery systems as we shift even more heavily during the holiday season to online shopping, a loss of efficiency was inevitable. All those individual cars and boxes create congestion that cry out for solutions.

Among the solutions (really a nonsolution) were the first drive-through banks of the 1970s. Is doing one’s banking without leaving the vehicle’s protective shell really an efficiency? Or is it merely an early acknowledgement and enabling of antisocial individualism? Pneumatic tubes that permitted drive-through banking did not speed up transactions appreciably, but the novel mechanism undoubtedly reinforced the psychological attachment Americans felt with their cars. That growing attachment was already apparent in the 1950s, with two bits of Americana from that decade still resonating: the drive-in theater and the drive-in restaurant. The drive-in theater was a low-fidelity efficiency and alternative to the grand movie houses built in the 1920s and 30s seating a few thousand people in one cavernous space. (A different sort of efficiency enabling choice later transformed most cinema establishments into multiplexes able to show 8–10 titles instead of one, handily diminishing audiences of thousands to hundreds or even tens and robbing the group experience of much of its inherent power. Now that premium streaming content is delivered to screens at home and we are disallowed assembly into large audiences, we have instead become something far more inert — viewers — with fully anticipatable degradation of the entertainment experience notwithstanding the handsome technologies found within the comforts of the home.) I’ve heard that drive-ins are experiencing a renaissance of sorts in 2020, with Walmart parking lots converted into showplaces, at least temporarily, to resemble (poorly) group experience and social cohesion connection. The drive-in restaurant of the 1950s, with their iconic carhops (sometimes on roller skates), is a further example of enabling car culture to proliferate. Never mind that eating in the car is actually kinda sad and maybe a little disgusting as odors and refuse collect in that confined space. One might suspect that drive-ins were directed toward teenyboppers and cruisers of the 1950s exploring newfound freedom, mobility, and the illusion of privacy in their cars, parked in neat rows at drive-ins (and Lookout Points for smooch sessions) all across the country. However, my childhood memory was that it was also a family affair.

Inevitably, fast food restaurants followed the banks in the 1970s and quickly established drive-through lanes, reinforcing the degradation of the food experience into mere feeding (often on one’s lonesome) rather than dining in community. Curiously, the pandemic has made every restaurant still operating, even the upscale ones, a drive-through and forced those with and without dedicated drive-through lanes to bring back the anachronistic carhop to serve the congestion. A trip to a local burger joint in Chicago last week revealed 40+ cars in queue and a dozen or so carhops on the exterior directing traffic and making deliveries through the car window (briefly penetrating the protective shell) so that no one would have to enter the building and expose oneself to virus carriers. I’ve yet to see a 2020 carhop wearing roller skates (now roller blades) or a poodle skirt.

Such arrangements are probably effective at minimizing pandemic risk and have become one of several new normals (discussion of political dysfunction deferred). Who can say how long they will persist? Still, it’s strange to observe the psychology of our response, even if only superficially and preliminarily. Car culture has been a curious phenomenon since at least the middle of the 20th century. New dynamics reinforcing our commitment to cars are surprising, perhaps, but a little unsurprising, too, considering how we made ourselves so dependent on them as the foundation of personal transportation infrastructure. As a doomer, I had rather expected that Peak Oil occurring around 2006 or so would spell the gradual (or sudden) end of happy motoring as prices at the pump, refusal to elevate standard fuel efficiency above 50 mph, and climbing average cost of new vehicles placed individual options beyond the reach of average folks. However, I’ve been genuinely surprised by fuel costs sinking to new lows (below the cost of production, even bizarrely inverting to the point that producers paid buyers to take inventory) and continued attempts to engineer (only partially) around the limitations of Peak Oil, if not indeed Peak Energy. I continue to believe these are mirages, like the record-setting bull market of 2020 occurring in the midst of simultaneous economic, social, and health crises.

Black Friday has over the past decades become the default kickoff of annual consumer madness associated with the holiday season and its gift-giving tradition. Due to the pandemic, this year has been considerably muted in comparison to other years — at least in terms of crowds. Shopping has apparently moved online fairly aggressively, which is an entirely understandable result of everyone being locked down and socially distanced. (Lack of disposable income ought to be a factor, too, but American consumers have shown remarkable willingness to take on substantial debt when able in support of mere lifestyle.) Nevertheless, my inbox has been deluged over the past week with incessant Black Friday and Cyber Monday advertising. Predictably, retailers continue feeding the frenzy.

Uncharacteristically, perhaps, this state of affairs is not the source of outrage on my part. I recognize that we live in a consumerist, capitalist society that will persist in buying and selling activities even in the face of increasing hardship. I’m also cynical enough to expect retailers (and the manufacturers they support, even if those manufacturers are Chinese) to stoke consumer desire through advertising, promotions, and discount sales. It’s simply what they do. Why stop now? Thus far, I’ve seen no rationalizations or other arguments excusing how it’s a little ghoulish to be profiting while so many are clearly suffering and facing individual and household fiscal cliffs. Instead, we rather blandly accept that the public needs to be served no less by mass market retailers than by, say, grocery and utility services. Failure by the private sector to maintain functioning supply lines (including nonessentials, I suppose) during a crisis would look too much like the appalling mismanagement of the same crisis by local, state, and federal governments. Is it ironic that centralized bureaucracies reveal themselves as incompetent at the very same time they consolidate power? Or more cynically, isn’t it outrageous that they barely even try anymore to address the true needs of the public?

One of the questions I’ve posed unrhetorically is this: when will it finally become undeniably clear that instead of being geared to growth we should instead be managing contraction? I don’t know the precise timing, but the issue will be forced on us sooner or later as a result of radically diminishing return (compared to a century ago, say) on investment (ROI) in the energy sector. In short, we will be pulled back down to earth from the perilous heights we scaled as resources needed to keep industrial civilization creaking along become ever more difficult to obtain. (Maybe we’ll have to start using the term unobtainium from the Avatar movies.) Physical resources are impossible to counterfeit at scale, unlike the bogus enormous increase in the fiat money supply via debt creation. If/when hyperinflation makes us all multimillionaires because everything is grossly overvalued, the absurd paradox of being cash rich yet resource poor ought to wake up some folks.

I’ve never before gone straight back with a redux treatment of a blog post. More typically, it takes more than a year before revisiting a given topic, sometimes several years. This time, supplemental information came immediately, though I’ve delayed writing about it. To wit, a Danish study published November 18, 2020, in the Annals of Internal Medicine indicates our face mask precautions against the Coronavirus may be ineffective:

Our results suggest that the recommendation to wear a surgical mask when outside the home among others did not reduce, at conventional levels of statistical significance, the incidence of SARS-CoV-2 infection in mask wearers in a setting where social distancing and other public health measures were in effect, mask recommendations were not among those measures, and community use of masks was uncommon. Yet, the findings were inconclusive and cannot definitively exclude a 46% reduction to a 23% increase in infection of mask wearers in such a setting. It is important to emphasize that this trial did not address the effects of masks as source control or as protection in settings where social distancing and other public health measures are not in effect.

The important phrase there is “did not reduce, at conventional levels of statistical significance,” which is followed by the caveat that the study was partial and so is inconclusive. To say something is statistically insignificant means that results do not exceed the calculated margin of error or randomness. A fair bit of commentary follows the published study, which I have not reviewed.

We’re largely resorting to conventional wisdom with respect to mask wearing. Most businesses and public venues (if open at all) have adopted the mask mandate out of conformity and despite wildly conflicting reports of their utility. Compared to locking down all nonessential social and economic activity, however, I remain resigned to their adoption even though I’m suspicious (as any cynic or skeptic should be) that they don’t work — at least not after the virus is running loose. There is, however, another component worth considering, namely, the need to been seen doing something, not nothing, to address the pandemic. Some rather bluntly call that virtue signalling, such as the pathologist at this link.

In the week since publication of the Danish study and the pathologist’s opinion (note the entirely misleading title), there has been a deluge of additional information, editorials, and protests (no more links, sorry) calling into question recommendations from health organizations and responses by politicians. Principled and unprincipled dissent was already underway since May 2020, which is growing with each month hardship persists. Of particular note is the Supreme Court’s 5-4 decision against New York Gov. Andrew Cuomo’s mandate that religious services be restricted to no more than 10 people in red zones and no more than 25 in orange zones. Score one for the Bill of Rights being upheld even in a time of crisis.