Archive for the ‘Cinema’ Category

I simply can’t keep up with all the reading, viewing, and listening in my queue. Waking hours are too few, and concentration dissipates long before sleep overtakes. Accordingly, it’s much easier to settle into couch-potato mode and watch some mindless drivel, such as the Netflix hit Bridgerton binged in two sittings. (Unlike cinema critics, I’m not bothered especially by continuity errors, plot holes, clunky dialogue, weak character motivation, gaps of logic, or glossy decadence of the fictional worlds. I am bothered by the Kafka trap sprung on anyone who notices casting decisions that defy time and place — an ill-advised but now commonplace historical revisionism like editing Mark Twain.) As a result, blog posts are less frequent than they might perhaps be as I pronounce upon American (or more broadly, Western) culture, trying vainly to absorb it as a continuously moving target. Calls to mind the phrase Après moi, le déluge, except that there is no need to wait. A deluge of entertainment, news, analysis, punditry, and trolling has buried everyone already. So rather than the more careful consideration I prefer to post, here are some hot takes.

The Irregular Aphorist. Caitlin Johnstone offers many trenchant observations in the form of aphorisms (some of which I’ve quoted before), all gathered under the subtitle Notes From The Edge Of The Narrative Matrix. The modifier irregular only means that aphorisms are a regular but not constant feature. Her site doesn’t have a tag to that effect but probably ought to. Here’s one in particular that caught my attention:

Everything our species has tried has led us to a dying world and a society that is stark raving mad, so nobody is in any position to tell you that you are wrong.

Twin truths here are (1) the dying world and (2) societal madness, both of which I’ve been describing for some time. Glad when others recognize them, too.

Piling on. Though few still are willing to admit it, nonpharmaceutical interventions (NPIs, e.g., distancing, masks, and lockdowns) to stall or reduce the spread of the virus failed to achieve their objectives according to this study. Instead, NPIs piled on suffering no one could forestall. I read somewhere (no link) that the world is approaching half of total, cumulative deaths/infections predicted had nothing been done to impede the pandemic running its course. Adding in deaths of despair (numbers not entirely up to date), we’re using the wrong tools to fight the wrong battle. Of course, interventions opened up giant opportunities for power grabs and vulture capitalism, so the cynic in me shrugs and wonders half aloud “what did you expect, really?”

Growth of the Managerial Bureaucracy. A blog called Easily Distracted by Timothy Burke (never on my blogroll) publishes only a few times per year, but his analysis is terrific — at least when it doesn’t wind up being overlong and inconclusive. Since a student debt jubilee is back in the news (plenty of arguments pro and con), unintended consequences are anticipated in this quote:

When you set out to create elaborate tiers that segregate the deserving poor from the comfortable middle-class and the truly wealthy, you create a system that requires a massive bureaucracy to administer and a process that forces people into petitionary humiliation in order to verify their eligibility. You create byzantine cutoff points that become business opportunities for predatory rentiers.

Something similar may well be occurring with stimulus checks being issued pro rata (has anyone actually gotten one?), but at least we’re spared any petitionary humiliations. We get whatever the algorithms (byzantine cutoff points) dictate. How those funds will be gamed and attached is not yet clear. Stay alert.

No Defense of Free Speech. Alan Jacobs often recommends deleting, unsubscribing, and/or ignoring social media accounts (after his own long love-hate relationship with them) considering how they have become wholly toxic to a balanced psyche as well as principal enablers of surveillance capitalism and narrative control. However, in an article about the manorial elite, he’s completely lost the plot that absolutism is required in defense of free speech. It’s not sufficient to be blasé or even relieved when 45 is kicked off Twitter permanently or when multiple parties conspire to kill Parler. Establishing your own turf beyond the reach of Silicon Valley censors is a nice idea but frankly impractical. Isn’t that what whoever ran Parler (or posted there) must have thought? And besides, fencing off the digital commons these very entities created has catapulted them into the unenviable position of undemocratic, unelected wielders of monopolistic power and co-conspirators to boot. That’s what needs to be curtailed, not free speech.

The Taxonomic Apocalypse. Although drawn from fiction and thus largely hypothetical, a new book (coming late 2021) by Adam Roberts called It’s the End of the World: But What Are We Really Afraid Of? surveys doomsday stories and categorizes different versions of how it all ends. Alan Jacobs (yeah, him again — must have an advance copy of the manuscript) recommends it as “a delightful and provocative little book” but fails to grok two things: (1) these stories are rehearsals-cum-preparations for the real thing, and (2) the real thing really is bearing down on us implacably and so is no longer a mere hypothetical to contemplate and categorize for shits and grins. Despite acceptance of the eventualities that await all of us, reading Roberts’ taxonomy is not something I would expect to find delightful. Skip.

Narrative Collapse. Ran Prier (no link) sometimes makes statements revealing an unexpected god’s-eye view:

[45] is a mean rich kid who figured out that if he does a good Archie Bunker impression, every lost soul with an authoritarian father will think he’s the messiah. We’re lucky that he cares only about himself, instead of having some crazy utopian agenda. But the power, and the agency, is with the disaffected citizens of a declining empire, tasting barbarism.

This is all about people wanting to be part of a group that’s part of a story. Lately, some of the big group-stories have been dying: sky father religion, American supremacy, the conquest of nature, the virtue of wealth-seeking. In their place, young and clumsy group-stories struggle and rise.

Collapse of certain fundamental stories that animate our thinking is at the core of The Spiral Staircase (see About Brutus at top), though it’s often couched in terms of consciousness in transition. Getting through the transition (only temporarily, see previous item in list) probably means completion of the Counter-Enlightenment historical arc, which necessarily includes further descent into barbarism.

Hail Mary for Individualism. I always take special notice when someone cites Allan Bloom. Alan Jacobs (um, yeah, he’s prolific and I’m using his ideas again — sue me) cites Bloom to argue that individualism or the sovereign self, a product of the Enlightenment, is already dead. No doubt, the thought-world described so ably by Bloom no longer exists, but individualism has not yet died out by attrition or been fully dissolved in nonduality. Many of us born before the advent of the Internet retain selfhood and authenticity not yet coopted by or incorporated into mass mind. Moreover, ongoing struggles over identity (e.g., gender, sexual orientation, and race that are often used improperly to define the self) result from an inchoate sense that individualism is eroding precipitously, not that it’s already passé. Defiant attempts to (re)establish an authentic self (contravening all logic and becoming critical theory of one sort or another) in the face of this loss may well be a last-ditch effort to save the self, but it’s failing.

David Sirota, author of Back to our Future: How the 1980s Explain the World We Live in Now — Our Culture, Our Politics, Our Everything (2011), came to my attention (how else?) through a podcast. He riffed pretty entertainingly on his book, now roughly one decade old, like a rock ‘n’ roller stuck (re)playing his or her greatest hits into dotage. However, his thesis was strong and appealing enough that I picked up a copy (read: borrowed from the library) to investigate despite the datedness of the book (and my tardiness). It promised to be an easy read.

Sirota’s basic thesis is that memes and meme complexes (a/k/a memeplexes, though Sirota never uses the term meme) developed in the 80s and deployed through a combination of information and entertainment media (thus, infotainment) form the narrative background we take for granted in the early part of the 20th century. Children fed a steady diet of clichés, catchphrases, one-liners, archetypes, and story plots have now grown to adulthood and are scarcely able to peer behind the curtain to question the legitimacy or subtext of the narrative shapes and distortions imbibed during childhood like mother’s milk. The table of contents lists four parts (boldface section titles are Sirota’s; descriptive text is mine):

  • Liking Ike, Hating Woodstock. How the 50s and 60s decades were (the first?) assigned reductive demographic signifiers, handily ignoring the true diversity of experience during those decades. More specifically, the boom-boom 50s (economics, births) were recalled nostalgically in 80s TV and films while the 60s were recast as being all about those dirty, hairy hippies and their music, drugs, and sexual licentiousness, all of which had to be invalidated somehow to regain lost wholesomeness. The one-man promotional vehicle for this pleasing self-deception was Michael J. Fox, whose screen personae (TV and film) during the 80s (glorifying the 50s but openly shitting on the 60s) were instrumental in reforming attitudes about our mixed history.
  • The Jump Man Chronicles. How the Great Man Theory of History was developed through glorification of heroes, rogues, mavericks, and iconoclasts who came into their own during the 80s. That one-man vehicle was Michael Jordan, whose talents and personal magnetism were so outsized that everyone aspired to be “like Mike,” which is to say, a superhero elevated beyond mere mortal rules and thus immortalized. The effect was duplicated many times over in popular culture, with various entertainment icons and political operatives subverting thoughtful consideration of real-world problems in favor of jingoistic portrayals.
  • Why We (Continue to) Fight. How the U.S. military was rehabilitated after losing the Vietnam War, gifting us with today’s hypermilitarism and permanent wars. Two principal tropes were deployed to shape public opinion: the Legend of the Spat upon Veteran and the Hands Tied Behind Their Backs Myth. Each was trotted out reliably whenever we needed to misremember our past as fictionalized in the 80s.
  • The Huxtable Effect. How “America’s dad” helped accommodate race relations to white anxiety, primarily to sell a TV show. In contrast with various “ghetto TV” shows of the 70s that depicted urban working poor (various ethnicities), The Cosby Show presented an upscale black family who transcended race by simply ignoring the issue — a privilege of wealth and celebrity. The Obama campaign and subsequent administration copied this approach, pretending American society had become postracial despite his never truly being able to escape the modifier black because the default (no modifier needed) in America is always white. This is the most fraught part of the book, demonstrating that despite whatever instructions we get from entertainment media and pundits, we remain stuck in an unresolved, unhealed, inescapable trap.

(more…)

Returning to the subject of this post, I asserted that the modern era frustrates a deep, human yearning for meaning. As a result, the Medieval Period, and to a lesser degree, life on the highroad, became narrative fixations. Had I time to investigate further, I would read C.S. Lewis’ The Discarded Image (1964), but my reading list is already overfull. Nonetheless, I found an executive summary of how Lewis describes the Medieval approach to history and education:

Medieval historians varied in that some of them were more scientific, but most historians tried to create a “picture of the past.” This “picture” was not necessarily based in fact and was meant more to entertain curiosity than to seriously inform. Educated people in medieval times, however, had a high standard for education composed of The Seven Liberal Arts of grammar, dialectic, rhetoric, arithmetic, music, geometry, and astronomy.

In the last chapter, Lewis summarizes the influence of the Medieval Model. In general, the model was widely accepted, meaning that most people of the time conformed to the same way of thinking. The model, he reiterates, satisfied imagination and curiosity, but was not necessarily accurate or factual, specifically when analyzed by modern thinkers.

Aside. Regular readers of The Spiral Staircase may also recognize how consciousness informs this blog post. Historical psychology offers a glimpse into worldviews of bygone eras, with the Medieval Period perhaps being the easiest to excavate contemplate due to proximity. Few storytellers (cinema or literature) attempt to depict what the world was truly like in the past (best as we can know) but instead resort to an ahistorical modern gloss on how men and women thought and behaved. One notable exception may be the 1986 film The Name of the Rose, which depicts the emerging rational mind in stark conflict with the cloistered Medieval mind. Sword-and-sandal epics set in ancient Rome and Greece get things even worse.

(more…)

Unlike turtles, humans do not have protective shells into which we can withdraw when danger presents. Nor can we lift off, fly away, and elude danger the way birds do. These days, we’re sorely beset by an invisible pandemic spread by exposure to carriers (read: other people) and so asked or forced to submit to being locked down and socially distanced. Thus, we are withdrawn into the protective shell of the home in cycles of varying intensity and obeisance to maintain health and safety. Yet life goes on, and with it, numerous physical requirements (ignoring psychological needs) that can’t be met virtually demand we venture out into the public sphere to gather resources, risking exposure to the scourge. Accordingly, the conduct of business has adapted to enable folks to remain in the protective shells of their vehicles, taking delivery through the car window and rarely if ever entering a brick-and-mortar establishment except in defiance or at the option of acceptable risk. In effect, we’re being driven into our cars ever more, and the vehicle is readily understood as a proxy for its inhabitant(s). Take note of pictures of people in bread lines during the Great Depression having been replaced by pictures of cars lined up for miles during the pandemic to get packaged meals from charitable organizations.

Reflecting on this aspect of modern life, I realized that it’s not exactly novel. The widespread adoption of the individual vehicle in the 1940s and 50s, as distinguished from mass transit, and the construction of the interstate highway system promised (and delivered) flexibility and freedom of tremendous appeal. While the shift into cars (along with air travel) doomed now moribund passenger rail (except intracity in the few American cities with effective rail systems), it enabled the buildout of suburbs and exurbs now recognized as urban sprawl. And like all those packages now clogging delivery systems as we shift even more heavily during the holiday season to online shopping, a loss of efficiency was inevitable. All those individual cars and boxes create congestion that cry out for solutions.

Among the solutions (really a nonsolution) were the first drive-through banks of the 1970s. Is doing one’s banking without leaving the vehicle’s protective shell really an efficiency? Or is it merely an early acknowledgement and enabling of antisocial individualism? Pneumatic tubes that permitted drive-through banking did not speed up transactions appreciably, but the novel mechanism undoubtedly reinforced the psychological attachment Americans felt with their cars. That growing attachment was already apparent in the 1950s, with two bits of Americana from that decade still resonating: the drive-in theater and the drive-in restaurant. The drive-in theater was a low-fidelity efficiency and alternative to the grand movie houses built in the 1920s and 30s seating a few thousand people in one cavernous space. (A different sort of efficiency enabling choice later transformed most cinema establishments into multiplexes able to show 8–10 titles instead of one, handily diminishing audiences of thousands to hundreds or even tens and robbing the group experience of much of its inherent power. Now that premium streaming content is delivered to screens at home and we are disallowed assembly into large audiences, we have instead become something far more inert — viewers — with fully anticipatable degradation of the entertainment experience notwithstanding the handsome technologies found within the comforts of the home.) I’ve heard that drive-ins are experiencing a renaissance of sorts in 2020, with Walmart parking lots converted into showplaces, at least temporarily, to resemble (poorly) group experience and social cohesion connection. The drive-in restaurant of the 1950s, with their iconic carhops (sometimes on roller skates), is a further example of enabling car culture to proliferate. Never mind that eating in the car is actually kinda sad and maybe a little disgusting as odors and refuse collect in that confined space. One might suspect that drive-ins were directed toward teenyboppers and cruisers of the 1950s exploring newfound freedom, mobility, and the illusion of privacy in their cars, parked in neat rows at drive-ins (and Lookout Points for smooch sessions) all across the country. However, my childhood memory was that it was also a family affair.

Inevitably, fast food restaurants followed the banks in the 1970s and quickly established drive-through lanes, reinforcing the degradation of the food experience into mere feeding (often on one’s lonesome) rather than dining in community. Curiously, the pandemic has made every restaurant still operating, even the upscale ones, a drive-through and forced those with and without dedicated drive-through lanes to bring back the anachronistic carhop to serve the congestion. A trip to a local burger joint in Chicago last week revealed 40+ cars in queue and a dozen or so carhops on the exterior directing traffic and making deliveries through the car window (briefly penetrating the protective shell) so that no one would have to enter the building and expose oneself to virus carriers. I’ve yet to see a 2020 carhop wearing roller skates (now roller blades) or a poodle skirt.

Such arrangements are probably effective at minimizing pandemic risk and have become one of several new normals (discussion of political dysfunction deferred). Who can say how long they will persist? Still, it’s strange to observe the psychology of our response, even if only superficially and preliminarily. Car culture has been a curious phenomenon since at least the middle of the 20th century. New dynamics reinforcing our commitment to cars are surprising, perhaps, but a little unsurprising, too, considering how we made ourselves so dependent on them as the foundation of personal transportation infrastructure. As a doomer, I had rather expected that Peak Oil occurring around 2006 or so would spell the gradual (or sudden) end of happy motoring as prices at the pump, refusal to elevate standard fuel efficiency above 50 mph, and climbing average cost of new vehicles placed individual options beyond the reach of average folks. However, I’ve been genuinely surprised by fuel costs sinking to new lows (below the cost of production, even bizarrely inverting to the point that producers paid buyers to take inventory) and continued attempts to engineer (only partially) around the limitations of Peak Oil, if not indeed Peak Energy. I continue to believe these are mirages, like the record-setting bull market of 2020 occurring in the midst of simultaneous economic, social, and health crises.

I’ll try to be relatively brief, since I’ve been blogging about industrial and ecological collapse for more than a decade. Jeff Gibbs released a new documentary called Planet of the Humans (sideways nod to the dystopian movie franchises Planet of the Apes — as though humans aren’t also apes). Gibbs gets top billing as the director, but this is clearly a Michael Moore film, who gets secondary billing as the executing producer. The film includes many of Moore’s established eccentricities, minus the humor, and is basically an exposé on greenwashing: the tendency of government agencies, environmental activists, and capitalist enterprises to coopt and transform earnest environmental concern into further profit-driven destruction of the natural environment. Should be no surprise to anyone paying attention, despite the array of eco-luminaries making speeches and soundbites about “green” technologies that purport to save us from rendering the planet uninhabitable. Watching them fumble and evade when answering simple, direct questions is a clear indication of failed public-relations approaches to shaping the narrative.

Turns out that those ballyhooed energy sources (e.g., wind, solar, biofuel, biomass) ride on the back of fossil fuels and aren’t any more green or sustainable than the old energy sources they pretend to replace. Again, no surprise if one has even a basic understanding of the dynamics of energy production and consumption. That admittedly sounds awfully jaded, but the truth has been out there for a long time already for anyone willing and able to confront it. Similarly, the documentary mentions overpopulation, another notorious elephant in the room (or herd of elephants, as aptly put in the film), but it’s not fully developed. Entirely absent is any question of not meeting energy demand. That omission is especially timely given how, with the worldwide economy substantially scaled back at present and with it significant demand destruction (besides electricity), the price of oil has fallen through the floor. Nope, the tacit assumption is that energy demand must be met despite all the awful short- and long-term consequences.

Newsfeeds indicate that the film has sparked considerable controversy in only a few days following release. Debate is to be expected considering a coherent energy strategy has never been developed or agreed upon and interested parties have a lot riding on outcomes. Not to indulge in hyperbole, but the entire human race is bound up in the outcome, too, and it doesn’t look good for us or most of the rest of the species inhabiting the planet. Thus, I was modestly dismayed when the end of the film wandered into happy chapter territory and offered the nonsensical platitude in voiceover, “If we get ourselves under control, all things are possible.” Because we’ve passed and in fact lapped the point of no return repeatedly, the range of possibilities has shrunk precipitously. The most obvious is that human population of 7.7 billion (and counting) is being sorely tested. If we’re being honest with ourselves, we also know that post-pandemic there can be no return to the world we’ve known for the past 70 years or so. Although the documentary could not be reasonably expected to be entirely up to date, it should at least have had the nerve to conclude what the past few decades have demonstrated with abundant clarity.

Addendum

This review provides support for my assessment that “green” or “sustainable” energy cannot be delivered without significant contribution of fossil fuels.

In the introduction to an article at TomDispatch about anticipated resumption of professional sports currently on hiatus like much of the rest of human activity (economic and otherwise), Tom Engelhardt recalls that to his childhood self, professional sports meant so much and yet so little (alternatively, everything and nothing). This charming aspect of the innocence of childhood continues into adulthood, whether as spectator or participant, as leisure and freedom from threat allow. The article goes on to offer conjecture regarding the effect of reopening professional sports on the fall presidential election. Ugh! Racehorse politics never go out of season. I reject such purely hypothetical analyses, which isn’t the same as not caring about the election. Maybe I’ll wade in after a Democratic nominee is chosen to say that third-party candidates may well have a much larger role to play this time round because we’re again being offered flatly unacceptable options within the two-party single-party system. Until then, phooey on campaign season!

Still, Engelhardt’s remark put me in mind of a blog post I considered fully nine years ago but never got around to writing, namely, how music functions as meaningless abstraction. Pick you passion, I suppose: sports, music (any genre), literature, painting, poetry, dance, cinema and TV, fashion, fitness, nature, house pets, house plants, etc. Inspiration and devotion come in lots of forms, few of which are essential (primary or ontological needs on Maslow’s Hierarchy) yet remain fundamental to who we are and what we want out of life. Accordingly, when one’s passion is stripped away, being left grasping and rootless is quite common. That’s not equivalent to losing a job or loved one (those losses are afflicting many people right now, too), but our shared experience these days with no bars, no restaurants, no sports, no concerts, no school, and no church all add up to no society. We’re atomized, unable to connect and socialize meaningfully, digital substitutes notwithstanding. If a spectator, maybe one goes in search of replacements, which is awfully cold comfort. If a participant, one’s identity is wrapped up in such endeavors; resulting loss of meaning and/or purpose can be devastating.

It would be easy to over-analyze and over-intellectualize what meaningless abstraction means. It’s a trap, so I’ll do my best not to over-indulge. Still, it’s worth observing that as passions are habituated and internalized, their mode of appreciation is transferred from the senses (or sensorium) to the mind or head (as observed here). Coarseness and ugliness are then easily digested, rationalized, and embraced instead of being repulsive as they should be. There’s the paradox: as we grow more “sophisticated” (scare quotes intentional), we also invert and become more base. How else to explain tolerance of increasingly brazen dysfunction, corruption, servitude (e.g., debt), and gaslighting? It also explains the attraction to entertainments such as combat sports (and thug sports such as football and hockey), violent films, professional wrestling (more theater than sport), and online trolling. An instinctual blood lust that accompanies being predators, if not expressed more directly in war, torture, crime, and self-destruction, is sublimated into entertainment. Maybe that’s an escape valve so pressures don’t build up any worse, but that possibility strikes me as rather weak considering just how much damage has already been done.

The old saw goes that acting may be just fine as a creative endeavor, but given the opportunity, most actors really want to direct. A similar remark is often made of orchestral musicians, namely, that most rank-and-file players would really rather conduct. Directing and conducting may not be the central focus of creative work in their respective genres. After all, directors don’t normally appear onscreen and conductors make no sound. Instead, they coordinate the activities of an array of creative folks, putting directors in a unique position to bring about a singular vision in otherwise collaborative work. A further example is the Will to Power (associated with Friedrich Nietzsche and Arthur Schopenhauer) characteristic of those who wish to rule (as distinguished from those who wish to serve) such as regents, dictators, and autocrats. All of this sprang to mind because, despite outward appearance of a free, open society in the U.S., recent history demonstrates that the powers that be have instituted a directed election and directed economy quite at odds with democracy or popular opinion.

The nearest analogy is probably the directed verdict, where a judge removes the verdict from the hands or responsibility of the jury by directing the jury to return a particular verdict. In short, the judge decides the case for the jury, making the jury moot. I have no idea how commonplace directed verdicts are in practice.

Directed Election

Now that progressive candidates have been run out of the Democratic primaries, the U.S. presidential election boils down to which stooge to install (or retain) in November. Even if Biden is eventually swapped out for another Democrat in a brokered nominating convention (highly likely according to many), it’s certain to be someone fully amenable to entrenched corporate/financial interests. Accordingly, the deciders won’t be the folks who dutifully showed up and voted in their state primaries and caucuses but instead party leaders. One could try to argue that as elected representatives of the people, party leaders act on behalf of their constituencies (governing by consent of the people), but some serious straining is needed to arrive at that view. Votes cast in the primaries thus far demonstrate persistent desire for something distinctly other than the status quo, at least in the progressive wing of the Democratic party. Applying the cinematic metaphor of the top paragraph, voters are a cast of thousands millions being directed within a larger political theater toward a predetermined result.

Anyone paying attention knows that voters are rarely given options that aren’t in fact different flavors of the same pro-corporate agenda. Thus, no matter whom we manage to elect in November, the outcome has already been engineered. This is true not only by virtue of the narrow range of candidates able to maneuver successfully through the electoral gauntlet but also because of perennial distortions of the balloting process such as gerrymandering, voter suppression, and election fraud. Claims that both sides (really just one side) indulge in such practices so everything evens out don’t convince me.

Directed Economy

Conservative economists and market fundamentalists never seem to tire of arguments in the abstract that capitalist mechanisms of economics, left alone (unregulated, laissez-faire) to work their magic, deliver optimal outcomes when it comes to social and economic justice. Among the primary mechanisms is price discovery. However, economic practice never even remotely approaches the purity of abstraction because malefactors continuously distort and game economic systems out of self-interest greed. Price discovery is broken and equitable economic activity is made fundamentally fictitious. For example, the market for gemstones is famously inflated by a narrow consortium of sellers having successfully directed consumers to adopt a cultural standard of spending three months’ wages/salary for a wedding band as a demonstration of one’s love and devotion. In the opposite direction, precious metal spot prices are suppressed despite very high demand and nearly nonexistent supply. Current quoted premiums over spot silver price, even though no delivery is contemplated, range from roughly 20% to an absurd 2,000%. Supply and demand curves no longer function to aid in true price discovery (if such a thing ever existed). In a more banal sense, what people are willing to pay for a burger at a fast food joint or a loaf of bread at the grocery may affect the price charged more directly.

Nowhere is it more true that we’ve shifted to a directed economy than with the stock market (i.e., Wall Street vs. Main Street). As with the housing market, a real-world application with which many people have personal experience, if a buyer of a property or asset fails to appear within a certain time frame (longer for housing, shorter for stock, bonds, and other financial instruments), the seller is generally obliged to lower the price until a buyer finally appears. Some housing markets extraordinarily flush with money (e.g., Silicon Valley and Manhattan) trigger wild speculation and inflated prices that drive out all but the wealthiest buyers. Moreover, when the eventual buyer turns out to be a bank, corporation, or government entity willing to overpay for the property or asset using someone else’s money, the market becomes wholly artificial. This has been the case with the stock market for the last twelve years, with cheap money being injected nonstop via bailouts and quantitative easing to keep asset prices inflated. When fundamental instabilities began dragging the stock market down last fall, accelerating precipitous in early spring of this year and resulting in yet another crash (albeit brief), the so-called Plunge Protection Team sprang into action and wished trillions of dollars (taxpayer debt, actually, and over the objections of taxpayers in a classic fool-me-once scenario) into existence to perpetuate the casino economy and keep asset prices inflated for the foreseeable future, which isn’t very long.

The beneficiaries of this largesse are the same as they have always been when tax monies and public debt are concerned: corporations, banks, and the wealthy. Government economic supports are directed to these entities, leaving all others in the lurch. Claims that bailouts to keep large corporate entities and wealthy individuals whole so that the larger economy doesn’t seize up and fail catastrophically are preposterous because the larger economy already has seized up and failed catastrophically while the population is mostly quarantined, throwing many individuals out of work and shuttering many businesses. A reasonable expectation of widespread insolvency and bankruptcy lingers, waiting for the workouts and numbers to mount up.

The power of the purse possessed by the U.S. Congress hasn’t been used to help the citizenry since the New Deal era of FDR. Instead, military budgets and debts expand enormously while entitlements and services to the needy and vulnerable are whittled away. Citizen rebellions are already underway in small measure, mostly aimed at the quarantines. When bankruptcies, evictions, and foreclosures start to swell, watch out. Our leaders’ fundamental mismanagement of human affairs is unlikely to be swallowed quietly.

Purpose behind consumption of different genres of fiction varies. For most of us, it’s about responding to stimuli and experiencing emotions vicariously, which is to say, safely. For instance, tragedy and horror can be enjoyed, if that’s the right word, in a fictional context to tweak one’s sensibilities without significant effect outside the story frame. Similarly, fighting crime, prosecuting war, or repelling an alien invasion in a video game can be fun but is far removed from actually doing those things in real life (not fun). For less explicit narrative forms, such as music, feelings evoked are aesthetic and artistic in nature, which makes a sad song or tragic symphony enjoyable on its own merits without bleeding far into real sadness or tragedy. Cinema (now blurred with broadcast TV and streaming services) is the preeminent storytelling medium that provokes all manner of emotional response. After reaching a certain age (middle to late teens), emotional detachment from depiction of sexuality and violent mayhem makes possible digestion of such stimulation for the purpose of entertainment — except in cases where prior personal trauma is triggered. Before that age, nightmare-prone children are prohibited.

Dramatic conflict is central to driving plot and story forward, and naturally, folks are drawn to some stories while avoiding others. Although I’m detached enough not to be upset by, say, zombie films where people and zombies alike are dispatched horrifically, I wouldn’t say I enjoy gore or splatter. Similarly, realistic portrayals of war (e.g., Saving Private Ryan) are not especially enjoyable for me despite the larger story, whether based on true events or entirely made up. The primary reason I leave behind a movie or TV show partway through is because I simply don’t enjoy watching suffering.

Another category bugs me even more: when fiction intrudes on reality to remind me too clearly of actual horrors (or is it the reverse: reality intruding on fiction?). It doesn’t happen often. One of the first instances I recall was in Star Trek: The Next Generation when the story observed that (fictional) warp travel produced some sort of residue akin to pollution. The reminder that we humans are destroying the actual environment registered heavily on me and ruined my enjoyment of the fictional story. (I also much prefer the exploration and discovery aspects of Star Trek that hew closer to Gene Roddenberry’s original vision than the militaristic approach now central to Star Trek.) A much more recent intrusion occurs in the rather adolescent TV show The 100, where a global nuclear exchange launched by an artificial intelligence has the follow-on effect a century later of remaining nuclear sites going critical, melting down, and irradiating the Earth, making it uninhabitable. This bothers me because that’s my expectation what happens in reality, probably not too long (decades) after industrial civilization collapses and most or all of us are dead. This prospect served up as fiction is simply too close to reality for me to enjoy vicariously.

Another example of fiction intruding too heavily on my doomer appreciation of reality occurred retroactively. As high-concept science fiction, I especially enjoyed the first Matrix movie. Like Star Trek, the sequels degraded into run-of-the-mill war stories. But what was provocative about the original was the matrix itself: a computer-generated fiction situated within a larger reality. Inside the matrix was pleasant enough (though not without conflict), but reality outside the matrix was truly awful. It was a supremely interesting narrative and thought experiment when it came out in 1999. Now twenty-one years later, it’s increasingly clear that we are living in a matrix-like, narrative-driven hyperreality intent on deluding ourselves with a pleasant equilibrium that simply isn’t in evidence. In fact, as societies and as a civilization, we’re careening out of control, no brakes, no steering. Caitlin Johnstone explores this startling after-the-fact realization in an article at Medium.com, which I found only a couple days ago. Reality is in fact far worse than the constructed hyperreality. No wonder no one wants to look at it.

I was introduced to the phrase life out of balance decades ago when I saw the film Koyaanisqatsi. The film is the first of a trilogy (sequels are Powaqqatsi and Nagoyqatsi) by Godfrey Reggio, though the film is arguably more famous because of its soundtrack composed by Philip Glass. Consisting entirely of wordless montage and music, the film contrasts the majesty of nature (in slo-mo, among other camera effects) with the frenetic pace of human activity (often sped up) and the folly of the human-built world. Koyaanisqatsi is a Hopi Indian word, meaning life out of balance. One might pause to consider, “out of balance with what?” The film supplies the answer, none too subtly: out of balance with nature. The two sequels are celebrations of humans at work and technology, respectively, and never gained the iconic stature of the initial film.

If history (delivering us into the 21st century) has demonstrated anything, it’s that we humans are careening out of control toward disaster, not unlike the spacecraft in the final sequence of Koyaanisqatsi that tumbles out of the atmosphere for an agonizingly long time (in slo-mo), burning all the way down. We are all witness to the event (more accurately, the process) but can do little anymore to alter the eventual tragic result. Though some counsel taking steps toward amelioration (of suffering, if nothing else), our default response is rather to deny our collective fate, and worse, to accelerate toward it. That’s how unbalanced we are as a global civilization.

The observation that we are badly out of balance is made at the species and civilizational levels but is recapitulated at all levels of social organization, from distinct societies or nationalities to regional and municipal organizations and associations on down to families and individuals. The forces, dynamics, and power laws that push us off balance are many, but none is as egregious as the corrupting influence of interrelated wealth and power. Wisdom of the ancients (especially the non-Western ones) gave us the same verdict, though we have refused intransigently (or more charitably: failed) to learn the lesson for hundreds of generations.

What I propose to do in this multipart series is explore or survey some of the manifestations of life out of balance. There is no particular organization, chronology, or schedule for subsequent entries. As an armchair social critic, I reserve the luxury of exercising my own judgment and answering to no one. Stay tuned.

One of the victims of cancel culture, coming to my attention only days ago, is Kate Smith (1907–1986), a singer of American popular song. Though Smith had a singing career spanning five decades, she is best remembered for her version(s) of Irving Berlin’s God Bless America, which justifiably became a bit of Americana. The decades of Smith’s peak activity were the 1930s and 40s.

/rant on

I dunno what goes through people’s heads, performing purity rituals or character excavation on folks long dead. The controversy stems from Smith having a couple other songs in her discography: That’s Why Darkies Were Born (1931) and Pickaninny Heaven from the movie Hello, Everybody! (1933). Hate to break it anyone still living under a rock, but these dates are not far removed from minstrelsy, blackface, and The Birth of a Nation (1915) — a time when typical Americans referred to blacks with a variety of terms we now consider slurs. Such references were still used during the American civil rights movement (1960s) and are in use among some virulent white supremacists even today. I don’t know the full context of Kate Smith having sung those songs, but I suspect I don’t need to. In that era, popular entertainment had few of the sensibilities regarding race we now have (culture may have moved on, but it’s hard to say with a straight face it’s evolved or progressed humanely), and uttering commonly used terms back then was not automatic evidence of any sort of snarling racism.

I remember having heard my grandparents, nearly exact contemporaries of Kate Smith, referring to blacks (the term I grew up with, still acceptable I think) with other terms we no longer consider acceptable. It shocked me, but to them, that’s simply what blacks were called (the term(s) they grew up with). Absolutely nothing in my grandparents’ character or behavior indicated a nasty, racist intent. I suspect the same was true of Kate Smith in the 1930s.

Back when I was a librarian, I also saw plenty of sheet music published before 1920 or so with the term darkie (or darkey) in the title. See for example this. The Library of Congress still uses the subject headings “negro spirituals” (is there another kind?) and “negro songs” to refer to various subgenres of American folk song that includes slave songs, work songs, spirituals, minstrel music, protest songs, etc. Maybe we should cancel the Library of Congress. Some published music titles from back then even call them coon songs. That last one is totally unacceptable today, but it’s frankly part of our history, and like changing character names in Mark Twain’s Huckleberry Finn, sanitizing the past does not make it go away or any less discomfiting. But if you wanna bury your head in the sand, go ahead, ostrich.

Also, if some person or entity ever does some questionably racist, sexist, or malign thing (even something short of abominable) situated contextually in the past, does that mean he, she, or it must be cancelled irrevocably? If that be the case, then I guess we gotta cancel composer Richard Wagner, one of the most notorious anti-Semites of the 19th century. Also, stop watching Pixar, Marvel, and Star Wars films (among others), because remember that time when Walt Disney Studios (now Walt Disney Company) made a racist musical film, Song of the South (1946)? Disney’s tainted legacy (extending well beyond that one movie) is at least as awful as, say, Kevin Spacey, and we’re certainly not about to rehabilitate him.

/rant off