Archive for the ‘Culture’ Category

Further to this blog post, see this quote from Daniel Schwindt’s The Case Against the Modern World (2016), which will be the subject of a new book blogging project:

As Frank Herbert, the master of science fiction, once put it: “fear is the mind-killer.” And this is the precise truth, because a person acting in fear loses his capacity for judgment precisely insofar as he is affected by his fear. In fear, he does things that, in a peaceful frame of mind, he’d have found ridiculous. This is why we would expect that, if fear were to become a generalized condition in a civilization, knowledge itself would begin to deteriorate. [p. 35]

There’s a Joseph Conrad title with which I’ve always struggled, not having read the short story: The Secret Sharer (1910). The problem for me is the modifier secret. Is a secret being shared or is someone sharing in secret? Another ambivalent term came up recently at Macro-Futilism (on my blogroll) regarding the term animal farm (not the novel by George Orwell). Is the animal farming or is the animal being farmed? Mention was made that ant and termites share with humans the characteristic that we farm. Apparently, several others do as well. Omission of humans in the linked list is a frustratingly commonplace failure to observe, whether out of ignorance or stupid convention, that humans are animals, too. I also recalled ant farms from boyhood, and although I never had one (maybe because I never had one), I misunderstood that the ants themselves were doing the farming, as opposed to the keeper of the kit farming the ants.

The additional detail at Macro-Futilism that piqued my curiosity, citing John Gowdy’s book Ultrasocial: The Evolution of Human Nature and the Quest for a Sustainable Future (2021), is the contention that animals that farm organize themselves into labor hierarchies (e.g., worker/drone, soldier, and gyne/queen). Whether those hierarchies are a knowing choice (at least on the part of humans) or merely blind adaptation to the needs of agriculturalism is not clearly stated in the blog post or quotations, nor is the possibility of exceptions to formation of hierarchies in the list of other farming species. (Is there a jellyfish hierarchy?) However, lumping together humans, ants, and termites as ultrasocial agricultural species rather suggests that social and/or cultural evolution is driving their inner stratification, not foresight or planning. Put more plainly, humans are little or no different from insects after discovery and adoption of agriculture except for the obviously much higher complexity of human society over other animal farms.

I’ve suggested many times on this blog that humans are not really choosing the course of history (human or otherwise) as it unfolds around us, and further, that trying to drive or channel history in a chosen direction is futile. Rather, history is like a headless (thus, mindless) beast, and humans are mostly along for the ride. Gowdy’s contention regarding agricultural species supports the idea that no one is or can be in charge and that’s we’re all responding to survival pressure and adapting at unconscious levels. We’re not mindless, like insects, but neither are we able to choose our path in the macro-historical sense. The humanist in me — an artifact of Enlightenment liberalism, perhaps (more to say about that in forthcoming posts) — clings still to the assertion that we have agency, meaning choices to make. But those choices typically operate at a far more mundane level than human history. Perhaps political leaders and industrial tycoons have greater influence over human affairs by virtue of armies, weapons, and machinery, but my fear that those decision-makers can really only dominate and destroy, not preserve or create in ways that allow for human flourishing.

Does this explain away scourges like inequality, exploitation, institutional failure, rank incompetence, and corruption, given that each of us responds to a radically different set of influences and available options? Impossible question to answer.

Ask parents what ambitions they harbor for their child or children and among the most patterned responses is “I just want them to be happy.” I find such an answer thoughtless and disingenuous, and the insertion of the hedge just to make happiness sound like a small ask is a red herring. To begin with, for most kids still in their first decade, happiness and playfulness are relatively effortless and natural so long as a secure, loving environment is provided. Certainly not a default setting, but it’s still quite commonplace. As the dreamy style of childhood cognition is gradually supplanted by supposedly more logical, rational, adult thinking, and as children become acquainted with iniquities of both history and contemporary life, innocence and optimism become impossible to retain. Cue the sullen teenager confronting the yawning chasm between desire and reality. Indeed, few people seem to make the transition into adulthood knowing with much clarity how to be happy in the midst of widespread travail and suffering. Instead, young adults frequently substitute self-destructive, nihilistic hedonism, something learned primarily (says me) from the posturing of movie characters and the celebrities who portray them. (Never understood the trope of criminals hanging at nightclubs, surrounded by drug addicts, nymphos, other unsavory types, and truly awful music, where they can indulge their assholery before everything inevitably goes sideways.)

Many philosophies recommend simplicity, naturalness, and independence as paths to happiness and moral rectitude. Transcendentalism was one such response to social and political complexities that spoil and/or corrupt. Yet two centuries on, the world has only gotten more and more complex, pressing on everyone especially for information processing in volume and sophistication that does not at all come naturally to most and is arguably not part of our evolutionary toolkit. Multiple social issues, if one is to engage them fairly, hinge on legalistic arguments and bewildering wordplay that render them fundamentally intractable. Accordingly, many waive away all nuance and adopt pro forma attitudes. Yet the airwaves, social media, the Internet, and even dinner conversations are suffused by the worst sorts of hypercomplexity and casuistry that confound even those who traffic regularly in such rhetoric. It’s a very long way from “I just want to be happy.”

(more…)

From an otherwise, rambling, clumsy blog post, this portion from an extended analysis of Mad Max: Fury Road caught my attention:

Ideas that cannot be challenged, that cannot bear even the slightest scrutiny, are ideas that can’t evolve. It doesn’t matter whether they are right or wrong.

They are static, mechanical and ultimately devoid of life itself.

This is our world today in the hands of the Woke Left, a world where the destructive and vindictive feminine has been elevated to the point of unimpeachable rightness. But this isn’t any kind of healthy feminine. It’s a Furiosa-like feminine, devoid of nurturing, all implied violence, all sexuality suppressed to the point of masculinity.

Look at Furiosa and tell me it isn’t asking another vital question, “In a dying world, is there any room for fertility while clinging like moss for survival?”

In our world feminism has robbed women of their greatest attribute, the ability to gestate and nurture life itself. Hollywood has spent two generations giving us female action heroes who are ultimately nothing more than Doods with Boobs. It’s the ultimate power fantasy of Third Wave feminism.

It’s not as destructive an archetype as the sluts on Sex in the City, mind you, because at least it can be tied in some ways back to motherhood, i.e. Ripley in James Cameron’s Aliens, but it’s still damaging to the cause of the healthy feminine nonetheless.

Furiosa is what happens when gender roles are maximally out of balance.

Although disinclined to take the optimistic perspective inhabited by bright-siders, I’m nonetheless unable to live in a state of perpetual fear that would to façile thinkers be more fitting for a pessimist. Yet unrelenting fear is the dominant approach, with every major media outlet constantly stoking a toxic combination of fear and hatred, as though activation and ongoing conditioning of the lizard brain (i.e., the amygdala — or maybe not) in everyone were worthy of the endeavor rather than it being a limited instinctual response, leaping to the fore only when immediate threat presents. I can’t guess the motivations of purveyors of constant fear to discern an endgame, but a few of the dynamics are clear enough to observe.

First thing that comes to mind is that the U.S. in the 1930s and 40s was pacifist and isolationist. Recent memory of the Great War was still keenly felt, and with the difficulties of the 1929 Crash and ensuing Great Depression still very must present, the prospect of engaging in a new, unlimited war (even over there) was not at all attractive to the citizenry. Of course, political leaders always regard (not) entering into war somewhat differently, maybe in terms of opportunity cost. Hard to say. Whether by hook or by crook (I don’t actually know whether advance knowledge of the Japanese attack on Pearl Harbor was suppressed), the U.S. was handily drawn into the war, and a variety of world-historical developments followed that promoted the U.S. (and its sprawling, unacknowledged empire) into the global hegemon, at least after the Soviet Union collapsed and before China rose from a predominantly peasant culture into world economic power. A not-so-subtle hindsight lesson was learned, namely, that against widespread public sentiment and at great cost, the war effort could (not would) provide substantial benefits (if ill-gotten and of questionable desirability).

None of the intervening wars (never declared) or Wars for Dummies (e.g., the war on poverty, the war on crime, the war on drugs) provided similar benefits except to government agencies and careerist administrators. Nor did the war on terror following the 9/11 attacks or subsequent undeclared wars and bombings in Afghanistan, Iraq, Syria, Libya, Yemen, and elsewhere provide benefits. All were massive boondoggles with substantial destruction and loss of life. Yet after 9/11, a body of sweeping legislation was enacted without much public debate or scrutiny — “smuggled in under cover of fear” one might say. The Patriot Act and The National Defense Authorization Act are among the most notable. The conditioned response by the citizenry to perceived but not actual existential fear was consistent: desperate pleading to keep everyone safe from threat (even if it originates in the U.S. government) and tacit approval to roll back civil liberties (even though the citizenry is not itself the threat). The wisdom of the old Benjamin Franklin quote, borne out of a very different era and now rendered more nearly as a bromide, has long been lost on many Americans.

The newest omnipresent threat, literally made-to-order (at least according to some — who can really know when it comes to conspiracy), is the Covid pandemic. Nearly every talking, squawking head in government and the mainstream media (the latter now practically useless except for obvious propaganda functions) is telling everyone who still watches (video and broadcast being the dominant modes) to cower in fear of each other, reduce or refuse human contact and social function, and most of all, take the vaccine-not-really-a-vaccine followed by what is developing into a ongoing series of boosters to maintain fear and anxiety if not indeed provide medical efficacy (no good way to measure and substantiate that, anyway). The drumbeat is loud and unabated, and a large, unthinking (or spineless) portion of the citizenry, cowed and cowering, has basically joined the drum circle, spreading a social consensus that is very, well, un-American. Opinion as to other nations on similar tracks are not ventured here. Running slightly ahead of the pandemic is the mind virus of wokery and its sufferers who demand, among other things, control over others’ thoughts and speech through threats and intimidation, censorship, and social cancellation — usually in the name of safety but without any evidence how driving independent thought underground or into hiding accomplishes anything worthwhile.

Again, motivations and endgame in all this are unclear, though concentration of power to compel seems to be exhilarating. In effect, regular folks are being told, “stand on one leg; good boy; now bark like a dog; very good boy; now get used to it because this shit is never going to end but will surely escalate to intolerability.” It truly surprises me to see police forces around the world harassing, beating, and terrorizing citizens for failing to do as told, however arbitrary or questionable the order or the underlying justification. Waiting for the moment to dawn on rank-and-file officers that their monopoly on use of force is serving and protecting the wrong constituency. (Not holding my breath.) This is the stuff of dystopic novels, except that it’s not limited to fiction and frankly never was. The hotspot(s) shift in terms of time and place, but totalitarian mind and behavioral control never seems to fade or invalidate itself as one might expect. Covid passports to grant full participation in society (signalling compliance, not health) is the early step already adopted by some countries. My repeated warnings over the years of creeping fascism (more coercive style than form of government) appears to be materializing before our very eyes. I’m afraid of what portends, but with what remains of my intact mind, I can’t live in perpetual fear, come what may.

/rant on

Remember when the War on Christmas meme materialized a few years ago out of thin air, even though no one in particular was on attack? Might have to rethink that one. Seems that every holiday now carries an obligation to revisit the past, recontextualize origin stories, and confess guilt over tawdry details of how the holiday came to be celebrated. Nearly everyone who want to know knows by now that Christmas is a gross bastardization of pagan celebrations of the winter solstice, cooped by various organized churches (not limited to Christians!) before succumbing to the awesome marketing onslaught (thanks, Coca-Cola) that makes Xmas the “most wonderful time of the year” (as the tune goes) and returning to holiday to its secular roots. Thanksgiving is now similarly ruined, no longer able to be celebrated and enjoyed innocently (like a Disney princess story reinterpreted as a white or male savior story — or even worse, a while male) but instead used as an excuse to admit America’s colonial and genocidal past and extermination mistreatment of native populations as white Europeans encroached ever more persistently on lands the natives held more or less as a commons. Gone are the days when one could gather among family and friends, enjoy a nice meal and good company, and give humble, sincere thanks for whatever bounty fortuna had bestowed. Now it’s history lectures and acrimony and families rent asunder along generational lines, typically initiated by those newly minted graduates of higher education and their newfangled ideas about equity, justice, and victimhood. Kids these days … get off my lawn!

One need not look far afield to find alternative histories that position received wisdom about the past in the cross-hairs just to enact purification rituals that make it/them, what, clean? accurate? whole? I dunno what the real motivation is except perhaps to force whites to self-flagellate over sins of our ancestors. Damn us to hell for having cruelly visited iniquity upon everyone in the process of installing white, patriarchal Europeanness as the dominant Western culture. I admit all of it, though I’m responsible for none of it. Moreover, history stops for no man, no culture, no status quo. White, patriarchal Europeanness is in serious demographic retreat and has arguably already lost its grip on cultural dominance. The future is female (among other things), amirite? Indeed, whether intended or not, that was the whole idea behind the American experiment: the melting pot. Purity was never the point. Mass migration out of politically, economically, and ecologically ravaged regions means that the experiment is no longer uniquely American.

Interdisciplinary approaches to history, if academic rigidity can be overcome, regularly develop new understandings to correct the historical record. Accordingly, the past is remarkably dynamic. (I’ve been especially intrigued by Graham Hancock’s work on ancient civilizations, mostly misunderstood and largely forgotten except for the megalithic projects left behind.) But the past is truly awful, with disaster piled upon catastrophe followed by calamity and cataclysm. Still waiting for the apocalypse. Peering too intently into the past is like staring at the sun: it scorches the retinas. Moreover, the entire history of history is replete with stories being told and retold, forgotten and recovered, transformed in each iteration from folklore into fable into myth into legend before finally passing entirely out of human memory systems. How many versions of Christmas are there across cultures and time? Or Thanksgiving, or Halloween, or any Hallmark® holiday that has crossed oceans and settled into foreign lands? What counts as the true spirit of any of them when their histories are so mutable?

/rant off

A quick search revealed that over 15 years of blog posts, the word macrohistory has been used only once. On reflection, macrohistory is something in which I’ve been involved for some time — mostly as a dilettante. Several book reviews and three book-blogging series (one complete, two either on hiatus or fully abandoned) concern macrohistory, and my own several multi-part blogs connect disparate dots over broader topics (if not quite history in the narrow sense). My ambition, as with macrohistory, is to tease out better (if only slightly) understandings of ourselves (since humans and human culture are obviously the most captivating thing evar). Scientists direct similar fascination to the inner workings of nonhuman systems — or at least larger systems in which humans are embedded. Thus, macrohistory can be distinguished from natural history by their objects of study. Relatedly, World-Systems Theory associated with Immanuel Wallerstein and The Fourth Turning (1997 book by William Strauss and Neil Howe) take similarly broad perspectives and attempt to identify historical dynamics and patterns not readily apparent. Other examples undoubtedly exist.

This is all preliminary to discussing a rather depressing article from the December 2020 issue of Harper’s Magazine: Rana Dasgupta’s disquieting (ahem) essay “The Silenced Majority” (probably behind a paywall). The subtitle poses the question, “Can America still afford democracy?” This innocuous line begs the question whether the U.S. (America and the United States of America [and its initialisms U.S. and U.S.A.] being sloppily equivalent almost everywhere, whereas useful distinctions describe the United Kingdom, Great Britain, and England) actually has or practices democracy anymore, to which many would answer flatly “nope.” The essay is an impressive exercise, short of book length, in macrohistory, though it’s limited to Western cultures, which is often the case with history told from inside the bubble. Indeed, if (as the aphorism goes) history is written/told primarily by the victors, one might expect to hear only of an ongoing series of victories and triumphs with all the setbacks, losses, and discontinuities excised like some censored curated Twitter or Facebook Meta discussion. One might also wonder how that same history reads when told from the perspective of non-Western countries, especially those in transitional regions such as Poland, Ukraine, Turkey, and Iran or those with histories long predating the rise of the West roughly 500 years ago, i.e., China, Japan, Egypt, and the lost cultures of Central America. Resentments of the Islamic world, having been eclipsed by the West, are a case in point. My grasp of world history is insufficient to entertain those perspectives. I note, however, that with globalism, the histories of all regions of the world are now intimately interconnected even while perspectives differ.

Dasgupta describes foundational Enlightenment innovations that animate Western thinking, even though the ideas are often poorly contextualized or understood. To wit:

In the seventeenth century, England was an emerging superpower. Supremacy would come from its invention of a world principle of property. This principle was developed following contact with the Americas, where it became possible to conjure vast new English properties “out of nothing”—in a way that was impracticable, for instance, in the militarized, mercantile societies of India. Such properties were created by a legal definition of ownership designed so that it could be applied only to the invaders. “As much land as a man tills, plants, improves, cultivates, and can use the product of,” John Locke wrote in 1689, “so much is his property.” When combined with other new legal categories such as “the savage” and “the state of nature,” this principle of property engendered societies such as Carolina, where Locke’s patron, the first earl of Shaftesbury, was a lord proprietor.

Obvious, isn’t it, that by imposing the notion of private property on indigenous inhabitants of North America, colonialists established ownership rights over territories where none had previously existed? Many consider that straightforward theft (again, begging the question) or at least fencing the commons. (Attempts to do the same in the open oceans and in space [orbit] will pick up as technology allows, I surmise.) In addition, extension of property ownership to human trafficking, i.e., slavery and its analogues still practiced today, has an exceptionally long history and was imported to the Americas, though the indigenous population proved to be poor candidates for subjugation. Accordingly, others were brought to North America in slave trade that extended across four centuries.

Dasgupta goes on:

From their pitiless opposition to the will of the people, we might imagine that British elites were dogmatic and reactionary. (Period dramas depicting stuck-up aristocrats scandalized by eccentricity and innovation flatter this version of history.) The truth is that they were open-minded radicals. They had no sentimentality about the political order, cutting the head off one king and sending another into exile. They could invent financial and legal structures (such as the Bank of England, founded in 1694) capable of releasing unprecedented market energies. Even their decision to exploit American land with African labor demonstrated their world-bending pursuit of wealth. Their mines and plantations would eventually supply the capital for the first industrial revolution. They loved fashion and technology, they believed in rationality, progress, and transparency. They were the “founding fathers” of our modern world.

And yet they presided over a political system as brutal as it was exclusive. Why? The answer is simple. They could not afford democracy, but also, crucially, they did not need it. [emphasis in original]

So much for the awe and sacred respect in which Enlightenment philosophers and the Founders are held — or used to be. Statues of these dudes (always dudes, natch) are being pulled down all the time. Moreover, association of liberal democracy with the 17th century is a fundamental mistake, though neoliberalism (another poorly defined and understood term) aims to shift backwards to a former or hybrid state of human affairs some are beginning to call digital feudalism.

The article goes on to discuss the balancing act and deals struck over the course of centuries to maintain economic and political control by the ownership class. It wasn’t until the 1930s and the postwar economic boom in the U.S. that democracy as commonly understood took root significantly. The labor movement in particular was instrumental in forcing FDR’s New Deal social programs, even though populism and socialism as political movements had been successfully beaten back. Interestingly, the hallowed American nuclear family (limited in its scope racially), an ahistorical formation that enjoyed a roughly 30-year heyday from 1945 to 1975, coincides with the rise of the American middle class and now-aged democratic institutions. They’re all connected with widely distributed wealth and prosperity. But after the oil crisis and stagflation of the middle 1970s, gains enjoyed by the middle class have steadily eroded and/or been actively beaten back (again!) so that dominant themes today are austerity imposed on the masses and inequality coughing up hundy-billionaires with increasing frequency. Estimates are that 30-40% of the American citizenry lives in poverty, bumping up against failed state territory. Inequality has returned to Gilded Age levels if not exceeded them. Dasgupta fails to cite perhaps the major underlying cause of this shift away from affordable democracy, back toward the brutal, world principal of property: falling EROI. Cheap foreign labor, productivity gains, and creation of a giant debtor society have simply not offset the disappearance of cheap energy.

Dasgupta’s further discussion of an emerging two-tier economy along with the Silicon Valley technocracy follows, but I’ll stop short here and encourage readers instead to investigate and think for themselves. Lots of guides and analyses help to illuminate the macrohistory, though I find the conclusions awful in their import. Dasgupta drives home the prognosis:

The neoliberal revolution aimed to restore the supremacy of capital after its twentieth-century subjugation by nation-states, and it has succeeded to an astonishing degree. As states compete and collude with gargantuan new private powers, a new political world arises. The principle of labor, which dominated the twentieth century—producing the industrious, democratic society we have come to regard, erroneously, as the norm—is once again being supplanted by a principle of property, the implications and consequences of which we know only too well from our history books.

Continuing from part 2. I’m so slow ….

If cognitive inertia (i.e., fear of change) used to manifest as technophobia, myriad examples demonstrate how technology has fundamentally penetrated the social fabric and shared mental space, essentially flipping the script to fear of missing out (FOMO) of whatever latest, greatest innovation comes down the pike (laden with fraud and deception — caveat emptor). With FOMO, a new phobia has emerged: fear of technological loss, or more specifically, inability to connect to the Internet. This is true especially among the young, born and bred after the onset of the computing and digital communications era. Who knows when, why, or how loss of connectivity might occur? Maybe a Carrrington Event, maybe rolling blackouts due to wildfires (such as those in California and Oregon), maybe a ransomware attack on ISPs, or maybe a totalitarian clampdown by an overweening government after martial law is declared (coming soon to a neighborhood near you!). Or maybe something simpler: infrastructure failure. For some, inability to connect digitally, electronically, is tantamount to total isolation. Being cut off from the thoughts of others and abandoned left to one’s own thoughts, even on the short term, is thus roughly equivalent to the torture of solitary confinement. Forget the notion of digital detox.

/rant on

Cheerleaders for technocracy are legion, of course, while the mind boggles at how society might or necessarily will be organized differently when it all fails (as it must, if for no other reason than energy depletion). Among the bounties of the communications era is a surfeit of entertainments, movies and TV shows especially, that are essentially new stories to replace or supplant old stories. It’s no accident, however, that the new ones come wrapped up in the themes, iconography, and human psychology (is there any other kind, really?) of old ones. Basically, everything old is new again. And because new stories are delivered through hyperpalatable media — relatively cheap, on demand, and removed from face-to-face social contexts — they arguably cause as much disorientation as reorientation. See, for instance, the embedded video, which is rather long and rambling but nevertheless gets at how religious instincts manifest differently throughout the ages and are now embedded in comic book stories and superheros that have overtaken the entertainment landscape.

Mention is made that the secular age coincides roughly with the rise of video stores, a form of on-demand selection of content more recently made even simpler with ubiquitous streaming services. Did people really start hunkering down in their living rooms, eschewing group entertainments and civic involvements only in the 1980s? The extreme lateness of that development in Western history is highly suspect, considering the death of god had been declared back in the middle of the 19th century. Moreover, the argument swings around to the religious instinct, a cry or meaning if you will, being blocked by organized churches and their endemic corruption and instead finding expression in so-called secular religions (oxymoron alert). Gawd, how I tire of everything that functions as psychological grounding being called a religion. Listen, pseudo-religious elements can be found in Cheerios if one twists up one’s mind sufficiently. That doesn’t make General Mills or Kellogg’s new secular-religious prophets.

Back to the main point. Like money grubbing, technophilia might quiet the desperate search for meaning temporarily, since there’s always more of both to acquire. Can’t get enough, ever. But after even partial acquisition, the soul feels strangely dissatisfied and disquieted. Empty, one might even say. So out roving into the public sphere one goes, seeking and pursuing something to fill one’s time and appetites. Curiously, many traditional solutions to this age-old problem taught the seeker to probe within as an alternative. Well screw that! In the hyper-connected 20th-century world, who has time for that measly self-isolation? More reified Cheerios!

/rant off

With each successive election cycle, I become more cynical (how is that even possible?) about the candidates and their supposed earnest mission to actually serve the public interest. The last couple cycles have produced a new meme that attempts to shift blame for poor governance to the masses: the low-information voter. Ironically, considering the fact that airwaves, magazines, books, public addresses, online venues, and even dinner conversations (such as they still exist if diners aren’t face-planted in their screens) are positively awash in political commentary and pointless debate and strategizing, there is no lack of information available. However, being buried under a déluge of information is akin to a defense attorney hiding damning discovery in an ocean of irrelevance, so I have some sympathy for voters who are thwarted in attempts to make even modestly informed decisions about political issues.

Multiply this basic relationship across many facets of ordinary life and the end result is the low-information citizen (also low-information consumer). Some parties (largely sellers of things, including ideas) possess a profusion of information, whereas useful, actionable information is hidden from the citizen/consumer by an information avalanche. For example, onerous terms of an insurance contract, debt instrument, liability waiver, or even routine license agreement are almost never read prior to signing or otherwise consenting; the acronym tl;dr (stands for “too long; didn’t read”) applies. In other situations, information is withheld entirely, such as pricing comparisons one might undertake if high-pressure sales tactics were not deployed to force potential buyers in decisions right here, right now, dammit! Or citizens are disempowered from exercising any critical judgment by erecting secrecy around a subject, national security being the utility excuse for everything the government doesn’t want people to know.

Add to this the concerted effort (plain enough to see if one bothers to look) to keep the population uneducated, without options and alternatives, scrambling just to get through the day/week/month (handily blocking information gathering), and thus trapped in a condition of low information. Such downward pressure (survival pressure, one might say when considering the burgeoning homeless population) is affecting a greater portion of the population than ever. The American Dream that energized and buoyed the lives of many generations of people (including immigrants) has morphed into the American Nightmare. Weirdly, the immigrant influx has not abated but rather intensified. However, I consider most of those folks (political, economic, and ecological) refugees, not immigrants.

So those are the options available to powers players, where knowledge is power: (1) withhold information, (2) if information can’t be withheld, then bury it as a proverbial needle in a haystack, and (3) render a large percentage of the public unable to process and evaluate information by keeping them undereducated. Oh, here’s another: (4) produce a mountain of mis- and disinformation that bewilders everyone. This last one is arguably the same as (2) except that the intent is propaganda or psyop. One could also argue that miseducating the public (e.g., various grievance studies blown into critical race theory now being taught throughout the educational system) is the same as undereducating. Again, intent matters. Turning someone’s head and radicalizing them with a highly specialized toolkit (mostly rhetorical) for destabilizing social relations is tantamount to making them completely deranged (if not merely bewildered).

These are elements of the ongoing epistemological crisis I’ve been observing for some time now, with the side effect of a quick descent into social madness being used to justify authoritarian (read: fascist) concentration of power and rollback of individual rights and freedoms. The trending term sensemaking also applies, referring to reality checks needed to keep oneself aligned with truth, which is not the same as consensus. Groups are forming up precisely for that purpose, centered on evidentiary rigor as well as skepticism toward the obvious disinformation issuing from government agencies and journalists who shape information according to rather transparent brazen agendas. I won’t point to any particular trusted source but instead recommend everyone do their best (no passivity!) to keep their wits about them and think for themselves. Not an easy task when the information environment is so thoroughly polluted — one might even say weaponized — that it requires special workarounds to navigate effectively.

Watched Soylent Green (1973) a few days ago for the first time since boyhood. The movie is based on a book by Richard Fleischer (which I haven’t read) and oddly enough has not yet been remade. How to categorize the film within familiar genres is tricky. Science fiction? Disaster? Dystopia? Police procedural? It checks all those boxes. Chief messages, considering its early 70s origin, are pollution and overpopulation, though global warming is also mentioned less pressingly. The opening montage looks surprisingly like what Godfrey Reggio did much better with Koyaanisqatsi (1982).

Soylent Green is set in 2022 — only a few months away now but a relatively remote future in 1973 — and the Earth is badly overpopulated, environmentally degraded, overheated, and struggling to support teeming billions mostly jammed into cities. Details are sketchy, and only old people can remember a time when the biosphere remained intact; whatever disaster had occurred was already long ago. Science fiction and futuristic films are often judged improperly by how correct prophecies turn out in reality, as though enjoyment were based on fidelity to reality. Soylent Green fares well in that respect despite its clunky, dated, 70s production design. Vehicles, computer screens, phones, wardrobe, and décor are all, shall we say, quaintly vintage. But consider this: had collapse occurred in the 70s, who’s to say that cellphones, flat screens, and the Internet would ever have been developed? Maybe the U.S. (and the world) would have been stalled in the 70s much the way Cuba is stuck in the 50s (when the monumentally dumb, ongoing U.S. embargo commenced).

The film’s star is Charlton Heston, who had established himself as a handsomely bankable lead in science fiction, disaster, and dystopian films (e.g., The Omega Man and The Planet of the Apes series). Though serviceable, his portrayal is remarkably plain, revealing Heston as a poor man’s Sean Connery or John Wayne (both far more charismatic contemporaries of Heston’s even in lousy films). In Soylent Green, Heston plays Detective Robert Thorn, though he’s mostly called “Thorn” onscreen. Other characters below the age of 65 or so also go by only one name. They all grew up after real foodstuffs (the titular Soylent Green being a synthetic wafer reputedly made out of plankton — the most palatable of three colors) and creature comforts became exceedingly scarce and expensive. Oldsters are given the respect of first and last names. Thorn investigates the assassination of a high-ranking industrialist to its well-known conspiratorial conclusion (hardly a spoiler anymore) in that iconic line at the very end of the film: “Soylent Green is people!” Seems industrialists, to keep people fed, are making food of human corpses. That eventual revelation drives the investigation and the film forward, a device far tamer than today’s amped up action thrillers where, for instance, a mere snap of the fingers can magically wipe out or restore half of the universe. Once the truth is proclaimed by Thorn (after first being teased whispered into a couple ears), the movie ends rather abruptly. That’s also what makes it a police procedural set in a disastrous, dystopic, science-fiction future stuck distinctively in the past: once the crime/riddle is solved, the story and film are over with no dénouement whatsoever.

Some of the details of the film, entirely pedestrian to modern audiences, are modestly enjoyable throwbacks. For instance, today’s penchant for memes and slang renaming of commonplace things is employed in Soylent Green. The catchphrase “Tuesday is Soylent Green Day” appears but is not overdriven. A jar of strawberries costs “150D,” which I first thought might be future currency in the form of debits or demerits but is probably just short for dollars. Front end loaders used for crowd control are called “scoops.” High-end apartment building rentals come furnished with live-in girls (prostitutes or gold-diggers, really) known as Furniture Girls. OTOH, decidedly 70s-era trash trucks (design hasn’t really changed much) are not emblazoned with the corporate name or logo of the Soylent Corporation (why not?). Similarly, (1) dressing the proles in dull, gray work clothes and brimless caps, (2) having them sleep on stairways or church refuges piled on top of each other so that characters have to step gingerly through them, (3) being so crammed together in protest when the Tuesday ration of Soylent Green runs short that they can’t avoid the scoops, (4) dripped blood clearly made of thick, oversaturated paint (at least on the DVD), and (5) a sepia haze covering daytime outdoor scenes are fairly lazy nods to world building on a low budget. None of this is particularly high-concept filmmaking, though the restraint is appreciated. The sole meme (entirely unprepared) that should have been better deployed is “going home,” a euphemism for reporting voluntarily to a processing plant (into Soylent Green, of course) at the end of one’s suffering life. Those who volunteer are shown 30 minutes of scenes, projected on a 360-degree theater that envelops the viewer, depicting the beauty and grandeur of nature before it had disappeared. This final grace offered to people (rather needlessly) serves the environmental message of the film well and could have been “driven home” a bit harder.

Like other aspects of the film’s back story, how agriculture systems collapsed is largely omitted. Perhaps such details (conjecture) are in the book. The film suggests persistent heat (no seasons), and accordingly, character are made to look like they never stop sweating. Scientific projections of how global warming will manifest do in fact point to hothouse Earth, though seasons will still occur in temperate latitudes. Because such changes normally occur in geological time, it’s an exceedingly slow process compared to human history and activity. Expert inquiry into the subject prophesied long ago that human activity would trigger and accelerate the transition. How long it will take is still unknown, but industrial civilization is definitely on that trajectory and human have done little since the 70s to curb self-destructive appetites or behaviors — except of course talk, which in the end is just more hot air. Moreover, dystopian science fiction has shifted over the decades away from self-recrimination to a long, seemingly endless stream of superheros fighting crime (and sometimes aliens). Considering film is entertainment meant to be enjoyed, the self-serious messages embedded in so many 70s-era disaster films warning us of human hubris are out of fashion. Instead, superpowers and supersuits rule cinema, transforming formerly uplifting science-fiction properties such as Star Trek into hypermilitaristic stories of interstellar social collapse. Soylent Green is a grim reminder that we once knew better, even in our entertainments.