Archive for the ‘Cinema’ Category

Was surprised to learn a while back that West Side Story (1961) was being remade by none other than Steven Spielberg. Yeah, that Steven Spielberg. Among the spurious reasons (I gather) for the frankly unnecessary remake was a desire to recast with actors of the proper ethnic origin. Ugh. Sure, the original actors who portrayed Bernardo and Maria were Americans of Greek and Russo/Ukrainian descent, respectively. So what? Spielberg’s casting didn’t get much closer (Canadian and American/Colombian, respectively), though the newly cast actors certainly look like they could be Puerto Rican. Any further updating of this particular adaptation of Shakespeare’s Romeo and Juliet (one of many) for today’s woke sensibilities was also foiled considering the plot (starry-eyed, ill-fated, would-be lovers divided by rivalrous families/gangs) remained essentially unchanged and the original 1950s NYC setting was kept. In addition, the original musical score (altered — more on that below) and choreography (updated? I can’t tell) were used. It wasn’t a shot-for-shot remake, and I presume some of the dialogue was changed, but I didn’t make direct comparisons. Lastly, considering the 1961 original won numerous awards, who exactly was crying out for a remake? Unsurprisingly, the remake was also nominated for awards.

Aside: In arts and entertainment media, remakes are restricted to cinema. No one rewrites a book. Restaged theater and musicals are merely new productions. Rerecording a pop song is understood as a “cover” of the original, not a remake. The rather large discography of classical music includes many, many different versions of the same works, e.g., Beethoven symphonies. (Some suggest, “Does anyone really need yet another version of Beethoven Symphony No. 5?” That question loses legitimacy when asked about live performance.) One might argue that those, too, are remakes, except that there is rarely such a thing as a definitive original. Moreover, consider that music is a dynamic art typically practiced live, in real time. A musical recording fixes that experience, whether live in concert or in the recording studio, on a playback medium intended for repeat play. Comparison of different performances can be quite interesting and enjoyable. Further, a recording of a sporting event might be made for more convenient rebroadcast shortly afterwards and/or for archival purposes, but repeat experience (i.e., rewatching the 1985 Super Bowl vs. listening repeatedly to a favorite music album) is anathema when the outcome has already been seen. Similarly, repeat viewing of TV shows and movies is best at wide intervals, after memory of original viewing fades. Cinema, in contrast with music, has always been a fixed form. Cinema is also not understood as a recording of a live experience. Its genesis as playback differs from stage theater or musical theater. (Some critics and superfans — especially the YouTube variety — don’t wait but instead immediately go back in search of Easter eggs and continuity errors.) Finally, only a modest number of TVs shows have been remade or rebooted, whereas remaking and rebooting movies is comparatively commonplace, which has been characterized as “Hollywood out of ideas.” Take note that West Side Story was first a stage musical and only later committed to film.

(more…)

Let me first restate axioms developed in previous blog posts. Narrative is the essential outward form of consciousness. Cognition has many preverbal and nonverbal subtleties, but the exchange of ideas occurs predominantly through narrative, and the story of self (told to oneself) can be understood as stream of consciousness: ongoing self-narration of sensations and events. The principal characteristic of narrative, at least that which is not pure fantasy, is in-the-moment sufficiency. Snap-judgment heuristics are merely temporary placeholders until, ideally at least, thoughtful reconsideration and revision that take time and discernment can be brought to bear. Stories we tell and are told, however, often do not reflect reality well, partly because our perceptual apparatuses are flawed, partly because individuals are untrained and unskilled in critical thinking (or overtrained and distorted), and partly because stories are polluted with emotions that make clear assessments impossible (to say nothing of malefactors with agendas). Some of us struggle to remove confabulation from narrative (as best we can) whereas others embrace it because it’s emotionally gratifying.

A good example of the reality principle is recognition, similar to the 1970s energy crisis, that energy supplies don’t magically appear by simply digging and drilling more of the stuff out of the ground. Those easy-to-get resources have been plundered already. The term peak oil refers to eventual decline in energy production (harvesting, really) when the easy stuff is more than half gone and undiminished (read: increasing) demand impels energy companies to go in search of more exotic supply (e.g., underwater or embedded in shale). If that reality is dissatisfying, a host of dreamt-up stories offer us deliverance from inevitable decline and reduction of lifestyle prerogatives by positing extravagant resources in renewables, hydrogen fuel cells, fusion (not to be confused with fission), or as-yet unexploited regions such as The Arctic National Wildlife Refuge. None of these represent plausible realities (except going into heretofore protected regions and bringing ecological devastation).

The relationship of fictional stories to reality is quite complex. For this blog post, a radically narrow description is that fiction is the imaginary space whereas ideas can be tried out and explored safely in preparation for implementation in reality. Science fiction (i.e., imagining interstellar space travel despite its flat impossibility in Newtonian physics) is a good example. Some believe humans can eventually accomplish what’s depicted in sci-fi, and in certain limited examples we already have. But many sci-fi stories simply don’t present a plausible reality. Taken as vicarious entertainment, they’re AOK superfine with me. But given that Western cultures (I can’t opine on cultures outside the West) have veered dangerously into rank ideation and believing their own hype, too many people believe fervently in aspirational futures that have no hope of ever instantiating. Just like giant pools of oil hidden under the Rocky Mountains (to cite something sent to me just today offering illusory relief from skyrocketing gasoline prices).

Among the many genres of narrative now on offer in fiction, no better example of sought-after-power is the superhero story. Identifying with the technological and financial power of Ironman and Batman or the god-powers of Thor and Wonder Woman is thrilling, perhaps, but again, these are not plausible realities. Yet these superrich, superstrong, superintelligent superheros are everywhere in fiction, attesting to liminal awareness of lack of power and indeed frailty. Many superhero stories are couched as coming-of-age stories for girls, who with grit and determination can fight toe-to-toe with any man and dominate. (Too many BS examples to cite.) Helps, of course, if the girl has magic at her disposal. Gawd, do I tire of these stories, told as origins in suffering, acquisition of skills, and coming into one’s own with the mature ability to force one’s will on others, often in the form of straight-up killing and assassination. Judge, jury, and executioner all rolled into one but entirely acceptable vigilantism if done wearing a supersuit and claiming spurious, self-appointed moral authority.

There are better narratives that don’t conflate power with force or lack plausibility in the world we actually inhabit. In a rather complicated article by Adam Tooze entitled “John Mearsheimer and the Dark Origins of Realism” at The New Statesman, after a lengthy historical and geopolitical analysis of competing narratives, a mode of apprehending reality is described:

… adopting a realistic approach towards the world does not consist in always reaching for a well-worn toolkit of timeless verities, nor does it consist in affecting a hard-boiled attitude so as to inoculate oneself forever against liberal enthusiasm. Realism, taken seriously, entails a never-ending cognitive and emotional challenge. It involves a minute-by-minute struggle to understand a complex and constantly evolving world, in which we are ourselves immersed, a world that we can, to a degree, influence and change, but which constantly challenges our categories and the definitions of our interests. And in that struggle for realism – the never-ending task of sensibly defining interests and pursuing them as best we can – to resort to war, by any side, should be acknowledged for what it is. It should not be normalised as the logical and obvious reaction to given circumstances, but recognised as a radical and perilous act, fraught with moral consequences. Any thinker or politician too callous or shallow to face that stark reality, should be judged accordingly.

From an otherwise, rambling, clumsy blog post, this portion from an extended analysis of Mad Max: Fury Road caught my attention:

Ideas that cannot be challenged, that cannot bear even the slightest scrutiny, are ideas that can’t evolve. It doesn’t matter whether they are right or wrong.

They are static, mechanical and ultimately devoid of life itself.

This is our world today in the hands of the Woke Left, a world where the destructive and vindictive feminine has been elevated to the point of unimpeachable rightness. But this isn’t any kind of healthy feminine. It’s a Furiosa-like feminine, devoid of nurturing, all implied violence, all sexuality suppressed to the point of masculinity.

Look at Furiosa and tell me it isn’t asking another vital question, “In a dying world, is there any room for fertility while clinging like moss for survival?”

In our world feminism has robbed women of their greatest attribute, the ability to gestate and nurture life itself. Hollywood has spent two generations giving us female action heroes who are ultimately nothing more than Doods with Boobs. It’s the ultimate power fantasy of Third Wave feminism.

It’s not as destructive an archetype as the sluts on Sex in the City, mind you, because at least it can be tied in some ways back to motherhood, i.e. Ripley in James Cameron’s Aliens, but it’s still damaging to the cause of the healthy feminine nonetheless.

Furiosa is what happens when gender roles are maximally out of balance.

/rant on

The ongoing epistemological crisis is getting no aid or relief from the chattering classes. Case in point: the Feb. 2021 issue of Harper’s Magazine has a special supplement devoted to “Life after Trump,” which divides recent history neatly into reality and unreality commencing from either the announcement of Trump’s candidacy, his unexpected success in the Republican primaries, his even less expected election (and inauguration), or now his removal from office following electoral defeat in Nov. 2020. Take your pick which signals the greatest deflection from history’s “proper” course before being derailed into a false trajectory. Charles Yu and Olivia Laing adopt the reality/unreality dichotomy in their contributions to the special supplement. Yu divides (as do many others) the nation into us and them: supporters of a supposed departure from reality/sanity and those whose clear perception penetrates the illusion. Laing bemoans the inability to distinguish fiction and fantasy from truth, unreality masquerading as your truth, my truth, anyone’s truth given repetition and persuasion sufficient to make it stick. Despite familiarity with these forced, unoriginal metaphors, I don’t believe them for a moment. Worse, they do more to encourage siloed thinking and congratulate the “Resistance” for being on the putative correct side of the glaringly obvious schism in the voting populace. Their arguments support a false binary, perpetuating and reinforcing a distorted and decidedly unhelpful interpretation of recent history. Much better analyses than theirs are available.

So let me state emphatically: like the universe, infinity, and oddly enough consciousness, reality is all-encompassing and unitary. Sure, different aspects can be examined separately, but the whole is nonetheless indivisible. Reality is a complete surround, not something one can opt into or out of. That doesn’t mean one’s mind can’t go elsewhere, either temporarily or permanently, but that does not create or constitute an alternate reality. It’s merely dissociation. Considering the rather extreme limitations of human perceptual apparatuses, it’s frankly inevitable that each of us occupies a unique position, an individual perspective, within a much, much (much, much …) larger reality. Add just a couple more axes to the graph below for time (from nanoseconds to eons) and physical scale (from subatomic to cosmic), and the available portion of reality anyone can grasp is clearly infinitesimally small, yet that tiny, tiny portion is utterly everything for each individual. It’s a weird kind of solipsism.

I get that Harper’s is a literary magazine and that writers/contributors take advantage of the opportunity to flex for whatever diminishing readership has the patience to actually finish their articles. Indeed, in the course of the special supplement, more than a few felicitous concepts and turns of phase appeared. However, despite commonplace protestations, the new chief executive at the helm of the ship of state has not in fact returned the American scene to normal reality after an awful but limited interregnum.

Aside: Citizens are asked to swallow the whopper that the current president, an elder statesman, the so-called leader of the free world, is in full control of this faculties. Funny how his handlers repeatedly erupt like a murder of crows at the first suggestion that a difficult, unvetted question might be posed, inviting the poor fellow to veer even slightly off the teleprompter script. Nope. Lest yet another foot-in-mouth PR disaster occur (too many already to count), he’s whisked away, out of range of cameras and mics before any lasting damage can be done. Everyone is supposed to pretend this charade is somehow normal. On the other hand, considering how many past presidents were plainly puppets, spokespersons, or charlatans (or at least denied the opportunity to enact an agenda), one could argue that the façade is normal. “Pay no attention to the man [or men] behind the curtain. I am the great and powerful Wizard of Oz!”

With some dismay, I admit that the tiny sliver of reality to which many attend incessantly is an even smaller subset of reality, served up via small, handheld devices that fit neatly in one’s pocket. One could say theirs is a pocket reality, mostly mass media controlled by Silicon Valley platforms and their censorious algorithms. Constrained by all things digital, and despite voluminous ephemera, that reality bears little resemblance to what digital refuseniks experience without the blue glare of screens washing all the color from their faces and their own authentic thoughts out of their heads. Instead, I recommend getting outside, into the open air and under the warm glow of the yellow sun, to experience life as an embodied being, not as a mere processor of yet someone else’s pocket reality. That’s how we all start out as children before getting sucked into the machine.

Weirdly, only when the screen size ramps up to 30 feet tall do consumers grow skeptical and critical of storytelling. At just the moment cinema audiences are invited to suspend disbelief, the Reality Principle and logic are applied to character, dialogue, plotting, and make-believe gadgetry, which often fail to ring true. Why does fiction come under such careful scrutiny while reality skates right on by, allowing the credulous to believe whatever they’re fed?

/rant off

Watched Soylent Green (1973) a few days ago for the first time since boyhood. The movie is based on a book by Richard Fleischer (which I haven’t read) and oddly enough has not yet been remade. How to categorize the film within familiar genres is tricky. Science fiction? Disaster? Dystopia? Police procedural? It checks all those boxes. Chief messages, considering its early 70s origin, are pollution and overpopulation, though global warming is also mentioned less pressingly. The opening montage looks surprisingly like what Godfrey Reggio did much better with Koyaanisqatsi (1982).

Soylent Green is set in 2022 — only a few months away now but a relatively remote future in 1973 — and the Earth is badly overpopulated, environmentally degraded, overheated, and struggling to support teeming billions mostly jammed into cities. Details are sketchy, and only old people can remember a time when the biosphere remained intact; whatever disaster had occurred was already long ago. Science fiction and futuristic films are often judged improperly by how correct prophecies turn out in reality, as though enjoyment were based on fidelity to reality. Soylent Green fares well in that respect despite its clunky, dated, 70s production design. Vehicles, computer screens, phones, wardrobe, and décor are all, shall we say, quaintly vintage. But consider this: had collapse occurred in the 70s, who’s to say that cellphones, flat screens, and the Internet would ever have been developed? Maybe the U.S. (and the world) would have been stalled in the 70s much the way Cuba is stuck in the 50s (when the monumentally dumb, ongoing U.S. embargo commenced).

The film’s star is Charlton Heston, who had established himself as a handsomely bankable lead in science fiction, disaster, and dystopian films (e.g., The Omega Man and The Planet of the Apes series). Though serviceable, his portrayal is remarkably plain, revealing Heston as a poor man’s Sean Connery or John Wayne (both far more charismatic contemporaries of Heston’s even in lousy films). In Soylent Green, Heston plays Detective Robert Thorn, though he’s mostly called “Thorn” onscreen. Other characters below the age of 65 or so also go by only one name. They all grew up after real foodstuffs (the titular Soylent Green being a synthetic wafer reputedly made out of plankton — the most palatable of three colors) and creature comforts became exceedingly scarce and expensive. Oldsters are given the respect of first and last names. Thorn investigates the assassination of a high-ranking industrialist to its well-known conspiratorial conclusion (hardly a spoiler anymore) in that iconic line at the very end of the film: “Soylent Green is people!” Seems industrialists, to keep people fed, are making food of human corpses. That eventual revelation drives the investigation and the film forward, a device far tamer than today’s amped up action thrillers where, for instance, a mere snap of the fingers can magically wipe out or restore half of the universe. Once the truth is proclaimed by Thorn (after first being teased whispered into a couple ears), the movie ends rather abruptly. That’s also what makes it a police procedural set in a disastrous, dystopic, science-fiction future stuck distinctively in the past: once the crime/riddle is solved, the story and film are over with no dénouement whatsoever.

Some of the details of the film, entirely pedestrian to modern audiences, are modestly enjoyable throwbacks. For instance, today’s penchant for memes and slang renaming of commonplace things is employed in Soylent Green. The catchphrase “Tuesday is Soylent Green Day” appears but is not overdriven. A jar of strawberries costs “150D,” which I first thought might be future currency in the form of debits or demerits but is probably just short for dollars. Front end loaders used for crowd control are called “scoops.” High-end apartment building rentals come furnished with live-in girls (prostitutes or gold-diggers, really) known as Furniture Girls. OTOH, decidedly 70s-era trash trucks (design hasn’t really changed much) are not emblazoned with the corporate name or logo of the Soylent Corporation (why not?). Similarly, (1) dressing the proles in dull, gray work clothes and brimless caps, (2) having them sleep on stairways or church refuges piled on top of each other so that characters have to step gingerly through them, (3) being so crammed together in protest when the Tuesday ration of Soylent Green runs short that they can’t avoid the scoops, (4) dripped blood clearly made of thick, oversaturated paint (at least on the DVD), and (5) a sepia haze covering daytime outdoor scenes are fairly lazy nods to world building on a low budget. None of this is particularly high-concept filmmaking, though the restraint is appreciated. The sole meme (entirely unprepared) that should have been better deployed is “going home,” a euphemism for reporting voluntarily to a processing plant (into Soylent Green, of course) at the end of one’s suffering life. Those who volunteer are shown 30 minutes of scenes, projected on a 360-degree theater that envelops the viewer, depicting the beauty and grandeur of nature before it had disappeared. This final grace offered to people (rather needlessly) serves the environmental message of the film well and could have been “driven home” a bit harder.

Like other aspects of the film’s back story, how agriculture systems collapsed is largely omitted. Perhaps such details (conjecture) are in the book. The film suggests persistent heat (no seasons), and accordingly, character are made to look like they never stop sweating. Scientific projections of how global warming will manifest do in fact point to hothouse Earth, though seasons will still occur in temperate latitudes. Because such changes normally occur in geological time, it’s an exceedingly slow process compared to human history and activity. Expert inquiry into the subject prophesied long ago that human activity would trigger and accelerate the transition. How long it will take is still unknown, but industrial civilization is definitely on that trajectory and human have done little since the 70s to curb self-destructive appetites or behaviors — except of course talk, which in the end is just more hot air. Moreover, dystopian science fiction has shifted over the decades away from self-recrimination to a long, seemingly endless stream of superheros fighting crime (and sometimes aliens). Considering film is entertainment meant to be enjoyed, the self-serious messages embedded in so many 70s-era disaster films warning us of human hubris are out of fashion. Instead, superpowers and supersuits rule cinema, transforming formerly uplifting science-fiction properties such as Star Trek into hypermilitaristic stories of interstellar social collapse. Soylent Green is a grim reminder that we once knew better, even in our entertainments.

From Ran Prieur (no link, note nested reply):


I was heavily into conspiracy theory in the 90’s. There was a great paper magazine, Kenn Thomas’s Steamshovel Press, that always had thoughtful and well-researched articles exploring anomalies in the dominant narrative.

Another magazine, Jim Martin’s Flatland, was more dark and paranoid but still really smart. A more popular magazine, Paranoia, was stupid but fun.

At some point, conspiracy culture shifted to grand narratives about absolute evil. This happened at the same time that superhero movies (along with Harry Potter and Lord of the Rings) took over Hollywood. The more epic and the more black-and-white the story, the more humans are drawn to it.

This is my half-baked theory: It used to be that ordinary people would accept whatever the TV said — or before that, the church. Only a few weirdos developed the skill of looking at a broad swath of potential facts, and drawing their own pictures.

It’s like seeing shapes in the clouds. It’s not just something you do or don’t do — it’s a skill you can develop, to see more shapes more easily. And now everyone is learning it.

Through the magic of the internet, everyone is discovering that they can make reality look like whatever they want. They feel like they’re finding truth, when really they’re veering off into madness.

SamuraiBeanDog replies: Except that the real issue with the current conspiracy crisis is that people are just replacing the old TV and church sources with social media and YouTube. The masses of conspiracy culture aren’t coming up with their own realities, they’re just believing whatever shit they’re told by conspiracy influencers.

Something that’s rarely said about influencers, and propaganda in general, is that they can’t change anyone’s mind — they have to work with what people already feel good about believing.

I simply can’t keep up with all the reading, viewing, and listening in my queue. Waking hours are too few, and concentration dissipates long before sleep overtakes. Accordingly, it’s much easier to settle into couch-potato mode and watch some mindless drivel, such as the Netflix hit Bridgerton binged in two sittings. (Unlike cinema critics, I’m not bothered especially by continuity errors, plot holes, clunky dialogue, weak character motivation, gaps of logic, or glossy decadence of the fictional worlds. I am bothered by the Kafka trap sprung on anyone who notices casting decisions that defy time and place — an ill-advised but now commonplace historical revisionism like editing Mark Twain.) As a result, blog posts are less frequent than they might perhaps be as I pronounce upon American (or more broadly, Western) culture, trying vainly to absorb it as a continuously moving target. Calls to mind the phrase Après moi, le déluge, except that there is no need to wait. A deluge of entertainment, news, analysis, punditry, and trolling has buried everyone already. So rather than the more careful consideration I prefer to post, here are some hot takes.

The Irregular Aphorist. Caitlin Johnstone offers many trenchant observations in the form of aphorisms (some of which I’ve quoted before), all gathered under the subtitle Notes From The Edge Of The Narrative Matrix. The modifier irregular only means that aphorisms are a regular but not constant feature. Her site doesn’t have a tag to that effect but probably ought to. Here’s one in particular that caught my attention:

Everything our species has tried has led us to a dying world and a society that is stark raving mad, so nobody is in any position to tell you that you are wrong.

Twin truths here are (1) the dying world and (2) societal madness, both of which I’ve been describing for some time. Glad when others recognize them, too.

Piling on. Though few still are willing to admit it, nonpharmaceutical interventions (NPIs, e.g., distancing, masks, and lockdowns) to stall or reduce the spread of the virus failed to achieve their objectives according to this study. Instead, NPIs piled on suffering no one could forestall. I read somewhere (no link) that the world is approaching half of total, cumulative deaths/infections predicted had nothing been done to impede the pandemic running its course. Adding in deaths of despair (numbers not entirely up to date), we’re using the wrong tools to fight the wrong battle. Of course, interventions opened up giant opportunities for power grabs and vulture capitalism, so the cynic in me shrugs and wonders half aloud “what did you expect, really?”

Growth of the Managerial Bureaucracy. A blog called Easily Distracted by Timothy Burke (never on my blogroll) publishes only a few times per year, but his analysis is terrific — at least when it doesn’t wind up being overlong and inconclusive. Since a student debt jubilee is back in the news (plenty of arguments pro and con), unintended consequences are anticipated in this quote:

When you set out to create elaborate tiers that segregate the deserving poor from the comfortable middle-class and the truly wealthy, you create a system that requires a massive bureaucracy to administer and a process that forces people into petitionary humiliation in order to verify their eligibility. You create byzantine cutoff points that become business opportunities for predatory rentiers.

Something similar may well be occurring with stimulus checks being issued pro rata (has anyone actually gotten one?), but at least we’re spared any petitionary humiliations. We get whatever the algorithms (byzantine cutoff points) dictate. How those funds will be gamed and attached is not yet clear. Stay alert.

No Defense of Free Speech. Alan Jacobs often recommends deleting, unsubscribing, and/or ignoring social media accounts (after his own long love-hate relationship with them) considering how they have become wholly toxic to a balanced psyche as well as principal enablers of surveillance capitalism and narrative control. However, in an article about the manorial elite, he’s completely lost the plot that absolutism is required in defense of free speech. It’s not sufficient to be blasé or even relieved when 45 is kicked off Twitter permanently or when multiple parties conspire to kill Parler. Establishing your own turf beyond the reach of Silicon Valley censors is a nice idea but frankly impractical. Isn’t that what whoever ran Parler (or posted there) must have thought? And besides, fencing off the digital commons these very entities created has catapulted them into the unenviable position of undemocratic, unelected wielders of monopolistic power and co-conspirators to boot. That’s what needs to be curtailed, not free speech.

The Taxonomic Apocalypse. Although drawn from fiction and thus largely hypothetical, a new book (coming late 2021) by Adam Roberts called It’s the End of the World: But What Are We Really Afraid Of? surveys doomsday stories and categorizes different versions of how it all ends. Alan Jacobs (yeah, him again — must have an advance copy of the manuscript) recommends it as “a delightful and provocative little book” but fails to grok two things: (1) these stories are rehearsals-cum-preparations for the real thing, and (2) the real thing really is bearing down on us implacably and so is no longer a mere hypothetical to contemplate and categorize for shits and grins. Despite acceptance of the eventualities that await all of us, reading Roberts’ taxonomy is not something I would expect to find delightful. Skip.

Narrative Collapse. Ran Prier (no link) sometimes makes statements revealing an unexpected god’s-eye view:

[45] is a mean rich kid who figured out that if he does a good Archie Bunker impression, every lost soul with an authoritarian father will think he’s the messiah. We’re lucky that he cares only about himself, instead of having some crazy utopian agenda. But the power, and the agency, is with the disaffected citizens of a declining empire, tasting barbarism.

This is all about people wanting to be part of a group that’s part of a story. Lately, some of the big group-stories have been dying: sky father religion, American supremacy, the conquest of nature, the virtue of wealth-seeking. In their place, young and clumsy group-stories struggle and rise.

Collapse of certain fundamental stories that animate our thinking is at the core of The Spiral Staircase (see About Brutus at top), though it’s often couched in terms of consciousness in transition. Getting through the transition (only temporarily, see previous item in list) probably means completion of the Counter-Enlightenment historical arc, which necessarily includes further descent into barbarism.

Hail Mary for Individualism. I always take special notice when someone cites Allan Bloom. Alan Jacobs (um, yeah, he’s prolific and I’m using his ideas again — sue me) cites Bloom to argue that individualism or the sovereign self, a product of the Enlightenment, is already dead. No doubt, the thought-world described so ably by Bloom no longer exists, but individualism has not yet died out by attrition or been fully dissolved in nonduality. Many of us born before the advent of the Internet retain selfhood and authenticity not yet coopted by or incorporated into mass mind. Moreover, ongoing struggles over identity (e.g., gender, sexual orientation, and race that are often used improperly to define the self) result from an inchoate sense that individualism is eroding precipitously, not that it’s already passé. Defiant attempts to (re)establish an authentic self (contravening all logic and becoming critical theory of one sort or another) in the face of this loss may well be a last-ditch effort to save the self, but it’s failing.

David Sirota, author of Back to our Future: How the 1980s Explain the World We Live in Now — Our Culture, Our Politics, Our Everything (2011), came to my attention (how else?) through a podcast. He riffed pretty entertainingly on his book, now roughly one decade old, like a rock ‘n’ roller stuck (re)playing his or her greatest hits into dotage. However, his thesis was strong and appealing enough that I picked up a copy (read: borrowed from the library) to investigate despite the datedness of the book (and my tardiness). It promised to be an easy read.

Sirota’s basic thesis is that memes and meme complexes (a/k/a memeplexes, though Sirota never uses the term meme) developed in the 80s and deployed through a combination of information and entertainment media (thus, infotainment) form the narrative background we take for granted in the early part of the 20th century. Children fed a steady diet of clichés, catchphrases, one-liners, archetypes, and story plots have now grown to adulthood and are scarcely able to peer behind the curtain to question the legitimacy or subtext of the narrative shapes and distortions imbibed during childhood like mother’s milk. The table of contents lists four parts (boldface section titles are Sirota’s; descriptive text is mine):

  • Liking Ike, Hating Woodstock. How the 50s and 60s decades were (the first?) assigned reductive demographic signifiers, handily ignoring the true diversity of experience during those decades. More specifically, the boom-boom 50s (economics, births) were recalled nostalgically in 80s TV and films while the 60s were recast as being all about those dirty, hairy hippies and their music, drugs, and sexual licentiousness, all of which had to be invalidated somehow to regain lost wholesomeness. The one-man promotional vehicle for this pleasing self-deception was Michael J. Fox, whose screen personae (TV and film) during the 80s (glorifying the 50s but openly shitting on the 60s) were instrumental in reforming attitudes about our mixed history.
  • The Jump Man Chronicles. How the Great Man Theory of History was developed through glorification of heroes, rogues, mavericks, and iconoclasts who came into their own during the 80s. That one-man vehicle was Michael Jordan, whose talents and personal magnetism were so outsized that everyone aspired to be “like Mike,” which is to say, a superhero elevated beyond mere mortal rules and thus immortalized. The effect was duplicated many times over in popular culture, with various entertainment icons and political operatives subverting thoughtful consideration of real-world problems in favor of jingoistic portrayals.
  • Why We (Continue to) Fight. How the U.S. military was rehabilitated after losing the Vietnam War, gifting us with today’s hypermilitarism and permanent wars. Two principal tropes were deployed to shape public opinion: the Legend of the Spat upon Veteran and the Hands Tied Behind Their Backs Myth. Each was trotted out reliably whenever we needed to misremember our past as fictionalized in the 80s.
  • The Huxtable Effect. How “America’s dad” helped accommodate race relations to white anxiety, primarily to sell a TV show. In contrast with various “ghetto TV” shows of the 70s that depicted urban working poor (various ethnicities), The Cosby Show presented an upscale black family who transcended race by simply ignoring the issue — a privilege of wealth and celebrity. The Obama campaign and subsequent administration copied this approach, pretending American society had become postracial despite his never truly being able to escape the modifier black because the default (no modifier needed) in America is always white. This is the most fraught part of the book, demonstrating that despite whatever instructions we get from entertainment media and pundits, we remain stuck in an unresolved, unhealed, inescapable trap.

(more…)

Returning to the subject of this post, I asserted that the modern era frustrates a deep, human yearning for meaning. As a result, the Medieval Period, and to a lesser degree, life on the highroad, became narrative fixations. Had I time to investigate further, I would read C.S. Lewis’ The Discarded Image (1964), but my reading list is already overfull. Nonetheless, I found an executive summary of how Lewis describes the Medieval approach to history and education:

Medieval historians varied in that some of them were more scientific, but most historians tried to create a “picture of the past.” This “picture” was not necessarily based in fact and was meant more to entertain curiosity than to seriously inform. Educated people in medieval times, however, had a high standard for education composed of The Seven Liberal Arts of grammar, dialectic, rhetoric, arithmetic, music, geometry, and astronomy.

In the last chapter, Lewis summarizes the influence of the Medieval Model. In general, the model was widely accepted, meaning that most people of the time conformed to the same way of thinking. The model, he reiterates, satisfied imagination and curiosity, but was not necessarily accurate or factual, specifically when analyzed by modern thinkers.

Aside. Regular readers of The Spiral Staircase may also recognize how consciousness informs this blog post. Historical psychology offers a glimpse into worldviews of bygone eras, with the Medieval Period perhaps being the easiest to excavate contemplate due to proximity. Few storytellers (cinema or literature) attempt to depict what the world was truly like in the past (best as we can know) but instead resort to an ahistorical modern gloss on how men and women thought and behaved. One notable exception may be the 1986 film The Name of the Rose, which depicts the emerging rational mind in stark conflict with the cloistered Medieval mind. Sword-and-sandal epics set in ancient Rome and Greece get things even worse.

(more…)

Unlike turtles, humans do not have protective shells into which we can withdraw when danger presents. Nor can we lift off, fly away, and elude danger the way birds do. These days, we’re sorely beset by an invisible pandemic spread by exposure to carriers (read: other people) and so asked or forced to submit to being locked down and socially distanced. Thus, we are withdrawn into the protective shell of the home in cycles of varying intensity and obeisance to maintain health and safety. Yet life goes on, and with it, numerous physical requirements (ignoring psychological needs) that can’t be met virtually demand we venture out into the public sphere to gather resources, risking exposure to the scourge. Accordingly, the conduct of business has adapted to enable folks to remain in the protective shells of their vehicles, taking delivery through the car window and rarely if ever entering a brick-and-mortar establishment except in defiance or at the option of acceptable risk. In effect, we’re being driven into our cars ever more, and the vehicle is readily understood as a proxy for its inhabitant(s). Take note of pictures of people in bread lines during the Great Depression having been replaced by pictures of cars lined up for miles during the pandemic to get packaged meals from charitable organizations.

Reflecting on this aspect of modern life, I realized that it’s not exactly novel. The widespread adoption of the individual vehicle in the 1940s and 50s, as distinguished from mass transit, and the construction of the interstate highway system promised (and delivered) flexibility and freedom of tremendous appeal. While the shift into cars (along with air travel) doomed now moribund passenger rail (except intracity in the few American cities with effective rail systems), it enabled the buildout of suburbs and exurbs now recognized as urban sprawl. And like all those packages now clogging delivery systems as we shift even more heavily during the holiday season to online shopping, a loss of efficiency was inevitable. All those individual cars and boxes create congestion that cry out for solutions.

Among the solutions (really a nonsolution) were the first drive-through banks of the 1970s. Is doing one’s banking without leaving the vehicle’s protective shell really an efficiency? Or is it merely an early acknowledgement and enabling of antisocial individualism? Pneumatic tubes that permitted drive-through banking did not speed up transactions appreciably, but the novel mechanism undoubtedly reinforced the psychological attachment Americans felt with their cars. That growing attachment was already apparent in the 1950s, with two bits of Americana from that decade still resonating: the drive-in theater and the drive-in restaurant. The drive-in theater was a low-fidelity efficiency and alternative to the grand movie houses built in the 1920s and 30s seating a few thousand people in one cavernous space. (A different sort of efficiency enabling choice later transformed most cinema establishments into multiplexes able to show 8–10 titles instead of one, handily diminishing audiences of thousands to hundreds or even tens and robbing the group experience of much of its inherent power. Now that premium streaming content is delivered to screens at home and we are disallowed assembly into large audiences, we have instead become something far more inert — viewers — with fully anticipatable degradation of the entertainment experience notwithstanding the handsome technologies found within the comforts of the home.) I’ve heard that drive-ins are experiencing a renaissance of sorts in 2020, with Walmart parking lots converted into showplaces, at least temporarily, to resemble (poorly) group experience and social cohesion connection. The drive-in restaurant of the 1950s, with their iconic carhops (sometimes on roller skates), is a further example of enabling car culture to proliferate. Never mind that eating in the car is actually kinda sad and maybe a little disgusting as odors and refuse collect in that confined space. One might suspect that drive-ins were directed toward teenyboppers and cruisers of the 1950s exploring newfound freedom, mobility, and the illusion of privacy in their cars, parked in neat rows at drive-ins (and Lookout Points for smooch sessions) all across the country. However, my childhood memory was that it was also a family affair.

Inevitably, fast food restaurants followed the banks in the 1970s and quickly established drive-through lanes, reinforcing the degradation of the food experience into mere feeding (often on one’s lonesome) rather than dining in community. Curiously, the pandemic has made every restaurant still operating, even the upscale ones, a drive-through and forced those with and without dedicated drive-through lanes to bring back the anachronistic carhop to serve the congestion. A trip to a local burger joint in Chicago last week revealed 40+ cars in queue and a dozen or so carhops on the exterior directing traffic and making deliveries through the car window (briefly penetrating the protective shell) so that no one would have to enter the building and expose oneself to virus carriers. I’ve yet to see a 2020 carhop wearing roller skates (now roller blades) or a poodle skirt.

Such arrangements are probably effective at minimizing pandemic risk and have become one of several new normals (discussion of political dysfunction deferred). Who can say how long they will persist? Still, it’s strange to observe the psychology of our response, even if only superficially and preliminarily. Car culture has been a curious phenomenon since at least the middle of the 20th century. New dynamics reinforcing our commitment to cars are surprising, perhaps, but a little unsurprising, too, considering how we made ourselves so dependent on them as the foundation of personal transportation infrastructure. As a doomer, I had rather expected that Peak Oil occurring around 2006 or so would spell the gradual (or sudden) end of happy motoring as prices at the pump, refusal to elevate standard fuel efficiency above 50 mph, and climbing average cost of new vehicles placed individual options beyond the reach of average folks. However, I’ve been genuinely surprised by fuel costs sinking to new lows (below the cost of production, even bizarrely inverting to the point that producers paid buyers to take inventory) and continued attempts to engineer (only partially) around the limitations of Peak Oil, if not indeed Peak Energy. I continue to believe these are mirages, like the record-setting bull market of 2020 occurring in the midst of simultaneous economic, social, and health crises.