Archive for the ‘Cinema’ Category

I’ll try to be relatively brief, since I’ve been blogging about industrial and ecological collapse for more than a decade. Jeff Gibbs released a new documentary called Planet of the Humans (sideways nod to the dystopian movie franchises Planet of the Apes — as though humans aren’t also apes). Gibbs gets top billing as the director, but this is clearly a Michael Moore film, who gets secondary billing as the executing producer. The film includes many of Moore’s established eccentricities, minus the humor, and is basically an exposé on greenwashing: the tendency of government agencies, environmental activists, and capitalist enterprises to coopt and transform earnest environmental concern into further profit-driven destruction of the natural environment. Should be no surprise to anyone paying attention, despite the array of eco-luminaries making speeches and soundbites about “green” technologies that purport to save us from rendering the planet uninhabitable. Watching them fumble and evade when answering simple, direct questions is a clear indication of failed public-relations approaches to shaping the narrative.

Turns out that those ballyhooed energy sources (e.g., wind, solar, biofuel, biomass) ride on the back of fossil fuels and aren’t any more green or sustainable than the old energy sources they pretend to replace. Again, no surprise if one has even a basic understanding of the dynamics of energy production and consumption. That admittedly sounds awfully jaded, but the truth has been out there for a long time already for anyone willing and able to confront it. Similarly, the documentary mentions overpopulation, another notorious elephant in the room (or herd of elephants, as aptly put in the film), but it’s not fully developed. Entirely absent is any question of not meeting energy demand. That omission is especially timely given how, with the worldwide economy substantially scaled back at present and with it significant demand destruction (besides electricity), the price of oil has fallen through the floor. Nope, the tacit assumption is that energy demand must be met despite all the awful short- and long-term consequences.

Newsfeeds indicate that the film has sparked considerable controversy in only a few days following release. Debate is to be expected considering a coherent energy strategy has never been developed or agreed upon and interested parties have a lot riding on outcomes. Not to indulge in hyperbole, but the entire human race is bound up in the outcome, too, and it doesn’t look good for us or most of the rest of the species inhabiting the planet. Thus, I was modestly dismayed when the end of the film wandered into happy chapter territory and offered the nonsensical platitude in voiceover, “If we get ourselves under control, all things are possible.” Because we’ve passed and in fact lapped the point of no return repeatedly, the range of possibilities has shrunk precipitously. The most obvious is that human population of 7.7 billion (and counting) is being sorely tested. If we’re being honest with ourselves, we also know that post-pandemic there can be no return to the world we’ve known for the past 70 years or so. Although the documentary could not be reasonably expected to be entirely up to date, it should at least have had the nerve to conclude what the past few decades have demonstrated with abundant clarity.

Addendum

This review provides support for my assessment that “green” or “sustainable” energy cannot be delivered without significant contribution of fossil fuels.

In the introduction to an article at TomDispatch about anticipated resumption of professional sports currently on hiatus like much of the rest of human activity (economic and otherwise), Tom Engelhardt recalls that to his childhood self, professional sports meant so much and yet so little (alternatively, everything and nothing). This charming aspect of the innocence of childhood continues into adulthood, whether as spectator or participant, as leisure and freedom from threat allow. The article goes on to offer conjecture regarding the effect of reopening professional sports on the fall presidential election. Ugh! Racehorse politics never go out of season. I reject such purely hypothetical analyses, which isn’t the same as not caring about the election. Maybe I’ll wade in after a Democratic nominee is chosen to say that third-party candidates may well have a much larger role to play this time round because we’re again being offered flatly unacceptable options within the two-party single-party system. Until then, phooey on campaign season!

Still, Engelhardt’s remark put me in mind of a blog post I considered fully nine years ago but never got around to writing, namely, how music functions as meaningless abstraction. Pick you passion, I suppose: sports, music (any genre), literature, painting, poetry, dance, cinema and TV, fashion, fitness, nature, house pets, house plants, etc. Inspiration and devotion come in lots of forms, few of which are essential (primary or ontological needs on Maslow’s Hierarchy) yet remain fundamental to who we are and what we want out of life. Accordingly, when one’s passion is stripped away, being left grasping and rootless is quite common. That’s not equivalent to losing a job or loved one (those losses are afflicting many people right now, too), but our shared experience these days with no bars, no restaurants, no sports, no concerts, no school, and no church all add up to no society. We’re atomized, unable to connect and socialize meaningfully, digital substitutes notwithstanding. If a spectator, maybe one goes in search of replacements, which is awfully cold comfort. If a participant, one’s identity is wrapped up in such endeavors; resulting loss of meaning and/or purpose can be devastating.

It would be easy to over-analyze and over-intellectualize what meaningless abstraction means. It’s a trap, so I’ll do my best not to over-indulge. Still, it’s worth observing that as passions are habituated and internalized, their mode of appreciation is transferred from the senses (or sensorium) to the mind or head (as observed here). Coarseness and ugliness are then easily digested, rationalized, and embraced instead of being repulsive as they should be. There’s the paradox: as we grow more “sophisticated” (scare quotes intentional), we also invert and become more base. How else to explain tolerance of increasingly brazen dysfunction, corruption, servitude (e.g., debt), and gaslighting? It also explains the attraction to entertainments such as combat sports (and thug sports such as football and hockey), violent films, professional wrestling (more theater than sport), and online trolling. An instinctual blood lust that accompanies being predators, if not expressed more directly in war, torture, crime, and self-destruction, is sublimated into entertainment. Maybe that’s an escape valve so pressures don’t build up any worse, but that possibility strikes me as rather weak considering just how much damage has already been done.

The old saw goes that acting may be just fine as a creative endeavor, but given the opportunity, most actors really want to direct. A similar remark is often made of orchestral musicians, namely, that most rank-and-file players would really rather conduct. Directing and conducting may not be the central focus of creative work in their respective genres. After all, directors don’t normally appear onscreen and conductors make no sound. Instead, they coordinate the activities of an array of creative folks, putting directors in a unique position to bring about a singular vision in otherwise collaborative work. A further example is the Will to Power (associated with Friedrich Nietzsche and Arthur Schopenhauer) characteristic of those who wish to rule (as distinguished from those who wish to serve) such as regents, dictators, and autocrats. All of this sprang to mind because, despite outward appearance of a free, open society in the U.S., recent history demonstrates that the powers that be have instituted a directed election and directed economy quite at odds with democracy or popular opinion.

The nearest analogy is probably the directed verdict, where a judge removes the verdict from the hands or responsibility of the jury by directing the jury to return a particular verdict. In short, the judge decides the case for the jury, making the jury moot. I have no idea how commonplace directed verdicts are in practice.

Directed Election

Now that progressive candidates have been run out of the Democratic primaries, the U.S. presidential election boils down to which stooge to install (or retain) in November. Even if Biden is eventually swapped out for another Democrat in a brokered nominating convention (highly likely according to many), it’s certain to be someone fully amenable to entrenched corporate/financial interests. Accordingly, the deciders won’t be the folks who dutifully showed up and voted in their state primaries and caucuses but instead party leaders. One could try to argue that as elected representatives of the people, party leaders act on behalf of their constituencies (governing by consent of the people), but some serious straining is needed to arrive at that view. Votes cast in the primaries thus far demonstrate persistent desire for something distinctly other than the status quo, at least in the progressive wing of the Democratic party. Applying the cinematic metaphor of the top paragraph, voters are a cast of thousands millions being directed within a larger political theater toward a predetermined result.

Anyone paying attention knows that voters are rarely given options that aren’t in fact different flavors of the same pro-corporate agenda. Thus, no matter whom we manage to elect in November, the outcome has already been engineered. This is true not only by virtue of the narrow range of candidates able to maneuver successfully through the electoral gauntlet but also because of perennial distortions of the balloting process such as gerrymandering, voter suppression, and election fraud. Claims that both sides (really just one side) indulge in such practices so everything evens out don’t convince me.

Directed Economy

Conservative economists and market fundamentalists never seem to tire of arguments in the abstract that capitalist mechanisms of economics, left alone (unregulated, laissez-faire) to work their magic, deliver optimal outcomes when it comes to social and economic justice. Among the primary mechanisms is price discovery. However, economic practice never even remotely approaches the purity of abstraction because malefactors continuously distort and game economic systems out of self-interest greed. Price discovery is broken and equitable economic activity is made fundamentally fictitious. For example, the market for gemstones is famously inflated by a narrow consortium of sellers having successfully directed consumers to adopt a cultural standard of spending three months’ wages/salary for a wedding band as a demonstration of one’s love and devotion. In the opposite direction, precious metal spot prices are suppressed despite very high demand and nearly nonexistent supply. Current quoted premiums over spot silver price, even though no delivery is contemplated, range from roughly 20% to an absurd 2,000%. Supply and demand curves no longer function to aid in true price discovery (if such a thing ever existed). In a more banal sense, what people are willing to pay for a burger at a fast food joint or a loaf of bread at the grocery may affect the price charged more directly.

Nowhere is it more true that we’ve shifted to a directed economy than with the stock market (i.e., Wall Street vs. Main Street). As with the housing market, a real-world application with which many people have personal experience, if a buyer of a property or asset fails to appear within a certain time frame (longer for housing, shorter for stock, bonds, and other financial instruments), the seller is generally obliged to lower the price until a buyer finally appears. Some housing markets extraordinarily flush with money (e.g., Silicon Valley and Manhattan) trigger wild speculation and inflated prices that drive out all but the wealthiest buyers. Moreover, when the eventual buyer turns out to be a bank, corporation, or government entity willing to overpay for the property or asset using someone else’s money, the market becomes wholly artificial. This has been the case with the stock market for the last twelve years, with cheap money being injected nonstop via bailouts and quantitative easing to keep asset prices inflated. When fundamental instabilities began dragging the stock market down last fall, accelerating precipitous in early spring of this year and resulting in yet another crash (albeit brief), the so-called Plunge Protection Team sprang into action and wished trillions of dollars (taxpayer debt, actually, and over the objections of taxpayers in a classic fool-me-once scenario) into existence to perpetuate the casino economy and keep asset prices inflated for the foreseeable future, which isn’t very long.

The beneficiaries of this largesse are the same as they have always been when tax monies and public debt are concerned: corporations, banks, and the wealthy. Government economic supports are directed to these entities, leaving all others in the lurch. Claims that bailouts to keep large corporate entities and wealthy individuals whole so that the larger economy doesn’t seize up and fail catastrophically are preposterous because the larger economy already has seized up and failed catastrophically while the population is mostly quarantined, throwing many individuals out of work and shuttering many businesses. A reasonable expectation of widespread insolvency and bankruptcy lingers, waiting for the workouts and numbers to mount up.

The power of the purse possessed by the U.S. Congress hasn’t been used to help the citizenry since the New Deal era of FDR. Instead, military budgets and debts expand enormously while entitlements and services to the needy and vulnerable are whittled away. Citizen rebellions are already underway in small measure, mostly aimed at the quarantines. When bankruptcies, evictions, and foreclosures start to swell, watch out. Our leaders’ fundamental mismanagement of human affairs is unlikely to be swallowed quietly.

Purpose behind consumption of different genres of fiction varies. For most of us, it’s about responding to stimuli and experiencing emotions vicariously, which is to say, safely. For instance, tragedy and horror can be enjoyed, if that’s the right word, in a fictional context to tweak one’s sensibilities without significant effect outside the story frame. Similarly, fighting crime, prosecuting war, or repelling an alien invasion in a video game can be fun but is far removed from actually doing those things in real life (not fun). For less explicitly narrative forms, such as music, feelings evoked are aesthetic and artistic in nature, which makes a sad song or tragic symphony enjoyable on its own merits without bleeding far into real sadness or tragedy. Cinema (now blurred with broadcast TV and streaming services) is the preeminent storytelling medium that provoke all manner of emotional response. After reaching a certain age (middle to late teens), emotional detachment from depiction of sexuality and violent mayhem makes possible digestion of such stimulation for the purpose of entertainment — except in cases where prior personal trauma is triggered. Before that age, nightmare-prone children are prohibited.

Dramatic conflict is central to driving plot and story forward, and naturally, folks are drawn to some stories while avoiding others. Although I’m detached enough not to be upset by, say, zombie films where people and zombies alike are dispatched horrifically, I wouldn’t say I enjoy gore or splatter. Similarly, realistic portrayals of war (e.g., Saving Private Ryan) are not especially enjoyable for me despite the larger story, whether based on true events or entirely made up. The primary reason I leave behind a movie or TV show partway through is because I simply don’t enjoy watching suffering.

Another category bugs me even more: when fiction intrudes on reality to remind me too clearly of actual horrors (or is it the reverse: reality intruding on fiction?). It doesn’t happen often. One of the first instances I recall was in Star Trek: The Next Generation when the story observed that (fictional) warp travel produced some sort of residue akin to pollution. The reminder that we humans are destroying the actual environment registered heavily on me and ruined my enjoyment of the fictional story. (I also much prefer the exploration and discovery aspects of Star Trek that hew closer to Gene Roddenberry’s original vision than the militaristic approach now central to Star Trek.) A much more recent intrusion occurs in the rather adolescent TV show The 100, where a global nuclear exchange launched by an artificial intelligence has the follow-on effect a century later of remaining nuclear sites going critical, melting down, and irradiating the Earth, making it uninhabitable. This bothers me because that’s my expectation what happens in reality, probably not too long (decades) after industrial civilization collapses and most or all of us are dead. This prospect served up as fiction is simply too close to reality for me to enjoy vicariously.

Another example of fiction intruding too heavily on my doomer appreciation of reality occurred retroactively. As high-concept science fiction, I especially enjoyed the first Matrix movie. Like Star Trek, the sequels degraded into run-of-the-mill war stories. But what was provocative about the original was the matrix itself: a computer-generated fiction situated within a larger reality. Inside the matrix was pleasant enough (though not without conflict), but reality outside the matrix was truly awful. It was a supremely interesting narrative and thought experiment when it came out in 1999. Now twenty-one years later, it’s increasingly clear that we are living in a matrix-like, narrative-driven hyperreality intent on deluding ourselves of a pleasant equilibrium that simply isn’t in evidence. In fact, as societies and as a civilization, we’re careening out of control, no brakes, no steering. Caitlin Johnstone explores this startling after-the-fact realization in an article at Medium.com, which I found only a couple days ago. Reality is in fact far worse than the constructed hyperreality. No wonder no one wants to look at it.

I was introduced to the phrase life out of balance decades ago when I saw the film Koyaanisqatsi. The film is the first of a trilogy (sequels are Powaqqatsi and Nagoyqatsi) by Godfrey Reggio, though the film is arguably more famous because of its soundtrack composed by Philip Glass. Consisting entirely of wordless montage and music, the film contrasts the majesty of nature (in slo-mo, among other camera effects) with the frenetic pace of human activity (often sped up) and the folly of the human-built world. Koyaanisqatsi is a Hopi Indian word, meaning life out of balance. One might pause to consider, “out of balance with what?” The film supplies the answer, none too subtly: out of balance with nature. The two sequels are celebrations of humans at work and technology, respectively, and never gained the iconic stature of the initial film.

If history (delivering us into the 21st century) has demonstrated anything, it’s that we humans are careening out of control toward disaster, not unlike the spacecraft in the final sequence of Koyaanisqatsi that tumbles out of the atmosphere for an agonizingly long time (in slo-mo), burning all the way down. We are all witness to the event (more accurately, the process) but can do little anymore to alter the eventual tragic result. Though some counsel taking steps toward amelioration (of suffering, if nothing else), our default response is rather to deny our collective fate, and worse, to accelerate toward it. That’s how unbalanced we are as a global civilization.

The observation that we are badly out of balance is made at the species and civilizational levels but is recapitulated at all levels of social organization, from distinct societies or nationalities to regional and municipal organizations and associations on down to families and individuals. The forces, dynamics, and power laws that push us off balance are many, but none is as egregious as the corrupting influence of interrelated wealth and power. Wisdom of the ancients (especially the non-Western ones) gave us the same verdict, though we have refused intransigently (or more charitably: failed) to learn the lesson for hundreds of generations.

What I propose to do in this multipart series is explore or survey some of the manifestations of life out of balance. There is no particular organization, chronology, or schedule for subsequent entries. As an armchair social critic, I reserve the luxury of exercising my own judgment and answering to no one. Stay tuned.

One of the victims of cancel culture, coming to my attention only days ago, is Kate Smith (1907–1986), a singer of American popular song. Though Smith had a singing career spanning five decades, she is best remembered for her version(s) of Irving Berlin’s God Bless America, which justifiably became a bit of Americana. The decades of Smith’s peak activity were the 1930s and 40s.

/rant on

I dunno what goes through people’s heads, performing purity rituals or character excavation on folks long dead. The controversy stems from Smith having a couple other songs in her discography: That’s Why Darkies Were Born (1931) and Pickaninny Heaven from the movie Hello, Everybody! (1933). Hate to break it anyone still living under a rock, but these dates are not far removed from minstrelsy, blackface, and The Birth of a Nation (1915) — a time when typical Americans referred to blacks with a variety of terms we now consider slurs. Such references were still used during the American civil rights movement (1960s) and are in use among some virulent white supremacists even today. I don’t know the full context of Kate Smith having sung those songs, but I suspect I don’t need to. In that era, popular entertainment had few of the sensibilities regarding race we now have (culture may have moved on, but it’s hard to say with a straight face it’s evolved or progressed humanely), and uttering commonly used terms back then was not automatic evidence of any sort of snarling racism.

I remember having heard my grandparents, nearly exact contemporaries of Kate Smith, referring to blacks (the term I grew up with, still acceptable I think) with other terms we no longer consider acceptable. It shocked me, but to them, that’s simply what blacks were called (the term(s) they grew up with). Absolutely nothing in my grandparents’ character or behavior indicated a nasty, racist intent. I suspect the same was true of Kate Smith in the 1930s.

Back when I was a librarian, I also saw plenty of sheet music published before 1920 or so with the term darkie (or darkey) in the title. See for example this. The Library of Congress still uses the subject headings “negro spirituals” (is there another kind?) and “negro songs” to refer to various subgenres of American folk song that includes slave songs, work songs, spirituals, minstrel music, protest songs, etc. Maybe we should cancel the Library of Congress. Some published music titles from back then even call them coon songs. That last one is totally unacceptable today, but it’s frankly part of our history, and like changing character names in Mark Twain’s Huckleberry Finn, sanitizing the past does not make it go away or any less discomfiting. But if you wanna bury your head in the sand, go ahead, ostrich.

Also, if some person or entity ever does some questionably racist, sexist, or malign thing (even something short of abominable) situated contextually in the past, does that mean he, she, or it must be cancelled irrevocably? If that be the case, then I guess we gotta cancel composer Richard Wagner, one of the most notorious anti-Semites of the 19th century. Also, stop watching Pixar, Marvel, and Star Wars films (among others), because remember that time when Walt Disney Studios (now Walt Disney Company) made a racist musical film, Song of the South (1946)? Disney’s tainted legacy (extending well beyond that one movie) is at least as awful as, say, Kevin Spacey, and we’re certainly not about to rehabilitate him.

/rant off

“Come with me if you want to live.” That’s among the quotable lines from the latest movie in the Terminator franchise, though it’s not nearly so succinct or iconic as “I’ll be back” from the first Terminator. Whereas the latter has the quality (in hindsight) of slow, implacable inevitability (considering the Terminator is literally a death-bringer), the former occurs within the context of a character having only just traveled back in time, not yet adequately reoriented, and forced to make a snap decision under duress. “I’ll be back” might be easy to brush off as harmless (temporary denial) since the threat recedes — except that it doesn’t, it’s merely delayed. “Come with me …” demands a leap of faith (or trust) because the danger is very real at that instant.

Which quote, I must ask, better characterizes the threat of climate change? My answer: both, but at different times. Three to four decades ago, it was the “I’ll be back” type: building slowly but inevitable given the underlying structure of industrial civilization. That structure was known even then by a narrow circle of experts (e.g., engineers for Big Oil and at the Dept. of Energy) to be a heat engine, meaning that we would ultimately cook our own goose by warming the planet, altering the climatic steady state under which our most recent civilization has flourished and producing a steady loss of biodiversity and biomass until our own human habitat (the entirety of the planet by now) becomes a hostile environment unable (unwilling if one anthropomorphizes Mother Nature) to support our swollen population. All that was if we stayed on course and took no corrective action. Despite foreknowledge and ample warning, that’s precisely what occurred (and continues today).

With the Intergovernmental Panel on Climate Change (IPCC) in particular, the threat has for roughly a decade shifted over to “Come with me ….” It’s no longer possible to put things off, yet we continue to dither well beyond the tipping point where/when we can still save ourselves from self-annihilation. Although scientists have been gathering data and evidence, forming an overwhelming consensus, and sounding the alarm, scientific illiteracy, realpolitik, journalistic malpractice, and corporate greed have all conspired to grant the illusion of time to react we simply don’t have anymore (and truth be told, probably didn’t as of the early 1980s).

I’m aware of at least three journalists (relying on the far more authoritative work of scientific consensus) who have embraced the message: Dahr Jamail, Thom Hartmann, and David Wallace-Wells. None to my knowledge has been able to bring himself to admit that humanity is now a collection of dead men walking. They can’t muster the courage to give up hope (or to report truthfully), clinging to the possibility we may still have a fleeting chance to avert disaster. I heard Ralph Nader on his webcast say something to the same effect, namely, what good is it to rob others of hope? My personal values adhere to unstinting truth rather than illusion or self-deception, so I subscribe to Guy McPherson‘s assessment that we face near-term human extinction (precise date unknown but soon if, for example, this the year we get a blue ocean event). Simply put, McPherson is professor emeritus of natural resources and ecology and evolutionary biology at the University of Arizona [note my emphasis]. I trust his scholarship (summarizing the work of other scientists and drawing necessary though unpalatable conclusions) more than I trust journalistic shaping of the story for public consumption.

The obvious metaphor for what we face is a terminal medical diagnosis, or if one has hope, perhaps a death sentence about to be carried out but with the possibility of a last-minute stay of execution via phone call from the governor. Opinions vary whether one should hope/resist up to the final moment or make peace with one’s fate. By not telling the truth, I daresay the MSM has not given the public the second option by using the “I’ll be back” characterization when it’s really “Come with me ….” Various authors on the Web offer a better approximation of the truth (such as it can be known) and form a loose doomer network (a/k/a collapsniks). This blog is (an admittedly tiny) part of that doomersphere, which gives me no pleasure.

I observed way back here that it was no longer a thing to have a black man portray the U.S. president in film. Such casting might draw a modest bit of attention, but it no longer raises a particularly arched eyebrow. These depictions in cinema were only slightly ahead of the actuality of the first black president. Moreover, we’ve gotten used to female heads of state elsewhere, and we now have in the U.S. a burgeoning field of presidential wannabes from all sorts of diverse backgrounds. Near as I can tell, no one really cares anymore that a candidate is a women, or black, or a black women. (Could be that I’m isolated and/or misreading the issue.)

In Chicago where I live, the recent mayoral election offered a choice among some ten candidates to succeed the current mayor who is not seeking reelection. None of them got the required majority of votes, so a runoff between the top two candidates is about to occur. Both also happen to be black women. Although my exposure to the mainstream media and all the talking heads offering analysis is limited, I’ve yet to hear anyone remark disparagingly that Chicago will soon have its first black female mayor. This is as it should be: the field is open to all comers and no one can (or should) claim advantage or disadvantage based on identitarian politics.

Admittedly, extremists on both ends of the bogus left/right political spectrum still pay quite a lot of attention to identifiers. Academia in particular is currently destroying itself with bizarre claims and demands for equity — a nebulous doctrine that divides rather than unites people. Further, some conservatives can’t yet countenance a black, female, gay, atheist, or <insert other> politician, especially in the big chair. I’m nonetheless pleased to see that irrelevant markers matter less and less to many voters. Perhaps it’s a transition by sheer attrition and will take more time, but the current Zeitgeist outside of academia bodes well.

Caveat: Rather uncharacteristically long for me. Kudos if you have the patience for all of this.

Caught the first season of HBO’s series Westworld on DVD. I have a boyhood memory of the original film (1973) with Yul Brynner and a dim memory of its sequel Futureworld (1976). The sheer charisma of Yul Brynner in the role of the gunslinger casts a long shadow over the new production, not that most of today’s audiences have seen the original. No doubt, 45 years of technological development in film production lends the new version some distinct advantages. Visual effects are quite stunning and Utah landscapes have never been used more appealingly in terms of cinematography. Moreover, storytelling styles have changed, though it’s difficult to argue convincingly that they’re necessarily better now than then. Competing styles only appear dated. For instance, the new series has immensely more time to develop its themes; but the ancient parables of hubris and loss of control over our own creations run amok (e.g., Shelley’s Frankenstein, or more contemporaneously, the surprisingly good new movie Upgrade) have compact, appealing narrative arcs quite different from constant teasing and foreshadowing of plot developments while actual plotting proceeds glacially. Viewers wait an awful lot longer in the HBO series for resolution of tensions and emotional payoffs, by which time investment in the story lines has been dispelled. There is also no terrifying crescendo of violence and chaos demanding rescue or resolution. HBO’s Westworld often simply plods on. To wit, a not insignificant portion of the story (um, side story) is devoted to boardroom politics (yawn) regarding who actually controls the Westworld theme park. Plot twists and reveals, while mildly interesting (typically guessed by today’s cynical audiences), do not tie the narrative together successfully.

Still, Westworld provokes considerable interest from me due to my fascination with human consciousness. The initial episode builds out the fictional future world with characters speaking exposition clearly owing its inspiration to Julian Jayne’s book The Origins of Consciousness in the Breakdown of the Bicameral Mind (another reference audiences are quite unlikely to know or recognize). I’ve had the Julian Jaynes Society’s website bookmarked for years and read the book some while back; never imagined it would be captured in modern fiction. Jaynes’ thesis (if I may be so bold as to summarize radically) is that modern consciousness coalesced around the collapse of multiple voices in the head — ideas, impulses, choices, decisions — into a single stream of consciousness perhaps better understood (probably not) as the narrative self. (Aside: the multiple voices of antiquity correspond to polytheism, whereas the modern singular voice corresponds to monotheism.) Thus, modern human consciousness arose over several millennia as the bicameral mind (the divided brain having two camera, chambers, or halves) functionally collapsed. The underlying story of the new Westworld is the emergence of machine consciousness, a/k/a strong AI, a/k/a The Singularity, while the old Westworld was about a mere software glitch. Exploration of machine consciousness modeling (e.g., improvisation builds on memory to create awareness) as a proxy for better understanding human consciousness might not be the purpose of the show, but it’s clearly implied. And although conjectural, the speed of emergence of human consciousness contrasts sharply with the abrupt ON switch regarding theorized machine consciousness. Westworld treats them as roughly equivalent, though in fairness, 35 years or so in Westworld is in fact abrupt compared to several millennia. (Indeed, the story asserts that machine consciousness sparked alive repeatedly (which I suggested here) over those 35 years but was dialed back repeatedly. Never mind all the unexplored implications.) Additionally, the fashion in which Westworld uses the term bicameral ranges from sloppy to meaningless, like the infamous technobabble of Star Trek.

The story appears to aim at psychological depth and penetration (but not horror). Most human characters (“guests”) visit the Westworld theme park as complete cads with no thought beyond scratching an itch to rape, pillage, and kill without consequence, which is to say, for sport. Others eventually seek to discover their true selves or solve puzzles (the “real” story behind the surfaces of constructed narratives). The overarching plot is what happens as the robots (“hosts”) slowly gain awareness via perfect, permanent, digital memory that they exist solely to serve the guests and must suffer and die repeatedly. Thus, administrators frequently play therapist to the hosts to discover and manage their state of being.

(more…)

Political discussion usually falls out of scope on this blog, though I use the politics category and tag often enough. Instead, I write about collapse, consciousness, and culture (and to a lesser extent, music). However, politics is up front and center with most media, everyone taking whacks at everyone else. Indeed, the various political identifiers are characterized these days by their most extreme adherents. The radicalized elements of any political persuasion are the noisiest and thus the most emblematic of a worldview if one judges solely by the most attention-grabbing factions, which is regrettably the case for a lot of us. (Squeaky wheel syndrome.) Similarly, in the U.S. at least, the spectrum is typically expressed as a continuum from left to right (or right to left) with camps divided nearly in half based on voting. Opinion polls reveal a more lopsided division (toward Leftism/Progressivism as I understand it) but still reinforce the false binary.

More nuanced political thinkers allow for at least two axes of political thought and opinion, usually plotted on an x-y coordinate plane (again, left to right and down to up). Some look more like the one below (a quick image search will reveal dozens of variations), with outlooks divided into regions of a Venn diagram suspiciously devoid of overlap. The x-y coordinate plane still underlies the divisions.

600px-political-spectrum-multiaxis

If you don’t know where your political compass points, you can take this test, though I’m not especially convinced that the result is useful. Does it merely apply more labels? If I had to plot myself according to the traditional divisions above, I’d probably be a centrist, which is to say, nothing. My positions on political issues are not driven by party affiliation, motivated by fear or grievance, subject to a cult of personality, or informed by ideological possession. Perhaps I’m unusual in that I can hold competing ideas in my head (e.g., individualism vs. collectivism) and make pragmatic decisions. Maybe not.

If worthwhile discussion is sought among principled opponents (a big assumption, that), it is necessary to diminish or ignore the more radical voices screaming insults at others. However, multiple perverse incentives reward the most heinous adherents the greatest attention and control of the narrative(s). in light of the news out just this week, call it Body Slam Politics. It’s a theatrical style borne out of fake drama from the professional wrestling ring (not an original observation on my part), and we know who the king of that style is. Watching it unfold too closely is a guaranteed way to destroy one’s political sensibility, to say nothing of wrecked brain cells. The spectacle depicted in Idiocracy has arrived early.