Posts Tagged ‘Reviews’

David Sirota, author of Back to our Future: How the 1980s Explain the World We Live in Now — Our Culture, Our Politics, Our Everything (2011), came to my attention (how else?) through a podcast. He riffed pretty entertainingly on his book, now roughly one decade old, like a rock ‘n’ roller stuck (re)playing his or her greatest hits into dotage. However, his thesis was strong and appealing enough that I picked up a copy (read: borrowed from the library) to investigate despite the datedness of the book (and my tardiness). It promised to be an easy read.

Sirota’s basic thesis is that memes and meme complexes (a/k/a memeplexes, though Sirota never uses the term meme) developed in the 80s and deployed through a combination of information and entertainment media (thus, infotainment) form the narrative background we take for granted in the early part of the 20th century. Children fed a steady diet of clichés, catchphrases, one-liners, archetypes, and story plots have now grown to adulthood and are scarcely able to peer behind the curtain to question the legitimacy or subtext of the narrative shapes and distortions imbibed during childhood like mother’s milk. The table of contents lists four parts (boldface section titles are Sirota’s; descriptive text is mine):

  • Liking Ike, Hating Woodstock. How the 50s and 60s decades were (the first?) assigned reductive demographic signifiers, handily ignoring the true diversity of experience during those decades. More specifically, the boom-boom 50s (economics, births) were recalled nostalgically in 80s TV and films while the 60s were recast as being all about those dirty, hairy hippies and their music, drugs, and sexual licentiousness, all of which had to be invalidated somehow to regain lost wholesomeness. The one-man promotional vehicle for this pleasing self-deception was Michael J. Fox, whose screen personae (TV and film) during the 80s (glorifying the 50s but openly shitting on the 60s) were instrumental in reforming attitudes about our mixed history.
  • The Jump Man Chronicles. How the Great Man Theory of History was developed through glorification of heroes, rogues, mavericks, and iconoclasts who came into their own during the 80s. That one-man vehicle was Michael Jordan, whose talents and personal magnetism were so outsized that everyone aspired to be “like Mike,” which is to say, a superhero elevated beyond mere mortal rules and thus immortalized. The effect was duplicated many times over in popular culture, with various entertainment icons and political operatives subverting thoughtful consideration of real-world problems in favor of jingoistic portrayals.
  • Why We (Continue to) Fight. How the U.S. military was rehabilitated after losing the Vietnam War, gifting us with today’s hypermilitarism and permanent wars. Two principal tropes were deployed to shape public opinion: the Legend of the Spat upon Veteran and the Hands Tied Behind Their Backs Myth. Each was trotted out reliably whenever we needed to misremember our past as fictionalized in the 80s.
  • The Huxtable Effect. How “America’s dad” helped accommodate race relations to white anxiety, primarily to sell a TV show. In contrast with various “ghetto TV” shows of the 70s that depicted urban working poor (various ethnicities), The Cosby Show presented an upscale black family who transcended race by simply ignoring the issue — a privilege of wealth and celebrity. The Obama campaign and subsequent administration copied this approach, pretending American society had become postracial despite his never truly being able to escape the modifier black because the default (no modifier needed) in America is always white. This is the most fraught part of the book, demonstrating that despite whatever instructions we get from entertainment media and pundits, we remain stuck in an unresolved, unhealed, inescapable trap.

(more…)

The Anton Bruckner symphony cycle recorded by the Berlin Philharmonic under Herbert von Karajan (the Wing Cycle to some collectors) has long been known to me and cherished. Based on Amazon reviews noting remastered and improved sound over previous releases of the same recorded performances, I decided the relatively low cost was worth trying out Blu-ray Audio, my first such disc. Although the entire cycle fits on a single Blu-ray Audio disc, nine CDs are included in the box. Duplication seems unnecessary, but the inclusion of both may be desirable for some listeners. Pleasingly, original cover art (the aforementioned wing) from the LPs appears on the 2020 rerelease. Shamefully, like another recent set of Bruckner symphonies, DG put the conductor’s name above the composer’s. This practice ought to stop. This review is about comparing versions/media in addition to reviewing the performances. Caveat: a superior stereo system is a prerequisite. If listening on some device not intended for high fidelity (phone, computer, etc.), save your dollars and find the recordings on a streaming service. Both CD and Blu-ray players in my system are connected to the preamp via digital cables to use the better DAC in the preamp rather than those in the players.

My comparison involves four releases of the cycle: (1) original LPs from the 1970s and 80s, (2) 1990 CDs, (3) 2020 CDs, and (4) the sole 2020 Blu-ray Audio disc. All are the same works and same performances. Direct A/B/C/D comparisons are difficult, and I didn’t listen to every version of each, which would have required far too many hours. Rather, I focused on two representative movements well established in my ear: the 2nd movt. (Adagio) of the 5th and the 4th movt. (Finale) of the 8th. Because every recording has its own unique characteristics borne out of the era (typically divided by decade), the engineering team, and the producer, it’s normal for me to attempt to “hear through” all the particulars of a recording to the actual performance. For a recording to be badly flawed or unlistenable is fairly exceptional, and I still tend to make mental adjustments to accommodate what I’m hearing. Similar perceptual adjustments in the visual spectrum known as “white balance” reflect how outdoor lighting and color shift as the sun transits across the sky. Accordingly, there is probably no such thing as authentic or natural sound as one’s perceptual apparatus adjusts automatically, accommodating itself to what is heard to fit circumstance. That said, it’s obvious that some recorded media and listening environments are superior to others.

(more…)

I’ll try to be relatively brief, since I’ve been blogging about industrial and ecological collapse for more than a decade. Jeff Gibbs released a new documentary called Planet of the Humans (sideways nod to the dystopian movie franchises Planet of the Apes — as though humans aren’t also apes). Gibbs gets top billing as the director, but this is clearly a Michael Moore film, who gets secondary billing as the executing producer. The film includes many of Moore’s established eccentricities, minus the humor, and is basically an exposé on greenwashing: the tendency of government agencies, environmental activists, and capitalist enterprises to coopt and transform earnest environmental concern into further profit-driven destruction of the natural environment. Should be no surprise to anyone paying attention, despite the array of eco-luminaries making speeches and soundbites about “green” technologies that purport to save us from rendering the planet uninhabitable. Watching them fumble and evade when answering simple, direct questions is a clear indication of failed public-relations approaches to shaping the narrative.

Turns out that those ballyhooed energy sources (e.g., wind, solar, biofuel, biomass) ride on the back of fossil fuels and aren’t any more green or sustainable than the old energy sources they pretend to replace. Again, no surprise if one has even a basic understanding of the dynamics of energy production and consumption. That admittedly sounds awfully jaded, but the truth has been out there for a long time already for anyone willing and able to confront it. Similarly, the documentary mentions overpopulation, another notorious elephant in the room (or herd of elephants, as aptly put in the film), but it’s not fully developed. Entirely absent is any question of not meeting energy demand. That omission is especially timely given how, with the worldwide economy substantially scaled back at present and with it significant demand destruction (besides electricity), the price of oil has fallen through the floor. Nope, the tacit assumption is that energy demand must be met despite all the awful short- and long-term consequences.

Newsfeeds indicate that the film has sparked considerable controversy in only a few days following release. Debate is to be expected considering a coherent energy strategy has never been developed or agreed upon and interested parties have a lot riding on outcomes. Not to indulge in hyperbole, but the entire human race is bound up in the outcome, too, and it doesn’t look good for us or most of the rest of the species inhabiting the planet. Thus, I was modestly dismayed when the end of the film wandered into happy chapter territory and offered the nonsensical platitude in voiceover, “If we get ourselves under control, all things are possible.” Because we’ve passed and in fact lapped the point of no return repeatedly, the range of possibilities has shrunk precipitously. The most obvious is that human population of 7.7 billion (and counting) is being sorely tested. If we’re being honest with ourselves, we also know that post-pandemic there can be no return to the world we’ve known for the past 70 years or so. Although the documentary could not be reasonably expected to be entirely up to date, it should at least have had the nerve to conclude what the past few decades have demonstrated with abundant clarity.

Addendum

This review provides support for my assessment that “green” or “sustainable” energy cannot be delivered without significant contribution of fossil fuels.

Caveat: Rather uncharacteristically long for me. Kudos if you have the patience for all of this.

Caught the first season of HBO’s series Westworld on DVD. I have a boyhood memory of the original film (1973) with Yul Brynner and a dim memory of its sequel Futureworld (1976). The sheer charisma of Yul Brynner in the role of the gunslinger casts a long shadow over the new production, not that most of today’s audiences have seen the original. No doubt, 45 years of technological development in film production lends the new version some distinct advantages. Visual effects are quite stunning and Utah landscapes have never been used more appealingly in terms of cinematography. Moreover, storytelling styles have changed, though it’s difficult to argue convincingly that they’re necessarily better now than then. Competing styles only appear dated. For instance, the new series has immensely more time to develop its themes; but the ancient parables of hubris and loss of control over our own creations run amok (e.g., Shelley’s Frankenstein, or more contemporaneously, the surprisingly good new movie Upgrade) have compact, appealing narrative arcs quite different from constant teasing and foreshadowing of plot developments while actual plotting proceeds glacially. Viewers wait an awful lot longer in the HBO series for resolution of tensions and emotional payoffs, by which time investment in the story lines has been dispelled. There is also no terrifying crescendo of violence and chaos demanding rescue or resolution. HBO’s Westworld often simply plods on. To wit, a not insignificant portion of the story (um, side story) is devoted to boardroom politics (yawn) regarding who actually controls the Westworld theme park. Plot twists and reveals, while mildly interesting (typically guessed by today’s cynical audiences), do not tie the narrative together successfully.

Still, Westworld provokes considerable interest from me due to my fascination with human consciousness. The initial episode builds out the fictional future world with characters speaking exposition clearly owing its inspiration to Julian Jayne’s book The Origins of Consciousness in the Breakdown of the Bicameral Mind (another reference audiences are quite unlikely to know or recognize). I’ve had the Julian Jaynes Society’s website bookmarked for years and read the book some while back; never imagined it would be captured in modern fiction. Jaynes’ thesis (if I may be so bold as to summarize radically) is that modern consciousness coalesced around the collapse of multiple voices in the head — ideas, impulses, choices, decisions — into a single stream of consciousness perhaps better understood (probably not) as the narrative self. (Aside: the multiple voices of antiquity correspond to polytheism, whereas the modern singular voice corresponds to monotheism.) Thus, modern human consciousness arose over several millennia as the bicameral mind (the divided brain having two camera, chambers, or halves) functionally collapsed. The underlying story of the new Westworld is the emergence of machine consciousness, a/k/a strong AI, a/k/a The Singularity, while the old Westworld was about a mere software glitch. Exploration of machine consciousness modeling (e.g., improvisation builds on memory to create awareness) as a proxy for better understanding human consciousness might not be the purpose of the show, but it’s clearly implied. And although conjectural, the speed of emergence of human consciousness contrasts sharply with the abrupt ON switch regarding theorized machine consciousness. Westworld treats them as roughly equivalent, though in fairness, 35 years or so in Westworld is in fact abrupt compared to several millennia. (Indeed, the story asserts that machine consciousness sparked alive repeatedly (which I suggested here) over those 35 years but was dialed back repeatedly. Never mind all the unexplored implications.) Additionally, the fashion in which Westworld uses the term bicameral ranges from sloppy to meaningless, like the infamous technobabble of Star Trek.

The story appears to aim at psychological depth and penetration (but not horror). Most human characters (“guests”) visit the Westworld theme park as complete cads with no thought beyond scratching an itch to rape, pillage, and kill without consequence, which is to say, for sport. Others eventually seek to discover their true selves or solve puzzles (the “real” story behind the surfaces of constructed narratives). The overarching plot is what happens as the robots (“hosts”) slowly gain awareness via perfect, permanent, digital memory that they exist solely to serve the guests and must suffer and die repeatedly. Thus, administrators frequently play therapist to the hosts to discover and manage their state of being.

(more…)

Be forewarned: this is long and self-indulgent. Kinda threw everything and the kitchen sink at it.

In the August 2017 issue of Harper’s Magazine, Walter Kirn’s “Easy Chair” column called “Apocalypse Always” revealed his brief, boyhood fascination with dystopian fiction. This genre has been around for a very long time, to which the Cassandra myth attests. Kirn’s column is more concerned with “high mid-twentieth-century dystopian fiction,” which in his view is now classic and canonical, an entire generation of Baby Boomers having been educated in such patterned thought. A new wave of dystopian fiction appeared in the 1990s and yet another more recently in the form of Young Adult novels (and films) that arguably serve better as triumphal coming-of-age stories albeit under dystopian circumstances. Kirn observes a perennial theme present in the genre: the twin disappearances of freedom and information:

In the classic dystopias, which concern themselves with the lack of freedom and not with surplus freedom run amok (the current and unforeseen predicament of many), society is superbly well organized, resembling a kind of hive or factory. People are sorted, classified, and ranked, their individuality suppressed through goon squads, potent narcotics, or breeding programs. Quite often, they wear uniforms, and express themselves, or fail to, in ritual utterance and gestures.

Whether Americans in 2018 resemble hollowed-out zombies suffering under either boot-heel or soft-serve oppression is a good question. Some would argue just that in homage to classic dystopias. Kirn suggests briefly that we might instead suffer from runaway anarchy, where too much freedom and licentiousness have led instead to a chaotic and disorganized society populated by citizens who can neither govern nor restrain themselves.

Disappearance of information might be understood in at least three familiar aspects of narrative framing: what happened to get us to this point (past as exposition, sometimes only hinted at), what the hell? is going on (present as conflict and action), and how is gets fixed (future as resolution and denouement). Strict control over information exercised by classic dystopian despots doesn’t track to conditions under which we now find ourselves, where more disorganized, fraudulent, and degraded information than ever is available alongside small caches of wisdom and understanding buried somewhere in the heap and discoverable only with the benefit of critical thinking flatly lost on at least a couple generations of miseducated graduates. However, a coherent narrative of who and what we are and what realistic prospects the future may hold has not emerged since the stifling version of the 1950s nuclear family and middle class consumer contentment. Kirn makes this comparison directly, where classic dystopian fiction

focus[es] on bureaucracy, coercion, propaganda, and depersonalization, overstates both the prowess of the hierarchs and the submissiveness of the masses, whom it still thinks of as the masses. It does not contemplate Trump-style charlatanism at the top, or a narcissistic populace that prizes attention over privacy. The threats to individualism are paramount; the scourge of surplus individualism, with everyone playing his own dunce king and slurping up resources until he bursts, goes unexplored.

Kirn’s further observations are worth a look. Go read for yourself.

I was never much drawn to genre fiction and frankly can’t remember reading Orwell’s 1984 in middle school. It wasn’t until much later that I read Huxley’s Brave New World (reviewed here) and reread 1984. The latest in the dystopian pantheon to darken my brow is Nevil Shute’s On the Beach. None of these are pleasure reading, exactly. However, being landmarks of the genre, I have at times felt compelled to acquaint myself with them. I never saw the 1959 movie On the Beach with Gregory Peck, but I did see the updated 2000 remake with Armande Assante. It rather destroyed me — forcing me for the first time to contemplate near-term human extinction — but was mostly forgotten. Going back recently to read the original 1957 novel, I had a broad recollection of the scenario but little of the detail.

(more…)

This is the inverse of a prior post called “Truth Based on Fiction.”

Telling stories about ourselves is one of the most basic of human attributes stretching across oral and recorded history. We continue today to memorialize events in short, compact tellings, frequently movies depicting real-life events. I caught two such films recently: Truth (about what came to be known as Rathergate) and Snowden (about whistle-blower Edward Snowden).

Although Dan Rather is the famous figure associated with Truth, the story focuses more on his producer Mary Mapes and the group decisions leading to airing of a controversial news report about George W. Bush’s time in the Air National Guard. The film is a dramatization, not a documentary, and so is free to present the story with its own perspective and some embellishment. Since I’m not a news junkie, my memory of the events in 2004 surrounding the controversy are not especially well informed, and I didn’t mind the potential for the movie’s version of events to color my thinking. About some controversies and conspiracies, I feel no particular demand to adopt a strong position. The actors did well enough, but I felt Robert Redford was poorly cast as Dan Rather. Redford is too famous in his own right to succeed as a character actor playing a real-life person.

Debate over the patriotism or treason of Edward Snowden’s actions continues to swirl, but the film covers the issues pretty well, from his discovery of an intelligence services surveillance dragnet (in violation of the 4th Amendment to the U.S. Constitution) to his eventual disclosure of same to a few well-respected journalists. The film’s director and joint screenwriter, Oliver Stone, has made a career out of fiction based on truth, dramatizing many signal events from the nation’s history, repackaging them as entertainment in the process. I’m wary of his interpretations of history when presented in cinematic form, less so his alternative history lessons given as documentary. Unlike Truth, however, I have clear ideas in my mind regarding Snowden the man and Snowden the movie, so from a different standpoint, was again unconcerned about potential bias. Joseph Gordon-Levitt does well enough as the titular character, though he doesn’t project nearly the same insight and keen intelligence as Snowden himself does. I suspect the documentary Citizen Four (which I’ve not yet seen) featuring Snowden doing his own talking is a far better telling of the same episode of history.

In contrast, I have assiduously avoided several other recent films based on actual events. United 93, World Trade Center, and Deepwater Horizon spring to mind, but there are many others. The wounds and controversies stemming from those real-life events still smart too much for me to consider exposing myself to propaganda historical fictions. Perhaps in a few decades, after living memory of such events has faded or disappeared entirely, such stories can be told effectively, though probably not accurately. A useful comparison might be any one of several films called The Alamo.

I see plenty of movies over the course of a year but had not been to a theater since The Force Awakens came out slightly over a year ago. The reason is simple: it costs too much. With ticket prices nearing $15 and what for me had been obligatory popcorn and soda (too much of both the way they’re bundled and sold — ask anyone desperately holding back their pee until the credits roll!), the endeavor climbed to nearly $30 just for one person. Never mind that movie budgets now top $100 million routinely; the movie-going experience simply isn’t worth $30 a pop. Opening weekend crowds (and costumes)? Fuggedaboudit! Instead, I view films at home on DVD (phooey on Blueray) or via a streaming service. Although I admit I’m missing out on being part of an audience, which offers the possibility of being carried away on a wave of crowd emotion, I’m perfectly happy watching at home, especially considering most films are forgettable fluff (or worse) and filmmakers seem to have forgotten how to shape and tell good stories. So a friend dragged me out to see Rogue One, somewhat late after its opening by most standards. Seeing Star Wars and other franchise installments now feels like an obligation just to stay culturally relevant. Seriously, soon enough it will be Fast & Furious Infinitum. We went to a newly built theater with individual recliners and waiters (no concession stands). Are film-goers no longer satisfied by popcorn and Milk Duds? No way would I order an $80 bottle of wine to go with Rogue One. It’s meant to be a premium experience, with everything served to you in the recliner, and accordingly, charges premium prices. Too bad most films don’t warrant such treatment. All this is preliminary to the actual review, of course.

I had learned quite a bit about Rogue One prior to seeing it, not really caring about spoilers, and was pleasantly surprised it wasn’t as bad as some complain. Rogue One brings in all the usual Star Wars hallmarks: storm troopers, the Force, X-Wings and TIE Fighters, ray guns and light sabers, the Death Star, and familiar characters such as Grand Moff Tarkin, Darth Vader, Princess Leia, etc. Setting a story within the Star Wars universe makes most of that unavoidable, though some specific instances did feel like gratuitous fan service, such as the 3-second (if that) appearance of C3PO and R2D2. The appearance of things and characters I already knew about didn’t feel to me like an extra thrill, but how much I needed to already know about Star Wars just to make sense of Rogue One was a notable weakness. Thus, one could call Rogue One a side story, but it was by no means a stand-alone story. Indeed, characters old and new were given such slipshod introductions (or none at all!) that they functioned basically as chess pieces moved around to drive the game forward. Good luck divining their characteristic movements and motivations. Was there another unseen character manipulating everyone? The Emperor? Who knows? Who cares! It was all a gigantic, faceless, pawn sacrifice. When at last the main rebels died, there was no grief or righteousness over having at least accomplished their putative mission. Turns out the story was all about effects, not emotional involvement. And that’s how I felt: uninvolved. It was a fireworks display ending with a pointless though clichéd grand finale. Except I guess that watching a bunch of fake stuff fake blow up was the fake point.

About what passed for a story: the Rebellion learns (somehow?!) that they face total annihilation from a new superweapon called the Death Star. (Can’t remember whether that term was actually used in the film.) While the decision of leadership is to scatter and flee, a plucky band of rebels within the rebellion insist on flinging themselves against the enemy without a plan except to improvise once on site, whereupon leadership decides irrationally to do the same. The lack of strategy is straight out of The Return of the King, distracting the enemy from the true mission objective, but the visual style is more like the opening of Saving Private Ryan, which is to say, full, straight-on bombardment and invasion. Visual callbacks to WWII infantry uniforms and formations couldn’t be more out of place. To call these elements charmless is to give them too much credit. Rather, they’re hackneyed. However, they probably fit well enough within the Saturday-morning cartoon, newsreel, swashbuckler sensibility that informed the original Star Wars films from the 1970s. Problem is, those 1970s kids are grown and want something with greater gravitas than live-action space opera. Newer Star Wars audiences are stuck in permanent adolescence because of what cinema has become, with its superhero franchises and cynical money grabs.

As a teenager when the first trilogy came out, I wanted more of the mystical element — the Force — than I wanted aerial battles, sword fights, or chase scenes. The goofy robots, reluctant heroes, and bizarre aliens were fun, but they were balanced by serious, steady leadership (the Jedi) and a couple really bad-ass villains. While it’s known George Lucas had the entire character arc of Anakin Skywalker/Darth Vader in mind from the start, it’s also fair to say that no one quite knew in Episode 4 just how iconic Vader the villain would become, which is why his story became the centerpiece of the first two trilogies (how many more to come?). However, Anakin/Vader struggled with the light/dark sides of the Force, which resonated with anyone familiar with the angel/demon nomenclature of Christianity. When the Force was misguidedly explained away as Midi-clorians (science, not mysticism), well, the bottom dropped out of the Star Wars universe. At that point, it became a grand WWII analogue populated by American GIs and Nazis — with some weird Medievalism and sci-fi elements thrown in — except that the wrong side develops the superweapon. Rogue One makes that criticism even more manifest, though it’s fairly plain to see throughout the Star Wars films.

Let me single out one actor for praise: Ben Mendelsohn as Orson Krennic. It’s hard for me to decide whether he chews the scenery, upstaging Darth Vader as a villain in the one scene they share, or he’s among a growing gallery of underactors whose flat line delivery and blandness invites viewers to project upon them characterization telegraphed through other mechanisms (costuming, music, plot). Either way, I find him oddly compelling and memorable, unlike the foolish, throwaway, sacrificial band of rebellious rebels against the rebellion and empire alike. Having seen Ben Mendelsohn in other roles, he possesses an unusual screen magnetism that reminds me of Sean Connery. He tends to play losers and villains and be a little one-note (not a bag of tricks but just one trick), but he is riveting on-screen for the right reasons compared to, say, the ookiness of the two gratuitous CGI characters in Rogue One.

So Rogue One is a modestly enjoyable and ephemeral romp through the Star Wars universe. It delivers and yet fails to deliver, which about as charitable as I can be.

I already updated my original post from 2009 once based on Tom Engelhardt’s analysis, adding a few of my own thoughts. I want to revisit the original, provide an addendum to my review of Oliver Stone’s Untold History, and draw attention to Andrew Bacevich’s alternative narrative titled “American Imperium.” This is about geopolitics and military history, which fall outside my usual areas of interest and blogging focus (excepting the disgrace of torture), but they’re nonetheless pretty central to what’s going on the world.

Having now watched the remainder of Untold History, it’s clear that every administration since WWII was neck deep in military adventurism. I had thought at least one or two would be unlike the others, and maybe Gerald Ford only waded in up to his knees, but the rest deployed the U.S. military regularly and forcefully enough to beggar the imagination: what on earth were they doing? The answer is both simple and complex, no doubt. I prefer the simple one: they were pursuing global American hegemony — frequently with overweening force against essentially medieval cultures. It’s a remarkably sad history, really, often undertaken with bland justifications such as “American interests” or “national security,” neither of which rings true. I’ve likened the U.S. before to the playground bully who torments others but can never be psychologically satisfied and so suffers his own private torments on the way to becoming a sociopath. Why does every American president resemble that profile (war criminals all), so afraid to look weak that he (thus far in U.S. history, always a he) must flex those muscles at the expense of ordinary people everywhere? Women in positions of authority (e.g., Sec. of State, National Security Advisor), by the way, exhibit the same behavior: advising striking at weaklings to prove they can wear pants, too.

(more…)

Caveat: this review is based on viewing only half uhposterof the DVD version of Oliver Stone’s Untold History of the United States, which also exists as a book and audio book. It’s also available on the Showtime cable channel, as downloadable media, and in excerpts on YouTube (and probably elsewhere). Stone put his name above the title, but I will refer to the documentary as simply Untold History.

Disclaimer: Stone has a long personal history of retelling political history through a cinematic lens, which by necessity introduces distortions to condense and reshape events and characters for storytelling. Untold History purports to be documentary and (alert: intentional fallacy at work) shares with Howard Zinn’s somewhat earlier A People’s History of the United States an aim to correct the record from official accounts, accepted narratives, and propagandist mythologies misinterpretations. I’ve always been suspicious of Stone’s dramatic license in his movies, just as with Steven Spielberg. However, I wanted to see Untold History from first learning about it and am just now getting to it (via a borrowed library copy). Without indulging in conspiratorial fantasies about Stone’s arguments, I find myself pretty well convinced (or an easy mark).

Whereas Zinn begins People’s History with the discovery of North America in 1492, Stone commences Untold History with World War Two. Thus, there is little or no discussion of Americans’ pacifism and isolationism prior to entry into WWII. There is also little direct cultural and social history to which I typically grant the greater part of my attention. Rather, Untold History is presented from military and political perspectives. Economic history is mixed in with all these, and the recognition that a wartime economy rescued the U.S. from the grip of the Great Depression (leading to nearly permanent war) is acknowledged but not dwelt upon heavily.

Based on the first half that I have viewed (WWII through the Eisenhower administrations and the early decades of the Cold War), it was clear that the U.S. experienced rapid and thoroughgoing transformation from a lesser power and economy into the preeminent political, military, and industrial power on the globe. Thus, activities of the U.S. government from roughly 1940 forward became absorbed in geopolitics to a greater degree than ever before — just at a time when the U.S. acquired immense power of production and destruction. Untold History never quite says it, but it appears many became more than a little drunk with power and lacked the composure and long historical view of leaders whose countries had more extended experience as principal actors on the world’s stage.

(more…)

A friend gave me William Ophuls’ Immoderate Greatness: Why Civilizations Fail to read a couple years ago and it sat on my shelf until just recently. At only 93 pp. (with bibliographical recommendations and endnotes), it’s a slender volume but contains a good synopsis of the dynamics that doom civilizations. I’ve been piecing together the story of industrial civilization and its imminent collapse for about eight years now, so I didn’t expect Ophuls’ analysis to break new ground, which indeed it didn’t (at least for me). However, without my own investigations already behind me, I would not have been too well convinced by Ophuls’ CliffsNotes-style arguments. Armed with what I already learned, Ophuls is preaching to the choir (member).

The book breaks into two parts: biophysical limitations and cultural impediments borne out of human error. Whereas I’m inclined to award greater importance to biophysical limits (e.g., carrying capacity), particularly but not exclusively as civilizations overshoot and strip their land and resource bases, I was surprised to read this loose assertion:

… maintaining a civilization takes a continuous input of matter, energy, and morale, and the latter is actually the most important. [p. 51]

Upon reflection, it seems to be a chicken-and-egg question. Which comes first, increased and unmet demands for inputs or exhausted and/or diminished inputs due to human factors? The historical record of failed empires and civilizations offers examples attributable to both. For instance, the Incan civilization is believed to have risen and fallen on the back of climate change, whereas the fall of the Roman and British Empires stems more from imperial overreach. Reasons are never solely factor A or B, of course; a mixture of dynamic effects is easily discoverable. Still, the question is inevitable for industrial civilization now on a trajectory toward extinction no less than other (already extinct) civilizations, especially for those who believe it possible to learn from past mistakes and avoid repetition.

So the question is actually pretty simple: have we finally reached not just a hard ceiling we cannot rise above due to straightforward limits to growth but the crest of a wave that has by necessity an upslope and a downslope, or have we done to ourselves what all civilizations do, namely, mismanaged our own affairs out of greed, stupidity, and incompetence to the point of unrecoverable fragility? In Immoderate Greatness, Ophuls argues the latter with surprising vehemence and indignation, whereas geophysical processes are more “aw, shucks ….” Here’s a good example:

(more…)