Posts Tagged ‘Collapse’

I don’t have the patience or expertise to prepare and offer a detailed political analysis such as those I sometimes (not very often) read on other blogs. Besides, once the comments start filling up at those sites, every possible permutation is trotted out, muddying the initial or preferred interpretation with alternatives that make at least as much sense. They’re interesting brainstorming sessions, but I have to wonder what is accomplished.

My own back-of-the-envelope analysis is much simpler and probably no closer to (or farther from) being correct, what with everything being open to dispute. So the new POTUS was born in 1946, which puts the bulk of his boyhood in the 1950s, overlapping with the Eisenhower Administration. That period has lots of attributes, but the most significant (IMO), which would impact an adolescent, was the U.S. economy launching into the stratosphere, largely on the back of the manufacturing sector (e.g., automobiles, airplanes, TVs, etc.), and creating the American middle class. The interstate highway system also dates from that decade. Secondarily, there was a strong but misplaced sense of American moral leadership (one might also say authority or superiority), since we took (too much) credit for winning WWII.

However, it wasn’t great for everyone. Racism, misogyny, and other forms of bigotry were open and virulent. Still, if one was lucky to be a white, middle class male, things were arguably about as good as they would get, which many remember rather fondly, either through rose-colored glasses or otherwise. POTUS as a boy wasn’t middle class, but the culture around him supported a worldview that he embodies even now. He’s also never been an industrialist, but he is a real estate developer (some would say slumlord) and media figure, and his models are taken from the 1950s.

The decade of my boyhood was the 1970s, which were the Nixon, Ford, and Carter Administrations. Everyone could sense the wheels were already coming off the bus, and white male entitlement was far diminished from previous decades. The Rust Belt was already a thing. Like children from the 1950s forward, however, I spent a lot of time in front of the TV. Much of it was goofy fun such as Gilligan’s Island, The Brady Bunch, and interestingly enough, Happy Days. It was innocent stuff. What are the chances that, as a boy plopped in front of the TV, POTUS would have seen the show below (excerpted) and taken special notice considering that the character shares his surname?

Snopes confirms that this a real episode from the TV show Trackdown. Not nearly as innocent as the shows I watched. The coincidences that the character is a con man, promises to build a wall, and claims to be the only person who can save the town are eerie, to say the least. Could that TV show be lodged in the back of POTUS’ brain, along with so many other boyhood memories, misremembered and revised the way memory tends to do?

Some have said that the great economic expansion of the 1950s and 60s was an anomaly. A constellation of conditions configured to produce an historical effect, a Golden Era by some reckonings, that cannot be repeated. We simply cannot return to an industrial or manufacturing economy that had once (arguably) made America great. And besides, the attempt would accelerate the collapse of the ecosystem, which is already in free fall. Yet that appears to be the intention of POTUS, whose early regression to childhood is a threat to us all.

Anthropologists, pundits, armchair cultural critics (like me), and others sometimes offer an aspect or characteristic, usually singular, that separates the human species from other animals. (Note: humans are animals, not the crowning creation of god in his own image, the dogma of major religions.) Typical singular aspects include tool use (very early on, fire), language, agriculture, self-awareness (consciousness), and intelligence, that last including especially the ability to conceptualize time and thus remember and plan ahead. The most interesting candidate suggested to me is our ability to kill from a distance. Without going into a list of things we don’t think we share with other species but surprisingly do, it interests me that none other possesses the ability to kill at a distance (someone will undoubtedly prove me wrong on this).

Two phrases spring to mind: nature is red in tooth and claw (Tennyson) and human life is nasty, brutish, and short (Hobbes). Both encapsulate what it means to have to kill to eat, which is hardly unique to animals. All sorts of plants, insects, and microorganisms embed themselves in hosts, sometimes killing the host and themselves. Symbiotic relationships also exist. The instance that interests me, though, is the act of killing in the animal kingdom that requires putting one’s own body at risk in life-or-death attack. Examples falling short of killing abound, such as intimidation to establish hierarchy, but to eat, an animal must kill its prey.

Having watched my share of historical fiction (pre-1800, say, but especially sword-and-sandal and medieval epics) on the TeeVee and at the cinema, the dramatic appeal of warring armies slamming into each other never seems to get old. Fighting is hand-to-hand or sword-to-sword, which are tantamount to the same. Archer’s arrows, projectiles launched from catapults and trebuchets, thrown knives, spears, and axes, and pouring boiling oil over parapets are killing from a relatively short distance, but the action eventually ends up being very close. The warrior code in fighting cultures honors the willingness to put oneself in harm’s way, to risk one’s own body. Leaders often exhibit mutual respect and may even share some intimacy. War may not be directly about eating, since humans are not cannibals under most circumstances; rather, it’s usually about control of resources, so secondarily about eating by amassing power. Those historical dramas often depict victors celebrating by enjoying lavish feasts.

Modern examples of warfare and killing from a distance make raining down death from above a bureaucratic action undertaken with little or no personal risk. Artillery, carpet bombing from 20,000 feet, drone strikes (controlled from the comfort of some computer lab in the Utah desert), and nuclear bombs are the obvious examples. No honorable warrior code attaches to such killing. Indeed, the chain of command separates the execution of kill orders from moral responsibility — probably a necessary disconnect when large numbers of casualties (collateral damage, if one prefers the euphemism) can be expected. Only war criminals, either high on killing or banally impervious to empathy and compassion, would dispatch hundreds of thousands at a time.

If killing from a distance is in most cases about proximity or lack thereof, one further example is worth mentioning: killing across time. While most don’t really conceptualize the space-time continuum as interconnected, the prospect of choices made today manifesting in megadeath in the foreseeable future is precisely the sort of bureaucratized killing from a distance that should be recognized and forestalled. Yet despite our supposed intellectual superiority over other species, we cannot avoid waging war, real and rhetorical, to control resources and narratives that enable us to eat. Eating the future would be akin to consuming seed corn, but that metaphor is not apt. Better perhaps to say that we’re killing the host. We’re embedded in the world, as indeed is everything we know to be alive, and rely upon the profundity of the biosphere for survival. Although the frequent charge is that humanity is a parasite or has become as cancer on the world, that tired assessment, while more accurate than not, is a little on the nose. A more charitable view is that, as a species, humanity, as the apex predator, has expanded its habitat to include the entire biosphere, killing to eat, and is slowly consuming and transforming it into a place uninhabitable by us, just as a yeast culture consumes its medium and grows to fill the space before dying all at once. So the irony or Pyrrhic victory is that we while we may fatten ourselves (well, some of us) in the short term, we have also created conditions leading to our own doom. Compared to other species whose time on Earth lasted tens of millions of years, human life on Earth turns out to be exactly what Hobbes said: nasty, brutish, and short.

In what has become a predictable status quo, President Obama recently renewed our official state of emergency with respect to the so-called War on Terror. It’s far too late to declare a new normal; we’ve been in this holding pattern for 16 years now. The article linked above provides this useful context:

There are now 32 states of national emergency pending in the United States, with the oldest being a 1979 emergency declared by President Jimmy Carter to impose sanctions during the Iran hostage crisis. Most are used to impose economic sanctions — mostly as a formality, because Congress requires it under the International Emergency Economic Powers Act.

In his term in office, Obama has declared 13 new emergencies, continued 21 declared by his predecessors and revoked just two, which imposed sanctions on Liberia and Russia.

Pro forma renewal of multiple states of national emergency is comparable to the 55-year-old U.S. embargo against Cuba, due for reauthorization next month, though indications are that the embargo may finally be relaxed or deauthorized. Both are examples of miserably failed policy, but they confer a semblance of power on the executive branch. Everyone knows by now that no one relinquishes power willingly, so Obama, like chief executives before him, keeps on keeping on ad nauseum.

Considering Obama’s credential as a Constitutional scholar, relatively unique among U.S. presidents, one might expect him to weigh his options with greater circumspection and with an eye toward restoring suspended civil liberties. However, he has shown little interest in doing so (as far as I know). In combination with the election only a couple months away, the U.S. appears to be in a position similar to Germany in 1932 — ready and willing to elect a despot (take your pick …) and continue its slide into fascism. Can’t even imagine avoiding that outcome now.

The surprising number of ongoing emergencies makes me point to James Howard Kunstler and his book The Long Emergency (2006). Though I haven’t read the book (I’m a failed doomer, I suppose), my understanding is that his prediction of a looming and lingering emergency is based on two intertwined factors currently playing out in geopolitics: peak oil and global warming. (“Climate change” is now preferred over “global warming.”) Those two dire threats (and the California drought) have faded somewhat from the headlines, partially due to fatigue, replaced primarily by terrorism and economic stresses, but the dangers never went away. Melting icecaps and glaciers are probably the clearest incontrovertible indications of anthropogenic global warming, which is poised to trigger nonlinear climate change and hasten the Sixth Extinction. We don’t know when, precisely, though time is growing short. Similarly, reports on energy production and consumption are subject to considerable falsification in the public sphere, making it impossible to know just how close in time we are to a new energy crisis. That inevitability has also been the target of a disinformation campaign, but even a rudimentary understanding of scientific principles is sufficient to enable clear thinkers to penetrate the fog.

I have no plans to return to doom blogging with any vigor. One emergency stacked upon the next, ready to collapse in a cascade of woe, has defeated me, and I have zero expectation that any real, meaningful response can be formulated and executed, especially while we are distracted with terrorism and creeping fascism.

An enduring trope of science fiction is naming of newly imagined gadgets and technologies (often called technobabble with a mixture of humor and derision), as well as naming segments of human and alien societies. In practice, that means renaming already familiar things to add a quasi-futuristic gleam, and it’s a challenge faced by every story that adopts an alternative or futuristic setting: describing the operating rules of the fictional world but with reference to recognizable human characteristics and institutions. A variety of recent Young Adult (YA) fiction has indulged in this naming and renaming, some of which have been made into movies, mostly dystopic in tone, e.g., the Hunger Games tetralogy, the Twilight saga, the Harry Potter series, the Maze Runner, and the Divergent trilogy. (I cite these because, as multipart series, they are stronger cultural touchstones, e.g., Star Wars, than similar once-and-done adult cinematic dystopias, e.g., Interstellar and ElyseumStar Trek is a separate case, considering how it has devolved after being rebooted from its utopian though militaristic origins into a pointless series of action thrillers set in space.) Some exposition rises to the level of lore but is mostly mere scene-setting removed slightly from our own reality. Similar naming schemes are used in cinematic universes borne out of comic books, especially character names, powers, and origins. Because comic book source material is extensive, almost all of it becomes lore, which is enjoyed by longtime children initiates into the alternate universes created by the writers and illustrators but mildly irritating to adult moviegoers like me.

History also has names for eras and events sufficiently far back in time for hindsight to provide a clear vantage point. In the U.S., we had the Colonial Era, the Revolutionary Period, The Frontier Era and Wild West, the Industrial/Mechanical Age, Modernism, and Postmodernism, to name a few but by no means all. Postmodernism is already roughly 40 years old, yet we have not yet named the era in which we now live. Indeed, because we’re the proverbial fish inside the fishbowl, unable to recognize the water in which we swim, the contemporary moment may have no need of naming, now or at any given time. That task awaits those who follow. We have, however, given names to the succession of generations following the Baby Boom. How well their signature characteristics fit their members is the subject of considerable debate.

As regular readers of this blog already know, I sense that we’re on the cusp of something quite remarkable, most likely a hard, discontinuous break from our recent past. Being one of the fish in the bowl, I probably possess no better understanding of our current phase of history than the next. Still, if had to choose one word to describe the moment, it would be dissolution. My 4-part blog post about dissolving reality is one attempt to provide an outline. A much older post called aged institutions considers the time-limited effectiveness of products of human social organization. The grand question of our time might be whether we are on the verge of breaking apart or simply transitioning into something new — will it be catastrophe or progress?

News this past week of Britain’s exit from the European Union may be only one example of break-up vs. unity, but the drive toward secession and separatism (tribal and ideological, typically based on bogus and xenophobic identity groups constantly thrown in our faces) has been gaining momentum even in the face of economic globalization (collectivism). Scotland very nearly seceded from the United Kingdom last year; Quebec has had multiple referenda about seceding from Canada, none yet successful; and Vermont, Texas, and California have all flirted with secession from the United States. No doubt some would argue that such examples of dissolution, actual or prospective, are actually transitional, meaning progressive. And perhaps they do in fact fulfill the need for smaller, finer, localized levels of social organization that many have argued are precisely what an era of anticipated resource scarcity demands. Whether what actually manifests will be catastrophe (as I expect it will) is, of course, what history and future historians will eventually name.

The question comes up with some regularity: “Where were you when …?” What goes in place of the ellipsis has changed with the generations. For my parents, it was the (first) Kennedy assassination (1963). For my siblings and chronological peers, it was first the Three Mile Island accident (1979) and then the Challenger disaster (1986). For almost everyone since, it’s been the September 11 attacks (2001), though a generation lacking memory of those events is now entering adulthood. These examples are admittedly taken from mainstream U.S. culture. If one lives elsewhere, it might be the Mexico City earthquake (1985), the Chernobyl disaster (1986), the Indonesian tsunami (2004), the Haitian earthquake (2010), the Fukushima disaster (2011), or any number of other candidates populating the calendar. Even within the U.S., other more local events might take on greater significance, such as Hurricane Katrina (2005) along the Gulf Coast or Hurricane Iniki (1992) in Hawaii, the latter of which gives September 11 a very different meaning for those who suffered through it.

What these events all have in common is partially a loss of innocence (particularly the man-made disasters) and a feeling of suspended animation while events are sorted out and news is processed. I remember with clarity how the TV news went into full disaster mode for the Challenger disaster, which proved to be the ridiculous template for later events, including 9/11. Most of the coverage was denial of the obvious and arrant speculation, but the event itself was captivating enough that journalists escaped wholesale condemnation they plainly deserved. The “where were you?” question is usually answered with the moment one became aware of some signal event, such as “I was on the bus” or “I was eating dinner.” News vectors changed dramatically from 1986 to 2001, as two relatively arbitrary points of comparison within my lifetime. Being jarred out of complacency and perceiving the world suddenly as a rather dangerous place hardly expresses the weird wait-and-see response most of us experience in the wake of disaster.

Hurricanes typically come with a narrow warning, but other events strike without clear expectation — except perhaps in terms of their inevitability. That inevitability informs expectations of further earthquakes (e.g., San Andreas and New Madrid faults) and volcanic eruptions (e.g., the Yellowstone supervolcano), though the predictive margin of error can be hundreds or even thousands of years. My more immediate concern is with avoidable man-made disasters that are lined up to fall like a series of dominoes. As with natural disasters, we’re all basically sitting ducks, completely vulnerable to what armed mayhem may arise. But rather than enter into suspended animation in the wake of events, current political, financial, and ecological conditions find me metaphorically ducking for cover in expectation of the inevitable blow(s). Frankly, I’ve been expecting political and financial crack-ups for years, and current events demonstrate extremely heightened risk. (Everyone seems to be asking which will be worse: a Trump or Clinton presidency? No one believes either candidate can guide us successfully through the labyrinth.) A tandem event (highly likely, in my view) could easily trigger a crisis of significant magnitude, given the combination of violent hyperbole and thin operational tolerance for business as usual. I surmise that anyone who offers the line “may you live in interesting times” has a poor understanding of what’s truly in store for us. What happens with full-on industrial collapse is even harder to contemplate.

Today is the 10-year anniversary of the opening of this blog. As a result, there is a pretty sizeable backblog should anyone decide to wade in. As mentioned in my first post, I only opened this blog to get posting privileges at a group blog I admired because it functioned more like a discussion than a broadcast. The group blog died of attrition years ago, yet here I am 10 years later still writing my personal blog (which isn’t really about me).

Social media lives and dies by the numbers, and mine are deplorable. Annual traffic has ranged from about 6,800 to about 12,500 hits, much of which I’m convinced is mere background noise and bot traffic. Cumulative hits number about 90,140, and unique visitors are about 19,350, neither of which is anything to crow about for a blog of this duration. My subscriber count continues to climb pointlessly, now resting at 745. However, I judge I might have only a half dozen regular readers and perhaps half again as many commentators. I’ve never earned a cent for my effort, nor am I likely to ever put up a Patreon link or similar goad for donations. All of which only demonstrate that almost no one cares what I have to write about. C’est la vie. I don’t write for that purpose and frankly wouldn’t know what to write about if I were trying to drive numbers.

So if you have read my blog, what are some of the thing you might have gathered from me? Here’s an incomplete synopsis:

  • Torture is unspeakably bad. History is full of devices, methodologies, and torturers, but we learned sometime in the course of two 20th-century world wars that nothing justifies it. Nevertheless, it continues to occur with surprising relish, and those who still torture (or want to) are criminally insane.
  • Skyscrapers are awesomely tall examples of technical brilliance, exuberance, audacity, and hubris. Better expressions of techno-utopian, look-mom-no-hands, self-defeating narcissism can scarcely be found. Yet they continue to be built at a feverish pace. The 2008 financial collapse stalled and/or doomed a few projects, but we’re back to game on.
  • Classical music, despite record budgets for performing ensembles, has lost its bid for anything resembling cultural and artistic relevance by turning itself into a museum (performing primarily works of long-dead composers) and abandoning emotional expression in favor of technical perfection, which is probably an accurate embodiment of the spirit of the times. There is arguably not a single living composer who has become a household name since Aaron Copland, who died in 1990 but was really well-known in the 1940s and 50s.
  • We’re doomed — not in any routine sense of the word having to do with individual mortality but in the sense of Near-Term (Human) Extinction (NTE). The idea is not widely accepted in the least, and the arguments are too lengthy to repeat (and unlikely to convince). However, for those few able to decipher it, the writing is on the wall.
  • American culture is a constantly moving target, difficult to define and describe, but its principal features are only getting uglier as time wears on. Resurgent racism, nationalism, and misogyny make clear that while some strides have been made, these attitudes were only driven underground for a while. Similarly, colonialism never really died but morphed into a new version (globalization) that escapes criticism from the masses, because, well, goodies.
  • Human consciousness — another moving target — is cratering (again) after 3,000–5,000 years. We have become hollow men, play actors, projecting false consciousness without core identity or meaning. This cannot be sensed or assessed easily from the first-person perspective.
  • Electronic media makes us tools. The gleaming attractions of sterile perfection and pseudo-sociability have hoodwinked most of the public into relinquishing privacy and intellectual autonomy in exchange for the equivalent of Huxley’s soma. This also cannot be sensed or assessed easily from the first-person perspective.
  • Electoral politics is a game played by the oligarchy for chumps. Although the end results are not always foreseeable (Jeb!), the narrow range of options voters are given (lesser of evils, the devil you know …) guarantees that fundamental change in our dysfunctional style of government will not occur without first burning the house down. After a long period of abstention, I voted in the last few elections, but my heart isn’t really in it.
  • Cinema’s infatuation with superheros and bankable franchises (large overlap there) signals that, like other institutions mentioned above, it has grown aged and sclerotic. Despite large budgets and impressive receipts (the former often over $100 million and the latter now in the billions for blockbusters) and considerable technical prowess, cinema has lost its ability to be anything more than popcorn entertainment for adolescent fanboys (of all ages).

This is admittedly a pretty sour list. Positive, worthwhile manifestations of the human experience are still out there, but they tend to be private, modest, and infrequent. I still enjoy a successful meal cooked in my own kitchen. I still train for and race in triathlons. I still perform music. I still make new friends. But each of these examples is also marred by corruptions that penetrate everything we do. Perhaps it’s always been so, and as I, too, age, I become increasingly aware of inescapable distortions that can no longer be overcome with innocence, ambition, energy, and doublethink. My plan is to continue writing the blog until it feels like a burden, at which point I’ll stop. But for now, there’s too much to think and write about, albeit at my own leisurely pace.

I found a curious blog post called “Stupid Things People Do When Their Society Breaks Down” at a website called alt-market.com, which has a subheading that reads “Sovereignty • Integrity • Prosperity.” I haven’t delved far at all into the site, but it appears to offer alternative news and advice for preppers. The blog post uses the terms liberty activists and preparedness-minded, the first of which I found a little self-congratulatory. Existence of anarchist movements, which include The Liberty Movement (mentioned in the comments at the site), have been known to me for some time, but my personal response to the prospect (indeed, inevitability) of collapse does not fit with theirs. Quoted below are the introductory paragraph, headings (seven stupid things referred to in the title), and truncated blurbs behind each (apologies for the long quote). My comments follow.

A frequent mistake that many people make when considering the concept of social or economic collapse is to imagine how people and groups will behave tomorrow based on how people behave today. It is, though, extremely difficult to predict human behavior in the face of terminal chaos. What we might expect, or what Hollywood fantasy might showcase for entertainment purposes, may not be what actually happens when society breaks down.

They Do Nothing. It’s sad to say, but the majority of people, regardless of the time or place in history, have a bad habit of ignoring the obvious.

They Sabotage Themselves With Paranoia. Even in the early stages of a social breakdown when infrastructure is still operational, paranoia among individuals and groups can spread like a poison.

They Become Shaky And Unreliable When The Going Gets Tough. This is why early organization is so important; it gives you time to learn the limitations and failings of the people around you before the SHTF.

They Become Hotheads And Tyrants. On the other side of the coin, there are those individuals who believe that if they can control everything and everyone in their vicinity then this will somehow mitigate the chaos of the world around them. They are people who secretly harbor fantasies of being kings during collapse.

They Become Political Extremists. Throughout most modern collapses, two politically extreme ideologies tend to bubble to the surface — communism and fascism. Both come from the same root psychosis, the psychosis of collectivism.

They Become Religious Zealots. Zealotry is essentially fanaticism to the point of complete moral ambiguity. Everyone who does not believe the way the zealot believes is the “other,” and the other is an enemy that must be annihilated.

They Abandon Their Moral Compass. Morally relative people when discovered are usually the first to be routed out or the first to die in survival situations because they cannot be trusted. No one wants to cooperate with them except perhaps other morally relative people.

Despite my basic disagreement that it’s possible to prepare effectively anymore for industrial collapse (or indeed that survival is necessarily a desirable thing in a collapse scenario), the advice seems to me pretty solid given the caveat that it’s “extremely difficult to predict human behavior in the face of terminal chaos.” However, they’re all negative lessons. One can certainly learn from the mistakes of history and attempt to avoid repeating them. (We have a predictably poor track record of learning from historical mistakes.) It may well be a case of hindsight bias that what looks perfectly clear from past examples can be used as a sort of template for best-laid-plans for both the process and aftermath of what may well be (by the article’s own admission) a protracted phase of social unrest and breakdown.

That said, let me argue just one thing, namely, why it may not be stupid (as the article opines rather flatly) after all to do nothing in preparation for rather foreseeable difficulties. Long answer short, it simply won’t matter. Whatever the precipitating event or process, the collapse of industrial civilization, unlike previous civilizational collapses, will be global. Moreover, it will be accompanied by ecological collapse and a global extinction event (process) on par with at least five previous mass extinctions. The world will thus be wrecked for human habitation on anything but the shortest additional term over those who perish at the outset. This is before one takes into account climate change (already underway but could become abrupt and nonlinear at any time) and the inevitable irradiation of the planet when 400+ nuclear sites go critical.

It’s not unusual for me to be accused of a convenient fatalism, of doing nothing because the alternative (doing something) is too difficult. That accusation sticks, of course; I won’t dispute it. However, my reading of trends guarantees the impossibility of stalling, much less reversing, our current trajectory and further suggests that the window of opportunity closed approximately forty years ago during the oil crisis and ecology movement of the 1970s. I would certainly throw my weight, influence, and effort (which are for all practical purposes nil) behind doing what is still possible to undo the worst instances of human corruption and despoliation. In addition, it seems to me worthwhile to map out what it would mean to meet our collective doom with grace, integrity, and yes, grim determination. That’s not doing nothing, but I’ve seen remarkably little discussion of those possible responses. What I see plenty of instead is a combination of bunker mentality and irrational desire for righteous punishment of perpetual enemies as we each cross death’s door. Both are desperate last hurrahs, final expressions of human frailty in the face of intractable and unfathomable loss. These, too, are the false promises of the last crop of presidential hopefuls, who ought to know quite well that presiding over collapse might just be worst possible vantage point, possessed of the full power of the state yet unable to overcome the force of momentum created by our own history.

I remember that sinking feeling when the Deepwater Horizon oil well blew out in April 2010 and gushed oil into the Gulf of Mexico for 87 days at an estimated rate of 62,000 barrels per day (9,900 m3/d) until it was reportedly capped (but may not have been fully contained). That feeling was more intense than the disgust I felt at discovering the existence of the Great Pacific Garbage Patch (and subsequently others in the Atlantic and Indian Oceans). For reasons that make no particular sense, slo-mo ecological disasters in the oceans didn’t sicken me as much as high-speed despoliation of the Gulf. More recently, I’ve been at a loss, unable to process things, actually, at two new high-speed calamities: the contaminated tap water flowing from public waterworks in Flint, MI, and the methane leak from an underground well in the Porter Ranch neighborhood of Los Angeles, CA (no links provided, search for yourself). Whereas the first two examples turned my stomach at the mere knowledge, the second two are quite literally sickening people.

These examples could be part of a daily diet of stomach-churning news if I had the nerve to gather further examples. Indeed, the doomer sites I habituate at intervals (no longer daily) gather them together for me. As with the examples above, many are industrial chemical spills and contamination; others are animal and plant populations dying en masse (e.g., bee colony collapse disorder); yet others are severe weather events (e.g., the California drought) on the rise due to the onset of climate change (which has yet to go nonlinear). Miserable examples keep piling up, yet business as usual continues while it can. Death tolls are difficult to assess, but at present, they appear to be impacting nonhuman species with greater ferocity thus far. Some characterize this as Mother Nature doing her necessary work by gradually removing the plant and animal species on which humans depend as the very top apex predator. That means eventually removing us, too. I don’t care for such a romantic anthropomorphism. Rather, I observe that we humans are doing damage to the natural world and to ourselves in perhaps the slowest slo-mo disaster, the most likely endpoint being near-term extinction.

As much, then, as the alarm has been sounding adequately with respect to high-speed disasters stemming from human greed, incompetence, and frailty, I find that even worse calamity awaiting us has yet to penetrate the popular mind. Admittedly, it’s awfully hard to get one’s head around: the extinction of the human species. Those who resign themselves to speaking the truth of inevitability are still characterized as kooks, wackos, conspiracy mongers, and worse, leaders of death cults. From my resigned side of the fence, proper characterization appears to be the very opposite: those who actively ruin nature for profit and power are the death cult leaders, while those who prophesy doom are merely run-of-the-mill Cassandras. The ranks of the latter, BTW, seem to be gaining while critical thought still exists in small, isolated oases.

A friend gave me William Ophuls’ Immoderate Greatness: Why Civilizations Fail to read a couple years ago and it sat on my shelf until just recently. At only 93 pp. (with bibliographical recommendations and endnotes), it’s a slender volume but contains a good synopsis of the dynamics that doom civilizations. I’ve been piecing together the story of industrial civilization and its imminent collapse for about eight years now, so I didn’t expect Ophuls’ analysis to break new ground, which indeed it didn’t (at least for me). However, without my own investigations already behind me, I would not have been too well convinced by Ophuls’ CliffsNotes-style arguments. Armed with what I already learned, Ophuls is preaching to the choir (member).

The book breaks into two parts: biophysical limitations and cultural impediments borne out of human error. Whereas I’m inclined to award greater importance to biophysical limits (e.g., carrying capacity), particularly but not exclusively as civilizations overshoot and strip their land and resource bases, I was surprised to read this loose assertion:

… maintaining a civilization takes a continuous input of matter, energy, and morale, and the latter is actually the most important. [p. 51]

Upon reflection, it seems to be a chicken-and-egg question. Which comes first, increased and unmet demands for inputs or exhausted and/or diminished inputs due to human factors? The historical record of failed empires and civilizations offers examples attributable to both. For instance, the Incan civilization is believed to have risen and fallen on the back of climate change, whereas the fall of the Roman and British Empires stems more from imperial overreach. Reasons are never solely factor A or B, of course; a mixture of dynamic effects is easily discoverable. Still, the question is inevitable for industrial civilization now on a trajectory toward extinction no less that other (already extinct) civilizations, especially for those who believe it possible to learn from past mistakes and avoid repetition.

(more…)

Continuing from part 1 and part 2, let me add one further example of how meaning is reversed under the Ironic perspective. At my abandoned group blog, Creative Destruction, which garners more traffic than The Spiral Staircase despite being woefully out of date, the post that gets the most hits argues (without irony) that, in the Star Wars universe, the Empire represents the good guys and the Jedi are the terrorists despite the good vs. evil archetypes being almost cartoonishly drawn, with the principal villain having succumbed to the dark side only to be redeemed by his innate goodness in the 11th hour. The reverse argument undoubtedly has some merit, but it requires overthinking and outsmarting oneself to arrive at the backwards conclusion. A similar dilemma of competing perspectives is present in The Avengers, where Captain America is unconflicted in his all-American goodness and straightforward identification of villainy but is surrounded by other far-too-clever superheroes who overanalyze (snarkily so), cannot agree on strategy, and/or question motivations and each others’ double or triple agency. If I understand correctly, this plot hook is the basis for the civil war among allies in the next Avengers movie.

The Post-Ironic takes the reversal of meaning and paradoxical retention of opposites that characterizes the Ironic and expands issues from false dualisms (e.g., either you’re with us or against us) to multifaceted free-for-alls where anyone’s wild interpretation of facts, events, policy, and strategy has roughly equal footing with another’s precisely because no authority exists to satisfy everyone as to the truth of matters. The cacophony of competing viewpoints — the multiplicity of possible meanings conjured from any collection of evidence — virtually guarantees that someone out there (often someone loony) will speak as though reading your mind. Don’t trust politicians, scientists, news anchors, pundits, teachers, academics, your parents, or even the pope? No problem. Just belly up to the ideological buffet and cherry pick choose from any of a multitude of viewpoints, few of which have much plausibility. But no matter: it’s a smorgasbord of options, and almost none of them can be discarded out of hand for being too beyond the pale. All must be tried and entertained.

One of the themes of this blog is imminent (i.e., occurring within the lifetimes of most readers) industrial collapse resulting from either financial collapse or loss of habitat for humans (or a combination of factors). Either could happen first, but my suspicion is that financial collapse will be the lit fuse leading to explosion of the population bomb. Collapse is quite literally the biggest story of our time despite its being prospective. However, opinion on the matter is loose, undisciplined, and ranges all over the map. Consensus within expert bodies such as the Intergovernmental Panel on Climate Change, assembled specifically to study climate change and reports its findings, ought to put an end to controversy, yet waters have been so muddied by competing narratives that credulous folks, if they bother paying attention at all, can’t really tell whom to believe. It doesn’t help that even well-educated folks, including many professionals, often lack critical thinking skill with which to evaluate evidence. So instead, wishy-washy emotionalism and psychological vulnerability awards hearts and minds to the most charismatic storyteller, not the truth-teller.

Perhaps the best instance of multiple meanings being simultaneously present and demanding consideration is found in the game of poker, which has become enormously popular in the past decade. To play the game effectively, one must weigh the likelihood and potential for any one of several competing narratives based on opponents’ actions. Mathematical analysis and intuition combine to recommend which scenario is most likely true and whether the risk is worth it (pot odds). If, for just one example, an opponent bets big at any point in the poker hand, several scenarios that must be considered:

  • the opponent has made his hand and cannot be beaten (e.g., nut flush, full house)
  • the opponent has a dominating hand and can be beaten only if one draws to make a better hand (e.g., top pair with high kicker or two pair)
  • the opponent has not yet fully made his hand and is on a draw (open-ended straight or four cards to a flush)
  • the opponent has a partial or weak hand and is bluffing at the pot

Take note that, as with climate change, evaluation in poker is prospective. Sometimes an opponent’s betting strategy is discovered in a showdown where players must reveal their cards; but often, one player or another mucks or folds and the actual scenario is undisclosed. The truth of climate change, until the future manifests, is to some tantalizingly unknown and contingent, as though it could be influenced by belief, hope, and/or faith. To rigorous thinkers, however, the future is charted for us with about the same inevitability as the sun rising in the morning — the biggest remaining unknown being timing.

Habitual awareness of multiple, competing scenarios extends well beyond table games and climate change. In geopolitics, the refusal to rule out the nuclear option, even when it would be completely disproportionate to a given provocation, is reckless brinkmanship. The typical rhetoric is that, like fighting dogs, any gesture of backing down would be interpreted as a display of submission or weakness and thus invite attack. So is the provocation or the response a bluff, a strong hand, or both? Although it is difficult to judge how U.S. leadership is perceived abroad (since I’m inside the bubble), the historical record demonstrates that the U.S. never hesitates to get mixed up in military action and adopts overweening strategies to defeat essentially feudal societies (e.g., Korea and Vietnam). Never mind that those strategies have been shown to fail or that those countries represented no credible threat to the U.S. Our military escapades in the 21st century are not so divergent, with the perception of threats being raised well beyond their true proportions relative to any number of health and social scourges that routinely kill many more Americans than terrorism ever did.

Because this post is already running long, conclusions will be in an addendum. Apologies for the drawn out posts.