Posts Tagged ‘Collapse’

As I read into Fingerprints of the Gods by Graham Hancock and learn more about antiquity, it becomes clear that weather conditions on Earth were far more hostile then (say, 15,000 years ago) than now. Looking way, way back into millions of years ago, scientists have plotted global average temperature and atmospheric carbon, mostly using ice cores as I understand it, yielding this graph:

co2-levels-over-time1

I’ve seen this graph before, which is often used by climate change deniers to show a lack of correlation between carbon and temperature. That’s not what concerns me. Instead, the amazing thing is how temperature careens up and down quickly (in geological time) between two limits, 12°C and 22°C, and forms steady states known at Ice Age Earth and Hot House Earth. According to the graph, we’re close to the lower limit. It’s worth noting that because of the extremely long timescale, the graph is considerably smoothed.

(more…)

As a boy, my home included a coffee table book, title unknown, likely published circa 1960, about the origins of human life on Earth. (A more recent book of this type attracting lots of attention is Sapiens: A Brief History of Humankind (2015) by Yuval Harari, which I haven’t yet read.) It was heavily enough illustrated that my siblings and I consulted it mostly for the pictures, which can probably be excused since we were youngsters at the time time. What became of the book escapes me. In the intervening decades, I made no particular study of the ancient world — ancient meaning beyond the reach of human memory systems. Thus, ancient could potentially refer to anthropological history in the tens of thousands of years, evolutionary history stretching across tens of millions of years, geological history over hundreds of millions of years, or cosmological time going back a few billions. For the purpose of this blog post, let’s limit ancient to no more than fifty thousand years ago.

A few months ago, updates (over the 1960 publication) to the story of human history and civilization finally reached me (can’t account for the delay of several decades) via webcasts published on YouTube between Joe Rogan, Randall Carlson, and Graham Hancock. Rogan hosts the conversations; Carlson and Hancock are independent researchers whose investigations converge on evidence of major catastrophes that struck the ancient world during the Younger Dryas Period, erasing most but not all evidence of an antediluvian civilization. Whereas I’m a doomer, they are catastrophists. To call this subject matter fascinating is a considerable understatement. And yet, it’s neither here nor there with respect to how we conduct our day-to-day lives. Their revised history connects to religious origin stories, but such narratives have been relegated to myth and allegory for a long time already, making them more symbolic than historical.

In the tradition of Galileo, Copernicus, Newton, and Darwin, all of whom went against scientific orthodoxy of their times but were ultimately vindicated, Carlson and Graham appear to be rogue scientists/investigators exploring deep history and struggling against the conventional story of the beginnings of civilization around 6,000 years ago in the Middle East and Egypt. John Anthony West is another who disputes the accepted narratives and timelines. West is also openly critical of “quackademics” who refuse to consider accumulating evidence but instead collude to protect their cherished ideological and professional positions. The vast body of evidence being pieced together is impressive, and I truly appreciate their collective efforts. I’m about 50 pp. into Hancock’s Fingerprints of the Gods (1995), which contains copious detail not well suited to the conversational style of a webcast. His follow-up Magicians of the Gods (2015) will have to wait. Carlson’s scholarly work is published at the website Sacred Geometry International (and elsewhere, I presume).

So I have to admit that my blog, launched in 2006 as a culture blog, turned partially into a doomer blog as that narrative gained the weight of overwhelming evidence. What Carlson and Hancock in particular present is evidence of major catastrophes that struck the ancient world and are going to repeat: a different sort of doom, so to speak. Mine is ecological, financial, cultural, and finally civilizational collapse borne out of exhaustion, hubris, frailty, and most importantly, poor stewardship. Theirs is periodic cataclysmic disaster including volcanic eruptions and explosions, great floods (following ice ages, I believe), meteor strikes, earthquakes, tsunamis, and the like, each capable of ending civilization all at once. Indeed, those inevitable events are scattered throughout our geological history, though at unpredictable intervals often spaced tens or hundreds of thousands of years apart. For instance, the supervolcano under Yellowstone is known to blow roughly every 600,000 years, and we’re overdue. Further, the surface of the Moon indicates bombardment from meteors; the Earth’s history of the same is hidden somewhat by continuous transformation of the landscape lacking on the Moon. The number of near misses, also known as near-Earth objects, in the last few decades is rather disconcerting. Any of these disasters could strike at any time, or we could wait another 10,000 years.

Carlson and Hancock warn that we must recognize the dangers, drop our petty international squabbles, and unite as a species to prepare for the inevitable. To do otherwise would be to court disaster. However, far from dismissing the prospect of doom I’ve been blogging about, they merely add another category of things likely to kill us off. They give the impression that we should turn our attention away from sudden climate change, the Sixth Extinction, and other perils to which we have contributed heavily and worry instead about death from above (the skies) and below (the Earth’s crust). It’s impossible to say which is the most worrisome prospect. As a fatalist, I surmise that there is little we can do to forestall any of these eventualities. Our fate is already sealed in one respect or another. That foreknowledge make life precious for me, and frankly, is another reason to put aside our petty squabbles.

I don’t have the patience or expertise to prepare and offer a detailed political analysis such as those I sometimes (not very often) read on other blogs. Besides, once the comments start filling up at those sites, every possible permutation is trotted out, muddying the initial or preferred interpretation with alternatives that make at least as much sense. They’re interesting brainstorming sessions, but I have to wonder what is accomplished.

My own back-of-the-envelope analysis is much simpler and probably no closer to (or farther from) being correct, what with everything being open to dispute. So the new POTUS was born in 1946, which puts the bulk of his boyhood in the 1950s, overlapping with the Eisenhower Administration. That period has lots of attributes, but the most significant (IMO), which would impact an adolescent, was the U.S. economy launching into the stratosphere, largely on the back of the manufacturing sector (e.g., automobiles, airplanes, TVs, etc.), and creating the American middle class. The interstate highway system also dates from that decade. Secondarily, there was a strong but misplaced sense of American moral leadership (one might also say authority or superiority), since we took (too much) credit for winning WWII.

However, it wasn’t great for everyone. Racism, misogyny, and other forms of bigotry were open and virulent. Still, if one was lucky to be a white, middle class male, things were arguably about as good as they would get, which many remember rather fondly, either through rose-colored glasses or otherwise. POTUS as a boy wasn’t middle class, but the culture around him supported a worldview that he embodies even now. He’s also never been an industrialist, but he is a real estate developer (some would say slumlord) and media figure, and his models are taken from the 1950s.

The decade of my boyhood was the 1970s, which were the Nixon, Ford, and Carter Administrations. Everyone could sense the wheels were already coming off the bus, and white male entitlement was far diminished from previous decades. The Rust Belt was already a thing. Like children from the 1950s forward, however, I spent a lot of time in front of the TV. Much of it was goofy fun such as Gilligan’s Island, The Brady Bunch, and interestingly enough, Happy Days. It was innocent stuff. What are the chances that, as a boy plopped in front of the TV, POTUS would have seen the show below (excerpted) and taken special notice considering that the character shares his surname?

Snopes confirms that this a real episode from the TV show Trackdown. Not nearly as innocent as the shows I watched. The coincidences that the character is a con man, promises to build a wall, and claims to be the only person who can save the town are eerie, to say the least. Could that TV show be lodged in the back of POTUS’ brain, along with so many other boyhood memories, misremembered and revised the way memory tends to do?

Some have said that the great economic expansion of the 1950s and 60s was an anomaly. A constellation of conditions configured to produce an historical effect, a Golden Era by some reckonings, that cannot be repeated. We simply cannot return to an industrial or manufacturing economy that had once (arguably) made America great. And besides, the attempt would accelerate the collapse of the ecosystem, which is already in free fall. Yet that appears to be the intention of POTUS, whose early regression to childhood is a threat to us all.

Anthropologists, pundits, armchair cultural critics (like me), and others sometimes offer an aspect or characteristic, usually singular, that separates the human species from other animals. (Note: humans are animals, not the crowning creation of god in his own image, the dogma of major religions.) Typical singular aspects include tool use (very early on, fire), language, agriculture, self-awareness (consciousness), and intelligence, that last including especially the ability to conceptualize time and thus remember and plan ahead. The most interesting candidate suggested to me is our ability to kill from a distance. Without going into a list of things we don’t think we share with other species but surprisingly do, it interests me that none other possesses the ability to kill at a distance (someone will undoubtedly prove me wrong on this).

Two phrases spring to mind: nature is red in tooth and claw (Tennyson) and human life is nasty, brutish, and short (Hobbes). Both encapsulate what it means to have to kill to eat, which is hardly unique to animals. All sorts of plants, insects, and microorganisms embed themselves in hosts, sometimes killing the host and themselves. Symbiotic relationships also exist. The instance that interests me, though, is the act of killing in the animal kingdom that requires putting one’s own body at risk in life-or-death attack. Examples falling short of killing abound, such as intimidation to establish hierarchy, but to eat, an animal must kill its prey.

Having watched my share of historical fiction (pre-1800, say, but especially sword-and-sandal and medieval epics) on the TeeVee and at the cinema, the dramatic appeal of warring armies slamming into each other never seems to get old. Fighting is hand-to-hand or sword-to-sword, which are tantamount to the same. Archer’s arrows, projectiles launched from catapults and trebuchets, thrown knives, spears, and axes, and pouring boiling oil over parapets are killing from a relatively short distance, but the action eventually ends up being very close. The warrior code in fighting cultures honors the willingness to put oneself in harm’s way, to risk one’s own body. Leaders often exhibit mutual respect and may even share some intimacy. War may not be directly about eating, since humans are not cannibals under most circumstances; rather, it’s usually about control of resources, so secondarily about eating by amassing power. Those historical dramas often depict victors celebrating by enjoying lavish feasts.

Modern examples of warfare and killing from a distance make raining down death from above a bureaucratic action undertaken with little or no personal risk. Artillery, carpet bombing from 20,000 feet, drone strikes (controlled from the comfort of some computer lab in the Utah desert), and nuclear bombs are the obvious examples. No honorable warrior code attaches to such killing. Indeed, the chain of command separates the execution of kill orders from moral responsibility — probably a necessary disconnect when large numbers of casualties (collateral damage, if one prefers the euphemism) can be expected. Only war criminals, either high on killing or banally impervious to empathy and compassion, would dispatch hundreds of thousands at a time.

If killing from a distance is in most cases about proximity or lack thereof, one further example is worth mentioning: killing across time. While most don’t really conceptualize the space-time continuum as interconnected, the prospect of choices made today manifesting in megadeath in the foreseeable future is precisely the sort of bureaucratized killing from a distance that should be recognized and forestalled. Yet despite our supposed intellectual superiority over other species, we cannot avoid waging war, real and rhetorical, to control resources and narratives that enable us to eat. Eating the future would be akin to consuming seed corn, but that metaphor is not apt. Better perhaps to say that we’re killing the host. We’re embedded in the world, as indeed is everything we know to be alive, and rely upon the profundity of the biosphere for survival. Although the frequent charge is that humanity is a parasite or has become as cancer on the world, that tired assessment, while more accurate than not, is a little on the nose. A more charitable view is that, as a species, humanity, as the apex predator, has expanded its habitat to include the entire biosphere, killing to eat, and is slowly consuming and transforming it into a place uninhabitable by us, just as a yeast culture consumes its medium and grows to fill the space before dying all at once. So the irony or Pyrrhic victory is that we while we may fatten ourselves (well, some of us) in the short term, we have also created conditions leading to our own doom. Compared to other species whose time on Earth lasted tens of millions of years, human life on Earth turns out to be exactly what Hobbes said: nasty, brutish, and short.

In what has become a predictable status quo, President Obama recently renewed our official state of emergency with respect to the so-called War on Terror. It’s far too late to declare a new normal; we’ve been in this holding pattern for 16 years now. The article linked above provides this useful context:

There are now 32 states of national emergency pending in the United States, with the oldest being a 1979 emergency declared by President Jimmy Carter to impose sanctions during the Iran hostage crisis. Most are used to impose economic sanctions — mostly as a formality, because Congress requires it under the International Emergency Economic Powers Act.

In his term in office, Obama has declared 13 new emergencies, continued 21 declared by his predecessors and revoked just two, which imposed sanctions on Liberia and Russia.

Pro forma renewal of multiple states of national emergency is comparable to the 55-year-old U.S. embargo against Cuba, due for reauthorization next month, though indications are that the embargo may finally be relaxed or deauthorized. Both are examples of miserably failed policy, but they confer a semblance of power on the executive branch. Everyone knows by now that no one relinquishes power willingly, so Obama, like chief executives before him, keeps on keeping on ad nauseum.

Considering Obama’s credential as a Constitutional scholar, relatively unique among U.S. presidents, one might expect him to weigh his options with greater circumspection and with an eye toward restoring suspended civil liberties. However, he has shown little interest in doing so (as far as I know). In combination with the election only a couple months away, the U.S. appears to be in a position similar to Germany in 1932 — ready and willing to elect a despot (take your pick …) and continue its slide into fascism. Can’t even imagine avoiding that outcome now.

The surprising number of ongoing emergencies makes me point to James Howard Kunstler and his book The Long Emergency (2006). Though I haven’t read the book (I’m a failed doomer, I suppose), my understanding is that his prediction of a looming and lingering emergency is based on two intertwined factors currently playing out in geopolitics: peak oil and global warming. (“Climate change” is now preferred over “global warming.”) Those two dire threats (and the California drought) have faded somewhat from the headlines, partially due to fatigue, replaced primarily by terrorism and economic stresses, but the dangers never went away. Melting icecaps and glaciers are probably the clearest incontrovertible indications of anthropogenic global warming, which is poised to trigger nonlinear climate change and hasten the Sixth Extinction. We don’t know when, precisely, though time is growing short. Similarly, reports on energy production and consumption are subject to considerable falsification in the public sphere, making it impossible to know just how close in time we are to a new energy crisis. That inevitability has also been the target of a disinformation campaign, but even a rudimentary understanding of scientific principles is sufficient to enable clear thinkers to penetrate the fog.

I have no plans to return to doom blogging with any vigor. One emergency stacked upon the next, ready to collapse in a cascade of woe, has defeated me, and I have zero expectation that any real, meaningful response can be formulated and executed, especially while we are distracted with terrorism and creeping fascism.

An enduring trope of science fiction is naming of newly imagined gadgets and technologies (often called technobabble with a mixture of humor and derision), as well as naming segments of human and alien societies. In practice, that means renaming already familiar things to add a quasi-futuristic gleam, and it’s a challenge faced by every story that adopts an alternative or futuristic setting: describing the operating rules of the fictional world but with reference to recognizable human characteristics and institutions. A variety of recent Young Adult (YA) fiction has indulged in this naming and renaming, some of which have been made into movies, mostly dystopic in tone, e.g., the Hunger Games tetralogy, the Twilight saga, the Harry Potter series, the Maze Runner, and the Divergent trilogy. (I cite these because, as multipart series, they are stronger cultural touchstones, e.g., Star Wars, than similar once-and-done adult cinematic dystopias, e.g., Interstellar and ElyseumStar Trek is a separate case, considering how it has devolved after being rebooted from its utopian though militaristic origins into a pointless series of action thrillers set in space.) Some exposition rises to the level of lore but is mostly mere scene-setting removed slightly from our own reality. Similar naming schemes are used in cinematic universes borne out of comic books, especially character names, powers, and origins. Because comic book source material is extensive, almost all of it becomes lore, which is enjoyed by longtime children initiates into the alternate universes created by the writers and illustrators but mildly irritating to adult moviegoers like me.

History also has names for eras and events sufficiently far back in time for hindsight to provide a clear vantage point. In the U.S., we had the Colonial Era, the Revolutionary Period, The Frontier Era and Wild West, the Industrial/Mechanical Age, Modernism, and Postmodernism, to name a few but by no means all. Postmodernism is already roughly 40 years old, yet we have not yet named the era in which we now live. Indeed, because we’re the proverbial fish inside the fishbowl, unable to recognize the water in which we swim, the contemporary moment may have no need of naming, now or at any given time. That task awaits those who follow. We have, however, given names to the succession of generations following the Baby Boom. How well their signature characteristics fit their members is the subject of considerable debate.

As regular readers of this blog already know, I sense that we’re on the cusp of something quite remarkable, most likely a hard, discontinuous break from our recent past. Being one of the fish in the bowl, I probably possess no better understanding of our current phase of history than the next. Still, if had to choose one word to describe the moment, it would be dissolution. My 4-part blog post about dissolving reality is one attempt to provide an outline. A much older post called aged institutions considers the time-limited effectiveness of products of human social organization. The grand question of our time might be whether we are on the verge of breaking apart or simply transitioning into something new — will it be catastrophe or progress?

News this past week of Britain’s exit from the European Union may be only one example of break-up vs. unity, but the drive toward secession and separatism (tribal and ideological, typically based on bogus and xenophobic identity groups constantly thrown in our faces) has been gaining momentum even in the face of economic globalization (collectivism). Scotland very nearly seceded from the United Kingdom last year; Quebec has had multiple referenda about seceding from Canada, none yet successful; and Vermont, Texas, and California have all flirted with secession from the United States. No doubt some would argue that such examples of dissolution, actual or prospective, are actually transitional, meaning progressive. And perhaps they do in fact fulfill the need for smaller, finer, localized levels of social organization that many have argued are precisely what an era of anticipated resource scarcity demands. Whether what actually manifests will be catastrophe (as I expect it will) is, of course, what history and future historians will eventually name.

The question comes up with some regularity: “Where were you when …?” What goes in place of the ellipsis has changed with the generations. For my parents, it was the (first) Kennedy assassination (1963). For my siblings and chronological peers, it was first the Three Mile Island accident (1979) and then the Challenger disaster (1986). For almost everyone since, it’s been the September 11 attacks (2001), though a generation lacking memory of those events is now entering adulthood. These examples are admittedly taken from mainstream U.S. culture. If one lives elsewhere, it might be the Mexico City earthquake (1985), the Chernobyl disaster (1986), the Indonesian tsunami (2004), the Haitian earthquake (2010), the Fukushima disaster (2011), or any number of other candidates populating the calendar. Even within the U.S., other more local events might take on greater significance, such as Hurricane Katrina (2005) along the Gulf Coast or Hurricane Iniki (1992) in Hawaii, the latter of which gives September 11 a very different meaning for those who suffered through it.

What these events all have in common is partially a loss of innocence (particularly the man-made disasters) and a feeling of suspended animation while events are sorted out and news is processed. I remember with clarity how the TV news went into full disaster mode for the Challenger disaster, which proved to be the ridiculous template for later events, including 9/11. Most of the coverage was denial of the obvious and arrant speculation, but the event itself was captivating enough that journalists escaped wholesale condemnation they plainly deserved. The “where were you?” question is usually answered with the moment one became aware of some signal event, such as “I was on the bus” or “I was eating dinner.” News vectors changed dramatically from 1986 to 2001, as two relatively arbitrary points of comparison within my lifetime. Being jarred out of complacency and perceiving the world suddenly as a rather dangerous place hardly expresses the weird wait-and-see response most of us experience in the wake of disaster.

Hurricanes typically come with a narrow warning, but other events strike without clear expectation — except perhaps in terms of their inevitability. That inevitability informs expectations of further earthquakes (e.g., San Andreas and New Madrid faults) and volcanic eruptions (e.g., the Yellowstone supervolcano), though the predictive margin of error can be hundreds or even thousands of years. My more immediate concern is with avoidable man-made disasters that are lined up to fall like a series of dominoes. As with natural disasters, we’re all basically sitting ducks, completely vulnerable to what armed mayhem may arise. But rather than enter into suspended animation in the wake of events, current political, financial, and ecological conditions find me metaphorically ducking for cover in expectation of the inevitable blow(s). Frankly, I’ve been expecting political and financial crack-ups for years, and current events demonstrate extremely heightened risk. (Everyone seems to be asking which will be worse: a Trump or Clinton presidency? No one believes either candidate can guide us successfully through the labyrinth.) A tandem event (highly likely, in my view) could easily trigger a crisis of significant magnitude, given the combination of violent hyperbole and thin operational tolerance for business as usual. I surmise that anyone who offers the line “may you live in interesting times” has a poor understanding of what’s truly in store for us. What happens with full-on industrial collapse is even harder to contemplate.

Today is the 10-year anniversary of the opening of this blog. As a result, there is a pretty sizeable backblog should anyone decide to wade in. As mentioned in my first post, I only opened this blog to get posting privileges at a group blog I admired because it functioned more like a discussion than a broadcast. The group blog died of attrition years ago, yet here I am 10 years later still writing my personal blog (which isn’t really about me).

Social media lives and dies by the numbers, and mine are deplorable. Annual traffic has ranged from about 6,800 to about 12,500 hits, much of which I’m convinced is mere background noise and bot traffic. Cumulative hits number about 90,140, and unique visitors are about 19,350, neither of which is anything to crow about for a blog of this duration. My subscriber count continues to climb pointlessly, now resting at 745. However, I judge I might have only a half dozen regular readers and perhaps half again as many commentators. I’ve never earned a cent for my effort, nor am I likely to ever put up a Patreon link or similar goad for donations. All of which only demonstrate that almost no one cares what I have to write about. C’est la vie. I don’t write for that purpose and frankly wouldn’t know what to write about if I were trying to drive numbers.

So if you have read my blog, what are some of the thing you might have gathered from me? Here’s an incomplete synopsis:

  • Torture is unspeakably bad. History is full of devices, methodologies, and torturers, but we learned sometime in the course of two 20th-century world wars that nothing justifies it. Nevertheless, it continues to occur with surprising relish, and those who still torture (or want to) are criminally insane.
  • Skyscrapers are awesomely tall examples of technical brilliance, exuberance, audacity, and hubris. Better expressions of techno-utopian, look-mom-no-hands, self-defeating narcissism can scarcely be found. Yet they continue to be built at a feverish pace. The 2008 financial collapse stalled and/or doomed a few projects, but we’re back to game on.
  • Classical music, despite record budgets for performing ensembles, has lost its bid for anything resembling cultural and artistic relevance by turning itself into a museum (performing primarily works of long-dead composers) and abandoning emotional expression in favor of technical perfection, which is probably an accurate embodiment of the spirit of the times. There is arguably not a single living composer who has become a household name since Aaron Copland, who died in 1990 but was really well-known in the 1940s and 50s.
  • We’re doomed — not in any routine sense of the word having to do with individual mortality but in the sense of Near-Term (Human) Extinction (NTE). The idea is not widely accepted in the least, and the arguments are too lengthy to repeat (and unlikely to convince). However, for those few able to decipher it, the writing is on the wall.
  • American culture is a constantly moving target, difficult to define and describe, but its principal features are only getting uglier as time wears on. Resurgent racism, nationalism, and misogyny make clear that while some strides have been made, these attitudes were only driven underground for a while. Similarly, colonialism never really died but morphed into a new version (globalization) that escapes criticism from the masses, because, well, goodies.
  • Human consciousness — another moving target — is cratering (again) after 3,000–5,000 years. We have become hollow men, play actors, projecting false consciousness without core identity or meaning. This cannot be sensed or assessed easily from the first-person perspective.
  • Electronic media makes us tools. The gleaming attractions of sterile perfection and pseudo-sociability have hoodwinked most of the public into relinquishing privacy and intellectual autonomy in exchange for the equivalent of Huxley’s soma. This also cannot be sensed or assessed easily from the first-person perspective.
  • Electoral politics is a game played by the oligarchy for chumps. Although the end results are not always foreseeable (Jeb!), the narrow range of options voters are given (lesser of evils, the devil you know …) guarantees that fundamental change in our dysfunctional style of government will not occur without first burning the house down. After a long period of abstention, I voted in the last few elections, but my heart isn’t really in it.
  • Cinema’s infatuation with superheros and bankable franchises (large overlap there) signals that, like other institutions mentioned above, it has grown aged and sclerotic. Despite large budgets and impressive receipts (the former often over $100 million and the latter now in the billions for blockbusters) and considerable technical prowess, cinema has lost its ability to be anything more than popcorn entertainment for adolescent fanboys (of all ages).

This is admittedly a pretty sour list. Positive, worthwhile manifestations of the human experience are still out there, but they tend to be private, modest, and infrequent. I still enjoy a successful meal cooked in my own kitchen. I still train for and race in triathlons. I still perform music. I still make new friends. But each of these examples is also marred by corruptions that penetrate everything we do. Perhaps it’s always been so, and as I, too, age, I become increasingly aware of inescapable distortions that can no longer be overcome with innocence, ambition, energy, and doublethink. My plan is to continue writing the blog until it feels like a burden, at which point I’ll stop. But for now, there’s too much to think and write about, albeit at my own leisurely pace.

I found a curious blog post called “Stupid Things People Do When Their Society Breaks Down” at a website called alt-market.com, which has a subheading that reads “Sovereignty • Integrity • Prosperity.” I haven’t delved far at all into the site, but it appears to offer alternative news and advice for preppers. The blog post uses the terms liberty activists and preparedness-minded, the first of which I found a little self-congratulatory. Existence of anarchist movements, which include The Liberty Movement (mentioned in the comments at the site), have been known to me for some time, but my personal response to the prospect (indeed, inevitability) of collapse does not fit with theirs. Quoted below are the introductory paragraph, headings (seven stupid things referred to in the title), and truncated blurbs behind each (apologies for the long quote). My comments follow.

A frequent mistake that many people make when considering the concept of social or economic collapse is to imagine how people and groups will behave tomorrow based on how people behave today. It is, though, extremely difficult to predict human behavior in the face of terminal chaos. What we might expect, or what Hollywood fantasy might showcase for entertainment purposes, may not be what actually happens when society breaks down.

They Do Nothing. It’s sad to say, but the majority of people, regardless of the time or place in history, have a bad habit of ignoring the obvious.

They Sabotage Themselves With Paranoia. Even in the early stages of a social breakdown when infrastructure is still operational, paranoia among individuals and groups can spread like a poison.

They Become Shaky And Unreliable When The Going Gets Tough. This is why early organization is so important; it gives you time to learn the limitations and failings of the people around you before the SHTF.

They Become Hotheads And Tyrants. On the other side of the coin, there are those individuals who believe that if they can control everything and everyone in their vicinity then this will somehow mitigate the chaos of the world around them. They are people who secretly harbor fantasies of being kings during collapse.

They Become Political Extremists. Throughout most modern collapses, two politically extreme ideologies tend to bubble to the surface — communism and fascism. Both come from the same root psychosis, the psychosis of collectivism.

They Become Religious Zealots. Zealotry is essentially fanaticism to the point of complete moral ambiguity. Everyone who does not believe the way the zealot believes is the “other,” and the other is an enemy that must be annihilated.

They Abandon Their Moral Compass. Morally relative people when discovered are usually the first to be routed out or the first to die in survival situations because they cannot be trusted. No one wants to cooperate with them except perhaps other morally relative people.

Despite my basic disagreement that it’s possible to prepare effectively anymore for industrial collapse (or indeed that survival is necessarily a desirable thing in a collapse scenario), the advice seems to me pretty solid given the caveat that it’s “extremely difficult to predict human behavior in the face of terminal chaos.” However, they’re all negative lessons. One can certainly learn from the mistakes of history and attempt to avoid repeating them. (We have a predictably poor track record of learning from historical mistakes.) It may well be a case of hindsight bias that what looks perfectly clear from past examples can be used as a sort of template for best-laid-plans for both the process and aftermath of what may well be (by the article’s own admission) a protracted phase of social unrest and breakdown.

That said, let me argue just one thing, namely, why it may not be stupid (as the article opines rather flatly) after all to do nothing in preparation for rather foreseeable difficulties. Long answer short, it simply won’t matter. Whatever the precipitating event or process, the collapse of industrial civilization, unlike previous civilizational collapses, will be global. Moreover, it will be accompanied by ecological collapse and a global extinction event (process) on par with at least five previous mass extinctions. The world will thus be wrecked for human habitation on anything but the shortest additional term over those who perish at the outset. This is before one takes into account climate change (already underway but could become abrupt and nonlinear at any time) and the inevitable irradiation of the planet when 400+ nuclear sites go critical.

It’s not unusual for me to be accused of a convenient fatalism, of doing nothing because the alternative (doing something) is too difficult. That accusation sticks, of course; I won’t dispute it. However, my reading of trends guarantees the impossibility of stalling, much less reversing, our current trajectory and further suggests that the window of opportunity closed approximately forty years ago during the oil crisis and ecology movement of the 1970s. I would certainly throw my weight, influence, and effort (which are for all practical purposes nil) behind doing what is still possible to undo the worst instances of human corruption and despoliation. In addition, it seems to me worthwhile to map out what it would mean to meet our collective doom with grace, integrity, and yes, grim determination. That’s not doing nothing, but I’ve seen remarkably little discussion of those possible responses. What I see plenty of instead is a combination of bunker mentality and irrational desire for righteous punishment of perpetual enemies as we each cross death’s door. Both are desperate last hurrahs, final expressions of human frailty in the face of intractable and unfathomable loss. These, too, are the false promises of the last crop of presidential hopefuls, who ought to know quite well that presiding over collapse might just be worst possible vantage point, possessed of the full power of the state yet unable to overcome the force of momentum created by our own history.

I remember that sinking feeling when the Deepwater Horizon oil well blew out in April 2010 and gushed oil into the Gulf of Mexico for 87 days at an estimated rate of 62,000 barrels per day (9,900 m3/d) until it was reportedly capped (but may not have been fully contained). That feeling was more intense than the disgust I felt at discovering the existence of the Great Pacific Garbage Patch (and subsequently others in the Atlantic and Indian Oceans). For reasons that make no particular sense, slo-mo ecological disasters in the oceans didn’t sicken me as much as high-speed despoliation of the Gulf. More recently, I’ve been at a loss, unable to process things, actually, at two new high-speed calamities: the contaminated tap water flowing from public waterworks in Flint, MI, and the methane leak from an underground well in the Porter Ranch neighborhood of Los Angeles, CA (no links provided, search for yourself). Whereas the first two examples turned my stomach at the mere knowledge, the second two are quite literally sickening people.

These examples could be part of a daily diet of stomach-churning news if I had the nerve to gather further examples. Indeed, the doomer sites I habituate at intervals (no longer daily) gather them together for me. As with the examples above, many are industrial chemical spills and contamination; others are animal and plant populations dying en masse (e.g., bee colony collapse disorder); yet others are severe weather events (e.g., the California drought) on the rise due to the onset of climate change (which has yet to go nonlinear). Miserable examples keep piling up, yet business as usual continues while it can. Death tolls are difficult to assess, but at present, they appear to be impacting nonhuman species with greater ferocity thus far. Some characterize this as Mother Nature doing her necessary work by gradually removing the plant and animal species on which humans depend as the very top apex predator. That means eventually removing us, too. I don’t care for such a romantic anthropomorphism. Rather, I observe that we humans are doing damage to the natural world and to ourselves in perhaps the slowest slo-mo disaster, the most likely endpoint being near-term extinction.

As much, then, as the alarm has been sounding adequately with respect to high-speed disasters stemming from human greed, incompetence, and frailty, I find that even worse calamity awaiting us has yet to penetrate the popular mind. Admittedly, it’s awfully hard to get one’s head around: the extinction of the human species. Those who resign themselves to speaking the truth of inevitability are still characterized as kooks, wackos, conspiracy mongers, and worse, leaders of death cults. From my resigned side of the fence, proper characterization appears to be the very opposite: those who actively ruin nature for profit and power are the death cult leaders, while those who prophesy doom are merely run-of-the-mill Cassandras. The ranks of the latter, BTW, seem to be gaining while critical thought still exists in small, isolated oases.