An article in Wired pushes the meme that coal, whilst claiming the lion’s share of responsibility for releasing CO2 into the atmosphere, can be cleaned up to continue to provide (mostly electrical) energy for everything we use. Pshaw, I say. Comments at the magazine’s website also call bullshit on the article, going as far as to baldly accuse Wired of shilling for big energy, and note that hundreds of similar comments following publication of the story have been purged. Pro-and-con debate on the subject lies beyond my technical prowess, though I have my suspicions. Most interesting to me, however, is what’s not said.

The implicit assumption is that energy demand must be met somehow. Totally and utterly outside of consideration is demand destruction, whether through pricing, metering, or simple unavailability. Sure, there’s 100+ years of coal still available to be mined (or harvested, or exploited, or <choose your euphemism>). Guess we have no choice but to go after it, right? The author does shed some hazy light on environmental and health costs from burning coal, especially in China where it’s worst, but nowhere is there a suggestion that we might stop burning so much of the stuff, which I find a serious omission. Instead, in true technophiliac fashion, an unproven innovation will rescue us from the consequences of our own behavior and deliver salvation (BAU, I suppose, including gadgety distraction if that’s your idea of fun) in the form of “clean coal,” namely, underground resequestration of CO2 released in the process of burning coal. Basically, it’s the equivalent of continuing to dig the hole we’re in by attempting to refill it with its own pollutants. And never mind the delayed effects of what’s already done.

The “clean coal” meme was risible on its face when it appeared a few years ago. Innovation notwithstanding, it continues to be primarily the work of fiction authors marketers and, I guess, stringers for Wired. Several coal ash spills and tonnes upon tonnes of CO2 added to the atmosphere (increasing year over year without stalling) since the meme was hatched are far more convincing to me than hopes of a technofix. Facts and figures make better arguments most of the time. I have none to offer. Instead, let me simply point to everyday sights (and smells inferred from the visuals) we confront. Here is an image from twenty years ago of the city where I live:

Here’s a more recent one:

These days are become a lot less exceptional. How far down this path do we intend to go? All the way, I surmise.

I heard the title phrase — improper use of celebrity — uttered recently in relation to celebrity feuds that fuel the paparazzi and related parasite press. It was one high-profile celebrity (is there any other kind?) admonishing another to behave himself because it is a mistake to air petty grievances publicly and thus fan media flames. That seemed to me a worthy corrective, considering how little self-restraint most people practice, especially overtly dramatic public personae who run increased risks of believing their own hype, and accordingly, entitlement to publicity, whether good or bad. We all know too much already about the childish antics of media whores who, among other things, throw tantrums with impunity compared to the general public.

rant on/

The Yale Forum on Climate Change & the Media just issued a public relations piece about a new Showtime multipart climate change documentary called Years of Living Dangerously. I call it public relations because, like all good PR, it appeals to prurient interests (look at the beautiful people doing beautiful people stuff) and instructs credulous readers far too much about what to think, lest anyone form opinions without the guidance of the infernal marketing machine. Rampant name-dropping with bullshit glamor-shots showing a few famous people (all filmmakers and actors, laughably relabeled “correspondent”) getting their green on precedes the risible assertion that celebrities function as proxies for the average person despite the average person having absolutely nothing in common with the wealth, overexposure, travel, command of attention, heaping of accolades, and enjoyment of fawning deference that characterize celebrities. Drawing focus to climate change and (one might hope) swinging discussion away from deniers (who champion controversy over truth) are cited as precisely the reasons why celebrities are perfect for this documentary. The PR piece further examines (albeit briefly) celebrity activism and provides links to studies on the social science of celebrity (gawd …) before admitting that some backlash might ensue. I guess I’m fomenting backlash.

As PR, the piece is certainly well written, despite its unabashed star-fucking celebrity worship. Further, celebrities have legitimate interests in politics, culture, climate change, and collapse, just like anyone else, even though exorbitant wealth enables them to behave as supranational entities like so many stateless multinational corporations. So why not use their fame to influence people, right? There You GoWell, we’ve already been down the primrose path of celebrity spokespersons occupying positions of influence, speaking from well-crafted scripts, and selling out issues and policy like commodities. Some celebs even understand those issues, though that’s no guarantee of wizened leadership. Consider Arnold Schwarzenegger’s undistinguished tenure as California governor. I have never lived in California to know first hand, but my dominant impression of Schwarzenegger’s leadership style was unapologetic political theater, with incessant catchphrases from his movies functioning as entertaining drivel misdirection matched against his inability (or anyone else’s, for that matter) to solve intractable problems. Curiously, his name is connected as a backer to Years of Living Dangerously, with a whole section of Yale’s PR piece devoted to charges of hypocrisy over his being a loot-and-pollute industrialist once removed through partial ownership in an investment company. Indeed, such conflicts of interest and hypocrisy are flagrant among celebrities who jet around the globe to movie sets (jet-setting?) then jet off again to have themselves filmed bearing witness (in flyovers, it seems, taking a spurious god’s-eye view from above the fray) to ecological devastation. I hesitate to raise this objection because ideological purity doesn’t exist, and as demonstrated in this lengthy blog post, charges of hypocrisy are hard to make stick after even modest analysis. But still, those who most enjoy the fruits of our passing Age of Abundance might pause to consider how it looks when they throw support behind undoing the same disastrous institutions that rewarded them so handsomely. It may not be quite the same as saying we must all now accept austerity (typically, you first! — as in Harrison Ford confronting Indonesian officials?), but near-universal austerity is inevitably where we’re headed anyway.

These are not my principal reasons for whining and ranting, however. My main reason is that by putting rich, celebrity “correspondents” at the center of the story (perhaps they put themselves there, I can’t really know), they adopt an approach similar to too-big-to-fail and too-rich-to-prosecute, except now it’s too-famous-to-ignore. The MSM, revealed as ugly-sister handmaidens to corporate and political power, has failed completely to engage the public sufficiently on climate change, but by putting pretty, loquacious celebrities on display and in charge of rude issue awakening, the documentary falls to the level of clickbait despite whatever intentions it may possess. So although nominally about climate change, it’s really about celebrities waking up to climate change. How lovely! But this is a life-and-death (mostly death at this stage) issue for all of humanity, not just entertainers. Further, what do celebrities qua celebrities bring to the discussion? Nothing, really, except the empty glamor of their fame, expert line delivery, and ability to improvise dialogue. Maybe I shouldn’t sniff at that, considering how  journalists (now climbing into celebrity ranks for all the wrong reasons and too often themselves at the center of the story, both of which undermine journalistic credibility) and politicians have failed so utterly to address social issues effectively. No matter that it’s the job of journalists and government policymakers to bring to light the harrowing news that we done done ourselves in. I warn, however, that if James Cameron or any other instigator behind Years of Living Dangerously believes their project to be a game changer, he or she has seriously misunderstood dynamics that shape public opinion. For centuries, we’ve been assiduously ignoring Cassandra-like warnings from far more authoritative scientists and blue-ribbon panels such as the IPCC. Why would that change now by mixing in celebrities?

And why on earth would earnest celebrity response to recognition of imminent disaster brought on by climate change be to put on a show (the Little Rascals response) with self-serving celebrity spin? Or for that matter, why succumb to notorious solutionism, hopefulness, and the ironically dispiriting happy chapter? The answer is that they have not yet processed the true gravity of our multiple dilemmas and reached the fully foreseeable conclusion after delayed effects are taken into account: we’ve totally and irredeemably fucked. But I guess that wouldn’t sell DVDs, now would it?

/rant off

Intellectual history is sometimes studied through themes and symbols found in novels with the writers of those novels being manifest about their intent. This is the second of two blog posts exploring truth-telling in fictional narrative. The first one is here.

Although I watch exactly zero TV, I see a fair number of movies (usually at home on DVD), which fulfills my need to stay in touch with the Zeitgeist of mainstream culture. Periodically, I go to iTunes Movie Trailers to see what’s coming out. In my experience, most offerings are interchangeable genre films with themes, stories, and effects drawn from the same worn-out bag of tricks. Actors, directors, and screenwriters repeat themselves with predictable regularity, which I’ll admit doesn’t necessarily stop their films from being entertaining or making money. If I’m drawn to any particular genre, it’s science fiction, which typically presents some provocative ideas, though they are promptly sacrificed to cinematic convention.

Considering the way the world is going, it was only a matter of time before yet another film explored transhumanism, though no one ever says transhumanism, if indeed they are aware of their underlying themes or merely express themselves through an inchoate artistic sensibility. The latest (due out in mid-April) renames the phenomenon Transcendence and stars Johnny Depp as a terminally ill mad scientist whose mind is up- or downloaded into a computer only to go power-hungry and berserk. (I’ve only seen the trailer and a couple featurettes.) Maybe it’s a cautionary tale, but not before luring credulous viewers into technophilia over the wildly imaginative possibilities of minds housed in computers. Michio Kaku, a science explainer/popularizer and author of the book The Future of the Mind, also teases initiates with the ridiculous potential to, say, reduce consciousness to a collection of data points to be “preserved” on a CD-ROM. Thus, through storytelling of consciousness disembodied and gone haywire, the controversy is taught, yet the inevitability of this future is plainly assumed. The scientists in the featurettes, by the way, say we’re only about 30 years away from being able to accomplish the wonders portrayed in the film.

Read the rest of this entry »

Captivating Fools

Posted: April 1, 2014 in Education, Idle Nonsense, Media
Tags: ,

/rant on

I got an unexpected dose of news today — unexpected because I do my best to tune it out and avoid allowing the great simulacrum to influence me too much. It started with the infernal Captivate screen in the elevator going to work, which broadcasts 3-second pablum to the 45-second captive drones (like me) making the final, vertical portion of their commute. In a fit of pique that made it through the editorial process unscathed, the screed screen read that viewers should be wary that anything and everything read and heard today (April 1) just might be lies. Or maybe that should be “lies,” since nothing is really a lie with the right marketing and political spin or prankish motivation behind it. My immediate thought was “Why should today be any different?” Indeed, considering the idiocy emanating from myriad media organs, functioning quite literally as Orwellian Ministries of Information, I’d say most mouth-breathers have pretty much mastered doublethink without even having it forced on them. Call it soft tyranny.

The utter failure of our political leaders and their too-friendly watchdogs in the Fourth Estate together to deliver anything resembling our true condition as late-stage capitalism winds down and the overlapping Carbon and Atomic Eras gradually reduce the planet to lifelessness is the condicio sine qua non of the modern age. For close to 20 years now (by my own lousy memory), we’ve been hearing dire warnings, some from the same media and politicians, that begin “if we don’t address this looming problem now ….” May as well drop that formulation and start with “Since we won’t address problems looming now for decades ….” Report after study after projection all come to the same essential conclusion: destruction of global habitat and the extinction of species that rely upon it for survival. That includes us. Calculating the cost of losses in dollars is a commonplace but completely irrelevant trope.

So quickly after the (ahem) valuable public service announcement that April Fools might be fooled, the screed screen said that a Gallup poll found a change of consumer confidence in one direction or the other. Gallup tracks consumer confidence continuously, but really, why poll the public? Is policy being crowd-sourced now? Sure, the people have power once they can be poked, prodded, goosed, and threatened to move their asses, but the pokers, prodders, goosers, and threateners can’t always predict just where the fickle public mood will wander. Cancel that — they’re actually pretty good at it because, as a nation, we’ve been miseducated and kept in a permanent state of adolescent thralldom. J.H. Kunstler characterizes the great unwashed masses pretty consistently as “demoralized, mentally inert, drugged-up, tattoo-bedizened populace of twerking slobs” or some variation. I concur.

I managed to go a few hours without the constant ooze of the screen dripping into my brain before getting stuck staring slack-jawed at one of the local Chicago news broadcasts, in this case, WGN, billed as Chicago’s Own with this roomful of teeth flashing their high beams indiscriminately at the cameras like an insane clown posse. As usual, the top story was another horrific South Side shooting of women and children then without any sort of contextualizing segue a report on a college student being sexually assaulted in the dormitory shower. These sorts of news stories aren’t lies, really, but one wonders why they are reported the way they are, with some poor on-the-scene hack clutching a mic on a street corner and admitting that “not much is known for certain but we hear ….” Luckily, I extricated myself before getting to the human interest, sports, and weather segments that turn or churn the daily news wheels with remarkably formulaic predictability. Why bother watching?

/rant off


Cold War antipathies between the “free world” and the Communist Block used to be conceptualized (in short) as “us and them” (sometimes “us vs. them”), which meant the U.S. and the U.S.S.R., the last two great superpowers. Additional facets to geopolitics were added by China, North Korea, the Middle East, India and Pakistan, and Brazil (mostly members of the nuclear club), but they didn’t figure as prominently in the rhetoric as what was clearly (even then) a false dualism. Binary thinking of this sort continues today in bogus phrases such as “either you’re with us or against us” or “if you’re not part of the solution then you’re part of the problem.” In American politics, the two-party system (Republicans and Democrats) appears to be intransigent and permanent despite political parties having risen and fallen over time both here and abroad. This herd team mentality keeps most political thinkers and observers from examining third-party alternatives with much seriousness the same way it forestalls bipartisanship. A little known fact is that the government-sounding agency called the Commission on Presidential Debates is, in reality, a private corporation financed by Anheuser-Busch and other major companies and created by the Republican and Democratic parties to seize control of the presidential debates in 1987 from The League of Women Voters.

Close identification with in-groups is learned early in life as cliques form in middle school (or before?) and is reinforced as each of us progresses through life’s phases. For instance, married/committed couples have a divergent set of understandings of personal relationships from unmarried individuals seeking/searching for a significant other. Childless couples have fundamentally differing social perspectives from those raising children (parents’ outlooks tracking with their children’s development). Working class folks have fewer opportunities and prerogatives than white-collar and professional workers. The rich enjoy considerable obeisance from everyone and benefit from undeserved favors and preferential treatment that the lower and middle classes can only look upon with envy and/or resentment. Examples go on and on.


We cling to these identities with surprising faithfulness, considering how they lump everyone rather imprecisely into categories, not altogether arbitrarily constructed but crude nonetheless. Blends of attitudes and truly creative, outlying thinking don’t figure in discussions dominated by rigid fidelity to narrow rhetoric, sound bites, and talking points. Interestingly, this same us-and-them effect is at work in discussions of collapse and NTE, the players divided unevenly between those who just don’t get it (for a variety of reasons) and those who believe all indications are beyond controversy, meaning, completely obvious: we’re on a hopelessly downward trajectory. Of course, this division omits the bulk of the population for whom the issue isn’t even broached, and even for those who acknowledge the issue, there are a surprising number of positions on the continuum, such as those who get it but haven’t extrapolated far enough, those who get it but lie or deny out of one motivation or another (e.g., self-enrichment or political gain, albeit short-term), and those who don’t get it yet are exceeding well-versed in the evidence (so that it can be argued and spun).

All these dividing lines, rather than being a celebration of diversity, make us a fractured society along multiple faults. Perhaps it’s just my perception, but there seems to be a widening gap between those who openly admit our future must lead ineluctably to doom and techno-utopians for whom future horizons loom bright. I’ve suggested elsewhere that newcomers to the issue of collapse have a lot of catching up to do, but that naïvely assumed a common, shared understanding of our reality upon which to base incontrovertible conclusions. Let me suggest something a bit more radical: the utter failure of the masses to grasp the immensity of the collapse story already unfolding around us while a few intrepid folks call bullshit on the substitute story offered by clever politicians, pundits, and marketeers — rhetoricians all — is equivalent to the divide between a poor, illiterate, itinerant farmer (or hunter or trapper) ca. 1780 and the Founders, a tiny group of landed gentry who were exceptionally well-educated men — Renaissance men, if you will, all having deep understanding of political and Enlightenment philosophy of the day. It must have been nearly impossible for the Founders to communicate effectively with the governed.


Today, the situation is reversed: mouth-breathing populists are now governing and have seized upon the means to manipulate the masses through disinformation and misdirection. Further, popular leaders and opinion-makers refuse to hear and simply cannot understand what a wizened few are telling them, namely, that unsustainable practices of industrial civilization have reached fever pitch and will soon become a hellscape of our own creation. Like a Revolutionary Era agriculturist or outdoorsman, today’s populists (and the large portion of the population they reflect — who elect them, in fact) may possess narrow expertise at their individual endeavors. Yet ironically, they remain over-specialized and cut off from broad intellectual traditions and are thus functionally illiterate. Similarly, the masses to whom they proselytize have at best limited command of reading and almost no critical thinking skills whatsoever. (We never even approached universal literacy, which is a gateway to erudition.) A liberal arts education is to them hollow and meaningless, they are fundamentally immune to what science instructs, and their heads are full of entertainments (e.g., superhero geekery and professional sports) and other distractions that block real knowledge and understanding gained through careful, sustained consideration of an array of sources and perspectives. Contrast them with folks who read voluminously, study trends and scientific reports, and draw conclusions from a wealth of evidence: the two groups might as well be speaking Mandarin and English for all the communication passing between them.


My sense of the term populist should not be mistaken for leaders who embody the will of the people. That’s obviously not happening. The most basic function of government is to formulate policy and allocate funds to execute those policies. The graphic below shows top policy priorities over the past five years:

Well down the list is dealing with global warming (and I’m guessing the related complex of problems). Protecting the environment fares about 10–20 points better, as though it were a separate issue. What is most important to the public, however, are those things at which our leaders are failing the worst: the economy and jobs; terrorism; and education. Every administration and Congress initially pays ample lip service to priorities with wide public support but then diverts to a different agenda. This paragraph by Joel Hirschhorn captures the sort of populism now practiced.

With the Bush-corrosion of our Constitution and collapse of the economic system after it had been exploited by the rich and corrupt, what better time for revolution? Instead, we got a president with a glib tongue, a terrific smile and a deep commitment to the two-party plutocracy and corporate state. Obama is no populist, not even close. Nor is he a genuine reformer. At best, he is a master exploiter of populism.

It’s noteworthy that Hirschhorn saw through the B.S. five years ago.

A similar disconnect between public mandate and leadership is described in this Truthout article from 2011:

According to the latest poll conducted by CBS “60 Minutes” and the magazine Vanity Fair, 61 percent of Americans want to raise taxes on the wealthy as the primary way to cut the budget. The same poll finds that the second most popular first choice for cutting the nation’s budget deficit, at 20 percent, is cutting the military budget. That is, 81 percent of us — four out of five — would cut the deficit by taxing the rich and/or slashing military spending. Only four percent of those polled favored cutting Medicare … and only three percent favored cutting Social Security … A second poll, this time by CNN, reports that 63 percent of Americans oppose the US War in Afghanistan and want it ended. Only 35 percent say they support the war (now in its ninth year).

With such a disconnect stalling meaningful discussion before it begins, no wonder that controlling rhetoric is defined instead by funding (profit), celebrity (guru glorification, including green-washing types), and false solutionism. They are precisely the wrong kinds of issues, of course. The right kinds might involve the realization that …

  1. in an interconnected world, we all succeed and fail together in this life (there is no us and them anymore),
  2. the time has long passed for solutions and (an attempt at) mitigation is the next step, and
  3. moral choices about how we act in the time remaining us are of paramount importance once deteriorating conditions lead to widespread chaos.

Instead we get slick salesmanship to keep the economy humming (funneling capital to the top) and the masses calmed or blissed-out on gadgetry. We get not-so-behind-the-scenes preparations to cull and quarantine the population when the going gets rough. And we succumb to infighting among those who can’t achieve consensus about what’s to be done. Us and them to the bitter end.

Morris Berman came forward with an interesting bit about the New Monastic Individual (NMI) first described in his book The Twilight of American Culture. He wrote two addition books to complete his second trilogy: Dark Ages America (also the title of his blog) and Why America Failed, taking from the latter the initials WAF to denote followers, commentators, acolytes, and habitués of his blog using the term Wafer. I hesitate to quote too liberally, since Prof. Berman sometimes puts up copyright notices at the ends of his blogs; I’ll redact this at the slightest whiff of an infringement challenge:

  1. Wafers recognize that 99% of those around them, if they are living in the United States, are basically stupid and nasty. This is not said so much as a judgment as a description: it’s simply the way things are, and these things are not going to change any time soon. Wafers know this, and they accept it.
  2. The lives of Wafers are driven by knowledge, not fear or fantasy. They are living in reality, in short, not drowning in the mass illusions of contemporary America.
  3. Wafers are serious about their lives. They are not here on this earth to waste time, to piss their lives away on other people’s agendas, as are most Americans — right up to and including the president. Their goals are truth, love, and joy, and they are dedicated to pursuing them.
  4. Finally, Wafers feel sorry for non-Wafers, and if they can, try to help them. They recognize, of course (see #1), that most cannot be helped; but if they come across someone who shows signs of potential Waferdom, of awakening to the the three points mentioned above, they try to fish them out of the drink, so to speak, and set them on the path of dignity, intelligence, integrity, and self-respect. Noblesse oblige, that sort of thing.

Numbers 1–3 are well and good. I’ve been a subscriber since Twilight was published. Evidence for the negative assessments is obvious and easy to obtain. Carving out a special place for a few Wafers to congratulate themselves (no. 4), however, strikes me as pissy and ungracious. But this isn’t precisely what I want to blog about. Rather, it’s how a former intellectual model of mine has fallen into disgrace, not that he would recognize or admit it. (This is IMO worse than the irrelevance complained about at his blog.) Prof. Berman was among the first to awaken in me a real curiosity in deeper stories behind cheap façades offered by most historical accounts, which form a dissatisfying consensus reality. I don’t possess the academic wherewithal to emulate him, but I’m a critical reader and can synthesize a lot of information.

Read the rest of this entry »

Intellectual history is sometimes studied through themes and symbols found in novels with the writers of those novels being manifest about their intent. This is the first of two blog posts exploring truth-telling in fictional narrative. This is also cross-posted at The Collapse of Industrial Civilization.

One of the many recurring themes and ideas that appear on this blog is that the essential form taken by consciousness is story or narrative. Story enables us to orient ourselves in the world and make it somewhat intelligible. It should not be overlooked that it is we who tell ourselves stories, narrating life as we go via the inner voice no less than attending to the great stories that inform culture. The Bible is one such story (or collection of stories), though its message is interpreted with a scandalously high degree of controversy. (I’m especially intrigued by Paula Hay’s thesis over at Mythodrome that the story of The Fall is really about the loss of animism, not a literal expulsion from the Garden of Eden. The Tao te Ching and the Qur’an are similar, one might even say, competing stories from other world cultures.) Story has taken on many forms throughout history, beginning with oral tradition. Setting epics in song and/or verse made them memorable, since fixed written forms came rather late in history (conceived in terms of tens of thousands of years). The appearance of books eroded oral tradition gradually, and the transition of the book into an everyday object after the invention of the printing press eventually helped undermine the authority of the Medieval Church, which housed libraries and trained clerics in the philosophical, ecclesiastical, and scientific (as it was then understood) interpretation of texts. Story continued its development in the Romantic novel and serial fiction, which attracted a mass audience. Today, however, with literacy in decline, cinema and television are the dominant forms of story.

Many categories, types, and genres of story have evolved in fiction. Considering that story arcs typically progress from calm to conflict to resolution, the nature of conflict and the roles we are asked to assume through identification with characters (often archetypal) are a subtly effective vehicle for learning and mind control. Those whose minds have been most deeply and successfully infiltrated are often the same who argue vociferously in defense of a given story, no matter the evidence, with arguments playing out in political spheres and mass media alike. In addition to lighter fare such as RomComs and coming-of-age stories, both of which define not-yet-fully-formed characters through their solidifying relationships, we get hero/antihero/superhero, war, and dystopian tales, where characters tend to be chiseled in place, mostly unchanging as action and events around them take center stage. It is significant that in such tales of conflict, antagonists typically appear from outside: political opponents, foreigners and terrorists, aliens (from space), and faceless, nameless threats such as infectious disease that one might poetically regard as destiny or fate. They threaten to invade, transform, and destroy existing society, which must be defended at all cost even though, ironically, no one believes on a moment’s contemplation it’s really worth saving. Exceptionally, the antagonist is one of us, but an aberrant, outlying example of us, such as a domestic terrorist or serial killer. And while plenty of jokes and memes float around in public that we are often our own worst enemies, becoming the monsters we aim to defeat, stories that identify our full, true threat to ourselves and the rest of creation precisely because of who we are and how we now live are relatively few.

In light of the story of industrial collapse, probably the biggest, baddest story of all time but which is only told and understood in fleeting glimpses, it occurred to me that at least two shows found in cinema and TV have gotten their basic stories mostly correct: The Matrix (predominantly the first film) and The Terminator (the TV show to a greater degree than the movie franchise). In both, a very few possess the truth: knowledge of our enslavement (actual or prospective) to machines of our own invention. Characters in the matrix may feel a sense of unease, of the projected reality being somehow off, but only a few take the notorious red pill and face reality in all its abject despair while most prefer the blue pill (or more accurately, no pill) and the blissful ignorance of illusion. Traveling back and forth between realities (one known to be quite false), the ultrachic glamor and superhero antics of the false reality are far, far more appealing than the dull, cold, grey reality without makeup, costumes, and enhanced fighting skills. Everyone behaves in the false reality with cool, almost emotionless confidence, whereas in the other reality everyone is strained to the breaking point by continuous stress at the threat of annihilation. In Terminator world, time travel enables a few to come back from the future, in the process spilling the beans about what happens after the Singularity, namely, that machines go on a rampage to kill humanity. The dominant emotion of the few initiates is again stress, which manifests as bunker mentality and constant battle readiness. Casualties are not limited to frayed nerves and strained civility, though; plenty of innocent bystanders die alongside those fighting to survive or forestall the future.

Those are only stories, reflections of our preoccupations and diversions from the truth available to witness without needing a red pill. But reality is nonetheless a bitter pill to swallow, so few who become aware of the option to square up to it vs. ignore it really want the truth. I judge that most are still blissfully unaware an option exists, though evidence and supporting stories are everywhere to be found. For those of us unable to pretend or unknow what we now know, the appearance of stress, paranoia, self-abnegation, infighting, gallows humor, and nihilism run parallel to character traits in the Matrix and Terminator worlds. Through story, reconfigured as entertainment, we may indeed be working through some of our psychological issues. And we experience some of the same coming together and tearing apart that inevitably accompany the great events of history. But unlike the childish teaser in this CBS News story that the apocalypse has a date, the machinations of history, like death and extinction, are not strictly events but processes. The process we initiated unwittingly but then ignored is beginning its final crescendo. Stories we tell ourselves conventionally end with triumphal resolution, flatly ignoring the destruction left in their wake. I warn: do not look for triumph in the story of industrial collapse except in those tiny, anonymous moments of grace where suffering ends.

I’m way overdue with my next blog but wanted to put up something quick. As usual, I’ve got more than a couple ideas percolating but no time to research and write them. Since the inception of this blog, I’ve gotten away from my self-imposed limit of 3-4 paragraphs and have grown prolix. Makes each post into an obstacle to be overcome. Perhaps I can rein myself in and got more done.

Some while back, my most-viewed post was Living Among Refuse. That has since been surpassed by Scheler’s Hierarchy, which now has over 4,500 hits. Curiously, most are from the Phillippines. No one coments, though, so as with most of my blog posts, there is scant to nonexistent discussion and dialogue. Active commentary is found at other blogs I frequent.

In the middle of last year, I began blogging at The Collapse of Industrial Civilization (see blogroll). The traffic and commentary there is quite robust, and frankly, I can’t keep up. I gave up reading, commenting, and guest-blogging at Nature Bats Last (see blogroll) for the same reason. Both blogs chronicle the ongoing collapse with no shortage of news articles about the demise of this or that species or ecosystem. I require no further convincing that we’re digging the pit of our own despair. (I stole that phrase, as I warned I would.) Awareness and individual response are both picking up intensity, with some flailing for solutions, others mining the emotional depths for profit or self-aggrandizement, and others so plainly gobsmacked trying to get their heads around it that hardly anything makes sense or indeed matters anymore. Corporate and government response is all theater as far as I can tell. I’ve got lots of content about all this in my blog backlog (backblog?) and at Collapse.

My two book blogging projects are still underway, with a final entry on The Master and His Emissary by Iain McGilchrist yet to be written (been waiting for months already) and additional progress through The Decline of the West by Otto Spengler already underway. As time allows, I will develop new posts on each.

Risk and Reward

Posted: January 18, 2014 in Culture, Idle Nonsense

A couple of my coworkers were griping recently about Chicago weather, even before recent extreme cold brought by the polar vortex (which is expected to return). Complaints focused on the cold and wind, but driving in the snow was singled out for special attention. I prepare for weather, but it wouldn’t occur to me to complain too much about it. Their reticence to go out and brave the elements brought to mind how in my youth we never thought twice about it; we were always outside, even on subzero days. The more snow, the better, and we would spend snow days off from school greedily and deliriously happy out on the hills from 9 a.m. until dark, not even stopping for food. I remember coming in one time soaking from sweat and snowmelt, the inside wrists of my sleeves completely caked with ice buildup I hadn’t even noticed. That boyhood sledding was on hills formed from worn-down bluffs carved long ago by the Missouri River in the northern suburbs of Kansas City. Chicago has no such hills, so winter sports here are of a fundamentally different sort, as I suppose they are in the Rockies and other mountainous climes.

Recently, I caught sight of an ad for the Hammerhead Pro X sled, offered for around $150:
It’s a far cry from the basic wood-and-steel sled of my youth, though those traditional types are still available. If the Hammerhead, made from high-tech materials, is built for speed and control (and adults), sleds of my youth were definitely more inclined toward reckless abandon. Indeed, it always seemed to me that the reward of all that climbing, cold, and effort was being enough out of control to have some fear but enough in control to master both sled and fear. We took our sleds over and off every jump and ramp and drop we could find, sometimes ending with bloody noses when the landing was too hard. I also saw my share of fools who risked a lot more than I ever did on toboggans and cafeteria trays — something they often regretted in hindsight. My one such learning experience was being goaded into riding off a ramp on top of a piece of sheet metal or metal siding someone had brought, which launched me somewhat higher into the air than I expected and with complete loss of control since there was absolutely nothing to hold onto. Luckily, no injury resulted, just a hard landing on my butt. A favorite memory is once executing a 360° powerslide on a patch of refrozen snow crust and then sledding straight out of it. I tried to repeat it several times that same day without success; it was a once-in-a-lifetime move.

The reward of extreme sports is something now left to far younger people than me. I blame Mountain Dew and the X Games for raising the risk bar higher and higher, with predictable results. I participate in endurance sports, but obstacle races, mountain biking (including downhill on ski slopes), skiing and snowboarding, and a variety of other risky endeavors are now things of the past for me. The rewards are no longer worth the risk.

I have watched, blogged about, and embedded my share of TED Talks over the years I’ve been active as a blogger. As a cottage industry for the cognoscenti, TED is an impressive, multinational undertaking being fed into by lots of impressive scientists, researchers, and know-it-alls. What’s not to like? One spends about 12 min. watching and listening to someone simplify complex issues, recommend a handful of impossible solutions, and make promises to roll up one’s sleeves and do the hard work, but then one forgets about it all the next day. Or one might watch another talk, or a series of talks, to get the brain goosed before moving on. The most succinct criticism I have heard of the phenomenon called it insight porn. So after my initial excitement with TED and its voluminous insights, I grew to be more and more skeptical of various speakers’ claims. This was especially true after I learned that Allan Savory admitted to having been instrumental in wholesale murder of 40,000 African elephants — a massacre undertaken with the full authority of scientific consensus (at the time, which Savory is now attempting to reverse — oops, his bad).

In light of my misgivings about TED, I found it curious that a TED Talk by Benjamin Bratton has attracted lots of positive attention. His subject is TED Talks themselves, and he makes his case in a TED Talk, which is reproduced as a transcript in an article by The Guardian. There is much to admire in Bratton’s analysis but more perhaps to find objectionable. He occupies a difficult position, both behind the curtain and before it, which creates recursive conflicts. (I’m rather fond myself of busting through rhetorical frames, but I try not to stand inside them at the same time.) Further, I read the article before watching the talk in video form and found quite a difference between the two media. The lack of visual distractions and audience response to the jokes in the transcript allow me to penetrate much deeper. I suspect that a large part of that is not being entrained by sympathetic response.

Bratton offers several worthwhile insights, among them the cult of the solution (my term, not his), which becomes hypothetical since implementation rarely goes forward. This is true especially when it comes to reconceptualizing fondly held myths and self-delusions about what science and sociology can really do for our understandings about the ways the world really works. That’s one of Bratton’s related criticisms: the nonworkability of TED Talk solutions. It’s ironic that Bratton observes so many TED Talks founder on practicality when at the same time his initial anecdote about a scientific presentation failing to motivate the listener (be more like Malcolm Gladwell and engage the emotions, transforming epiphany into entertainment) receives serious derision. There’s room for both interpretations, but some circumspection is needed here. Nothing could be more obvious than how various political and corporate entities have succeeded in motivating the public into action or inaction through propaganda, marketing campaigns, and emotional manipulation, whereas the sober, objective posture of scientific inquiry has failed utterly to get action on an extremely pressing civilizational disaster quite different from the one Bratton envisions. (Regular readers of this blog know what I’m referring to.) Bratton actually takes note of GOP dominance of messaging (what he calls bracketing reality) but fails to connect the dots.

The T-E-D in TED

Bratton also objects that the technology-education-design orientation of TED ought to be instead tech-econ-design. This reconfiguration got by my internal bullshit sensor on video, but the transcript fares less well. For instance, he notes that TED often delivers placebo technoradicalism (no lack of clever word formations) but concludes that it fails precisely because we’re too timid to actually embrace technology with enough gusto. He doesn’t seem to get that technological advance is at best mixed and at worse disastrous (e.g., hypercomplexity, WMDs, nuclear mismanagement and accidents, overconsumption of resources leading to population overshoot leading back to overconsumption). Bratton apparently still worships the tech idol, failing to recognize that the world it has delivered serves us rather poorly. Further, neither education nor economics have turned out to be much of a panacea for our most intractable problems. I’m unsure what Bratton thinks exchanging one for the other in the context of TED will accomplish. Last, redefining design as immunization rather than innovation sounds like something worthwhile but is the same hypothetical, unrealizable solution he criticizes earlier. The notion that we can engineer our way out of problems if only designers wise up and others stop gaming systems for profit or self-aggrandizement begs quite a lot of questions that go unaddressed.

In spite of his disavowal of a simple takeaway, Bratton offers standard PoMo word salad filled with specialized jargon:

… it’s not as though there is a shortage of topics for serious discussion. We need a deeper conversation about the difference between digital cosmopolitanism and cloud feudalism … I would like new maps of the world, ones not based on settler colonialism, legacy genomes and bronze age myths, but instead on something more … scalable … we need to raise the level of general understanding to the level of complexity of the systems in which we are embedded and which are embedded in us. This is not about “personal stories of inspiration”, it’s about the difficult and uncertain work of demystification and reconceptualisation: the hard stuff that really changes how we think …

Really changes how we think? For someone who writes about “entanglements of technology and culture,” Bratton has a serious misconception about how the brainchild of the Enlightenment can ultimately win the day and deliver salvation from ourselves. Myth and story and narrative are not mistakes to be replaced by rationalism; they are who and how we are in the world. They don’t go away by wagging the bony finger of science/tech but are only slowly forgotten, revised, and replaced by yet other myths, stories, and narratives. The depth of our entrenchment in such cultural baggage is one of the very things forestalling change. We can peer behind the curtain or see from outside the bubble sometimes, but we can’t escape them. We never could.