Archive for the ‘Industrial Collapse’ Category

From the not-really-surprising-news category comes a New Scientist report earlier this month that the entire world was irradiated by follow-on effects of the Fukushima disaster. Perhaps it’s exactly as the article states: the equivalent of one X-ray. I can’t know with certainty, nor can bupkis be done about it by the typical Earth inhabitant (or the atypical inhabitant, I might add). Also earlier this month, a tunnel collapse at the Dept. of Energy’s Hanford nuclear waste storage site in Washington State gave everyone a start regarding possible or potential nearby release of radiation. Similar to Fukushima, I judge there is little by way of trust regarding accurate news or disclosure and fuck all anyone can do about any of it.

I’m far too convinced of collapse by now to worry too much about these Tinkerbells, knowing full well that what’s to come will be worse by many magnitudes of order when the firecrackers start popping due to inaction and inevitability. Could be years or decades away still; but as with other aspects of collapse, who knows precisely when? Risky energy plant operations and nuclear waste disposal issues promise to be with us for a very long time indeed. Makes it astonishing to think that we plunged full-steam ahead without realistic (i.e., politically acceptable) plans to contain the problems before creating them. Further, nuclear power is still not economically viable without substantial government subsidy. The likelihood of abandonment of this technological boondoggle seems pretty remote, though perhaps not as remote as the enormous expense of decommissioning all the sites currently operating.

These newsbits and events also reminded me of the despair I felt in 1986 on the heels of the Chernobyl disaster. Maybe in hindsight it’s not such a horrible thing to cede entire districts to nature for a period of several hundred years as what some have called exclusion or sacrifice zones. Absent human presence, such regions demonstrate remarkable resilience and profundity in a relatively short time. Still, it boggles the mind, doesn’t it, to think of two exclusion zones now, Chernobyl and Fukushima, where no one should go until, at the very least, the radioactive half-life has expired? Interestingly, that light at the end of the tunnel, so to speak, seems to be telescoping even farther away from the date of the disaster, a somewhat predictable shifting of the goalposts. I’d conjecture that’s because contamination has not yet ceased and is actually ongoing, but again, what do I know?

On a lighter note, all this also put me in mind of the hardiness of various foodstuffs. God knows we consume loads of crap that can hardly be called food anymore, from shelf-stable fruit juices and bakery items (e.g., Twinkies) that never go bad to not-cheese used by Taco Bell and nearly every burger joint in existence to McDonald’s burgers and fries that refuse to spoil even when left out for months to test that very thing. It give me considerable pause to consider that foodstuff half-lives have been radically and unnaturally extended by creating abominable Frankenfoods that beggar the imagination. For example, strawberries and tomatoes used to be known to spoil rather quickly and thus couldn’t withstand long supply lines from farm to table; nor were they available year round. Rather sensibly, people grew their own when they could. Today’s fruits and veggies still spoil, but interventions undertaken to extend their stability have frequently come at the expense of taste and nutrition. Organic and heirloom markets have sprung up to fill those niches, which suggest the true cost of growing and distributing everyday foods that will not survive a nuclear holocaust.

I pull in my share of information about current events and geopolitics despite a practiced inattention to mainstream media and its noisome nonsense. (See here for another who turned off the MSM.) I read or heard somewhere (can’t remember where) that most news outlets and indeed most other media, to drive traffic, now function as outrage engines, generating no small amount of righteousness, indignation, anger, and frustration at all the things so egregiously wrong in our neighborhoods, communities, regions, and across the world. These are all negative emotions, though legitimate responses to various scourges plaguing us currently, many of which are self-inflicted. It’s enough aggregate awfulness to draw people into the street again in principled protest, dissent, and resistance; it’s not yet enough to effect change. Alan Jacobs comments about outrage engines, noting that sharing via retweets is not the same as caring. In the Age of Irony, a decontextualized “yo, check this out!” is nearly as likely to be interpreted as support rather than condemnation (or mere gawking for entertainment value). Moreover, pointing, linking, and retweeting are each costless versions of virtue signaling. True virtue makes no object of publicity.

So where do I get my outrage quotient satisfied? Here is a modest linkfest, in no particular order, of sites not already on my blogroll. I don’t habituate these sites daily, but I drop in, often skimming, enough to keep abreast of themes and events of importance. (more…)

Even before I begin, you must know what the title means. It’s the proliferation of options that induces dread in the toothpaste aisle of the store. Paste or gel? Tartar control or extra whitening? Plain, mint, cinnamon, or bubble gum? The matrix of combinations is enough to reduce the typical shopper to a quivering state of high anxiety lest the wrong toothpaste be bought. Oh, how I long for the days when choices ran solely between plain Crest and Colgate. I can’t say whether the toothpaste effect originated with oral hygiene. A similarly bewildering host of choices confronts shoppers in the soft drink aisle. Foodstuffs seem especially prone to brand fragmentation. Woe be the retailer forced to shelve all 38 Heinz products on this page. (True, some are just different packaging of the same basic item, but still.)

Purveyors of alcoholic beverages are on the bandwagon, too. I rather like the bygone cliché of the cowboy/gunslinger riding off the range, swinging into the saloon, and ordering simply “whisky.” Nowadays, even a poorly stocked bar is certain to have a dozen or so whiskys (see this big brand list, which doesn’t include sub-brands or craft distillers.) Then come all the varieties of schnapps, rum, and vodka, each brand further fragmented with infusions and flavorings of every imaginable type. Some truly weird ones are found here. Who knew that these spirits were simply blank canvases awaiting the master distiller’s crazy inventiveness.

/rant on

What really gets my bile flowing on this issue, however, is the venerable Lays potato chip. Seriously, Frito-Lay, what are you thinking? You arguably perfected the potato chip, much like McDonald’s perfected the French fry. (Both are fried potato, interestingly.) Further, you have a timeless, unbeatable slogan: “betcha can’t eat just one.” The plain, salted chip, the “Classic” of the Lays brand, cannot be improved upon and is a staple comfort food. Yet you have succumbed to the toothpaste effect and gone haywire with flavorings (I won’t even countenance the Wavy, Poppables, Kettle-Cooked, Ruffles, and STAX varieties). For variety’s sake, I’d be content with a barbecue chip, maybe even salt & vinegar, but you’ve piled on past the point of ridiculousness:

  • cheddar & sour cream (a favorite of mine)
  • Chile limón
  • deli style
  • dill pickle
  • flamin’ hot
  • honey barbecue
  • limón
  • pico de gallo
  • salt & vinegar (not to my taste)
  • sour cream & onion (a good alternative)
  • sweet Southern heat barbecue
  • Southern biscuits & gravy
  • Tapatío (salsa picante)

(more…)

First, a few reminders:

  • The United States has been in an undeclared state of war for 15 years, the longest in U.S. history and long enough that young people today can say legitimately, “we’ve always been at war with Oceania.” The wars encompass the entirety of both terms of the Obama Administration.
  • The inciting events were attacks on U.S. soil carried out on September 11, 2001 (popularly, 9/11), which remain shrouded in controversy and conspiracy despite the official narrative assigning patsy blame to al-Qaida operating in Afghanistan and Iraq.
  • On the heels of the attacks, the Bush Administration commenced a propaganda campaign to sell invasion and regime change in those two countries and, over widespread public protest, went ahead and launched preemptive wars, ostensibly because an existential threat existed with respect to weapons of mass destruction (WMDs) possessed by Iraq in particular.
  • The propaganda campaign has since been revealed to have been cooked up and untrue, yet it buffaloed a lot of people into believing (even to this day) that Iraq was somehow responsible for 9/11.
  • Our preemptive wars succeeded quickly in toppling governments and capturing (and executing) their leaders but immediately got bogged down securing a peace that never came.
  • Even with an embarrassing mismatch of force, periodic troop surges and draw downs, trillions of dollars wasted spent prosecuting the wars, and incredible, pointless loss of life (especially on the opposing sides), our objective in the Middle East (other than the oil, stupid!) has never been clear. The prospect of final withdrawal is nowhere on the horizon.

Continuous war — declared or merely waged — has been true of the U.S. my whole life, though one would be hard pressed to argue that it truly represents an immediate threat to U.S. citizens except to those unlucky enough to be deployed in war zones. Still, the monkey-on-the-back is passed from administration to administration. One might hope, based on campaign rhetoric, that the new executive (45) might recognize continuous war as the hot potato it is and dispense with it, but the proposed federal budget, with its $52 billion increase in military spending (+10% over 2016), suggests otherwise. Meanwhile, attention has been turned away from true existential threats that have been bandied about in the public sphere for at least a decade: global warming and climate change leading to Near-Term Extinction (NTE). Proximal threats, largely imagined, have absorbed all our available attention, and depending on whom one polls, our worst fears have already been realized.

The 20th and 21st centuries (so far) have been a series of “hot” wars (as distinguished from the so-called Cold War). Indeed, there has scarcely been a time when the U.S. has not been actively engaged fighting phantoms. If the Cold War was a bloodless, ideological war to stem the nonexistent spread of communism, we have adopted and coopted the language of wartime to launch various rhetorical wars. First was LBJ’s War on Poverty, the only “war” aimed at truly helping people. Nixon got into the act with his War on Drugs, which was punitive. Reagan expanded the War on Drugs, which became the War on Crime. Clinton increased the punitive character of the War on Crime by instituting mandatory minimum sentencing, which had the side effect of establishing what some call the prison-industrial complex, inflating the incarceration rate of Americans to the point that the U.S. is now ranked second in the world behind the Seychelles (!), a ranking far, far higher than any other industrialized nation.

If U.S. authoritarians hadn’t found enough people to punish or sought to convince the public that threats exist on all sides, requiring constant vigilance and a massive security apparatus including military, civil police, and intelligence services comprised of 16 separate agencies (of which we know), Bush coined and declared the War on Terror aimed at punishing those foreign and domestic who dare challenge U.S. hegemony in all things. It’s not called a national security state for nuthin’, folks. I aver that the rhetorical War on Poverty has inverted and now become a War on the Poverty-Stricken. De facto debtors’ prisons have reappeared, predatory lending has become commonplace, and income inequality grows more exaggerated with every passing year, leaving behind large segments of the U.S. population as income and wealth pool in an ever-shrinking number of hands. Admittedly, the trend is global.

At some point, perhaps in the 1960s when The Establishment (or more simply, The Man) became a thing to oppose, the actual Establishment must have decided it was high time to circle the wagons and protect its privileges, essentially going to war with (against, really) the people. Now five decades on, holders of wealth and power demonstrate disdain for those outside their tiny circle, and our the government can no longer be said with a straight face to be of, by, and for the people (paraphrasing the last line of Lincoln’s Gettysburg Address). Rather, the government has been hijacked and turned into something abominable. Yet the people are strangely complicit, having allowed history to creep along with social justice in marked retreat. True threats do indeed exist, though not the ones that receive the lion’s share of attention. I surmise that, as with geopolitics, the U.S. government has brought into being an enemy and conflict that bodes not well for its legitimacy. Which collapse occurs first is anyone’s guess.

As I read into Fingerprints of the Gods by Graham Hancock and learn more about antiquity, it becomes clear that weather conditions on Earth were far more hostile then (say, 15,000 years ago) than now. Looking way, way back into millions of years ago, scientists have plotted global average temperature and atmospheric carbon, mostly using ice cores as I understand it, yielding this graph:

co2-levels-over-time1

I’ve seen this graph before, which is often used by climate change deniers to show a lack of correlation between carbon and temperature. That’s not what concerns me. Instead, the amazing thing is how temperature careens up and down quickly (in geological time) between two limits, 12°C and 22°C, and forms steady states known at Ice Age Earth and Hot House Earth. According to the graph, we’re close to the lower limit. It’s worth noting that because of the extremely long timescale, the graph is considerably smoothed.

(more…)

As a boy, my home included a coffee table book, title unknown, likely published circa 1960, about the origins of human life on Earth. (A more recent book of this type attracting lots of attention is Sapiens: A Brief History of Humankind (2015) by Yuval Harari, which I haven’t yet read.) It was heavily enough illustrated that my siblings and I consulted it mostly for the pictures, which can probably be excused since we were youngsters at the time time. What became of the book escapes me. In the intervening decades, I made no particular study of the ancient world — ancient meaning beyond the reach of human memory systems. Thus, ancient could potentially refer to anthropological history in the tens of thousands of years, evolutionary history stretching across tens of millions of years, geological history over hundreds of millions of years, or cosmological time going back a few billions. For the purpose of this blog post, let’s limit ancient to no more than fifty thousand years ago.

A few months ago, updates (over the 1960 publication) to the story of human history and civilization finally reached me (can’t account for the delay of several decades) via webcasts published on YouTube between Joe Rogan, Randall Carlson, and Graham Hancock. Rogan hosts the conversations; Carlson and Hancock are independent researchers whose investigations converge on evidence of major catastrophes that struck the ancient world during the Younger Dryas Period, erasing most but not all evidence of an antediluvian civilization. Whereas I’m a doomer, they are catastrophists. To call this subject matter fascinating is a considerable understatement. And yet, it’s neither here nor there with respect to how we conduct our day-to-day lives. Their revised history connects to religious origin stories, but such narratives have been relegated to myth and allegory for a long time already, making them more symbolic than historical.

In the tradition of Galileo, Copernicus, Newton, and Darwin, all of whom went against scientific orthodoxy of their times but were ultimately vindicated, Carlson and Graham appear to be rogue scientists/investigators exploring deep history and struggling against the conventional story of the beginnings of civilization around 6,000 years ago in the Middle East and Egypt. John Anthony West is another who disputes the accepted narratives and timelines. West is also openly critical of “quackademics” who refuse to consider accumulating evidence but instead collude to protect their cherished ideological and professional positions. The vast body of evidence being pieced together is impressive, and I truly appreciate their collective efforts. I’m about 50 pp. into Hancock’s Fingerprints of the Gods (1995), which contains copious detail not well suited to the conversational style of a webcast. His follow-up Magicians of the Gods (2015) will have to wait. Carlson’s scholarly work is published at the website Sacred Geometry International (and elsewhere, I presume).

So I have to admit that my blog, launched in 2006 as a culture blog, turned partially into a doomer blog as that narrative gained the weight of overwhelming evidence. What Carlson and Hancock in particular present is evidence of major catastrophes that struck the ancient world and are going to repeat: a different sort of doom, so to speak. Mine is ecological, financial, cultural, and finally civilizational collapse borne out of exhaustion, hubris, frailty, and most importantly, poor stewardship. Theirs is periodic cataclysmic disaster including volcanic eruptions and explosions, great floods (following ice ages, I believe), meteor strikes, earthquakes, tsunamis, and the like, each capable of ending civilization all at once. Indeed, those inevitable events are scattered throughout our geological history, though at unpredictable intervals often spaced tens or hundreds of thousands of years apart. For instance, the supervolcano under Yellowstone is known to blow roughly every 600,000 years, and we’re overdue. Further, the surface of the Moon indicates bombardment from meteors; the Earth’s history of the same is hidden somewhat by continuous transformation of the landscape lacking on the Moon. The number of near misses, also known as near-Earth objects, in the last few decades is rather disconcerting. Any of these disasters could strike at any time, or we could wait another 10,000 years.

Carlson and Hancock warn that we must recognize the dangers, drop our petty international squabbles, and unite as a species to prepare for the inevitable. To do otherwise would be to court disaster. However, far from dismissing the prospect of doom I’ve been blogging about, they merely add another category of things likely to kill us off. They give the impression that we should turn our attention away from sudden climate change, the Sixth Extinction, and other perils to which we have contributed heavily and worry instead about death from above (the skies) and below (the Earth’s crust). It’s impossible to say which is the most worrisome prospect. As a fatalist, I surmise that there is little we can do to forestall any of these eventualities. Our fate is already sealed in one respect or another. That foreknowledge make life precious for me, and frankly, is another reason to put aside our petty squabbles.

I don’t have the patience or expertise to prepare and offer a detailed political analysis such as those I sometimes (not very often) read on other blogs. Besides, once the comments start filling up at those sites, every possible permutation is trotted out, muddying the initial or preferred interpretation with alternatives that make at least as much sense. They’re interesting brainstorming sessions, but I have to wonder what is accomplished.

My own back-of-the-envelope analysis is much simpler and probably no closer to (or farther from) being correct, what with everything being open to dispute. So the new POTUS was born in 1946, which puts the bulk of his boyhood in the 1950s, overlapping with the Eisenhower Administration. That period has lots of attributes, but the most significant (IMO), which would impact an adolescent, was the U.S. economy launching into the stratosphere, largely on the back of the manufacturing sector (e.g., automobiles, airplanes, TVs, etc.), and creating the American middle class. The interstate highway system also dates from that decade. Secondarily, there was a strong but misplaced sense of American moral leadership (one might also say authority or superiority), since we took (too much) credit for winning WWII.

However, it wasn’t great for everyone. Racism, misogyny, and other forms of bigotry were open and virulent. Still, if one was lucky to be a white, middle class male, things were arguably about as good as they would get, which many remember rather fondly, either through rose-colored glasses or otherwise. POTUS as a boy wasn’t middle class, but the culture around him supported a worldview that he embodies even now. He’s also never been an industrialist, but he is a real estate developer (some would say slumlord) and media figure, and his models are taken from the 1950s.

The decade of my boyhood was the 1970s, which were the Nixon, Ford, and Carter Administrations. Everyone could sense the wheels were already coming off the bus, and white male entitlement was far diminished from previous decades. The Rust Belt was already a thing. Like children from the 1950s forward, however, I spent a lot of time in front of the TV. Much of it was goofy fun such as Gilligan’s Island, The Brady Bunch, and interestingly enough, Happy Days. It was innocent stuff. What are the chances that, as a boy plopped in front of the TV, POTUS would have seen the show below (excerpted) and taken special notice considering that the character shares his surname?

Snopes confirms that this a real episode from the TV show Trackdown. Not nearly as innocent as the shows I watched. The coincidences that the character is a con man, promises to build a wall, and claims to be the only person who can save the town are eerie, to say the least. Could that TV show be lodged in the back of POTUS’ brain, along with so many other boyhood memories, misremembered and revised the way memory tends to do?

Some have said that the great economic expansion of the 1950s and 60s was an anomaly. A constellation of conditions configured to produce an historical effect, a Golden Era by some reckonings, that cannot be repeated. We simply cannot return to an industrial or manufacturing economy that had once (arguably) made America great. And besides, the attempt would accelerate the collapse of the ecosystem, which is already in free fall. Yet that appears to be the intention of POTUS, whose early regression to childhood is a threat to us all.

Anthropologists, pundits, armchair cultural critics (like me), and others sometimes offer an aspect or characteristic, usually singular, that separates the human species from other animals. (Note: humans are animals, not the crowning creation of god in his own image, the dogma of major religions.) Typical singular aspects include tool use (very early on, fire), language, agriculture, self-awareness (consciousness), and intelligence, that last including especially the ability to conceptualize time and thus remember and plan ahead. The most interesting candidate suggested to me is our ability to kill from a distance. Without going into a list of things we don’t think we share with other species but surprisingly do, it interests me that none other possesses the ability to kill at a distance (someone will undoubtedly prove me wrong on this).

Two phrases spring to mind: nature is red in tooth and claw (Tennyson) and human life is nasty, brutish, and short (Hobbes). Both encapsulate what it means to have to kill to eat, which is hardly unique to animals. All sorts of plants, insects, and microorganisms embed themselves in hosts, sometimes killing the host and themselves. Symbiotic relationships also exist. The instance that interests me, though, is the act of killing in the animal kingdom that requires putting one’s own body at risk in life-or-death attack. Examples falling short of killing abound, such as intimidation to establish hierarchy, but to eat, an animal must kill its prey.

Having watched my share of historical fiction (pre-1800, say, but especially sword-and-sandal and medieval epics) on the TeeVee and at the cinema, the dramatic appeal of warring armies slamming into each other never seems to get old. Fighting is hand-to-hand or sword-to-sword, which are tantamount to the same. Archer’s arrows, projectiles launched from catapults and trebuchets, thrown knives, spears, and axes, and pouring boiling oil over parapets are killing from a relatively short distance, but the action eventually ends up being very close. The warrior code in fighting cultures honors the willingness to put oneself in harm’s way, to risk one’s own body. Leaders often exhibit mutual respect and may even share some intimacy. War may not be directly about eating, since humans are not cannibals under most circumstances; rather, it’s usually about control of resources, so secondarily about eating by amassing power. Those historical dramas often depict victors celebrating by enjoying lavish feasts.

Modern examples of warfare and killing from a distance make raining down death from above a bureaucratic action undertaken with little or no personal risk. Artillery, carpet bombing from 20,000 feet, drone strikes (controlled from the comfort of some computer lab in the Utah desert), and nuclear bombs are the obvious examples. No honorable warrior code attaches to such killing. Indeed, the chain of command separates the execution of kill orders from moral responsibility — probably a necessary disconnect when large numbers of casualties (collateral damage, if one prefers the euphemism) can be expected. Only war criminals, either high on killing or banally impervious to empathy and compassion, would dispatch hundreds of thousands at a time.

If killing from a distance is in most cases about proximity or lack thereof, one further example is worth mentioning: killing across time. While most don’t really conceptualize the space-time continuum as interconnected, the prospect of choices made today manifesting in megadeath in the foreseeable future is precisely the sort of bureaucratized killing from a distance that should be recognized and forestalled. Yet despite our supposed intellectual superiority over other species, we cannot avoid waging war, real and rhetorical, to control resources and narratives that enable us to eat. Eating the future would be akin to consuming seed corn, but that metaphor is not apt. Better perhaps to say that we’re killing the host. We’re embedded in the world, as indeed is everything we know to be alive, and rely upon the profundity of the biosphere for survival. Although the frequent charge is that humanity is a parasite or has become as cancer on the world, that tired assessment, while more accurate than not, is a little on the nose. A more charitable view is that, as a species, humanity, as the apex predator, has expanded its habitat to include the entire biosphere, killing to eat, and is slowly consuming and transforming it into a place uninhabitable by us, just as a yeast culture consumes its medium and grows to fill the space before dying all at once. So the irony or Pyrrhic victory is that we while we may fatten ourselves (well, some of us) in the short term, we have also created conditions leading to our own doom. Compared to other species whose time on Earth lasted tens of millions of years, human life on Earth turns out to be exactly what Hobbes said: nasty, brutish, and short.

Back in the day, I studied jazz improvisation. Like many endeavors, it takes dedication and continuous effort to develop the ear and learn to function effectively within the constraints of the genre. Most are familiar with the most simple form: the 12-bar blues. Whether more attuned to rhythm, harmony, lyrics, or structure doesn’t much matter; all elements work together to define the blues. As a novice improviser, structure is easy to grasp and lyrics don’t factor in (I’m an instrumentalist), but harmony and rhythm, simple though they may be to understand, are formidable when one is making up a solo on the spot. That’s improvisation. In class one day, after two passes through the chord changes, the instructor asked me how I thought I had done, and I blurted out that I was just trying to fill up the time. Other students heaved a huge sigh of recognition and relief: I had put my thumb on our shared anxiety. None of us were skilled enough yet to be fluent or to actually have something to say — the latter especially the mark of a skilled improvisor — but were merely trying to plug the whole when our turn came.

These days, weekends feel sorta the same way. On Friday night, the next two days often feel like a yawning chasm where I plan what I know from experience will be an improvisation, filling up the available time with shifting priorities, some combination of chores, duties, obligations, and entertainments (and unavoidable bodily functions such as eating, sleeping, etc.). Often enough I go back to work with stories to tell about enviable weekend exploits, but just I often have a nagging feeling that I’m still a novice with nothing much to say or contribute, just filling up the time with noise. And as I contemplate what years and decades may be left to me (if the world doesn’t crack up first), the question arises: what big projects would I like to accomplish before I’m done? That, too, seems an act of improvisation.

I suspect recent retirees face these dilemmas with great urgency until they relax and decide “who cares?” What is left to do, really, before one finally checks out? If careers are completed, children are raised, and most of life’s goals are accomplished, what remains besides an indulgent second childhood of light hedonism? Or more pointedly, what about one’s final years keeps it from feeling like quiet desperation or simply waiting for the Grim Reaper? What last improvisations and flourishes are worth undertaking? I have no answers to these questions. They don’t press upon me just yet with any significance, and I suffered no midlife crisis (so far) that would spur me to address the questions head on. But I can feel them gathering in the back of my mind like a shadow — especially with the specters of American-style fascism, financial and industrial collapse, and NTE looming.

Predictions are fool’s errands. Useful ones, anyway. The future branches in so many possible directions that truly reliable predictions are banal, such as the sun will rise in the east, death, and taxes. (NTE is arguably another term for megadeath, but I gotta reinforce that prediction to keep my doomer bonafides.) Now only a few days prior to the general election finds me anxious that the presidential race is still too close to call. More than a few pundits say that Donald Trump could actually win. At the same time, a Hillary Clinton win gives me no added comfort, really. Moreover, potential squabbles over the outcome threaten to turn the streets into riot zones. I had rather expected such disruptions during or after the two nominating conventions, but they settled on their presumptive nominees without drama.

Polls are often predictive, of course, and despite their acknowledged margins of error, they typically forecast results with enough confidence that many voters don’t bother to vote, safe in the assumption that predicted results (an obvious oxymoron) make moot the need to actually cast one’s vote. (The West Coast must experience this phenomenon more egregiously than the East Coast, except perhaps for California’s rather large population and voting power. Has Hawaii ever mattered?) For that reason alone, I’d like to see a blackout on polling in the weeks leading up to an election (2–3 ought to do), including election day. This would allow us to avoid repeating the experience of the Chicago Daily Tribune publishing the headline “Dewey Defeats Truman” back in 1948.

Analysis of voting patterns and results also dissuades voters from considering anything other than a strategic vote for someone able to actually win, as opposed to supporting worthy candidates polling far enough behind they don’t stand a chance of winning, thus reinforcing a two-party system no one really likes because it keeps delivering supremely lousy candidates. Jesse Ventura, having defied the polls and been elected to office as an independent, has been straightforward about his disdain for the notion that voting outside the two main parties is tantamount to throwing away one’s vote. A related meme is that by voting for independent Ralph Nader in 2000, the Democratic vote was effectively split, handing the win (extraordinarily close and contestable though it was) to George Bush. My thinking aligns with Jesse Ventura, not with those who view votes for Ralph Nader as betrayals.

If the presidential race is still too close for comfort, Michael Moore offers a thoughtful explanation how Trump could win:

This excerpt from Moore’s new film TrumpLand has been taken out of context by many pro-Trump ideologues. I admit the first time I saw it I was unsure whether Moore supports Trump. Additional remarks elsewhere indicate that he does not. The spooky thing is that as emotional appeals go, it’s clear that Trump connects with the people powerfully. But Moore is right about another thing: to vote for Trump is really a giant “fuck you” to the establishment, which won’t end well.