Archive for the ‘Politics’ Category

Societies sometimes employ leveling mechanisms to keep the high and mighty from getting too, well, high and mighty or to pull them back down when they nonetheless manage to scale untenable heights. Some might insist that the U.S. breakaway from the British crown and aristocratic systems in the Revolutionary Era was, among other things, to establish an egalitarian society in accordance with liberal philosophy of the day. This is true to a point, since we in the U.S. don’t have hereditary aristocratic titles, but a less charitable view is that the Founders really only substituted the landed gentry, which to say themselves, for the tyrannical British. Who scored worse on the tyranny scale is a matter of debate, especially when modern sensibilities are applied to historical practices. Although I don’t generally care for such hindsight moralizing, it’s uncontroversial that the phrase “all men are created equal” (from the U.S. Declaration of Independence) did not then apply, for instance, to slaves and women. We’re still battling to establish equality (a level playing field) among all men and women. For SJWs, the fight has become about equality of outcome (e.g., quotas), which is a perversion of the more reasonable and achievable equality of opportunity.

When and where available resources were more limited, say, in agrarian or subsistence economies, the distance or separation between top and bottom was relatively modest. In a nonresource economy, where activity is financialized and decoupled from productivity (Bitcoin, anyone?), the distance between top and bottom can grow appallingly wide. I suspect that an economist could give a better explanation of this phenomenon than I can, but my suspicion is that it has primarily to do with fiat currency (money issued without sound backing such as precious metals), expansion of credit, and creation of arcane instruments of finance, all of which give rise to an immense bureaucracy of administrative personnel to create, manage, and manipulate them.

The U.S. tax structure of the 1950s — steep taxes levied against the highest earners — was a leveling mechanism. Whether intentionally corrective of the excesses of the Jazz Age is beyond my knowledge. However, that progressive tax structure has been dismantled (“leveled,” one might say), shifting from progressive to regressive and now to transgressive. Regressive is where more or disproportionate tax responsibility is borne by those already struggling to satisfy their basic needs. Transgressive is outright punishment of those who fail to earn enough, as though the whip functions as a spur to success. Indeed, as I mentioned in the previous blog post, the mood of the country right now is to abandon and blame those whom financial success has eluded. Though the term debtor’s prison belongs to a bygone era, we still have them, as people are imprisoned over nonviolent infractions such as parking tickets only to have heavy, additional, administrative fines and fees levied on them, holding them hostage to payment. That’s victimizing the victim, pure and simple.

At the other end of the scale, the superrich ascend a hierarchy that is absurdly imbalanced since leveling mechanisms are no longer present. Of course, disdain of the nouveau riche exists, primarily because social training does not typically accompany amassing of new fortunes, allowing many of that cohort to be amazingly gauche and intransigently proud of it (names withheld). That disdain is especially the prerogative of those whose wealth is inherited, not the masses, but is not an effective leveling mechanism. If one is rich, famous, and charming enough, indulgences for bad or criminal behavior are commonplace. For instance, those convicted of major financial crime in the past decade are quite few, whereas beneficiaries (multimillionaires) of looting of the U.S. Treasury are many. One very recent exception to indulgences is the wave of people being accused of sexual misconduct, but I daresay the motivation is unrelated to that of standard leveling mechanisms. Rather, it’s moral panic resulting from strains being felt throughout society having to do with sexual orientation and identity.

When the superrich ascend into the billionaire class, they tend to behave supranationally: buying private islands or yachts outside the jurisdiction or control of nation states, becoming nominal residents of the most advantageous tax havens, and shielding themselves from the rabble. While this brand of anarchism may be attractive to some and justified to others, detaching from social hierarchies and abandoning or ignoring others in need once one’s own fortunes are secure is questionable behavior to say the least. Indeed, those of such special character are typically focal points of violence and mayhem when the lives of the masses become too intolerable. That target on one’s back can be ignored or forestalled for a long time, perhaps, but the eventuality of nasty blowback is virtually guaranteed. That’s the final leveling mechanism seen throughout history.


Brief, uncharacteristic foray into national politics. The Senate narrowly approved a tax reform bill that’s been hawked by that shiny-suit-wearing-used-car-salesman-conman-guy over the past months as simply a big, fat tax cut. From all appearances, it won’t quite work out that way. The 479-pp. bill is available here (PDF link), including last-minute handwritten amendments. I don’t know how typical that is of legislative processes, but I doubt rushing or forcing a vote in the dead of night on an unfinished bill no one has had the opportunity to review leads to good results. Moreover, what does that say to schoolchildren about finishing one’s homework before turning it in?

Considering the tax reform bill is still a work in progress, it’s difficult to know with much certainty its effects if/when signed into law. However, summaries and snapshots of tax effects on typical American households have been provided to aid in the layperson’s grasp of the bill. This one from Mic Network Inc. (a multichannel news/entertainment network with which I am unfamiliar, so I won’t vouch for its reliability) states that the bill is widely unpopular and few trust the advance marketing of the bill:

Only 16% of Americans have said they think the plan is actually going to cut their taxes, less than half the number of people polled who think that their bill is going to go up, according to a Nov. 15 poll from Quinnipiac University.

Yet it seems the Republican-led effort will be successful, despite concerns that many middle class people could actually see their taxes rise, that social programs could suffer, that small businesses could be harmed and that a hoped-for economic boom may never materialize. [links removed]

When a change in tax law goes into effect, one good question is, “who gets help and who gets hurt?” For decades now, the answer has almost always been Reverse Robin Hood: take (or steal) from the poor and give to the rich. That’s why income inequality has increased to extreme levels commencing with the Reagan administration. The economic field of play has been consciously, knowingly tilted in favor of certain groups at the expense of others. Does anyone really believe that those in power are looking out for the poor and downtrodden? Sorry, that’s not the mood of the nation right now. Rather than assisting people who need help, governments at all levels have been withdrawing support and telling people, in effect, “you’re on your own, but first pay your taxes.” I propose we call the new tax bill Reverse Cowgirl, because if anything is certain about it, it’s that lots of people are gonna get fucked.

Here’s the last interesting bit I am lifting from Anthony Gidden’s The Consequences of Modernity. Then I will be done with this particular book-blogging project. As part of Gidden’s discussion of the risk profile of modernity, he characterizes risk as either objective or perceived and further divides in into seven categories:

  1. globalization of risk (intensity)
  2. globalization of risk (frequency)
  3. environmental risk
  4. institutionalized risk
  5. knowledge gaps and uncertainty
  6. collective or shared risk
  7. limitations of expertise

Some overlap exists, and I will not distinguish them further. The first two are of primary significance today for obvious reasons. Although the specter of doomsday resulting from a nuclear exchange has been present since the 1950s, Giddens (writing in 1988) provides this snapshot of today’s issues:

The sheer number of serious risks in respect of socialised nature is quite daunting: radiation from major accidents at nuclear power-stations or from nuclear waste; chemical pollution of the seas sufficient to destroy the phytoplankton that renews much of the oxygen in the atmosphere; a “greenhouse effect” deriving from atmospheric pollutants which attack the ozone layer, melting part of the ice caps and flooding vast areas; the destruction of large areas of rain forest which are a basic source of renewable oxygen; and the exhaustion of millions of acres of topsoil as a result of widespread use of artificial fertilisers. [p. 127]

As I often point out, these dangers were known 30–40 years ago (in truth, much longer), but they have only worsened with time through political inaction and/or social inertia. After I began to investigate and better understand the issues roughly a decade ago, I came to the conclusion that the window of opportunity to address these risks and their delayed effects had already closed. In short, we’re doomed and living on borrowed time as the inevitable consequences of our actions slowly but steadily manifest in the world.

So here’s the really interesting part. The modern worldview bestows confidence borne out of expanding mastery of the built environment, where risk is managed and reduced through expert systems. Mechanical and engineering knowledge figure prominently and support a cause-and-effect mentality that has grown ubiquitous in the computing era, with its push-button inputs and outputs. However, the high modern outlook is marred by overconfidence in our competence to avoid disaster, often of our own making. Consider the abject failure of 20th-century institutions to handle geopolitical conflict without devolving into world war and multiple genocides. Or witness periodic crashes of financial markets, two major nuclear accidents, and numerous space shuttles and rockets destroyed. Though all entail risk, high-profile failures showcase our overconfidence. Right now, engineers (software and hardware) are confident they can deliver safe self-driving vehicles yet are blithely ignoring (says me, maybe not) major ethical dilemmas regarding liability and technological unemployment. Those are apparently problems for someone else to solve.

Since the start of the Industrial Revolution, we’ve barrelled headlong into one sort of risk after another, some recognized at the time, others only apparent after the fact. Nuclear weapons are the best example, but many others exist. The one I raise frequently is the live social experiment undertaken with each new communications technology (radio, cinema, telephone, television, computer, social networks) that upsets and destabilizes social dynamics. The current ruckus fomented by the radical left (especially in the academy but now infecting other environments) regarding silencing of free speech (thus, thought policing) is arguably one concomitant.

According to Giddens, the character of modern risk contrasts with that of the premodern. The scale of risk prior to the 17th century was contained and expectation of social continuity was strong. Risk was also transmuted through magical thinking (superstition, religion, ignorance, wishfulness) into providential fortuna or mere bad luck, which led to feelings of relative security rather than despair. Modern risk has now grown so widespread, consequential, and soul-destroying, situated at considerable remove leading to feelings of helplessness and hopelessness, that those not numbed by the litany of potential worries afflicting daily life (existential angst or ontological insecurity) often develop depression and other psychological compulsions and disturbances. Most of us, if aware of globalized risk, set it aside so that we can function and move forward in life. Giddens says that this conjures up anew a sense of fortuna, that our fate is no longer within our control. This

relieves the individual of the burden of engagement with an existential situation which might otherwise be chronically disturbing. Fate, a feeling that things will take their own course anyway, thus reappears at the core of a world which is supposedly taking rational control of its own affairs. Moreover, this surely exacts a price on the level of the unconscious, since it essentially presumes the repression of anxiety. The sense of dread which is the antithesis of basic trust is likely to infuse unconscious sentiments about the uncertainties faced by humanity as a whole. [p. 133]

In effect, the nature of risk has come full circle (completed a revolution, thus, revolutionized risk) from fate to confidence in expert control and back to fate. Of course, a flexibility of perspective is typical as situation demands — it’s not all or nothing — but the overarching character is clear. Giddens also provides this quote by Susan Sontag that captures what he calls the low-probability, high-consequence character of modern risk:

A permanent modern scenario: apocalypse looms — and it doesn’t occur. And still it looms … Apocalypse is now a long-running serial: not ‘Apocalypse Now,’ but ‘Apocalypse from now on.’ [p. 134]

The scandal surrounding Harvey Weinstein and all the people he harassed, bullied, assaulted, molested, and raped has provided occasion for many who had dealings with him to revisit their experiences and wonder what might have been (or not been) had things gone differently, had they acted otherwise in response to his oafish predations. I judge it’s nearly impossible for those outside the Hollywood scene to understand fully the stakes involved (and thus the distorted psychology), but on the other hand, nearly everyone has experience with power imbalances that enable some to get away with exploiting and victimizing others. And because American culture responds to tragedies like a bunch of rubberneckers, the witch hunt has likely only just begun. There’s a better than average chance that, as with icebergs, the significantly larger portion of the problem lies hidden below the surface, as yet undisclosed. Clamor won’t alter much in the end; the dynamics are too ingrained. Still, expect accusations to fly all over the industry, including victim blaming. My strong suspicion is that some folks dodged (actively or passively) becoming victims and paid a price in terms of career success, whereas others fell prey or simply went along (and then stayed largely silent until now) and got some extra consideration out of it. Either way, it undermines one’s sense of self-worth, messing with one’s head for years afterwards. Sometimes there’s no escaping awful circumstance.

Life is messy, right? We all have episodes from our past that we wish we could undo. Hindsight makes the optimal path far more clear than in the moment. Fortunately, I have no crimes among my regrets, but with certain losses, I certainly wish I had known then what I know now (a logical fallacy). Strange that the news cycle has me revisiting my own critical turning points in sympathy with others undoubtedly doing the same.

As I generalize this thought process, I can’t help but to wonder as well what might have been had we not, say, (1) split the atom and immediately weaponized the technology, (2) succumbed to various Red Scares scattered around 20th- and 21st-century calendars but instead developed a progressive society worthy of the promise our institutions once embodied, (3) plunged forward out of avarice and shortsightedness by plundering the Earth, and (4) failed to reverse course once the logical conclusion to our aggregate effects on the biosphere was recognized. No utopia would have arisen had we dodged these bullets, of course, but the affairs of men would have been marginally improved, and we might even have survived the 21st century. Such thinking is purely hypothetical and invites a fatalist like me to wonder whether — given our frailty, weakness, and corruption (the human condition being akin to original sin) — we don’t already inhabit the best of all possible worlds.

Isn’t that a horrible thought? A world full of suffering and hardship, serial rapists and murderers, incompetent and venal political leaders, and torture and genocides is the best we can do? We can’t avoid our own worst tendencies? Over long spans of time, cataclysmic earthquakes, volcanic eruptions, superstorms, and meteor strikes already make life on Earth rather precarious, considering that over 99% of all species that once existed are now gone. On balance, we have some remarkable accomplishments, though often purchased with sizeable trade-offs (e.g., slave labor, patriarchal suppression). Still, into the dustbin of history is where we are headed rather sooner than later, having enjoyed only a brief moment in the sun.

I revisit my old blog posts when I see some reader activity in the WordPress backstage and was curious to recall a long quote of Iain McGilchrist summarizing arguments put forth by Anthony Giddens in his book Modernity and Self-identity (1991). Giddens had presaged recent cultural developments, namely, the radicalization of nativists, supremacists, Social Justice Warriors (SJWs), and others distorted by absorbed in identity politics. So I traipsed off to the Chicago Public Library (CPL) and sought out the book to read. Regrettably, CPL didn’t have a copy, so I settled on a slightly earlier book, The Consequences of Modernity (1990), which is based on a series of lectures delivered at Stanford University in 1988.

Straight away, the introduction provides a passage that goes to the heart of matters with which I’ve been preoccupied:

Today, in the late twentieth century, it is argued by many, we stand at the opening of a new era … which is taking us beyond modernity itself. A dazzling variety of terms has been suggested to refer to this transition, a few of which refer positively to the emergence of a new type of social system (such as the “information society” or the “consumer society”) but most of which suggest rather than a preceding state of affairs is drawing to a close … Some of the debates about these matters concentrate mainly upon institutional transformations, particularly those which propose that we are moving from a system based upon the manufacture of material goods to one concerned more centrally with information. More commonly, however, those controversies are focused largely upon issues of philosophy and epistemology. This is the characteristic outlook, for example, of the the author who has been primarily responsible for popularising the notion of post-modernity, Jean-François Lyotard. As he represents it, post-modernity refers to a shift away from attempts to ground epistemology and from from faith in humanly engineered progress. The condition of post-modernity is distinguished by an evaporating of the “grand narrative” — the overarching “story line” by means of which we are placed in history as being having a definite past and a predictable future. The post-modern outlook sees a plurality of heterogeneous claims to knowledge, in which science does not have a privileged place. [pp. 1–2, emphasis added]

That’s a lot to unpack all at once, but the fascinating thing is that notions now manifesting darkly in the marketplace of ideas were already in the air in the late 1980s. Significantly, this was several years still before the Internet brought the so-called Information Highway to computer users, before the cell phone and smart phone were developed, and before social media displaced traditional media (TV was only 30–40 years old but had previously transformed our information environment) as the principal way people gather news. I suspect that Giddens has more recent work that accounts for the catalyzing effect of the digital era (including mobile media) on culture, but for the moment, I’m interested in the book in hand.

Regular readers of this blog (I know of one or two) already know my armchair social criticism directed to our developing epistemological crisis (challenges to authority and expertise, psychotic knowledge, fake news, alternative facts, dissolving reality, and science denial) as well as the Transhumanist fantasy of becoming pure thought (once we evolve beyond our bodies). Until that’s accomplished with imagined technology, we increasingly live in our heads, in the abstract, disoriented and adrift on a bewildering sea of competing narratives. Moreover, I’ve stated repeatedly that highly mutable story (or narrative) underlie human cognition and consciousness, making most of us easy marks for charismatic thought leaders storytellers. Giddens was there nearly 30 years ago with these same ideas, though his terms differ.

Giddens dispels the idea of post-modernity and insists that, from a sociological perspective, the current period is better described as high modernism. This reminds me of Oswald Spengler and my abandoned book blogging of The Decline of the West. It’s unimportant to me who got it more correct but note that the term Postmodernism has been adopted widely despite its inaccuracy (at least according to Giddens). As I get further into the book, I’ll have plenty more to say.

Here’s a familiar inspirational phrase from The Bible: the truth shall set you free (John 8:32). Indeed, most of us take it as, um, well, gospel that knowledge and understanding are unqualified goods. However, the information age has turned out to be a mixed blessing. Any clear-eyed view of the the way the world works and its long, tawdry history carries with it an inevitable awareness of injustice, inequity, suffering, and at the extreme end, some truly horrific epaisodes of groups victimizing each other. Some of the earliest bits of recorded history, as distinguished from oral history, are financial — keeping count (or keeping accounts). Today differs not so much in character as in the variety of counts being kept and the sophistication of information gathering.

The Bureau of Labor Statistics, a part of the U.S. Department of Labor, is one information clearinghouse that slices and dices available data according to a variety of demographic characteristics. The fundamental truth behind such assessments, regardless of the politics involved, is that when comparisons are made between unlike groups, say, between men and women or young and old, one should expect to find differences and indeed be rather surprised if comparisons revealed none. So the question of gender equality in the workplace, or its implied inverse, gender inequality in the workplace, is a form of begging the question, meaning that if one seeks differences, one shall most certainly find them. But those differences are not prima facie evidence of injustice in the sense of the popular meme that women are disadvantaged or otherwise discriminated against in the workplace. Indeed, the raw data can be interpreted according to any number of agendas, thus the phrase “lying with statistics,” and most of us lack the sophistication to contextualize statistics properly, which is to say, free of the emotional bias that plagues modern politics, and more specifically, identity politics.

The fellow who probably ran up against this difficulty the worst is Charles Murray in the aftermath of publication of his book The Bell Curve (1994), which deals with how intelligence manifests differently across demographic groups yet functions as the primary predictor of social outcomes. Murray is particularly well qualified to interpret data and statistics dispassionately, and in true seek-and-find fashion, differences between groups did appear. It is unclear how much his resulting prescriptions for social programs are borne out of data vs. ideology, but most of us are completely at sea wading through the issues without specialized academic training to make sense of the evidence.

More recently, another fellow caught in the crosshairs on issues of difference is James Damore, who was fired from his job at Google after writing what is being called an anti-diversity manifesto (but might be better termed an internal memo) that was leaked and then went viral. The document can be found here. I have not dug deeply into the details, but my impression is that Damore attempted a fairly academic unpacking of the issue of gender differences in the workplace as they conflicted with institutional policy only to face a hard-set ideology that is more RightThink than truth. In Damore’s case, the truth did set him free — free from employment. Even the NY Times recognizes that the Thought Police sprang into action yet again to demand that its pet illusions about society be supported rather than dispelled. These witch hunts and shaming rituals (vigilante justice carried out in the court of public opinion) are occurring with remarkable regularity.

In a day and age where so much information (too much information, as it turns out) is available to us to guide our thinking, one might hope for careful, rational analysis and critical thinking. However, trends point to the reverse: a return to tribalism, xenophobia, scapegoating, and victimization. There is also a victimization Olympics at work, with identity groups vying for imaginary medals awarded to whoever’s got it worst. I’m no Pollyanna when it comes to the notion that all men are brothers and, shucks, can’t we all just get along? That’s not our nature. But the marked indifference of the natural world to our suffering as it besets us with drought, fire, floods, earthquakes, tsunamis, hurricanes, tornadoes, and the like (and this was just the last week!) might seem like the perfect opportunity to find within ourselves a little grace and recognize our common struggles in the world rather than add to them.

Since Jordan Peterson came to prominence last fall, he’s been maligned and misunderstood. I, too, rushed to judgment before understanding him more fully by watching many of his YouTube clips (lectures, speeches, interviews, webcasts, etc.). As the months have worn on and media continue to shove Peterson in everyone’s face (with his willing participation), I’ve grown in admiration and appreciation of his two main (intertwined) concerns: free speech and cultural Marxism. Most of the minor battles I’ve fought on these topics have come to nothing as I’m simply brushed off for not “getting it,” whatever “it” is (I get that a lot for not being a conventional thinker). Simply put, I’m powerless, thus harmless and of no concern. I have to admit, though, to being surprised at the proposals Peterson puts forward in this interview, now over one month old:

Online classes are nothing especially new. Major institutions of higher learning already offer distance-learning courses, and some institutions exist entirely online, though they tend to be degree mills with less concern over student learning than with profitability and boosting student self-esteem. Peterson’s proposal is to launch an online university for the humanities, and in tandem, to reduce the number of students flowing into today’s corrupted humanities departments where they are indoctrinated into the PoMo cult of cultural Marxism (or as Peterson calls it in the interview above, neo-Marxism). Teaching course content online seems easy enough. As pointed out, the technology for it has matured. (I continue to believe face-to-face interaction is far better.) The stated ambition to overthrow the current method of teaching the humanities, though, is nothing short of revolutionary. It’s worth observing, however, that the intent appears not to be undermining higher education (which is busy destroying itself) but to save or rescue students from the emerging cult.

Being a traditionalist, I appreciate the great books approach Peterson recommends as a starting point. Of course, this approach stems from exactly the sort of dead, white, male hierarchy over which social justice warriors (SJWs) beat their breasts. No doubt: patriarchy and oppression are replete throughout human history, and we’re clearly not yet over with it. To understand and combat it, however, one must study rather than discard history or declare it invalid as a subject of study. That also requires coming to grips with some pretty hard, brutal truths about our capacity for mayhem and cruelty — past, present, and future.

I’ve warned since the start of this blog in 2006 that the future is not shaping up well for us. It may be that struggles over identity many young people are experiencing (notably, sexual and gender dysphoria occurring at the remarkably vulnerable phase of early adulthood) are symptoms of a larger cultural transition into some other style of consciousness. Peterson clearly believes that the struggle in which he is embroiled is fighting against the return of an authoritarian style tried repeatedly in the 20th century to catastrophic results. Either way, it’s difficult to contemplate anything worthwhile emerging from brazen attempts at thought control by SJWs.

Violent events of the past week (Charleston, VA; Barcelona, Spain) and political responses to them have dominated the news cycle, pushing other newsworthy items (e.g., U.S.-South Korean war games and a looming debt ceiling crisis) off the front page and into the darker recesses of everyone’s minds (those paying attention, anyway). We’re absorbed instead with culture wars run amok. I’m loath to apply the term terrorism to regular periodic eruptions of violence, both domestic and foreign. That term carries with it intent, namely, the objective to create day-to-day terror in the minds of a population so as to interfere with proper functions of society. It’s unclear to me whether recent perpetrators of violence are coherent enough to formulate sophisticated motivations or plans. The dumb, obvious way of doing things — driving into crowds of people — takes little or no planning and may just as well be the result of inchoate rage boiling over in a moment of high stress and opportunity. Of course, it needn’t be all or nothing, and considering our reflexively disproportionate responses, the term terrorism and attendant destabilization is arguably accurate even without specified intent. That’s why in the wake of 9/11 some 16 years ago, the U.S. has become a security state.

It’s beyond evident that hostilities have been simmering below the not-so-calm surface. Many of those hostilities, typically borne out of economic woes but also part of a larger clash of civilizations, take the form of identifying an “other” presumably responsible for one’s difficulties and then victimizing the “other” in order to elevate oneself. Of course, the “other” isn’t truly responsible for one’s struggles, so the violent dance doesn’t actually elevate anyone, as in “supremacy”; it just wrecks both sides (though unevenly). Such warped thinking seems to be a permanent feature of human psychology and enjoys popular acceptance when the right “other” is selected and universal condemnation when the wrong one is chosen. Those doing the choosing and those being chosen haven’t changed much over the centuries. Historical Anglo-Saxons and Teutons choose and people of color (all types) get chosen. Jews are also chosen with dispiriting regularity, which is an ironic inversion of being the Chosen People (if you believe in such things — I don’t). However, any group can succumb to this distorted power move, which is why so much ongoing, regional, internecine conflict exists.

As I’ve been saying for years, a combination of condemnation and RightThink has simultaneously freed some people from this cycle of violence but merely driven the holdouts underground. Supremacy in its various forms (nationalism, racism, antisemitism, etc.) has never truly been expunged. RightThink itself has morphed (predictably) into intolerance, which is now veering toward radicalism. Perhaps a positive outcome of this latest resurgence of supremacist ideology is that those infected with the character distortion have been emboldened to identify themselves publicly and thus can be dealt with somehow. Civil authorities and thought leaders are not very good at dealing with hate, often shutting people out of the necessary public conversation and/or seeking to legislate hate out of existence with restrictions on free speech. But it is precisely through free expression and diplomacy that we address conflict. Violence is a failure to remain civil (duh!), and war (especially the genocidal sort) is the extreme instance. It remains to be seen if the lid can be kept on this boiling pot, but considering cascade failures lined up to occur within the foreseeable future, I’m pessimistic that we can see our way past the destructive habit of shifting blame onto others who often suffer even worse than those holding the reins of power.

Previous blogs on this topic are here and here.

Updates to the Bulletin of the Atomic Scientists resetting the metaphorical doomsday clock hands used to appear at intervals of 3–7 years. Updates have been issued in each of the last three years, though the clock hands remained in the same position from 2015 to 2016. Does that suggest raised geopolitical instability or merely resumed paranoia resulting from the instantaneous news cycle and radicalization of society and politics? The 2017 update resets the minute hand slightly forward to 2½ minutes to midnight:

doomsdayclock_black_2-5mins_regmark2028129For the last two years, the minute hand of the Doomsday Clock stayed set at three minutes before the hour, the closest it had been to midnight since the early 1980s. In its two most recent annual announcements on the Clock, the Science and Security Board warned: “The probability of global catastrophe is very high, and the actions needed to reduce the risks of disaster must be taken very soon.” In 2017, we find the danger to be even greater, the need for action more urgent. It is two and a half minutes to midnight, the Clock is ticking, global danger looms. Wise public officials should act immediately, guiding humanity away from the brink. If they do not, wise citizens must step forward and lead the way …

The principal concern of the Bulletin since its creation has been atomic/nuclear war. Recent updates include climate change in the mix. Perhaps it is not necessary to remind regular readers here, but the timescales for these two threats are quite different: global thermonuclear war (a term from the 1980s when superpowers last got weird and paranoid about things) could erupt almost immediately given the right lunacy provocation, such as the sabre-rattling now underway between the U.S. and North Korea, whereas climate change is an event typically unfolding across geological time. The millions of years it usually takes to manifest climate change fully and reach a new steady state (hot house earth vs. ice age earth), however, appears to have been accelerated by human inputs (anthropogenic climate change, or as Guy McPherson calls it, abrupt climate change) to only a few centuries.

Nuclear arsenals around the world are the subject of a curious article at Visual Capitalist (including several reader-friendly infographics) by Nick Routley. The estimated number of weapons in the U.S. arsenal has risen since the last time I blogged about this in 2010. I still find it impossible to fathom why more than a dozen nukes are necessary, or in my more charitable moments toward the world’s inhabitants, why any of them are necessary. Most sober analysts believe we are far safer today than, say, the 1950s and early 1960s when brinkmanship was anybody’s game. I find this difficult to judge considering the two main actors today on the geopolitical stage are both witless, unpredictable, narcissistic maniacs. Moreover, the possibility of some ideologue (religious or otherwise) getting hold of WMDs (not necessarily nukes) and creating mayhem is increasing as the democratization of production filters immense power down to lower and lower elements of society. I for one don’t feel especially safe.

According to Hal Smith of The Compulsive Explainer (see my blogroll), the tragedy of our time is, simply put, failed social engineering. Most of his blog post is quoted below:

Americans, for example, have decided to let other forces manage their nation — and not let Americans themselves manage it. At least this is what I see happening, with the election of Trump. They have handed the management of their country over to a man with a collection of wacky ideas — and they feel comfortable with this. Mismanagement is going on everywhere — and why not include the government in this?

This is typical behavior for a successful society in decline. They cannot see what made them successful, has been taken too far — and is now working against them. The sensible thing for them to do is back off for awhile, analyze their situation — and ask “What is going wrong here?” But they never do this — and a collapse ensues.

In our present case, the collapse involves a global society based on Capitalism — that cannot adapt itself to a Computer-based economy. The Software ecosystem operates differently — it is based on cooperation, not competition.

Capitalism was based on just that — Capital — money making money. And it was very successful — for those it favored. Money is still important in the Computer economy — people still have to be paid. But what they are being paid for has changed — information is now being managed, something different entirely.

Hardware is still important — but that is not where the Big Money is being made. It is now being made in Computers, and their Software.

I’m sympathetic to this view but believe that a look back through history reveals something other than missed opportunities and too-slow adaptation as we fumbled our way forward, namely, repeated catastrophic failures. Such epic fails include regional and global wars, genocides, and societal collapses that rise well above the rather bland term mismanagement. A really dour view of history, taking into account more than a few truly vicious, barbaric episodes, might regard the world as a nearly continuous stage of horrors from which we periodically take refuge, and the last of these phases is drawing quickly to a close.

The breakneck speed of technological innovation and information exchange has resulted not in Fukuyama’s mistakenly exuberant “end of history” (kinda-sorta winning the Cold War but nevertheless losing the peace?) but instead an epoch where humans are frankly left behind by follow-on effects of their own unrestrained restlessness. Further, if history is a stage of horrors, then geopolitics is theater of the absurd. News reports throughout the new U.S. presidential administration, still less than 6 months in (more precisely, 161 days or 23 weeks), tell of massive economic and geopolitical instabilities threatening to collapse the house of cards with only a slight breeze. Contortions press agents and politicized news organs go through to provide cover for tweets, lies, and inanities emanating from the disturbed mind of 45 are carnival freak show acts. Admittedly, not much has changed over the previous two administrations — alterations of degree only, not kind — except perhaps to demonstrate beyond any shadow of doubt that our elected, appointed, and purchased leaders (acknowledging many paths to power) are fundamentally incompetent to deal effectively with human affairs, much less enact social engineering projects beyond the false happiness of Facebook algorithms that hide bad news. Look no further than the egregious awfulness of both presidential candidates in the last election coughed up like hairballs from the mouths of their respective parties. The aftermath of those institutional failures finds both major parties in shambles, further degraded than their already deplorable states prior to the election.

So how much worse can things get? Well, scary as it sounds, lots. The electrical grid is still working, water is still flowing to the taps, and supply lines continue to keep store shelves stocked with booze and brats for extravagant holiday celebrations. More importantly, we in the U.S. have (for now, unlike Europe) avoided repetition of any major terrorist attacks. But everyone with an honest ear to the ground recognizes our current condition as the proverbial calm before the storm. For instance, we’re threatened by the first ice-free Arctic in the history of mankind later this year and a giant cleaving off of the Larsen C Ice Shelf in Antarctica within days. In addition, drought in the Dakotas will result in a failed wheat harvest. Guy McPherson (in particular, may well be others) has been predicting for years that abrupt, nonlinear climate change when the poles warm will end the ability to grow grain at scale, leading to worldwide famine, collapse, and near-term extinction. Seems like we’re passing the knee of the curve. Makes concerns about maladaptation and failed social engineering pale by comparison.