Posts Tagged ‘Politics’

Brief, uncharacteristic foray into national politics. The Senate narrowly approved a tax reform bill that’s been hawked by that shiny-suit-wearing-used-car-salesman-conman-guy over the past months as simply a big, fat tax cut. From all appearances, it won’t quite work out that way. The 479-pp. bill is available here (PDF link), including last-minute handwritten amendments. I don’t know how typical that is of legislative processes, but I doubt rushing or forcing a vote in the dead of night on an unfinished bill no one has had the opportunity to review leads to good results. Moreover, what does that say to schoolchildren about finishing one’s homework before turning it in?

Considering the tax reform bill is still a work in progress, it’s difficult to know with much certainty its effects if/when signed into law. However, summaries and snapshots of tax effects on typical American households have been provided to aid in the layperson’s grasp of the bill. This one from Mic Network Inc. (a multichannel news/entertainment network with which I am unfamiliar, so I won’t vouch for its reliability) states that the bill is widely unpopular and few trust the advance marketing of the bill:

Only 16% of Americans have said they think the plan is actually going to cut their taxes, less than half the number of people polled who think that their bill is going to go up, according to a Nov. 15 poll from Quinnipiac University.

Yet it seems the Republican-led effort will be successful, despite concerns that many middle class people could actually see their taxes rise, that social programs could suffer, that small businesses could be harmed and that a hoped-for economic boom may never materialize. [links removed]

When a change in tax law goes into effect, one good question is, “who gets help and who gets hurt?” For decades now, the answer has almost always been Reverse Robin Hood: take (or steal) from the poor and give to the rich. That’s why income inequality has increased to extreme levels commencing with the Reagan administration. The economic field of play has been consciously, knowingly tilted in favor of certain groups at the expense of others. Does anyone really believe that those in power are looking out for the poor and downtrodden? Sorry, that’s not the mood of the nation right now. Rather than assisting people who need help, governments at all levels have been withdrawing support and telling people, in effect, “you’re on your own, but first pay your taxes.” I propose we call the new tax bill Reverse Cowgirl, because if anything is certain about it, it’s that lots of people are gonna get fucked.

Advertisements

Here’s the last interesting bit I am lifting from Anthony Gidden’s The Consequences of Modernity. Then I will be done with this particular book-blogging project. As part of Gidden’s discussion of the risk profile of modernity, he characterizes risk as either objective or perceived and further divides in into seven categories:

  1. globalization of risk (intensity)
  2. globalization of risk (frequency)
  3. environmental risk
  4. institutionalized risk
  5. knowledge gaps and uncertainty
  6. collective or shared risk
  7. limitations of expertise

Some overlap exists, and I will not distinguish them further. The first two are of primary significance today for obvious reasons. Although the specter of doomsday resulting from a nuclear exchange has been present since the 1950s, Giddens (writing in 1988) provides this snapshot of today’s issues:

The sheer number of serious risks in respect of socialised nature is quite daunting: radiation from major accidents at nuclear power-stations or from nuclear waste; chemical pollution of the seas sufficient to destroy the phytoplankton that renews much of the oxygen in the atmosphere; a “greenhouse effect” deriving from atmospheric pollutants which attack the ozone layer, melting part of the ice caps and flooding vast areas; the destruction of large areas of rain forest which are a basic source of renewable oxygen; and the exhaustion of millions of acres of topsoil as a result of widespread use of artificial fertilisers. [p. 127]

As I often point out, these dangers were known 30–40 years ago (in truth, much longer), but they have only worsened with time through political inaction and/or social inertia. After I began to investigate and better understand the issues roughly a decade ago, I came to the conclusion that the window of opportunity to address these risks and their delayed effects had already closed. In short, we’re doomed and living on borrowed time as the inevitable consequences of our actions slowly but steadily manifest in the world.

So here’s the really interesting part. The modern worldview bestows confidence borne out of expanding mastery of the built environment, where risk is managed and reduced through expert systems. Mechanical and engineering knowledge figure prominently and support a cause-and-effect mentality that has grown ubiquitous in the computing era, with its push-button inputs and outputs. However, the high modern outlook is marred by overconfidence in our competence to avoid disaster, often of our own making. Consider the abject failure of 20th-century institutions to handle geopolitical conflict without devolving into world war and multiple genocides. Or witness periodic crashes of financial markets, two major nuclear accidents, and numerous space shuttles and rockets destroyed. Though all entail risk, high-profile failures showcase our overconfidence. Right now, engineers (software and hardware) are confident they can deliver safe self-driving vehicles yet are blithely ignoring (says me, maybe not) major ethical dilemmas regarding liability and technological unemployment. Those are apparently problems for someone else to solve.

Since the start of the Industrial Revolution, we’ve barrelled headlong into one sort of risk after another, some recognized at the time, others only apparent after the fact. Nuclear weapons are the best example, but many others exist. The one I raise frequently is the live social experiment undertaken with each new communications technology (radio, cinema, telephone, television, computer, social networks) that upsets and destabilizes social dynamics. The current ruckus fomented by the radical left (especially in the academy but now infecting other environments) regarding silencing of free speech (thus, thought policing) is arguably one concomitant.

According to Giddens, the character of modern risk contrasts with that of the premodern. The scale of risk prior to the 17th century was contained and expectation of social continuity was strong. Risk was also transmuted through magical thinking (superstition, religion, ignorance, wishfulness) into providential fortuna or mere bad luck, which led to feelings of relative security rather than despair. Modern risk has now grown so widespread, consequential, and soul-destroying, situated at considerable remove leading to feelings of helplessness and hopelessness, that those not numbed by the litany of potential worries afflicting daily life (existential angst or ontological insecurity) often develop depression and other psychological compulsions and disturbances. Most of us, if aware of globalized risk, set it aside so that we can function and move forward in life. Giddens says that this conjures up anew a sense of fortuna, that our fate is no longer within our control. This

relieves the individual of the burden of engagement with an existential situation which might otherwise be chronically disturbing. Fate, a feeling that things will take their own course anyway, thus reappears at the core of a world which is supposedly taking rational control of its own affairs. Moreover, this surely exacts a price on the level of the unconscious, since it essentially presumes the repression of anxiety. The sense of dread which is the antithesis of basic trust is likely to infuse unconscious sentiments about the uncertainties faced by humanity as a whole. [p. 133]

In effect, the nature of risk has come full circle (completed a revolution, thus, revolutionized risk) from fate to confidence in expert control and back to fate. Of course, a flexibility of perspective is typical as situation demands — it’s not all or nothing — but the overarching character is clear. Giddens also provides this quote by Susan Sontag that captures what he calls the low-probability, high-consequence character of modern risk:

A permanent modern scenario: apocalypse looms — and it doesn’t occur. And still it looms … Apocalypse is now a long-running serial: not ‘Apocalypse Now,’ but ‘Apocalypse from now on.’ [p. 134]

The scandal surrounding Harvey Weinstein and all the people he harassed, bullied, assaulted, molested, and raped has provided occasion for many who had dealings with him to revisit their experiences and wonder what might have been (or not been) had things gone differently, had they acted otherwise in response to his oafish predations. I judge it’s nearly impossible for those outside the Hollywood scene to understand fully the stakes involved (and thus the distorted psychology), but on the other hand, nearly everyone has experience with power imbalances that enable some to get away with exploiting and victimizing others. And because American culture responds to tragedies like a bunch of rubberneckers, the witch hunt has likely only just begun. There’s a better than average chance that, as with icebergs, the significantly larger portion of the problem lies hidden below the surface, as yet undisclosed. Clamor won’t alter much in the end; the dynamics are too ingrained. Still, expect accusations to fly all over the industry, including victim blaming. My strong suspicion is that some folks dodged (actively or passively) becoming victims and paid a price in terms of career success, whereas others fell prey or simply went along (and then stayed largely silent until now) and got some extra consideration out of it. Either way, it undermines one’s sense of self-worth, messing with one’s head for years afterwards. Sometimes there’s no escaping awful circumstance.

Life is messy, right? We all have episodes from our past that we wish we could undo. Hindsight makes the optimal path far more clear than in the moment. Fortunately, I have no crimes among my regrets, but with certain losses, I certainly wish I had known then what I know now (a logical fallacy). Strange that the news cycle has me revisiting my own critical turning points in sympathy with others undoubtedly doing the same.

As I generalize this thought process, I can’t help but to wonder as well what might have been had we not, say, (1) split the atom and immediately weaponized the technology, (2) succumbed to various Red Scares scattered around 20th- and 21st-century calendars but instead developed a progressive society worthy of the promise our institutions once embodied, (3) plunged forward out of avarice and shortsightedness by plundering the Earth, and (4) failed to reverse course once the logical conclusion to our aggregate effects on the biosphere was recognized. No utopia would have arisen had we dodged these bullets, of course, but the affairs of men would have been marginally improved, and we might even have survived the 21st century. Such thinking is purely hypothetical and invites a fatalist like me to wonder whether — given our frailty, weakness, and corruption (the human condition being akin to original sin) — we don’t already inhabit the best of all possible worlds.

Isn’t that a horrible thought? A world full of suffering and hardship, serial rapists and murderers, incompetent and venal political leaders, and torture and genocides is the best we can do? We can’t avoid our own worst tendencies? Over long spans of time, cataclysmic earthquakes, volcanic eruptions, superstorms, and meteor strikes already make life on Earth rather precarious, considering that over 99% of all species that once existed are now gone. On balance, we have some remarkable accomplishments, though often purchased with sizeable trade-offs (e.g., slave labor, patriarchal suppression). Still, into the dustbin of history is where we are headed rather sooner than later, having enjoyed only a brief moment in the sun.

Here’s a familiar inspirational phrase from The Bible: the truth shall set you free (John 8:32). Indeed, most of us take it as, um, well, gospel that knowledge and understanding are unqualified goods. However, the information age has turned out to be a mixed blessing. Any clear-eyed view of the the way the world works and its long, tawdry history carries with it an inevitable awareness of injustice, inequity, suffering, and at the extreme end, some truly horrific epaisodes of groups victimizing each other. Some of the earliest bits of recorded history, as distinguished from oral history, are financial — keeping count (or keeping accounts). Today differs not so much in character as in the variety of counts being kept and the sophistication of information gathering.

The Bureau of Labor Statistics, a part of the U.S. Department of Labor, is one information clearinghouse that slices and dices available data according to a variety of demographic characteristics. The fundamental truth behind such assessments, regardless of the politics involved, is that when comparisons are made between unlike groups, say, between men and women or young and old, one should expect to find differences and indeed be rather surprised if comparisons revealed none. So the question of gender equality in the workplace, or its implied inverse, gender inequality in the workplace, is a form of begging the question, meaning that if one seeks differences, one shall most certainly find them. But those differences are not prima facie evidence of injustice in the sense of the popular meme that women are disadvantaged or otherwise discriminated against in the workplace. Indeed, the raw data can be interpreted according to any number of agendas, thus the phrase “lying with statistics,” and most of us lack the sophistication to contextualize statistics properly, which is to say, free of the emotional bias that plagues modern politics, and more specifically, identity politics.

The fellow who probably ran up against this difficulty the worst is Charles Murray in the aftermath of publication of his book The Bell Curve (1994), which deals with how intelligence manifests differently across demographic groups yet functions as the primary predictor of social outcomes. Murray is particularly well qualified to interpret data and statistics dispassionately, and in true seek-and-find fashion, differences between groups did appear. It is unclear how much his resulting prescriptions for social programs are borne out of data vs. ideology, but most of us are completely at sea wading through the issues without specialized academic training to make sense of the evidence.

More recently, another fellow caught in the crosshairs on issues of difference is James Damore, who was fired from his job at Google after writing what is being called an anti-diversity manifesto (but might be better termed an internal memo) that was leaked and then went viral. The document can be found here. I have not dug deeply into the details, but my impression is that Damore attempted a fairly academic unpacking of the issue of gender differences in the workplace as they conflicted with institutional policy only to face a hard-set ideology that is more RightThink than truth. In Damore’s case, the truth did set him free — free from employment. Even the NY Times recognizes that the Thought Police sprang into action yet again to demand that its pet illusions about society be supported rather than dispelled. These witch hunts and shaming rituals (vigilante justice carried out in the court of public opinion) are occurring with remarkable regularity.

In a day and age where so much information (too much information, as it turns out) is available to us to guide our thinking, one might hope for careful, rational analysis and critical thinking. However, trends point to the reverse: a return to tribalism, xenophobia, scapegoating, and victimization. There is also a victimization Olympics at work, with identity groups vying for imaginary medals awarded to whoever’s got it worst. I’m no Pollyanna when it comes to the notion that all men are brothers and, shucks, can’t we all just get along? That’s not our nature. But the marked indifference of the natural world to our suffering as it besets us with drought, fire, floods, earthquakes, tsunamis, hurricanes, tornadoes, and the like (and this was just the last week!) might seem like the perfect opportunity to find within ourselves a little grace and recognize our common struggles in the world rather than add to them.

This is the inverse of a prior post called “Truth Based on Fiction.”

Telling stories about ourselves is one of the most basic of human attributes stretching across oral and recorded history. We continue today to memorialize events in short, compact tellings, frequently movies depicting real-life events. I caught two such films recently: Truth (about what came to be known as Rathergate) and Snowden (about whistle-blower Edward Snowden).

Although Dan Rather is the famous figure associated with Truth, the story focuses more on his producer Mary Mapes and the group decisions leading to airing of a controversial news report about George W. Bush’s time in the Air National Guard. The film is a dramatization, not a documentary, and so is free to present the story with its own perspective and some embellishment. Since I’m not a news junkie, my memory of the events in 2004 surrounding the controversy are not especially well informed, and I didn’t mind the potential for the movie’s version of events to color my thinking. About some controversies and conspiracies, I feel no particular demand to adopt a strong position. The actors did well enough, but I felt Robert Redford was poorly cast as Dan Rather. Redford is too famous in his own right to succeed as a character actor playing a real-life person.

Debate over the patriotism or treason of Edward Snowden’s actions continues to swirl, but the film covers the issues pretty well, from his discovery of an intelligence services surveillance dragnet (in violation of the 4th Amendment to the U.S. Constitution) to his eventual disclosure of same to a few well-respected journalists. The film’s director and joint screenwriter, Oliver Stone, has made a career out of fiction based on truth, dramatizing many signal events from the nation’s history, repackaging them as entertainment in the process. I’m wary of his interpretations of history when presented in cinematic form, less so his alternative history lessons given as documentary. Unlike Truth, however, I have clear ideas in my mind regarding Snowden the man and Snowden the movie, so from a different standpoint, was again unconcerned about potential bias. Joseph Gordon-Levitt does well enough as the titular character, though he doesn’t project nearly the same insight and keen intelligence as Snowden himself does. I suspect the documentary Citizen Four (which I’ve not yet seen) featuring Snowden doing his own talking is a far better telling of the same episode of history.

In contrast, I have assiduously avoided several other recent films based on actual events. United 93, World Trade Center, and Deepwater Horizon spring to mind, but there are many others. The wounds and controversies stemming from those real-life events still smart too much for me to consider exposing myself to propaganda historical fictions. Perhaps in a few decades, after living memory of such events has faded or disappeared entirely, such stories can be told effectively, though probably not accurately. A useful comparison might be any one of several films called The Alamo.

Violent events of the past week (Charleston, VA; Barcelona, Spain) and political responses to them have dominated the news cycle, pushing other newsworthy items (e.g., U.S.-South Korean war games and a looming debt ceiling crisis) off the front page and into the darker recesses of everyone’s minds (those paying attention, anyway). We’re absorbed instead with culture wars run amok. I’m loath to apply the term terrorism to regular periodic eruptions of violence, both domestic and foreign. That term carries with it intent, namely, the objective to create day-to-day terror in the minds of a population so as to interfere with proper functions of society. It’s unclear to me whether recent perpetrators of violence are coherent enough to formulate sophisticated motivations or plans. The dumb, obvious way of doing things — driving into crowds of people — takes little or no planning and may just as well be the result of inchoate rage boiling over in a moment of high stress and opportunity. Of course, it needn’t be all or nothing, and considering our reflexively disproportionate responses, the term terrorism and attendant destabilization is arguably accurate even without specified intent. That’s why in the wake of 9/11 some 16 years ago, the U.S. has become a security state.

It’s beyond evident that hostilities have been simmering below the not-so-calm surface. Many of those hostilities, typically borne out of economic woes but also part of a larger clash of civilizations, take the form of identifying an “other” presumably responsible for one’s difficulties and then victimizing the “other” in order to elevate oneself. Of course, the “other” isn’t truly responsible for one’s struggles, so the violent dance doesn’t actually elevate anyone, as in “supremacy”; it just wrecks both sides (though unevenly). Such warped thinking seems to be a permanent feature of human psychology and enjoys popular acceptance when the right “other” is selected and universal condemnation when the wrong one is chosen. Those doing the choosing and those being chosen haven’t changed much over the centuries. Historical Anglo-Saxons and Teutons choose and people of color (all types) get chosen. Jews are also chosen with dispiriting regularity, which is an ironic inversion of being the Chosen People (if you believe in such things — I don’t). However, any group can succumb to this distorted power move, which is why so much ongoing, regional, internecine conflict exists.

As I’ve been saying for years, a combination of condemnation and RightThink has simultaneously freed some people from this cycle of violence but merely driven the holdouts underground. Supremacy in its various forms (nationalism, racism, antisemitism, etc.) has never truly been expunged. RightThink itself has morphed (predictably) into intolerance, which is now veering toward radicalism. Perhaps a positive outcome of this latest resurgence of supremacist ideology is that those infected with the character distortion have been emboldened to identify themselves publicly and thus can be dealt with somehow. Civil authorities and thought leaders are not very good at dealing with hate, often shutting people out of the necessary public conversation and/or seeking to legislate hate out of existence with restrictions on free speech. But it is precisely through free expression and diplomacy that we address conflict. Violence is a failure to remain civil (duh!), and war (especially the genocidal sort) is the extreme instance. It remains to be seen if the lid can be kept on this boiling pot, but considering cascade failures lined up to occur within the foreseeable future, I’m pessimistic that we can see our way past the destructive habit of shifting blame onto others who often suffer even worse than those holding the reins of power.

Previous blogs on this topic are here and here.

Updates to the Bulletin of the Atomic Scientists resetting the metaphorical doomsday clock hands used to appear at intervals of 3–7 years. Updates have been issued in each of the last three years, though the clock hands remained in the same position from 2015 to 2016. Does that suggest raised geopolitical instability or merely resumed paranoia resulting from the instantaneous news cycle and radicalization of society and politics? The 2017 update resets the minute hand slightly forward to 2½ minutes to midnight:

doomsdayclock_black_2-5mins_regmark2028129For the last two years, the minute hand of the Doomsday Clock stayed set at three minutes before the hour, the closest it had been to midnight since the early 1980s. In its two most recent annual announcements on the Clock, the Science and Security Board warned: “The probability of global catastrophe is very high, and the actions needed to reduce the risks of disaster must be taken very soon.” In 2017, we find the danger to be even greater, the need for action more urgent. It is two and a half minutes to midnight, the Clock is ticking, global danger looms. Wise public officials should act immediately, guiding humanity away from the brink. If they do not, wise citizens must step forward and lead the way …

The principal concern of the Bulletin since its creation has been atomic/nuclear war. Recent updates include climate change in the mix. Perhaps it is not necessary to remind regular readers here, but the timescales for these two threats are quite different: global thermonuclear war (a term from the 1980s when superpowers last got weird and paranoid about things) could erupt almost immediately given the right lunacy provocation, such as the sabre-rattling now underway between the U.S. and North Korea, whereas climate change is an event typically unfolding across geological time. The millions of years it usually takes to manifest climate change fully and reach a new steady state (hot house earth vs. ice age earth), however, appears to have been accelerated by human inputs (anthropogenic climate change, or as Guy McPherson calls it, abrupt climate change) to only a few centuries.

Nuclear arsenals around the world are the subject of a curious article at Visual Capitalist (including several reader-friendly infographics) by Nick Routley. The estimated number of weapons in the U.S. arsenal has risen since the last time I blogged about this in 2010. I still find it impossible to fathom why more than a dozen nukes are necessary, or in my more charitable moments toward the world’s inhabitants, why any of them are necessary. Most sober analysts believe we are far safer today than, say, the 1950s and early 1960s when brinkmanship was anybody’s game. I find this difficult to judge considering the two main actors today on the geopolitical stage are both witless, unpredictable, narcissistic maniacs. Moreover, the possibility of some ideologue (religious or otherwise) getting hold of WMDs (not necessarily nukes) and creating mayhem is increasing as the democratization of production filters immense power down to lower and lower elements of society. I for one don’t feel especially safe.

Allow me to propose a hypothetical, to conduct a thought experiment if you will.

Let’s say that the powers that be, our governmental and corporate overlords, have been fully aware and convinced of impending disaster for some time, decades even. What to do with that burdensome information? How to prepare the public or themselves? Make the truth openly public and possibly spark a global panic or bury the information, denying and obfuscating when news eventually got out? Let’s say that, early on, the decision was made to bury the information and keep plodding through a few more blissfully ignorant decades as though nothing were amiss. After all, prophecies of disaster, extrapolating simple trend lines (such as population growth), were not uncommon as early as the 18th and 19th centuries. Science had made sufficient progress by the 1970s to recognize without much controversy that problems with industrial civilization were brewing and would soon overflow, overtaking our ability to maintain control over the processes we set in motion or indeed ourselves. Thus, at the intuitive level of deep culture, we initiated the ecology movement, the predecessor of environmentalism, and experienced the (first) international oil crisis. The decision to bury the prognosis for civilization (doom!) resulted in keeping a lid on things until the information swung fully into public view in the middle 2000s (the decade, not the century), thanks to a variety of scientists not among the power elite who sounded the alarms anew. At that point, obfuscation and disinformation became the dominant strategies.

Meanwhile, to keep the lights on and the store shelves stocked, the powers that be launched a campaign of massive debt spending, stealing from a future we would never reach anyway, and even dabbled at modest terraforming to forestall the worst by spraying chemicals in the atmosphere, creating global dimming. This program, like many others, were denied and made into conspiracy theories (chemtrails vs. contrails), enabling the public to ignore the obvious evidence of climate change and resulting slo-mo environmental collapse. Public uprising and outrage were easily quelled with essentially the same bread and circuses in which the Classical Romans indulged as their empire was in the midst of a protracted collapse. Modern global industrial empire will not experience the same centuries-long disintegration.

Now, I’ll admit, I don’t actually believe much of this. As with most conspiracies, this hypothetical doesn’t pass the straight-face test. Nor do the powers that be demonstrate competence sufficient to pull off even routine programs, much less extravagant ones. However, elements are undoubtedly true, such as the knowledge that energy policy and resources simply won’t meet anticipated demand with global population still swelling out of control. Neither will food production. Rather than make a difficult and questionable philosophical decision to serve the public interest by hiding the truth and keeping modern civilization going until the breaking point of a hard crash, at which point few would survive (or want to), the easy decision was probably made to ignore and obfuscate the truth, do nothing to keep the worst ravages of global industry from hastening our demise, and gather to themselves all financial resources, leaving everyone else in the lurch. The two basic options are to concern ourselves with everyone’s wellbeing over time vs. one’s own position in the short term.

In case the denial and obfuscation has worked on you, the reader of this doom blog, please consider (if you dare) this lengthy article at New York Magazine called “The Uninhabitable Earth” by David Wallace-Wells. Headings are these:

  1. “Doomsday”
  2. Heat Death
  3. The End of Food
  4. Climate Plagues
  5. Unbreathable Air
  6. Perpetual War
  7. Permanent Economic Collapse
  8. Poisoned Oceans
  9. The Great Filter

No one writes this stuff just to scare the public and get attention. Rather, it’s about telling the truth and whistle-blowing. While captains if industry and kings of the realm slumber, fattened and self-satisfied upon their beds, at least some of the rest of us recognize that the future is barrelling at us with the same indifference for human wellbeing (or the natural world) that our leaders have shown.

According to Hal Smith of The Compulsive Explainer (see my blogroll), the tragedy of our time is, simply put, failed social engineering. Most of his blog post is quoted below:

Americans, for example, have decided to let other forces manage their nation — and not let Americans themselves manage it. At least this is what I see happening, with the election of Trump. They have handed the management of their country over to a man with a collection of wacky ideas — and they feel comfortable with this. Mismanagement is going on everywhere — and why not include the government in this?

This is typical behavior for a successful society in decline. They cannot see what made them successful, has been taken too far — and is now working against them. The sensible thing for them to do is back off for awhile, analyze their situation — and ask “What is going wrong here?” But they never do this — and a collapse ensues.

In our present case, the collapse involves a global society based on Capitalism — that cannot adapt itself to a Computer-based economy. The Software ecosystem operates differently — it is based on cooperation, not competition.

Capitalism was based on just that — Capital — money making money. And it was very successful — for those it favored. Money is still important in the Computer economy — people still have to be paid. But what they are being paid for has changed — information is now being managed, something different entirely.

Hardware is still important — but that is not where the Big Money is being made. It is now being made in Computers, and their Software.

I’m sympathetic to this view but believe that a look back through history reveals something other than missed opportunities and too-slow adaptation as we fumbled our way forward, namely, repeated catastrophic failures. Such epic fails include regional and global wars, genocides, and societal collapses that rise well above the rather bland term mismanagement. A really dour view of history, taking into account more than a few truly vicious, barbaric episodes, might regard the world as a nearly continuous stage of horrors from which we periodically take refuge, and the last of these phases is drawing quickly to a close.

The breakneck speed of technological innovation and information exchange has resulted not in Fukuyama’s mistakenly exuberant “end of history” (kinda-sorta winning the Cold War but nevertheless losing the peace?) but instead an epoch where humans are frankly left behind by follow-on effects of their own unrestrained restlessness. Further, if history is a stage of horrors, then geopolitics is theater of the absurd. News reports throughout the new U.S. presidential administration, still less than 6 months in (more precisely, 161 days or 23 weeks), tell of massive economic and geopolitical instabilities threatening to collapse the house of cards with only a slight breeze. Contortions press agents and politicized news organs go through to provide cover for tweets, lies, and inanities emanating from the disturbed mind of 45 are carnival freak show acts. Admittedly, not much has changed over the previous two administrations — alterations of degree only, not kind — except perhaps to demonstrate beyond any shadow of doubt that our elected, appointed, and purchased leaders (acknowledging many paths to power) are fundamentally incompetent to deal effectively with human affairs, much less enact social engineering projects beyond the false happiness of Facebook algorithms that hide bad news. Look no further than the egregious awfulness of both presidential candidates in the last election coughed up like hairballs from the mouths of their respective parties. The aftermath of those institutional failures finds both major parties in shambles, further degraded than their already deplorable states prior to the election.

So how much worse can things get? Well, scary as it sounds, lots. The electrical grid is still working, water is still flowing to the taps, and supply lines continue to keep store shelves stocked with booze and brats for extravagant holiday celebrations. More importantly, we in the U.S. have (for now, unlike Europe) avoided repetition of any major terrorist attacks. But everyone with an honest ear to the ground recognizes our current condition as the proverbial calm before the storm. For instance, we’re threatened by the first ice-free Arctic in the history of mankind later this year and a giant cleaving off of the Larsen C Ice Shelf in Antarctica within days. In addition, drought in the Dakotas will result in a failed wheat harvest. Guy McPherson (in particular, may well be others) has been predicting for years that abrupt, nonlinear climate change when the poles warm will end the ability to grow grain at scale, leading to worldwide famine, collapse, and near-term extinction. Seems like we’re passing the knee of the curve. Makes concerns about maladaptation and failed social engineering pale by comparison.

An old Star Trek episode called “A Taste for Armageddon” depicts Capt. Kirk and crew confronting a planetary culture that has adopted purely administrative warfare with a nearby planet, where computer simulations determine outcomes of battles and citizens/inhabitants are notified to report for their destruction in disintegration chambers to comply with those outcomes. Narrative resolution is tidied up within the roughly 1-hour span of the episode, of course, but it was and is nonetheless a thought-provoking scenario. The episode, now 50 years old, prophesies a hyper-rational approach to conflict. (I was 4 years old at the time it aired on broadcast television, and I don’t recall having seen it since. Goes to show how influential high-concept storytelling can be even on someone quite young.) The episode came to mind as I happened across video showing how robot soldiers are being developed to supplement and eventually replace human combatants. See, for example, this:

The robot in the video above is not overtly militarized, but there is no doubt that it will could be. Why the robot takes bipedal, humanoid form with an awkwardly high center of gravity is unclear to me beyond our obvious self-infatuation. Additional videos with two-wheeled, quadriped, and even insect-like multilegged designs having much improved movement and flexibility can be found with a simple search. Any of them can be transformed into ground-based killing machines, as suggested more manifestly in the video below highlighting various walking, rolling, flying, floating, and swimming machines developed to do our dirty work:

(more…)