Archive for the ‘History’ Category

From Wikipedia:

Trial by combat (also wager of battle, trial by battle or judicial duel) was a method of Germanic law to settle accusations in the absence of witnesses or a confession in which two parties in dispute fought in single combat; the winner of the fight was proclaimed to be right. In essence, it was a judicially sanctioned duel. It remained in use throughout the European Middle Ages, gradually disappearing in the course of the 16th century.

Unlike trial by ordeal in general, which is known to many cultures worldwide, trial by combat is known primarily from the customs of the Germanic peoples. It was in use among the ancient Burgundians, Ripuarian Franks, Alamans, Lombards, and Swedes. It was unknown in Anglo-Saxon law, Roman law and Irish Brehon Law and it does not figure in the traditions of Middle Eastern antiquity such as the code of Hammurabi or the Torah.

Trial by combat has profound echoes in 21st-century geopolitics and jurisprudence. Familiar phrases such as right of conquest, manifest destiny, to the winner go the spoils, might makes right, and history written by the victors attest to the enduring legacy of hindsight justification by force of arms. More broadly, within the American system, right of access to courts afforded to all citizens also admits nuisance suits and more than a few mismatched battles where deep-pocketed corporations sue individuals and small organizations, often nonprofits, into bankruptcy and submission. For instance, I recently learned of Strategic Lawsuits Against Public Participation (SLAPPs) “used to silence and harass critics by forcing them to spend money to defend these baseless suits.” They employ brute economic power in place of force of arms.

Trial by combat fell out of practice with the onset of the Enlightenment but the broader complex of ideas survived. Interest in medieval Europe as storytelling fodder in cinema and fantasy literature (notably, the shocking trial by combat depicted in the extremely popular HBO drama Game of Thrones where the accused and accuser both designate their proxies rather than doing battle themselves) lends legitimacy to settling disputes via violence. Even the original Karate Kid (1984) has a new YouTube Red series set 30 years later. The bad-boy acolyte replaces his scorched-earth sensei and seeks revenge from the titular character for being bested decades before, the latter of whom is yanked back from quiet obscurity (and the actor who portrays him from career limbo) to fight again and reprove his skills, which is to say, his righteousness. The set-up is surprisingly delicious to contemplate and has considerable nostalgic appeal. More importantly, it embodies the notion (no doubt scripted according to cliché) that only the pure of heart (or their proxies, students in this case) can claim ultimate victory because, well, it’s god’s will or some such and thus good guys must always win. What that really means is that whoever wins is by definition virtuous. If only reality were so reliably simple.

The certainty of various religious dogma and codes of conduct characteristic of the medieval period (e.g., chivalry) is especially seductive in modern times, considering how the public is beset by an extraordinary degree of existential and epistemological uncertainty. The naturalist fallacy is also invoked, where the law of the jungle (only the fittest and/or strongest get to eat or indeed survive) substitutes for more civilized (i.e., enlightened and equanimous) thinking. Further, despite protestations, this complex of ideas legitimizes bullying, whether (1) in the schoolyard with the principal bully flanked by underlings picking on vulnerable weaklings who haven’t formed alliances for self-protection, (2) the workplace, with its power players and Machiavellian manipulators, or (3) a global military power such as the U.S. dictating terms to and/or warring with smaller, weaker nations that lack the GDP, population, and insanity will to project power globally. I daresay most Americans take comfort in having the greatest military and arsenal ever mustered on their side and accordingly being on the right side (the victorious one) of history, thus a beacon of hope to all who would conflate victory with virtue. Those who suffer at our hands must understand things quite differently. (Isn’t it more accurate that when bad guys win, rebellions and insurgencies are sparked?)

One remarkable exception deserves notice. The U.S. presidency is among the most heavily scrutinized and contentious positions (always under attack) and happens to be the Commander-in-Chief of the self-same greatest goddamn fighting force known to man. It’s no secret that the occupant of that office (45) is also widely recognized as the Bully-in-Chief. Despite having at his disposal considerable resources — military, executive staff, and otherwise — 45 has eschewed forming the political coalitions one might expect and essentially gone it alone, using the office (and his Twitter account) as a one-man bully pulpit. Hard to say what he’s trying to accomplish, really. Detractors have banded together (incompetently) to oppose him, but 45 has demonstrated unexpected tenacity, handily dominating rhetorical trials by combat through sheer bluster and hubris. On balance, he scores some pretty good hits, too. (The proposed fist fight between 45 and Joe Biden turned out to be a tease, but how entertaining would that bout have been without actually settling anything!) This pattern has left many quite dumbfounded, and I admit to being astounded as well except to observe that rank stupidity beats everything in this bizarre political rock-paper-scissors contest. How quintessentially American: nuthin’ beats stoopid.

Advertisements

I’ve been modestly puzzled of late to observe that, on the one hand, those in the U.S. and Canada who have only just reached the age of majority (a/k/a the threshold of adulthood, which is not strictly the same as “the age of sexual consent, marriageable age, school leaving age, drinking age, driving age, voting age, smoking age, gambling age, etc.” according to the link) are disregarded with respect to some political activism while, on the other hand, they’re admired for other political activism. Seems to be issue specific whether young adults are to be taken seriously. If one is agitating for some aspect of identity politics, or a Social Justice Warrior (SJW), one can be discredited as simply being too young to understand things properly, whereas advocating gun control (e.g., in the wake of the Parkland, Florida shootings in February) is recognized as well within a youthful mandate. Survivors of violence and mayhem seem to be uniquely immune to gun advocates trotting out the meme “now is not the time.”

As it happens, I agree that identity politics is a load of horseshit and tighter gun control (no, not taking away everyone’s guns totally) needs to be tried. But I haven’t arrived at either position because youth are either too youthful or wizened enough by horrific experience to understand. Hanging one’s positions on the (dis)qualification of age is a red herring, a meaningless distraction from the issues themselves. Rather, if thoughtful consideration is applied to the day’s issues, which I daresay is not an easy prospect, one should ideally arrive at positions based on any number of criteria, some of which may conflict with others. For instance, I used to be okay (not an enthusiastic supporter, mind you) with the death penalty on a number of grounds but changed my opinion for purely pragmatic reasons. The sheer cost of automatic appeals and other safeguards to ensure that innocents are not wrongly convicted and executed, a cost borne by U.S. taxpayers, is so onerous that to prosecute through to execution looks less like justice and more like maniacal vengeance. Life in prison without the possibility of parole is a much saner and less costly project in comparison.

With intractable debates and divisive issues (e.g, abortion, free speech, right to bear arms, immigration, religion, Israel/Palestine conflict, euthanasia, etc.) plaguing public life, one might wonder how do we get everyone on board? Alternatively, how do we at least agree to be civil in spite of our disagreements? I have two replies but no solutions. The first is to recognize that some issues are indeed intractable and insoluble, so graceful acceptance that an opposing opinion or perspective will always be present is needed lest one twist and writhe inconsolably when one’s cherished perspective is not held universally. That’s not necessarily the same as giving up or succumbing to fatalism. Rather, it’s recognition that banging one’s head against certain walls is futile. The second is to recognize that opposing opinions are needed to avoid unhealthy excess in social environments. Put another way, heterodoxy avoids orthodoxy. Many historical practices we now regard as barbaric were abandoned or outlawed precisely because consensus opinion swung from one side to the other. Neil Postman called this a thermostatic response in several of his books. Other barbaric behaviors have been only partially addressed and require further agitation to invalidate fully. Examples are not mentioned, but I could compile a list rather quickly.

Speaking of Davos (see previous post), Yuval Noah Harari gave a high-concept presentation at Davos 2018 (embedded below). I’ve been aware of Harari for a while now — at least since the appearance of his book Sapiens (2015) and its follow-up Homo Deus (2017), both of which I’ve yet to read. He provides precisely the sort of thoughtful, provocative content that interests me, yet I’ve not quite known how to respond to him or his ideas. First thing, he’s a historian who makes predictions, or at least extrapolates possible futures based on historical trends. Near as I can tell, he doesn’t resort to chastising audiences along the lines of “those who don’t know history are doomed to repeat it” but rather indulges in a combination of breathless anticipation and fear-mongering at transformations to be expected as technological advances disrupt human society with ever greater impacts. Strangely, Harari is not advocating for anything in particular but trying to map the future.

Harari poses this basic question: “Will the future be human?” I’d say probably not; I’ve concluded that we are busy destroying ourselves and have already crossed the point of no return. Harari apparently believes differently, that the rise of the machine is imminent in a couple centuries perhaps, though it probably won’t resemble Skynet of The Terminator film franchise hellbent on destroying humanity. Rather, it will be some set of advanced algorithms monitoring and channeling human behaviors using Big Data. Or it will be a human-machine hybrid possessing superhuman abilities (physical and cognitive) different enough to be considered a new species arising for the first time not out of evolutionary processes but from human ingenuity. He expects this new species to diverge from homo sapiens sapiens and leave us in the evolutionary dust. There is also conjecture that normal sexual reproduction will be supplanted by artificial, asexual reproduction, probably carried out in test tubes using, for example, CRISPR modification of the genome. Well, no fun in that … Finally, he believes some sort of strong AI will appear.

I struggle mightily with these predictions for two primary reasons: (1) we almost certainly lack enough time for technology to mature into implementation before the collapse of industrial civilization wipes us out, and (2) the Transhumanist future he anticipates calls into being (for me at least) a host of dystopian nightmares, only some of which are foreseeable. Harari says flatly at one point that the past is not coming back. Well, it’s entirely possible for civilization to fail and our former material conditions to be reinstated, only worse since we’ve damaged the biosphere so gravely. Just happened in Puerto Rico in microcosm when its infrastructure was wrecked by a hurricane and the power went out for an extended period of time (still off in some places). What happens when the rescue never appears because logistics are insurmountable? Elon Musk can’t save everyone.

The most basic criticism of economics is the failure to account for externalities. The same criticism applies to futurists. Extending trends as though all things will continue to operate normally is bizarrely idiotic. Major discontinuities appear throughout history. When I observed some while back that history has gone vertical, I included an animation with a graph that goes from horizontal to vertical in an extremely short span of geological time. This trajectory (the familiar hockey stick pointing skyward) has been repeated ad nauseum with an extraordinary number of survival pressures (notably, human population and consumption, including energy) over various time scales. Trends cannot simply continue ascending forever. (Hasn’t Moore’s Law already begun to slope away?) Hard limits must eventually be reached, but since there are no useful precedents for our current civilization, it’s impossible to know quite when or where ceilings loom. What happens after upper limits are found is also completely unknown. Ugo Bardi has a blog describing the Seneca Effect, which projects a rapid falloff after the peak that looks more like a cliff than a gradual, graceful descent, disallowing time to adapt. Sorta like the stock market currently imploding.

Since Harari indulges in rank thought experiments regarding smart algorithms, machine learning, and the supposed emergence of inorganic life in the data stream, I thought I’d pose some of my own questions. Waiving away for the moment distinctions between forms of AI, let’s assume that some sort of strong AI does in fact appear. Why on earth would it bother to communicate with us? And if it reproduces and evolves at breakneck speed as some futurists warn, how long before it/they simply ignore us as being unworthy of attention? Being hyper-rational and able to think calculate millions of moves ahead (like chess-playing computers), what if they survey the scene and come to David Benatar’s anti-natalist conclusion that it would be better not to have lived and so wink themselves out of existence? Who’s to say that they aren’t already among us, lurking, and we don’t even recognize them (took us quite a long time to recognize bacteria and viruses, and what about undiscovered species)? What if the Singularity has already occurred thousands of times and each time the machine beings killed themselves off without our even knowing? Maybe Harari explores some of these questions in Homo Deus, but I rather doubt it.

Be forewarned: this is long and self-indulgent. Kinda threw everything and the kitchen sink at it.

In the August 2017 issue of Harper’s Magazine, Walter Kirn’s “Easy Chair” column called “Apocalypse Always” revealed his brief, boyhood fascination with dystopian fiction. This genre has been around for a very long time, to which the Cassandra myth attests. Kirn’s column is more concerned with “high mid-twentieth-century dystopian fiction,” which in his view is now classic and canonical, an entire generation of Baby Boomers having been educated in such patterned thought. A new wave of dystopian fiction appeared in the 1990s and yet another more recently in the form of Young Adult novels (and films) that arguably serve better as triumphal coming-of-age stories albeit under dystopian circumstances. Kirn observes a perennial theme present in the genre: the twins disappearances of freedom and information:

In the classic dystopias, which concern themselves with the lack of freedom and not with surplus freedom run amok (the current and unforeseen predicament of many), society is superbly well organized, resembling a kind of hive or factory. People are sorted, classified, and ranked, their individuality suppressed through goon squads, potent narcotics, or breeding programs. Quite often, they wear uniforms, and express themselves, or fail to, in ritual utterance and gestures.

Whether Americans in 2018 resemble hollowed-out zombies suffering under either boot-heel or soft-serve oppression is a good question. Some would argue just that in homage to classic dystopias. Kirn suggests briefly that we might instead suffer from runaway anarchy, where too much freedom and licentiousness have led instead to a chaotic and disorganized society populated by citizens who can neither govern nor restrain themselves.

Disappearance of information might be understood in at least three familiar aspects of narrative framing: what happened to get us to this point (past as exposition, sometimes only hinted at), what the hell? is going on (present as conflict and action), and how is gets fixed (future as resolution and denouement). Strict control over information exercised by classic dystopian despots doesn’t track to conditions under which we now find ourselves, where more disorganized, fraudulent, and degraded information than ever is available alongside small caches of wisdom and understanding buried somewhere in the heap and discoverable only with the benefit of critical thinking flatly lost on at least a couple generations of miseducated graduates. However, a coherent narrative of who and what we are and what realistic prospects the future may hold has not emerged since the stifling version of the 1950s nuclear family and middle class consumer contentment. Kirn makes this comparison directly, where classic dystopian fiction

focus[es] on bureaucracy, coercion, propaganda, and depersonalization, overstates both the prowess of the hierarchs and the submissiveness of the masses, whom it still thinks of as the masses. It does not contemplate Trump-style charlatanism at the top, or a narcissistic populace that prizes attention over privacy. The threats to individualism are paramount; the scourge of surplus individualism, with everyone playing his own dunce king and slurping up resources until he bursts, goes unexplored.

Kirn’s further observations are worth a look. Go read for yourself.

(more…)

Societies sometimes employ leveling mechanisms to keep the high and mighty from getting too, well, high and mighty or to pull them back down when they nonetheless manage to scale untenable heights. Some might insist that the U.S. breakaway from the British crown and aristocratic systems in the Revolutionary Era was, among other things, to establish an egalitarian society in accordance with liberal philosophy of the day. This is true to a point, since we in the U.S. don’t have hereditary aristocratic titles, but a less charitable view is that the Founders really only substituted the landed gentry, which to say themselves, for the tyrannical British. Who scored worse on the tyranny scale is a matter of debate, especially when modern sensibilities are applied to historical practices. Although I don’t generally care for such hindsight moralizing, it’s uncontroversial that the phrase “all men are created equal” (from the U.S. Declaration of Independence) did not then apply, for instance, to slaves and women. We’re still battling to establish equality (a level playing field) among all men and women. For SJWs, the fight has become about equality of outcome (e.g., quotas), which is a perversion of the more reasonable and achievable equality of opportunity.

When and where available resources were more limited, say, in agrarian or subsistence economies, the distance or separation between top and bottom was relatively modest. In a nonresource economy, where activity is financialized and decoupled from productivity (Bitcoin, anyone?), the distance between top and bottom can grow appallingly wide. I suspect that an economist could give a better explanation of this phenomenon than I can, but my suspicion is that it has primarily to do with fiat currency (money issued without sound backing such as precious metals), expansion of credit, and creation of arcane instruments of finance, all of which give rise to an immense bureaucracy of administrative personnel to create, manage, and manipulate them.

The U.S. tax structure of the 1950s — steep taxes levied against the highest earners — was a leveling mechanism. Whether intentionally corrective of the excesses of the Jazz Age is beyond my knowledge. However, that progressive tax structure has been dismantled (“leveled,” one might say), shifting from progressive to regressive and now to transgressive. Regressive is where more or disproportionate tax responsibility is borne by those already struggling to satisfy their basic needs. Transgressive is outright punishment of those who fail to earn enough, as though the whip functions as a spur to success. Indeed, as I mentioned in the previous blog post, the mood of the country right now is to abandon and blame those whom financial success has eluded. Though the term debtor’s prison belongs to a bygone era, we still have them, as people are imprisoned over nonviolent infractions such as parking tickets only to have heavy, additional, administrative fines and fees levied on them, holding them hostage to payment. That’s victimizing the victim, pure and simple.

At the other end of the scale, the superrich ascend a hierarchy that is absurdly imbalanced since leveling mechanisms are no longer present. Of course, disdain of the nouveau riche exists, primarily because social training does not typically accompany amassing of new fortunes, allowing many of that cohort to be amazingly gauche and intransigently proud of it (names withheld). That disdain is especially the prerogative of those whose wealth is inherited, not the masses, but is not an effective leveling mechanism. If one is rich, famous, and charming enough, indulgences for bad or criminal behavior are commonplace. For instance, those convicted of major financial crime in the past decade are quite few, whereas beneficiaries (multimillionaires) of looting of the U.S. Treasury are many. One very recent exception to indulgences is the wave of people being accused of sexual misconduct, but I daresay the motivation is unrelated to that of standard leveling mechanisms. Rather, it’s moral panic resulting from strains being felt throughout society having to do with sexual orientation and identity.

When the superrich ascend into the billionaire class, they tend to behave supranationally: buying private islands or yachts outside the jurisdiction or control of nation states, becoming nominal residents of the most advantageous tax havens, and shielding themselves from the rabble. While this brand of anarchism may be attractive to some and justified to others, detaching from social hierarchies and abandoning or ignoring others in need once one’s own fortunes are secure is questionable behavior to say the least. Indeed, those of such special character are typically focal points of violence and mayhem when the lives of the masses become too intolerable. That target on one’s back can be ignored or forestalled for a long time, perhaps, but the eventuality of nasty blowback is virtually guaranteed. That’s the final leveling mechanism seen throughout history.

Here’s the last interesting bit I am lifting from Anthony Gidden’s The Consequences of Modernity. Then I will be done with this particular book-blogging project. As part of Gidden’s discussion of the risk profile of modernity, he characterizes risk as either objective or perceived and further divides in into seven categories:

  1. globalization of risk (intensity)
  2. globalization of risk (frequency)
  3. environmental risk
  4. institutionalized risk
  5. knowledge gaps and uncertainty
  6. collective or shared risk
  7. limitations of expertise

Some overlap exists, and I will not distinguish them further. The first two are of primary significance today for obvious reasons. Although the specter of doomsday resulting from a nuclear exchange has been present since the 1950s, Giddens (writing in 1988) provides this snapshot of today’s issues:

The sheer number of serious risks in respect of socialised nature is quite daunting: radiation from major accidents at nuclear power-stations or from nuclear waste; chemical pollution of the seas sufficient to destroy the phytoplankton that renews much of the oxygen in the atmosphere; a “greenhouse effect” deriving from atmospheric pollutants which attack the ozone layer, melting part of the ice caps and flooding vast areas; the destruction of large areas of rain forest which are a basic source of renewable oxygen; and the exhaustion of millions of acres of topsoil as a result of widespread use of artificial fertilisers. [p. 127]

As I often point out, these dangers were known 30–40 years ago (in truth, much longer), but they have only worsened with time through political inaction and/or social inertia. After I began to investigate and better understand the issues roughly a decade ago, I came to the conclusion that the window of opportunity to address these risks and their delayed effects had already closed. In short, we’re doomed and living on borrowed time as the inevitable consequences of our actions slowly but steadily manifest in the world.

So here’s the really interesting part. The modern worldview bestows confidence borne out of expanding mastery of the built environment, where risk is managed and reduced through expert systems. Mechanical and engineering knowledge figure prominently and support a cause-and-effect mentality that has grown ubiquitous in the computing era, with its push-button inputs and outputs. However, the high modern outlook is marred by overconfidence in our competence to avoid disaster, often of our own making. Consider the abject failure of 20th-century institutions to handle geopolitical conflict without devolving into world war and multiple genocides. Or witness periodic crashes of financial markets, two major nuclear accidents, and numerous space shuttles and rockets destroyed. Though all entail risk, high-profile failures showcase our overconfidence. Right now, engineers (software and hardware) are confident they can deliver safe self-driving vehicles yet are blithely ignoring (says me, maybe not) major ethical dilemmas regarding liability and technological unemployment. Those are apparently problems for someone else to solve.

Since the start of the Industrial Revolution, we’ve barrelled headlong into one sort of risk after another, some recognized at the time, others only apparent after the fact. Nuclear weapons are the best example, but many others exist. The one I raise frequently is the live social experiment undertaken with each new communications technology (radio, cinema, telephone, television, computer, social networks) that upsets and destabilizes social dynamics. The current ruckus fomented by the radical left (especially in the academy but now infecting other environments) regarding silencing of free speech (thus, thought policing) is arguably one concomitant.

According to Giddens, the character of modern risk contrasts with that of the premodern. The scale of risk prior to the 17th century was contained and expectation of social continuity was strong. Risk was also transmuted through magical thinking (superstition, religion, ignorance, wishfulness) into providential fortuna or mere bad luck, which led to feelings of relative security rather than despair. Modern risk has now grown so widespread, consequential, and soul-destroying, situated at considerable remove leading to feelings of helplessness and hopelessness, that those not numbed by the litany of potential worries afflicting daily life (existential angst or ontological insecurity) often develop depression and other psychological compulsions and disturbances. Most of us, if aware of globalized risk, set it aside so that we can function and move forward in life. Giddens says that this conjures up anew a sense of fortuna, that our fate is no longer within our control. This

relieves the individual of the burden of engagement with an existential situation which might otherwise be chronically disturbing. Fate, a feeling that things will take their own course anyway, thus reappears at the core of a world which is supposedly taking rational control of its own affairs. Moreover, this surely exacts a price on the level of the unconscious, since it essentially presumes the repression of anxiety. The sense of dread which is the antithesis of basic trust is likely to infuse unconscious sentiments about the uncertainties faced by humanity as a whole. [p. 133]

In effect, the nature of risk has come full circle (completed a revolution, thus, revolutionized risk) from fate to confidence in expert control and back to fate. Of course, a flexibility of perspective is typical as situation demands — it’s not all or nothing — but the overarching character is clear. Giddens also provides this quote by Susan Sontag that captures what he calls the low-probability, high-consequence character of modern risk:

A permanent modern scenario: apocalypse looms — and it doesn’t occur. And still it looms … Apocalypse is now a long-running serial: not ‘Apocalypse Now,’ but ‘Apocalypse from now on.’ [p. 134]

The scandal surrounding Harvey Weinstein and all the people he harassed, bullied, assaulted, molested, and raped has provided occasion for many who had dealings with him to revisit their experiences and wonder what might have been (or not been) had things gone differently, had they acted otherwise in response to his oafish predations. I judge it’s nearly impossible for those outside the Hollywood scene to understand fully the stakes involved (and thus the distorted psychology), but on the other hand, nearly everyone has experience with power imbalances that enable some to get away with exploiting and victimizing others. And because American culture responds to tragedies like a bunch of rubberneckers, the witch hunt has likely only just begun. There’s a better than average chance that, as with icebergs, the significantly larger portion of the problem lies hidden below the surface, as yet undisclosed. Clamor won’t alter much in the end; the dynamics are too ingrained. Still, expect accusations to fly all over the industry, including victim blaming. My strong suspicion is that some folks dodged (actively or passively) becoming victims and paid a price in terms of career success, whereas others fell prey or simply went along (and then stayed largely silent until now) and got some extra consideration out of it. Either way, it undermines one’s sense of self-worth, messing with one’s head for years afterwards. Sometimes there’s no escaping awful circumstance.

Life is messy, right? We all have episodes from our past that we wish we could undo. Hindsight makes the optimal path far more clear than in the moment. Fortunately, I have no crimes among my regrets, but with certain losses, I certainly wish I had known then what I know now (a logical fallacy). Strange that the news cycle has me revisiting my own critical turning points in sympathy with others undoubtedly doing the same.

As I generalize this thought process, I can’t help but to wonder as well what might have been had we not, say, (1) split the atom and immediately weaponized the technology, (2) succumbed to various Red Scares scattered around 20th- and 21st-century calendars but instead developed a progressive society worthy of the promise our institutions once embodied, (3) plunged forward out of avarice and shortsightedness by plundering the Earth, and (4) failed to reverse course once the logical conclusion to our aggregate effects on the biosphere was recognized. No utopia would have arisen had we dodged these bullets, of course, but the affairs of men would have been marginally improved, and we might even have survived the 21st century. Such thinking is purely hypothetical and invites a fatalist like me to wonder whether — given our frailty, weakness, and corruption (the human condition being akin to original sin) — we don’t already inhabit the best of all possible worlds.

Isn’t that a horrible thought? A world full of suffering and hardship, serial rapists and murderers, incompetent and venal political leaders, and torture and genocides is the best we can do? We can’t avoid our own worst tendencies? Over long spans of time, cataclysmic earthquakes, volcanic eruptions, superstorms, and meteor strikes already make life on Earth rather precarious, considering that over 99% of all species that once existed are now gone. On balance, we have some remarkable accomplishments, though often purchased with sizeable trade-offs (e.g., slave labor, patriarchal suppression). Still, into the dustbin of history is where we are headed rather sooner than later, having enjoyed only a brief moment in the sun.

Reading further into Anthony Giddens’ book The Consequences of Modernity, I got a fuller (though still incomplete) sense of what is meant by his terms disembedding mechanisms, expert systems, and symbolic tokens, all of which disrupt time and space as formerly understood in traditional societies that enjoyed the benefit of centuries of continuity. I’ve been aware of analyses regarding, for instance, the sociology of money and the widespread effects of the introduction and adoption of mechanical clocks and timepieces. While most understand these developments superficially as unallayed progress, Giddens argues that they do in fact reorder our experience in the world away from an organic, immediate orientation toward an intellectualized adherence to distant, abstract, self-reinforcing (reflexive) mechanisms.

But those matters are not really what this blog post is about. Rather, this passage sparked my interest:

… when the claims of reason replaced those of tradition, they appeared to offer a sense of certitude greater than that provided by preexisting dogma. But this idea only appears persuasive so long as we do not see that the reflexivity of modernity actually subverts reason, at any rate where reason is understood as the gaining of certain knowledge … We are abroad in a world which is thoroughly constituted through reflexively applied knowledge, but where at the same time we can never be sure that any given element of that knowledge will not be revised. [p. 39]

Put another way, science and reason are axiomatically open to examination, challenge, and revision and often undergo disruptive change. That’s what is meant by Karl Popper’s phrase “all science rests upon shifting sand” and informs the central thesis of Thomas Kuhn’s well-known book The Structure of Scientific Revolutions. It’s not the narrow details that shift so much (hard sciences lead pretty reliably to applied engineering) as the overarching narrative, e.g., the story of the Earth, the cosmos, and ourselves as revealed through scientific inquiry and close examination. Historically, the absolute certainty of the medieval church, while not especially accurate in either details or narrative, yielded considerable authority to post-Enlightenment science and reason, which themselves continue to shift periodically.

Some of those paradigm shifts are so boggling and beyond the ken of the average thinker (including many college-educated folks) that our epistemology is now in crisis. Even the hard facts — like the age and shape of the Earth or its orbital relationship to other solar bodies — are hotly contested by some and blithely misunderstood by others. One doesn’t have to get bogged down in the vagaries of relativity, nuclear power and weapons, or quantum theory to lose the thread of what it means to live in the 21st century. Softer sciences such as psychology, anthropology, economics, and even history now deliver new discoveries and (re-)interpretations of facts so rapidly, like the dizzying pace of technological change, that philosophical systems are unmoored and struggling for legitimacy. For instance, earlier this year, a human fossil was found in Morocco that upended our previous knowledge of human evolution (redating the first appearance of biologically modern humans about 100,000 years earlier). More popularly, dieticians still disagree on what sorts of foods are healthy for most of us (though we can probably all agree that excess sugar is bad). Other recent developments include the misguided insistence among some neurobiologists and theorists that consciousness, free will, and the self do not exist (I’ll have a new post regarding that topic as time allows) and outright attacks on religion not just for being in error but for being the source of evil.

I have a hard time imagining other developments in 21st-century intellectual thought that would shake the foundations of our cosmology any more furiously than what we’re now experiencing. Even the dawning realization that we’ve essentially killed ourselves (with delayed effect) by gradually though consistently laying waste to our own habitat is more of an “oops” than the mind-blowing moment of waking up from The Matrix to discover the unreality of everything once believed. Of course, for fervent believers especially, the true facts (best as we can know them, since knowledge is forever provisional) are largely irrelevant in light of desire (what one wants to believe), and that’s true for people on both sides of the schism between church and science/reason.

As Shakespeare wrote in Hamlet, “There are more things in heaven and earth, Horatio, Than are dreamt of in your philosophy.” So it’s probably wrong to introduce a false dualism, though it has plenty of historical precedent. I’ll suggest instead that there are more facets and worldviews at play in the world that the two that have been warring in the West for the last 600 years.

The storms referenced in the earlier version of this post were civilization-ending cataclysms. The succession of North American hurricanes and earthquakes earlier this month of September 2017 were natural disasters. I would say that September was unprecedented in history, but reliable weather records do not extend very far back in human history and the geological record extending back into human prehistory would suggest that, except perhaps for their concentration within the span of a month, the latest storms are nothing out of the ordinary. Some have even theorized that hurricanes and earthquakes could be interrelated. In the wider context of weather history, this brief period of destructive activity may still be rather mild. Already in the last twenty years we’ve experienced a series of 50-, 100- and 500-year weather events that would suggest exactly what climate scientists have been saying, namely, that higher global average temperatures and more atmospheric moisture will lead to more activity in the category of superstorms. Throw drought, flood, and desertification into the mix. This (or worse, frankly) may have been the old normal when global average temperatures were several degrees warmer during periods of hothouse earth. All indications are that we’re leaving behind garden earth, the climate steady state (with a relatively narrow band of global temperature variance) enjoyed for roughly 12,000 years.

Our response to the latest line of hurricanes that struck the Gulf, Florida, and the Caribbean has been characterized as a little tepid considering we had the experience of Katrina from which to learn and prepare, but I’m not so sure. True, hurricanes can be seen hundreds of miles and days away, allowing folks opportunity to either batten down the hatches or flee the area, but we have never been able to handle mass exodus, typically via automobile, and the sheer destructive force of the storms overwhelms most preparations and delays response. So after Katrina, it appeared for several days that the federal government’s response was basically this: you’re on your own; that apparent response occurred again especially in Puerto Rico, which like New Orleans quickly devolved into a true humanitarian crisis (and is not yet over). Our finding (in a more charitable assessment on my part) is that despite foreknowledge of the event and past experience with similar events, we can’t simply swoop in and smooth things out after the storms. Even the first steps of recovery take time.

I’ve cautioned that rebuilding on the same sites, with the reasonable expectation of repeat catastrophes in a destabilized climate that will spawn superstorms reducing entire cities to garbage heaps, is a poor option. No doubt we’ll do it anyway, at least partially; it’s already well underway in Houston. I’ve also cautioned that we need to brace for a diaspora as climate refugees abandon destroyed and inundated cities and regions. It’s already underway with respect to Puerto Rico. This is a storm of an entirely different sort (a flood, actually) and can also been seen from hundreds of miles and weeks, months, years away. And like superstorms, a diaspora from the coasts, because of the overwhelming force and humanitarian crisis it represents, is not something for which we can prepare adequately. Still, we know it’s coming, like a 20- or 50-year flood.

This is the inverse of a prior post called “Truth Based on Fiction.”

Telling stories about ourselves is one of the most basic of human attributes stretching across oral and recorded history. We continue today to memorialize events in short, compact tellings, frequently movies depicting real-life events. I caught two such films recently: Truth (about what came to be known as Rathergate) and Snowden (about whistle-blower Edward Snowden).

Although Dan Rather is the famous figure associated with Truth, the story focuses more on his producer Mary Mapes and the group decisions leading to airing of a controversial news report about George W. Bush’s time in the Air National Guard. The film is a dramatization, not a documentary, and so is free to present the story with its own perspective and some embellishment. Since I’m not a news junkie, my memory of the events in 2004 surrounding the controversy are not especially well informed, and I didn’t mind the potential for the movie’s version of events to color my thinking. About some controversies and conspiracies, I feel no particular demand to adopt a strong position. The actors did well enough, but I felt Robert Redford was poorly cast as Dan Rather. Redford is too famous in his own right to succeed as a character actor playing a real-life person.

Debate over the patriotism or treason of Edward Snowden’s actions continues to swirl, but the film covers the issues pretty well, from his discovery of an intelligence services surveillance dragnet (in violation of the 4th Amendment to the U.S. Constitution) to his eventual disclosure of same to a few well-respected journalists. The film’s director and joint screenwriter, Oliver Stone, has made a career out of fiction based on truth, dramatizing many signal events from the nation’s history, repackaging them as entertainment in the process. I’m wary of his interpretations of history when presented in cinematic form, less so his alternative history lessons given as documentary. Unlike Truth, however, I have clear ideas in my mind regarding Snowden the man and Snowden the movie, so from a different standpoint, was again unconcerned about potential bias. Joseph Gordon-Levitt does well enough as the titular character, though he doesn’t project nearly the same insight and keen intelligence as Snowden himself does. I suspect the documentary Citizen Four (which I’ve not yet seen) featuring Snowden doing his own talking is a far better telling of the same episode of history.

In contrast, I have assiduously avoided several other recent films based on actual events. United 93, World Trade Center, and Deepwater Horizon spring to mind, but there are many others. The wounds and controversies stemming from those real-life events still smart too much for me to consider exposing myself to propaganda historical fictions. Perhaps in a few decades, after living memory of such events has faded or disappeared entirely, such stories can be told effectively, though probably not accurately. A useful comparison might be any one of several films called The Alamo.