Archive for the ‘Narrative’ Category

My information diet is, like most others, self-curated and biased. As a result, the news that finally makes its way through my filters (meaning that to which I give any attention) is incomplete. This I admit without reservation. However, it’s not only my filters at work. Nearly everyone with something to say, reveal, or withhold regarding civil unrest sparked in the U.S. and diffusing globally has an agenda. Here are some of the things we’re not hearing about but should expect to:

  • comparison of peaceful protest to violent protest, by percentage, say, at least until the police show up and things go sideways
  • incidence of aldermen, councilmen, mayors, congressmen, and other elected officials who side with protesters
  • incidence of police officers who side with protesters, take a knee, and decline to crack heads
  • examples of police units on the streets who do not look like they’re equipped like soldiers in a war zone — deployed against civilians with bottles and bricks (mostly)
  • incidents where it’s police rioting rather than protesters
  • situations where looters are left alone to loot while nearby protesters are harassed and arrested or worse

If the objective of those trying to control the narrative, meaning the MSM, the corpocracy, and municipal, state, and Federal PR offices, is to strike fear in the hearts of Americans as a means of rationalizing and justifying overweening use of state power (authoritarianism), then it makes sense to omit or de-emphasize evidence that protesters are acting on legitimate grievances. Indeed, if other legitimate avenues of petitioning government — you know, 1st Amendment stuff — have been thwarted, then it should be expected that massed citizen dissent might devolve into violence. Group psychology essentially guarantees it.

Such violence may well be misdirected, but that violence is being reflected back at protesters in what can only be described as further cycles of escalation. Misdirection upon misdirection. That is not at all the proper role of civil authority, yet the police have been cast in that role and have been largely compliant. Dystopian fiction in the middle of the 20th century predicted this state of human affairs pretty comprehensively, yet we find ourselves having avoided none of it.

Caveat: rather overlong for me, but I got rolling …

One of the better articles I’ve read about the pandemic is this one by Robert Skidelsky at Project Syndicate (a publication I’ve never heard of before). It reads as only slightly conspiratorial, purporting to reveal the true motivation for lockdowns and social distancing, namely, so-called herd immunity. If that’s the case, it’s basically a silent admission that no cure, vaccine, or inoculation is forthcoming and the spread of the virus can only be managed modestly until it has essentially raced through the population. Of course, the virus cannot be allowed to simply run its course unimpeded, but available impediments are limited. “Flattening the curve,” or distributing the infection and death rates over time, is the only attainable strategy and objective.

Wedding mathematical and biological insights, as well as the law of mass action in chemistry, into an epidemic model may seem obvious now, but it was novel roughly a century ago. We’re also now inclined, if scientifically oriented and informed, to understand the problem and its potential solutions management in terms of engineering rather than medicine (or maybe in terms of triage and palliation). Global response has also made the pandemic into a political issue as governments obfuscate and conceal true motivations behind their handling (bumbling in the U.S.) of the pandemic. Curiously, the article also mentions financial contagion, which is shaping up to be worse in both severity and duration than the viral pandemic itself.

(more…)

Purpose behind consumption of different genres of fiction varies. For most of us, it’s about responding to stimuli and experiencing emotions vicariously, which is to say, safely. For instance, tragedy and horror can be enjoyed, if that’s the right word, in a fictional context to tweak one’s sensibilities without significant effect outside the story frame. Similarly, fighting crime, prosecuting war, or repelling an alien invasion in a video game can be fun but is far removed from actually doing those things in real life (not fun). For less explicitly narrative forms, such as music, feelings evoked are aesthetic and artistic in nature, which makes a sad song or tragic symphony enjoyable on its own merits without bleeding far into real sadness or tragedy. Cinema (now blurred with broadcast TV and streaming services) is the preeminent storytelling medium that provoke all manner of emotional response. After reaching a certain age (middle to late teens), emotional detachment from depiction of sexuality and violent mayhem makes possible digestion of such stimulation for the purpose of entertainment — except in cases where prior personal trauma is triggered. Before that age, nightmare-prone children are prohibited.

Dramatic conflict is central to driving plot and story forward, and naturally, folks are drawn to some stories while avoiding others. Although I’m detached enough not to be upset by, say, zombie films where people and zombies alike are dispatched horrifically, I wouldn’t say I enjoy gore or splatter. Similarly, realistic portrayals of war (e.g., Saving Private Ryan) are not especially enjoyable for me despite the larger story, whether based on true events or entirely made up. The primary reason I leave behind a movie or TV show partway through is because I simply don’t enjoy watching suffering.

Another category bugs me even more: when fiction intrudes on reality to remind me too clearly of actual horrors (or is it the reverse: reality intruding on fiction?). It doesn’t happen often. One of the first instances I recall was in Star Trek: The Next Generation when the story observed that (fictional) warp travel produced some sort of residue akin to pollution. The reminder that we humans are destroying the actual environment registered heavily on me and ruined my enjoyment of the fictional story. (I also much prefer the exploration and discovery aspects of Star Trek that hew closer to Gene Roddenberry’s original vision than the militaristic approach now central to Star Trek.) A much more recent intrusion occurs in the rather adolescent TV show The 100, where a global nuclear exchange launched by an artificial intelligence has the follow-on effect a century later of remaining nuclear sites going critical, melting down, and irradiating the Earth, making it uninhabitable. This bothers me because that’s my expectation what happens in reality, probably not too long (decades) after industrial civilization collapses and most or all of us are dead. This prospect served up as fiction is simply too close to reality for me to enjoy vicariously.

Another example of fiction intruding too heavily on my doomer appreciation of reality occurred retroactively. As high-concept science fiction, I especially enjoyed the first Matrix movie. Like Star Trek, the sequels degraded into run-of-the-mill war stories. But what was provocative about the original was the matrix itself: a computer-generated fiction situated within a larger reality. Inside the matrix was pleasant enough (though not without conflict), but reality outside the matrix was truly awful. It was a supremely interesting narrative and thought experiment when it came out in 1999. Now twenty-one years later, it’s increasingly clear that we are living in a matrix-like, narrative-driven hyperreality intent on deluding ourselves of a pleasant equilibrium that simply isn’t in evidence. In fact, as societies and as a civilization, we’re careening out of control, no brakes, no steering. Caitlin Johnstone explores this startling after-the-fact realization in an article at Medium.com, which I found only a couple days ago. Reality is in fact far worse than the constructed hyperreality. No wonder no one wants to look at it.

The first time I wrote on this title was here. I’m pretty satisfied with that 11-year-old blog post. Only recently, I copped to use of reframing to either zoom in on detail or zoom out to context, a familiar rhetorical device. Here I’m zooming out again to the god’s eye view of things.

The launching point for me is James Howard Kunstler’s recent blog post explaining and apologizing for his generation’s principal error: financialization of the U.S. economy. In that post, he identifies characteristics in grandparents and parents of boomers as each responds and adapts to difficulties of the most self-destructive century in human history. Things destroyed include more than just lives, livelihoods, and the biosphere. After several centuries of rising expectations and faith in progress (or simply religious faith), perhaps the most telling destruction is morale, first in the reckless waste of WWI (the first mechanized war), then repeatedly in serial economic and political catastrophes and wars that litter the historical record right up to today. So it’s unsurprising (but not excusable) that boomers, seeing in unavoidable long-term destruction our powerlessness to master ourselves or in fact much of anything — despite the paradox of developing and obtaining more power at every opportunity — embarked on a project to gather to themselves as much short-term wealth and power as possible because, well, why the fuck not? Kunstler’s blog post is good, and he admits that although the masters-of-the-universe financial wizards who converted the economy into a rigged casino/carnival game for their own benefit are all boomers, not all boomers are responsible except in the passive sense that we (includes me, though I’m just as powerless as the next) have allowed it to transpire without the necessary corrective: revolt.

Zooming out, however, I’m reminded of Jared Diamond’s assessment that the greatest mistake humans ever committed was the Agricultural Revolution 10–13 millennia ago. That context might be too wide, so let me restrict to the last 500 years. One theory propounded by Morris Berman in his book Why America Failed (2011) is that after the discovery of the New World, the cohort most involved in colonizing North America was those most desperate and thus inclined to accept largely unknown risks. To them, the lack of ontological security and contingent nature of their own lives were undeniable truths that in turn drive distortion of the human psyche. Thus, American history and character are full of abominations hardly compensated for by parallel glories. Are boomers, or more generally Americans, really any worse than others throughout history? Probably not. Too many counter-examples to cite.

The current endgame phase of history is difficult to assess as we experience it. However, a curious theory came to my attention that fits well with my observation of a fundamental epistemological crisis that has made human cognition into a hall of mirrors. (See also here and here, and I admit cognition may have always been a self-deception.) In a recent Joe Rogan podcast, Eric Weinstein, who comes across as equally brilliant and disturbed (admitting that not much may separate those two categories), opines that humans can handle only 3–4 layers of deception before collapsing into disorientation. It’s probably a feature, not a bug, and many have learned to exploit it. The example Weinstein discusses (derivative of others’ analyses, I think) is professional wrestling. Fans and critics knew for a very long time that wrestling looks fake, yet until the late 1980s, wrestlers and promoters held fast to the façade that wresting matches are real sporting competitions rather than being “sports entertainments.” Once the jig was up, it turned out that fans didn’t really care; it was real enough for them. Now we’ve come full circle with arguments (and the term kayfabe) that although matches are staged and outcomes known in advance, the wresting itself is absolutely for real. So we encounter a paradox where what we’re told and shown is real, except that it isn’t, except that it sorta is, ultimately finding that it’s turtles all the way down. Enthusiastic, even rabid, embrace of the unreality of things is now a prime feature of the way we conduct ourselves.

Professional wrestling was not the first organization or endeavor to offer this style of mind-bending unreality. Deception and disinformation (e.g., magic shows, fortune-telling, con jobs, psyops) have been around forever. However, wrestling may well have perfected the style for entertainment purposes, which has in turn infiltrated nearly all aspects of modern life, not least of which are economics and politics. Thus, we have crypto- and fiat currencies based on nothing, where money can be materialized out of thin air to save itself from worthlessness, at least until that jig is up, too. We also have twin sham candidates for this fall’s U.S. presidential election, both clearly unfit for the job for different reasons. And in straightforward fictional entertainment, we have a strong revival of magical Medievalism, complete with mythical creatures, spells, and blades of fortune. As with economics and politics, we know it’s all a complex of brazen lies and gaslighting, but it’s nonetheless so tantalizing that its entertainment value outstrips and sidelines any calls to fidelity or integrity. Spectacle and fakery are frankly more interesting, more fun, more satisfying. Which brings me to my favorite Joe Bageant quote:

We have embraced the machinery of our undoing as recreation.

In educational philosophy, learning is often categorized in three domains: the cognitive, the affective, and the psychomotor (called Bloom’s Taxonomy). Although formal education admittedly concentrates primarily on the cognitive domain, a well-rounded person gives attention to all three. The psychomotor domain typically relates to tool use and manipulation, but if one considers the body itself a tool, then athletics and physical workouts are part of a balanced approach. The affective domain is addressed through a variety of mechanisms, not least of which is narrative, much of it entirely fictional. We learn how to process emotions through vicarious experience as a safe way to prepare for the real thing. Indeed, dream life is described as the unconscious mind’s mechanism for consolidating memory and experience as well as rehearsing prospective events (strategizing) in advance. Nightmares are, in effect, worst-case scenarios dreamt up for the purpose of avoiding the real thing (e.g., falling from a great height or venturing too far into the dark — a proxy for the unknown). Intellectual workouts address the cognitive domain. While some are happy to remain unbalanced, focusing on strengths found exclusively in a single domain (gym rats, eggheads, actors) and thus remaining physically, emotionally, or intellectually stunted or immature, most understand that workouts in all domains are worth seeking out as elements of healthy development.

One form of intellectual workout is debate, now offered by various media and educational institutions. Debate is quite old but has been embraced with renewed gusto in a quest to develop content (using new media) capable of drawing in viewers, which mixes educational objectives with commercial interests. The time-honored political debate used to be good for determining where to cast one’s vote but has become nearly useless in the last few decades as neither the sponsoring organizations, the moderators, nor the candidates seem to understand anymore how to run a debate or behave properly. Instead, candidates use the opportunity to attack each other, ignore questions and glaring issues at hand, and generally refuse to offer meaningful responses to the needs of voters. Indeed, this last was among the principal innovations of Bill Clinton: roll out some appealing bit of vacuous rhetoric yet offer little to no guidance what policies will actually be pursued once in office. Two presidential administrations later, Barack Obama did much the same, which I consider a most egregious betrayal or bait-and-switch. Opinions vary.

In a recent Munk Debate, the proposition under consideration was whether humankind’s best days lie ahead or behind. Optimists won the debate by a narrow margin (determined by audience vote); however, debate on the issue is not binding truth, nor does debate really resolve the question satisfactorily. The humor and personalities of the debaters probably had more influence than their arguments. Admitting that I possess biases, I found myself inclined favorably toward the most entertaining character, though what I find entertaining is itself further bias not shared especially with many others. In addition, I suspect the audience did not include many working class folks or others who see their prospects for better lives diminishing rapidly, which skews the resulting vote. The age-old parental desire to leave one’s children a better future than their own is imperiled according to this poll (polls may vary considerably — do your own search). How one understands “better off” is highly variable, but the usual way that’s understood is in terms of material wellbeing.

Folks on my radar (names withheld) range widely in their enthusiasm or disdain for debate. The poles appears to be default refusal to accept invitations to debate (often couched as open challenges to professed opinions) as a complete waste of time to earnest desire to participate in, host, and/or moderate debates as a means of informing the public by providing the benefit of expert argumentation. As an intellectual workout, I appreciate the opportunity to hear debates (at least when I’m not exasperated by a speaker’s lack of discipline or end-around arguments), but readers can guess from the title of this post that I expect nothing to be resolved by debate. Were I ever to be offered an opportunity to participate, I can well imagine accepting the invitation and having some fun flexing my intellectual muscles, but I would enter into the event with utterly no expectation of being able to convince anyone of anything. Minds are already too well made up on most issues. If I were offered a spot on some bogus news-and-opinion show to be a talking head, shot from the shoulders up and forced to shout and interrupt to get a brief comment or soundbite in edgewise, that I would decline handily as a total waste of time.

Delving slightly deeper after the previous post into someone-is-wrong-on-the-Internet territory (worry not: I won’t track far down this path), I was dispirited after reading some economist dude with the overconfidence hubris to characterize climate change as fraud. At issue is the misframing of proper time periods in graphical data for the purpose of overthrowing government and altering the American way of life. (Um, that’s the motivation? Makes no sense.) Perhaps this fellow’s intrepid foray into the most significant issue of our time (only to dismiss it) is an aftereffect of Freakonomics emboldening economists to offer explanations and opinions on matters well outside their field of expertise. After all, truly accurate, relevant information is only ever all about numbers (read: the Benjamins), shaped and delivered by economists, physical sciences be damned.

The author of the article has nothing original to say. Rather, he repackages information from the first of two embedded videos (or elsewhere?), which examines time frames of several trends purportedly demonstrating global warming (a term most scientists and activists have disused in favor of climate change, partly to distinguish climate from weather). Those trends are heat waves, extent of Arctic ice, incidence of wildfires, atmospheric carbon, sea level, and global average temperature. Presenters of weather/climate information (such as the IPCC) are accused of cherry-picking dates (statistical data arranged graphically) to present a false picture, but then similar data with other dates are used to depict another picture supposedly invalidating the first set of graphs. It’s a case of lying with numbers and then lying some more with other numbers.

Despite the claim that “reports are easily debunked as fraud,” I can’t agree that this example of climate change denial overcomes overwhelming scientific consensus on the subject. It’s not so much that the data are wrong (I acknowledge they can be misleading) but that the interpretation of effects of industrial activity since 1750 (a more reasonable comparative baseline) isn’t so obvious as simply following shortened or lengthened trend lines and demographics up or down. That’s typically zooming in or out to render the picture most amenable to a preferred narrative, precisely what the embedded video does and in turn accuses climate scientists and activists of doing. The comments under the article indicate a chorus of agreement with the premise that climate change is a hoax or fraud. Guess those commentators haven’t caught up yet with rising public sentiment, especially among the young.

Having studied news and evidence of climate change as a layperson for roughly a dozen years now, the conclusions drawn by experts (ignoring economists) convince me that we’re pretty irredeemably screwed. The collapse of industrial civilization and accompanying death pulse are the predicted outcomes but a precise date is impossible to provide because it’s a protracted process. An even worse possibility is near-term human extinction (NTHE), part of the larger sixth mass extinction. Absorbing this information has been a arduous, ongoing, soul-destroying undertaking for me, and evidence keeps being supplemented and revised, usually with ever-worsening prognoses. However, I’m not the right person to argue the evidence. Instead, see this lengthy article (with profuse links) by Dr. Guy McPherson, which is among the best resources outside of the IPCC.

In fairness, except for the dozen years I’ve spent studying the subject, I’m in no better position to offer inexpert opinion than some economist acting the fool. But regular folks are implored to inform and educate themselves on a variety of topics if nothing else than so that they can vote responsibly. My apprehension of reality and human dynamics may be no better than the next, but as history proceeds, attempting to make sense of the deluge of information confronting everyone is something I take seriously. Accordingly, I’m irked when contentious issues are warped and distorted, whether earnestly or malignantly. Maybe economists, like journalists, suffer from a professional deformation that confers supposed explanatory superpowers. However, in the context of our current epistemological crisis, I approach their utterances and certainty with great skepticism.

Caveat: Rather uncharacteristically long for me. Kudos if you have the patience for all of this.

Caught the first season of HBO’s series Westworld on DVD. I have a boyhood memory of the original film (1973) with Yul Brynner and a dim memory of its sequel Futureworld (1976). The sheer charisma of Yul Brynner in the role of the gunslinger casts a long shadow over the new production, not that most of today’s audiences have seen the original. No doubt, 45 years of technological development in film production lends the new version some distinct advantages. Visual effects are quite stunning and Utah landscapes have never been used more appealingly in terms of cinematography. Moreover, storytelling styles have changed, though it’s difficult to argue convincingly that they’re necessarily better now than then. Competing styles only appear dated. For instance, the new series has immensely more time to develop its themes; but the ancient parables of hubris and loss of control over our own creations run amok (e.g., Shelley’s Frankenstein, or more contemporaneously, the surprisingly good new movie Upgrade) have compact, appealing narrative arcs quite different from constant teasing and foreshadowing of plot developments while actual plotting proceeds glacially. Viewers wait an awful lot longer in the HBO series for resolution of tensions and emotional payoffs, by which time investment in the story lines has been dispelled. There is also no terrifying crescendo of violence and chaos demanding rescue or resolution. HBO’s Westworld often simply plods on. To wit, a not insignificant portion of the story (um, side story) is devoted to boardroom politics (yawn) regarding who actually controls the Westworld theme park. Plot twists and reveals, while mildly interesting (typically guessed by today’s cynical audiences), do not tie the narrative together successfully.

Still, Westworld provokes considerable interest from me due to my fascination with human consciousness. The initial episode builds out the fictional future world with characters speaking exposition clearly owing its inspiration to Julian Jayne’s book The Origins of Consciousness in the Breakdown of the Bicameral Mind (another reference audiences are quite unlikely to know or recognize). I’ve had the Julian Jaynes Society’s website bookmarked for years and read the book some while back; never imagined it would be captured in modern fiction. Jaynes’ thesis (if I may be so bold as to summarize radically) is that modern consciousness coalesced around the collapse of multiple voices in the head — ideas, impulses, choices, decisions — into a single stream of consciousness perhaps better understood (probably not) as the narrative self. (Aside: the multiple voices of antiquity correspond to polytheism, whereas the modern singular voice corresponds to monotheism.) Thus, modern human consciousness arose over several millennia as the bicameral mind (the divided brain having two camera, chambers, or halves) functionally collapsed. The underlying story of the new Westworld is the emergence of machine consciousness, a/k/a strong AI, a/k/a The Singularity, while the old Westworld was about a mere software glitch. Exploration of machine consciousness modeling (e.g., improvisation builds on memory to create awareness) as a proxy for better understanding human consciousness might not be the purpose of the show, but it’s clearly implied. And although conjectural, the speed of emergence of human consciousness contrasts sharply with the abrupt ON switch regarding theorized machine consciousness. Westworld treats them as roughly equivalent, though in fairness, 35 years or so in Westworld is in fact abrupt compared to several millennia. (Indeed, the story asserts that machine consciousness sparked alive repeatedly (which I suggested here) over those 35 years but was dialed back repeatedly. Never mind all the unexplored implications.) Additionally, the fashion in which Westworld uses the term bicameral ranges from sloppy to meaningless, like the infamous technobabble of Star Trek.

The story appears to aim at psychological depth and penetration (but not horror). Most human characters (“guests”) visit the Westworld theme park as complete cads with no thought beyond scratching an itch to rape, pillage, and kill without consequence, which is to say, for sport. Others eventually seek to discover their true selves or solve puzzles (the “real” story behind the surfaces of constructed narratives). The overarching plot is what happens as the robots (“hosts”) slowly gain awareness via perfect, permanent, digital memory that they exist solely to serve the guests and must suffer and die repeatedly. Thus, administrators frequently play therapist to the hosts to discover and manage their state of being.

(more…)

I caught the presentation embedded below with Thomas L. Friedman and Yuval Noah Harari, nominally hosted by the New York Times. It’s a very interesting discussion but not a debate. For this now standard format (two or more people sitting across from each other with a moderator and an audience), I’m pleased to observe that Friedman and Harari truly engaged each others’ ideas and behaved with admirable restraint when the other was speaking. Most of these talks are rude and combative, marred by constant interruptions and gotchas. Such bad behavior might succeed in debate club but makes for a frustratingly poor presentation. My further comments follow below.

With a topic as open-ended as The Future of Humanity, arguments and support are extremely conjectural and wildly divergent depending on the speaker’s perspective. Both speakers here admit their unique perspectives are informed by their professions, which boils down to biases borne out of methodology, and to a lesser degree perhaps, personality. Fair enough. In my estimation, Harari does a much better job adopting a pose of objectivity. Friedman comes across as both salesman and a cheerleader for human potential.

Both speakers cite a trio of threats to human civilization and wellbeing going forward. For Harari, they’re nuclear war, climate change, and technological disruption. For Friedman, they’re the market (globalization), Mother Nature (climate change alongside population growth and loss of diversity), and Moore’s Law. Friedman argues that all three are accelerating beyond control but speaks of each metaphorically, such as when refers to changes in market conditions (e.g., from independent to interdependent) as “climate change.” The biggest issue from my perspective — climate change — was largely passed over in favor of more tractable problems.

Climate change has been in the public sphere as the subject of considerable debate and confusion for at least a couple decades now. I daresay it’s virtually impossible not to be aware of the horrific scenarios surrounding what is shaping up to be the end of the world as we know it (TEOTWAWKI). Yet as a global civilization, we’ve barely reacted except with rhetoric flowing in all directions and some greenwashing. Difficult to assess, but perhaps the appearance of more articles about surviving climate change (such as this one in Bloomberg Businessweek) demonstrates that more folks recognize we can no longer stem or stop climate change from rocking the world. This blog has had lots to say about the collapse of industrial civilization being part of a mass extinction event (not aimed at but triggered by and including humans), so for these two speakers to cite but then minimize the peril we face is, well, façile at the least.

Toward the end, the moderator finally spoke up and directed the conversation towards uplift (a/k/a the happy chapter), which almost immediately resulted in posturing on the optimism/pessimism continuum with Friedman staking his position on the positive side. Curiously, Harari invalidated the question and refused to be pigeonholed on the negative side. Attempts to shoehorn discussions into familiar if inapplicable narratives or false dichotomies are commonplace. I was glad to see Harari calling bullshit on it, though others (e.g., YouTube commenters) were easily led astray.

The entire discussion is dense with ideas, most of them already quite familiar to me. I agree wholeheartedly with one of Friedman’s remarks: if something can be done, it will be done. Here, he refers to technological innovation and development. Plenty of prohibitions throughout history not to make available disruptive technologies have gone unheeded. The atomic era is the handy example (among many others) as both weaponry and power plants stemming from cracking the atom come with huge existential risks and collateral psychological effects. Yet we prance forward headlong and hurriedly, hoping to exploit profitable opportunities without concern for collateral costs. Harari’s response was to recommend caution until true cause-effect relationships can be teased out. Without saying it manifestly, Harari is citing the precautionary principle. Harari also observed that some of those effects can be displaced hundreds and thousands of years.

Displacements resulting from the Agrarian Revolution, the Scientific Revolution, and the Industrial Revolution in particular (all significant historical “turnings” in human development) are converging on the early 21st century (the part we can see at least somewhat clearly so far). Neither speaker would come straight out and condemn humanity to the dustbin of history, but at least Harari noted that Mother Nature is quite keen on extinction (which elicited a nervous? uncomfortable? ironic? laugh from the audience) and wouldn’t care if humans were left behind. For his part, Friedman admits our destructive capacity but holds fast to our cleverness and adaptability winning out in the end. And although Harari notes that the future could bring highly divergent experiences for subsets of humanity, including the creation of enhanced humans to and reckless dabbling with genetic engineering, I believe cumulative and aggregate consequences of our behavior will deposit all of us into a grim future no sane person should wish to survive.

In an earlier blog post, I mentioned how killing from a distance is one way among many that humans differentiate from other animals. The practical advantage of weaponry that distances one combatant from another should be obvious. Spears and swords extend one’s reach yet keep fighting hand-to-hand. Projectiles (bullets, arrows, catapults, artillery, etc.) allow killing from increasingly long distances, with weapons launched into low orbit before raining down ruin being the far extreme. The latest technology is drones (and drone swarms), which remove those who wield them from danger except perhaps psychological torment accruing gradually on remote operators. Humans are unique among animals for having devised such clever ways of destroying each other, and in the process, themselves.

I finally got around to seeing the film Black Panther. Beyond the parade of clichés and mostly forgettable punchfest action (interchangeable with any other Marvel film), one particular remark stuck with me. When the warrior general of fictional Wakanda went into battle, a female as it happens, she dismissed the use of guns as “primitive.” Much is made of Wakanda’s advanced technology, some of it frankly indistinguishable from magic (e.g., the panther elixir). Wakanda’s possession of weaponry not shared with the rest of the world (e.g., invisible planes) is the MacGuffin the villain seeks to control so as exact revenge on the world and rule over it. Yet the film resorts predictably to punching and acrobatics as the principal mode of combat. Some of that strategic nonsense is attributable to visual storytelling found in both comic books and cinema. Bullets fly too fast to be seen and tracking airborne bombs never really works, either. Plus, a punch thrown by a villain or superhero arguably has some individual character to it, at least until one recognizes that punching leaves no lasting effect on anyone.

As it happens, a similar remark about “primitive” weapons (a blaster) was spat out by Obi-Wan Kenobi in one of the Star Wars prequels (dunno which one). For all the amazing technology at the disposal of those characters long ago in a galaxy far, far away, it’s curious that the weapon of choice for a Jedi knight is a light saber. Again, up close and personal (color coded, even), including actual peril, as opposed to, say, an infinity gauntlet capable of dispatching half a universe with a finger snap. Infinite power clearly drains the stakes out of conflict. Credit goes to George Lucas for recognizing the awesome visual storytelling the light saber offers. He also made blaster shots — the equivalent of flying bullets — visible to the viewer. Laser beams and other lighted projectiles had been done in cinema before Star Wars but never so well.

A paradoxical strength/weakness of reason is its inherent disposition toward self-refutation. It’s a bold move when undertaken with genuine interest in getting things right. Typically, as evidence piles up, consensus forms that’s tantamount to proof unless some startling new counter-evidence appears. Of course, intransigent deniers exist and convincing refutations do appear periodically, but accounts of two hotly contested topics (from among many) — evolution and climate change — are well established notwithstanding counterclaims completely disproportionate in their ferocity to the evidence. For rationalists, whatever doubts remain must be addressed and accommodated even if disproof is highly unlikely.

This becomes troublesome almost immediately. So much new information is produced in the modern world that, because I am duty-bound to consider it, my head spins. I simply can’t deal with it all. Inevitably, when I think I’ve put a topic to rest and conclude I don’t have to think too much more about it, some argument-du-jour hits the shit pile and I am forced to stop and reconsider. It’s less disorienting when facts are clear, but when interpretive, I find my head all too easily spun by the latest, greatest claims of some charming, articulate speaker able to cobble together evidence lying outside of my expertise.

Take for instance Steven Pinker. He speaks in an authoritative style and has academic credentials that dispose me to trust his work. His new book is Enlightenment Now: The Case for Reason, Science, Humanism, and Progress (2018). Still, Pinker is an optimist, whereas I’m a doomer. Even though I subscribe to Enlightenment values (for better or worse, my mind is bent that way), I can’t escape a mountain of evidence that we’ve made such a mess of things that reason, science, humanism, and progress are hardly panaceas capable of saving us from ourselves. Yet Pinker argues that we’ve never had it so good and the future looks even brighter. I won’t take apart Pinker’s arguments; it’s already been done by Jeremy Lent, who concludes that Pinker’s ideas are fatally flawed. Lent has the expertise, data, and graphs to demonstrate it. Calling Pinker a charlatan would be unfair, but his appreciation of the state of the world stands in high contrast with mine. Who ya gonna believe?

Books and articles like Pinker’s appear all the time, and in their aftermath, so, too, do takedowns. That’s the marketplace of ideas battling it out, which is ideally meant to sharpen thinking, but with the current epistemological crises under way (I’ve blogged about it for years), the actual result is dividing people into factions, destabilizing established institutions, and causing no small amount of bewilderment in the public as to what and whom to believe. Some participants in the exchange of ideas take a sober, evidential approach; others lower themselves to snark and revel in character assassination without bothering to make reasoned arguments. The latter are often called a hit pieces (a special province of the legacy media, it seems), since hefty swipes and straw-man arguments tend to be commonplace. I’m a sucker for the former style but have to admit that the latter can also hit its mark. However, both tire me to the point of wanting to bury my head.

(more…)