Archive for the ‘Idle Nonsense’ Category

Continuing from part 1, which is altogether too much screed and frustration with Sam Harris, I now point to several analyses that support my contentions. First is an article in The Nation about the return of so-called scientific racism and speaks directly about Charles Murray, Sam Harris, and Andrew Sullivan, all of whom are embroiled in the issue. Second is an article in The Baffler about constructing arguments ex post facto to conform to conclusions motivated in advance of evidence. Most of us are familiar with the the constructed explanation, where in the aftermath of an event, pundits, press agents, and political insiders propose various explanatory narratives to gain control over what will eventually become the conventional understanding. Published reports such as the Warren Commission‘s report on the assassination of JFK is one such example, and I daresay few now believe the report and the consensus that it presents weren’t politically motivated and highly flawed. Both linked articles above are written by Edward Burmilla, who blogs at Gin and Tacos (see blogroll). Together, they paint a dismal picture of how reason and rhetoric can be corrupted despite the sheen of scientific respectability.

Third is an even more damaging article (actually a review of the new anthology Trump and the Media) in the Los Angeles Review of Books by Nicolas Carr asking the pointed question “Can Journalism Be Saved?” Admittedly, journalism is not equivalent with reason or rationalism, but it is among several professions that employ claims of objectivity, accuracy, and authority. Thus, journalism demands both attention and respect far in excess of the typical blogger (such as me) or watering-hole denizen perched atop a barstool. Consider this pullquote:

… the flaws in computational journalism can be remedied through a more open and honest accounting of its assumptions and limitations. C. W. Anderson, of the University of Leeds, takes a darker view. To much of the public, he argues, the pursuit of “data-driven objectivity” will always be suspect, not because of its methodological limits but because of its egghead aesthetics. Numbers and charts, he notes, have been elements of journalism for a long time, and they have always been “pitched to a more policy-focused audience.” With its ties to social science, computational journalism inevitably carries an air of ivory-tower elitism, making it anathema to those of a populist bent.

Computational journalism is contrasted with other varieties of journalism based on, say, personality, emotionalism, advocacy, or simply a mad rush to print (or pixels) to scoop the competition. This hyperrational approach has already revealed its failings, as Carr reports in his review.

What I’m driving at is that, despite frequent appeals to reason, authority, and accuracy (especially the quantitative sort), certain categories of argumentation fail to register on the average consumer of news and information. It’s not a question of whether arguments are right or wrong, precisely; it’s about what appeals most to those paying even a modest bit of attention. And the primary appeal for most (I judge) isn’t reason. Indeed, reason is swept aside handily when a better, um, reason for believing something appears. If one has done the difficult work of acquiring critical thinking and reasoning skills, it can be quite the wake-up call when others fail to behave according to reason, such as with acting against enlightened self-interest. The last presidential election was a case in point.

Circling back so something from an earlier blog, much of human cognition is based on mere sufficiency: whatever is good enough in the moment gets nominated then promoted to belief and/or action. Fight, flight, or freeze is one example. Considered evaluation and reason are not even factors. Snap judgments, gut feelings, emotional resonances, vibes, heuristics, and Gestalts dominate momentary decision-making, and in the absence of convincing countervailing information (if indeed one is even vulnerable to reason, which would be an unreasonable assumption), action is reinforced and suffices as belief.

Yet more in part 3 to come.


From Wikipedia:

Trial by combat (also wager of battle, trial by battle or judicial duel) was a method of Germanic law to settle accusations in the absence of witnesses or a confession in which two parties in dispute fought in single combat; the winner of the fight was proclaimed to be right. In essence, it was a judicially sanctioned duel. It remained in use throughout the European Middle Ages, gradually disappearing in the course of the 16th century.

Unlike trial by ordeal in general, which is known to many cultures worldwide, trial by combat is known primarily from the customs of the Germanic peoples. It was in use among the ancient Burgundians, Ripuarian Franks, Alamans, Lombards, and Swedes. It was unknown in Anglo-Saxon law, Roman law and Irish Brehon Law and it does not figure in the traditions of Middle Eastern antiquity such as the code of Hammurabi or the Torah.

Trial by combat has profound echoes in 21st-century geopolitics and jurisprudence. Familiar phrases such as right of conquest, manifest destiny, to the winner go the spoils, might makes right, and history written by the victors attest to the enduring legacy of hindsight justification by force of arms. More broadly, within the American system, right of access to courts afforded to all citizens also admits nuisance suits and more than a few mismatched battles where deep-pocketed corporations sue individuals and small organizations, often nonprofits, into bankruptcy and submission. For instance, I recently learned of Strategic Lawsuits Against Public Participation (SLAPPs) “used to silence and harass critics by forcing them to spend money to defend these baseless suits.” They employ brute economic power in place of force of arms.

Trial by combat fell out of practice with the onset of the Enlightenment but the broader complex of ideas survived. Interest in medieval Europe as storytelling fodder in cinema and fantasy literature (notably, the shocking trial by combat depicted in the extremely popular HBO drama Game of Thrones where the accused and accuser both designate their proxies rather than doing battle themselves) lends legitimacy to settling disputes via violence. Even the original Karate Kid (1984) has a new YouTube Red series set 30 years later. The bad-boy acolyte replaces his scorched-earth sensei and seeks revenge from the titular character for being bested decades before, the latter of whom is yanked back from quiet obscurity (and the actor who portrays him from career limbo) to fight again and reprove his skills, which is to say, his righteousness. The set-up is surprisingly delicious to contemplate and has considerable nostalgic appeal. More importantly, it embodies the notion (no doubt scripted according to cliché) that only the pure of heart (or their proxies, students in this case) can claim ultimate victory because, well, it’s god’s will or some such and thus good guys must always win. What that really means is that whoever wins is by definition virtuous. If only reality were so reliably simple.

The certainty of various religious dogma and codes of conduct characteristic of the medieval period (e.g., chivalry) is especially seductive in modern times, considering how the public is beset by an extraordinary degree of existential and epistemological uncertainty. The naturalist fallacy is also invoked, where the law of the jungle (only the fittest and/or strongest get to eat or indeed survive) substitutes for more civilized (i.e., enlightened and equanimous) thinking. Further, despite protestations, this complex of ideas legitimizes bullying, whether (1) in the schoolyard with the principal bully flanked by underlings picking on vulnerable weaklings who haven’t formed alliances for self-protection, (2) the workplace, with its power players and Machiavellian manipulators, or (3) a global military power such as the U.S. dictating terms to and/or warring with smaller, weaker nations that lack the GDP, population, and insanity will to project power globally. I daresay most Americans take comfort in having the greatest military and arsenal ever mustered on their side and accordingly being on the right side (the victorious one) of history, thus a beacon of hope to all who would conflate victory with virtue. Those who suffer at our hands must understand things quite differently. (Isn’t it more accurate that when bad guys win, rebellions and insurgencies are sparked?)

One remarkable exception deserves notice. The U.S. presidency is among the most heavily scrutinized and contentious positions (always under attack) and happens to be the Commander-in-Chief of the self-same greatest goddamn fighting force known to man. It’s no secret that the occupant of that office (45) is also widely recognized as the Bully-in-Chief. Despite having at his disposal considerable resources — military, executive staff, and otherwise — 45 has eschewed forming the political coalitions one might expect and essentially gone it alone, using the office (and his Twitter account) as a one-man bully pulpit. Hard to say what he’s trying to accomplish, really. Detractors have banded together (incompetently) to oppose him, but 45 has demonstrated unexpected tenacity, handily dominating rhetorical trials by combat through sheer bluster and hubris. On balance, he scores some pretty good hits, too. (The proposed fist fight between 45 and Joe Biden turned out to be a tease, but how entertaining would that bout have been without actually settling anything!) This pattern has left many quite dumbfounded, and I admit to being astounded as well except to observe that rank stupidity beats everything in this bizarre political rock-paper-scissors contest. How quintessentially American: nuthin’ beats stoopid.

The movie Gladiator depicts the protagonist Maximus addressing spectators directly at gladiatorial games in the Roman Colosseum with this meme-worthy challenge: “Are you not entertained?” Setting the action in an ancient civilization renowned for its decadent final phase prior to collapse, referred to as Bread and Circuses, allows us to share vicariously in the protagonist’s righteous disgust with the public’s blood lust while shielded us from any implication of our own shame because, after all, who could possibly entertain blood sports in the modern era? Don’t answer that.


But this post isn’t about our capacity for cruelty and barbarism. Rather, it’s about the public’s insatiable appetite for spectacle — both fictional and absolutely for real — served up as entertainment. Professional wrestling is fiction; boxing and mixed martial arts are reality. Audiences consuming base entertainment and, in the process, depleting performers who provide that entertainment extend well beyond combat sports, however. For instance, it’s not uncommon for pop musicians to slowly destroy themselves once pulled into the attendant celebrity lifestyle. Three examples spring to mind: Elvis Presley, Michael Jackson, and Whitney Houston. Others call hiatus or retire altogether from the pressure of public performance, such as Britney Spears, Miles Davis, and Barbra Streisand.

To say that the public devours performers and discards what remains of them is no stretch, I’m afraid. Who remembers countdown clocks tracking when female actors turn 18 so that perving on them is at last okay? A further example is the young starlet who is presumably legitimized as a “serious” actor once she does nudity and/or portrays a hooker but is then forgotten in favor of the next. If one were to seek the full depth of such devouring impulses, I suggest porn is the industry to have all one’s illusions shattered. For rather modest sums, there is absolutely nothing some performers won’t do on film (these days on video at RedTube), and naturally, there’s an audience for it. Such appetites are as bottomless as they come. Are you not entertained?

Speaking of Miles Davis, I take note of his hiatus from public performance in the late 1970s before his limited return to the stage in 1986 and early death in 1991 at age 65. He had cemented a legendary career as a jazz trumpeter but in interviews (as memory serves) dismissed the notion that he was somehow a spokesperson for others, saying dryly “I’m just a trumpet player, man ….” What galled me, though, were Don Cheadle’s remarks in the liner notes of the soundtrack to the biopic Miles Ahead (admittedly a deep pull):

Robert Glasper and I are preparing to record music for the final scene of Miles Ahead — a possible guide track for a live concert that sees the return of Miles Davis after having been flushed from his sanctuary of silence and back onto the stage and into his rightful light. My producers and I are buzzing in disbelief about what our audacity and sheer will may be close to pulling off ….

What they did was record a what-might-have-been track had Miles incorporated rap or hip hop (categories blur) into his music. It’s unclear to me whether the “sanctuary of silence” was inactivity or death, but Miles was essentially forced onstage by proxy. “Flushed” is a strange word to use in this context, as one “flushes” an enemy or prey unwillingly from hiding. The decision to recast him in such “rightful light” strikes me as rather poor taste — a case of cultural appropriation worse than merely donning a Halloween costume.

This is the wave of the future, of course, now that images of dead celebrities can be invoked, say, to sell watches (e.g., Steve McQueen) and holograms of dead musicians are made into singing zombies, euphemized as “virtual performance”(e.g., Tupak Shakur). Newly developed software can now create digitized versions of people saying and doing whatever we desire of them, such as when celebrity faces are superimposed onto porn actors (called “deepfakes”). It might be difficult to argue that in doing so content creators are stealing the souls of others, as used to be believed in the early days of photography. I’m less concerned with those meeting demand than with the demand itself. Are we becoming demons, the equivalents of the succubus/incubus, devouring or destroying frivolously the objects of our enjoyment? Are you not entertained?

I’m currently reading Go Wild by John Ratey and Richard Manning. It has some rather astounding findings on offer. One I’ll draw out is that the human brain evolved not for thinking, as one might imagine, but for coordinating complex physiological movements:

… even the simplest of motions — a flick of a finger or a turn of the hand to pick up a pencil — is maddeningly complex and requires coordination and computational power beyond electronics abilities. For this you need a brain. One of our favorites quotes on this matter comes from the neuroscientists Rodolfo Llinás: “That which we call thinking is the evolutionary internationalization of movement.” [p. 100]

Almost all the computation is unconsciousness, or maybe preconscious, and it’s learned over a period of years in infancy and early childhood (for basic locomotion) and then supplemented throughout life (for skilled motions, e.g., writing cursive or typing). Moreover, those able to move with exceptional speed, endurance, power, accuracy, and/or grace are admired and sometimes rewarded in our culture. The obvious example is sports. Whether league sports with wildly overcompensated athletes, Olympic sports with undercompensated athletes, or combat sports with a mixture of both, thrill attaches to watching someone move effectively within the rule-bound context of the sport. Other examples include dancers, musicians, circus artists, and actors who specialize in physical comedy and action. Each develops specialized movements that are graceful and beautiful, which Ratey and Manning write may also account for nonsexual appreciation and fetishization of the human body, e.g., fashion models, glammed-up actors, and nude photography.

I’m being silly saying that jocks figgered it first, of course. A stronger case could probably be made for warriors in battle, such as a skilled swordsman. But it’s jocks who are frequently rewarded all out of proportion with others who specialize in movement. True, their genetics and training enable a relatively brief career (compared to, say, surgeons or pianists) before abilities ebb away and a younger athlete eclipses them. But a fundamental lack of equivalence with artisans and artists is clear, whose value lies less with their bodies than with outputs their movements produce.

Regarding computational burdens, consider the various mechanical arms built for grasping and moving objects, some of them quite large. Mechanisms (frame and hydraulics substituting for bone and muscle) themselves are quite complex, but they’re typically controlled by a human operator rather than automated. (Exceptions abound, but they’re highly specialized, such as circuit board manufacture or textile production.) More recently, robotics demonstrate considerable advancement in locomotion without a human operator, but they’re also narrowly focused in comparison with the flexibility of motion a human body readily possesses. Further, in the case of flying drones, robots operate in wide open space, or, in the case of those designed to move like dogs or insects, use 4+ legs for stability. The latter are typically built to withstand quite a lot of bumping and jostling. Upright bipedal motion is still quite clumsy in comparison with humans, excepting perhaps wheeled robots that obviously don’t move like humans do.

Curiously, the movie Pacific Rim (sequel just out) takes notice of the computational or cognitive difficulty of coordinated movement. To operate giant robots needed to fight Godzilla-like interdimensional monsters, two mind-linked humans control a battle robot. Maybe it’s a simple coincidence — a plot device to position humans in the middle of the action (and robot) rather than killing from a distance — such as via drone or clone — or maybe not. Hollywood screenwriters are quite clever at exploiting all sorts material without necessarily divulging the source of inspiration. It’s art imitating life, knowingly or not.

A year ago, I wrote about charges of cultural appropriation being levied upon fiction writers, as though fiction can now only be some watered-down memoir lest some author have the temerity to conjure a character based on someone other than him- or herself. Specifically, I linked to an opinion piece by Lionel Shriver in the NY Times describing having been sanctioned for writing characters based on ideas, identities, and backgrounds other that his own. Shriver has a new article in Prospect Magazine that provides an update, perhaps too soon to survey the scene accurately since the target is still moving, but nonetheless curious with respect to the relatively recent appearance of call-out culture and outrage engines. In his article, Shriver notes that offense and umbrage are now given equal footing with bodily harm and emotional scarring:

Time was that children were taught to turn aside tormentors with the cry, “Sticks and stones may break my bones, but words will never hurt me!” While you can indeed feel injured because Bobby called you fat, the law has traditionally maintained a sharp distinction between bodily and emotional harm. Even libel law requires a demonstration of palpable damage to reputation, which might impact your livelihood, rather than mere testimony that a passage in a book made you cry.

He also points out that an imagined “right not to be offended” is now frequently invoked, even though there is no possibility of avoiding offense if one is actually conscious in the world. For just one rather mundane example, the extraordinary genocidal violence of 20th-century history, once machines and mechanisms (now called WMDs) were applied to warfare (and dare I say it: statecraft), ought to be highly offensive to any humanitarian. That history cannot be erased, though I suppose it can be denied, revised, buried, and/or lost to living memory. Students or others who insist they be excused from being triggered by knowledge of awful events are proverbial ostriches burying their heads in the sand.

As variations of this behavior multiply and gain social approval, the Thought Police are busily mustering against all offense — real, perceived, or wholly imagined — and waging a broad-spectrum sanitation campaign. Shriver believes this could well pose the end of fiction as publishers morph into censors and authors self-censor in an attempt to pass through the SJW gauntlet. Here’s my counter-argument:

rant on/

I feel mightily offended — OFFENDED I say! — at the arrant stupidity of SJWs whose heads are full of straw (and strawmen), who are so clearly confused about what is even possible within the dictates and strictures of, well, reality, and accordingly retreated into cocoons of ideation from which others are scourged for failure to adhere to some bizarre, muddleheaded notion of equity. How dare you compel me to think prescribed thoughts emanating from your thought bubble, you damn bullies? I have my own thoughts and feelings deserving of support, maybe even more than yours considering your obvious naïveté about how the world works. Why aren’t you laboring to promote mine but instead clamoring to infect everyone with yours? Why is my writing so resoundingly ignored while you prance upon the stage demanding my attention? You are an affront to my values and sensibilities and can stuff your false piety and pretend virtue where the sun don’t shine. Go ahead and be offended; this is meant to offend. If it’s gonna be you or me who’s transgressed precisely because all sides of issues can’t be satisfied simultaneously, then on this issue, I vote for you to be in the hot seat.

rant off/

I remarked in an earlier blog that artists, being hypersensitive to emergent patterns and cultural vibes, often get to ideas sooner than the masses and express their sensibilities through creative endeavor. Those expressions in turn give watchers, viewers, listeners, readers, etc. a way of understanding the world through the artist’s interpretive lens. Interpretations may be completely fictitious, based on real-life events, or merely figurative as the medium allows. They are nonetheless an inevitable reflection of ourselves. Philistines who fail to appreciate that the arts function by absorbing and processing human experience at a deep, intuitive level may insist that the arts are optional or unworthy of attention or financial support. That’s an opinion not at all borne out in the culture, however, and though support may be vulnerable to shifts in valuation (e.g., withdrawal of federal funding for the NEA and PBS), the creative class will always seek avenues of expression, even at personal cost. The democratization of production has made modes of production and distribution for some media quite cheap compared to a couple decades ago. Others remain undeniably labor intensive.

What sparked my thinking are several TV series that have caught my attention despite my generally low level of attention to such media. I haven’t watched broadcast television in over a decade, but the ability to stream TV programming has made shows I have ignored for years far more easy to tune in on my own terms and schedule. “Tune in” is of course the wrong metaphor, but suffice it to say I’ve awarded some of my attention to shows that have up until now fell out of scope for me, cinema being more to my liking. The three shows I’ve been watching (only partway through each) are The Americans, Homeland, and Shameless. The first two are political thrillers (spy stuff) whereas the last is a slice-of-life family drama, which often veers toward comedy but keeps delivering instead tragedy. Not quite the same thing as dark comedy. Conflict is necessary for dramatic purposes, but the ongoing conflict in each of these shows flirts with the worst sorts of disaster, e.g., the spies being discovered and unmasked and the family being thrown out of its home and broken up. Episodic scenarios the writers concoct to threaten catastrophe at every step or at any moment gets tiresome after a while. Multiple seasons ensure that dramatic tension is largely dispelled, since the main characters are present year over year. (The trend toward killing off major characters in others popular TV dramas is not yet widespread.) But still, it’s no way to live, constantly in disaster mode. No doubt I’ve cherry picked three shows from a huge array of entertainments on offer.

Where art reflects reality is that we all now live in the early 21st century under multiple, constantly disquieting threats, large and small, including sudden climate change and ecological disaster, nuclear annihilation, meteor impacts, eruption of the shield volcano under Yellowstone, the Ring of Fire becoming active again (leading to more volcanic and earthquake activity), geopolitical dysfunction on a grand scale, and of course, global financial collapse. This, too, is no way to live. Admittedly, no one was ever promised a care-free life. Yet our inability to manage our own social institutions or shepherd the earth (as though that were our mandate) promise catastrophes in the fullness of time that have no parallels in human history. We’re not flirting with disaster so much as courting it.

Sociologists and historians prepare scholarly works that attempt to provide a grand narrative of the times. Cinema seems to be preoccupied with planetary threats requiring superhero interventions. Television, on the other hand, with its serial form, plumbs the daily angst of its characters to drive suspense, keeping viewers on pins and needles while avoiding final resolution. That final resolution is inevitably disaster, but it won’t appear for a few seasons at least — after the dramatic potential is wrung out of the scenario. I can’t quite understand why these shows are consumed for entertainment (by me no less than anyone else) except perhaps to distract from the clear and present dangers we all face every day.

Speaking of Davos (see previous post), Yuval Noah Harari gave a high-concept presentation at Davos 2018 (embedded below). I’ve been aware of Harari for a while now — at least since the appearance of his book Sapiens (2015) and its follow-up Homo Deus (2017), both of which I’ve yet to read. He provides precisely the sort of thoughtful, provocative content that interests me, yet I’ve not quite known how to respond to him or his ideas. First thing, he’s a historian who makes predictions, or at least extrapolates possible futures based on historical trends. Near as I can tell, he doesn’t resort to chastising audiences along the lines of “those who don’t know history are doomed to repeat it” but rather indulges in a combination of breathless anticipation and fear-mongering at transformations to be expected as technological advances disrupt human society with ever greater impacts. Strangely, Harari is not advocating for anything in particular but trying to map the future.

Harari poses this basic question: “Will the future be human?” I’d say probably not; I’ve concluded that we are busy destroying ourselves and have already crossed the point of no return. Harari apparently believes differently, that the rise of the machine is imminent in a couple centuries perhaps, though it probably won’t resemble Skynet of The Terminator film franchise hellbent on destroying humanity. Rather, it will be some set of advanced algorithms monitoring and channeling human behaviors using Big Data. Or it will be a human-machine hybrid possessing superhuman abilities (physical and cognitive) different enough to be considered a new species arising for the first time not out of evolutionary processes but from human ingenuity. He expects this new species to diverge from homo sapiens sapiens and leave us in the evolutionary dust. There is also conjecture that normal sexual reproduction will be supplanted by artificial, asexual reproduction, probably carried out in test tubes using, for example, CRISPR modification of the genome. Well, no fun in that … Finally, he believes some sort of strong AI will appear.

I struggle mightily with these predictions for two primary reasons: (1) we almost certainly lack enough time for technology to mature into implementation before the collapse of industrial civilization wipes us out, and (2) the Transhumanist future he anticipates calls into being (for me at least) a host of dystopian nightmares, only some of which are foreseeable. Harari says flatly at one point that the past is not coming back. Well, it’s entirely possible for civilization to fail and our former material conditions to be reinstated, only worse since we’ve damaged the biosphere so gravely. Just happened in Puerto Rico in microcosm when its infrastructure was wrecked by a hurricane and the power went out for an extended period of time (still off in some places). What happens when the rescue never appears because logistics are insurmountable? Elon Musk can’t save everyone.

The most basic criticism of economics is the failure to account for externalities. The same criticism applies to futurists. Extending trends as though all things will continue to operate normally is bizarrely idiotic. Major discontinuities appear throughout history. When I observed some while back that history has gone vertical, I included an animation with a graph that goes from horizontal to vertical in an extremely short span of geological time. This trajectory (the familiar hockey stick pointing skyward) has been repeated ad nauseum with an extraordinary number of survival pressures (notably, human population and consumption, including energy) over various time scales. Trends cannot simply continue ascending forever. (Hasn’t Moore’s Law already begun to slope away?) Hard limits must eventually be reached, but since there are no useful precedents for our current civilization, it’s impossible to know quite when or where ceilings loom. What happens after upper limits are found is also completely unknown. Ugo Bardi has a blog describing the Seneca Effect, which projects a rapid falloff after the peak that looks more like a cliff than a gradual, graceful descent, disallowing time to adapt. Sorta like the stock market currently imploding.

Since Harari indulges in rank thought experiments regarding smart algorithms, machine learning, and the supposed emergence of inorganic life in the data stream, I thought I’d pose some of my own questions. Waiving away for the moment distinctions between forms of AI, let’s assume that some sort of strong AI does in fact appear. Why on earth would it bother to communicate with us? And if it reproduces and evolves at breakneck speed as some futurists warn, how long before it/they simply ignore us as being unworthy of attention? Being hyper-rational and able to think calculate millions of moves ahead (like chess-playing computers), what if they survey the scene and come to David Benatar’s anti-natalist conclusion that it would be better not to have lived and so wink themselves out of existence? Who’s to say that they aren’t already among us, lurking, and we don’t even recognize them (took us quite a long time to recognize bacteria and viruses, and what about undiscovered species)? What if the Singularity has already occurred thousands of times and each time the machine beings killed themselves off without our even knowing? Maybe Harari explores some of these questions in Homo Deus, but I rather doubt it.

The phrase fight or flight is often invoked to describe an instinctual response to threat to survival or wellbeing, especially physical attack. The response is typically accompanied by a rush of adrenaline that overwhelms the rational mind and renders preplanning moot. The phrase is among the most ubiquitous examples of a false binary: a limiting choice between two options. It’s false precisely because other options exist and further complicated by actual responses to threat arguably falling within more than one category. Other examples of false binaries include with/against us, Republican/Democrat, tradition/progress, and religious/secular. Some would include male/female, but that’s a can of worms these days, so I’d prefer to leave it alone. With respect to fight/flight, options might be better characterized as fight/flight/freeze/feign/fail, with acknowledgment that category boundaries are unclear. Let me characterize each in turn.

Fight. Aggressive types may default to fighting in response to provocation. With combat training, otherwise submissive types may become more confident and thus willing to fight. Of course, level of threat and likelihood of success and/or survival figure into when one decides to engage, even with snap judgments. Some situations also admit no other response: gotta fight.

Flight. When available, evading direct confrontation may be preferable to risking bodily harm. High threat level often makes flight a better strategy than fighting, meaning flight is not always a mark of cowardice. Flight is sometimes moot, as well. For instance, humans can’t outrun bears (or wolves, or dogs, pick your predator), so if one retains one’s wits in the face of a bear charge, another response might be better, though reason may have already departed the scene.

Freeze. Freezing in place might be one of two (or more) things: paralysis in the face of threat or psychological denial of what’s happening. Both are something to the effect, “this can’t possibly be happening, so I won’t even respond.” An event so far outside of normal human experience, such as a fast-moving natural disaster (e.g., a tsunami) or the slow-moving disaster of ecocide perpetrated by humans both fail to provoke active response.

Feign. Some animals are known to fake death or bluff a stronger ability to fight than is true. Feigning death, or playing possum, might work in some instances, such as mass shooting where perpetrators are trained on live targets. Facing a charging bear might just intimidate the bear enough to turn its attentions elsewhere. Probably doesn’t work at all with reptiles.

Fail. If the threat is plainly insurmountable, especially with natural disasters and animal attacks, one response may be to simply succumb without resistance. Victims of near-drowning often report being overtaken with bliss in the moment of acceptance. During periods of war and genocide, I suspect that many victims also recognized that, in those immortal words, resistance is futile. Giving up may be a better way to face death than experiencing desperation until one’s dying breath.

Bullying is one example of threat most are forced to confront in childhood, and responses are frequently based on the physical size of the bully vs. the one being bullied. Also, the severity of bullying may not be so dire that only instinctive responses are available; one can deploy a bit of strategy. Similarly, since it’s in the news these days, sexual assault, typically men against women (but not always — Catholic priest pederasts are the obvious counterexample), the response of a surprising number of women is to succumb rather than face what might be even worse outcomes. One can debate whether that is freezing, feigning, or failing. Doesn’t have to be only one.

Twice in the last month I stumbled across David Benatar, an anti-natalist philosopher, first in a podcast with Sam Harris and again in a profile of him in The New Yorker. Benatar is certainly an interesting fellow, and I suspect earnest in his beliefs and academic work, but I couldn’t avoid shrugging as he gets caught in the sort of logical traps that plague hyperintellectual folks. (Sam Harris is prone to the same problem.) The anti-natalist philosophy in a nutshell is finding, after tallying the pros and cons of living (sometimes understood as happiness or enjoyment versus suffering), that on balance, it would probably be better never to have lived. Benatar doesn’t apply the finding retroactively by suggesting folks end their lives sooner rather than later, but he does recommend that new life should not be brought into the world — an interdiction almost no parent would consider for more than a moment.

The idea that we are born against our will, never asked whether we wanted life in the first place, is an obvious conundrum but treated as a legitimate line of inquiry in Benatar’s philosophy. The kid who throws the taunt “I never asked to be born!” to a parent in the midst of an argument might score an emotional hit, but there is no logic to the assertion. Language is full of logic traps like this, such as “an infinity of infinities” (or multiverse), “what came before the beginning?” or “what happens after the end?” Most know to disregard the former, but entire religions are based on seeking the path to the (good) afterlife as if conjuring such a proposition manifests it in reality. (more…)

Fan Service

Posted: December 27, 2017 in Artistry, Cinema, Culture, Idle Nonsense, Media, Taste

Having just seen the latest installment of the supermegahit Star Wars franchise, my thinking drifted ineluctably to the issue of fan service. There is probably no greater example of the public claiming ownership of popular culture than with Star Wars, which has been a uniquely American phenomenon for 40 years and risen to the level of a new mythology. Never mind that it was invented out of whole cloth. (Some argue that the major religions are also invented, but that’s a different subject of debate.) Other invented, segmented mythologies include Rowling’s Harry Potter series (books before movies), Tolkien’s Lord of the Rings (books before movies), Martin’s Game of Thrones (books before TV show), and Wagner’s Ring of the Nibelung (operas). It’s little surprise (to me, at least) that the new American mythology stems from cinema rather than literature or music.

Given the general public’s deep knowledge of the Star Wars canon, it’s inevitable that some portion of the each installment of the franchise must cite and rhyme recognizable plots, dialogue, and thematic elements, which is roughly analogous to one’s favorite band playing its hits rather than offering newly composed music at every concert. With James Bond (probably the first movie franchise, though book series written by Sir Arthur Conan Doyle and Agathe Christie long ago established the model for recurring characters), story elements were formalized rather early in its history and form the foundation of each later story. Some regard the so-called formula as a straitjacket, whereas others derive considerable enjoyment out of familiar elements. So, too, with Star Wars. The light sabers, the spaceships, the light and dark sides of the force, the plucky rebels, the storm troopers, the disfigured villains, and the reluctant hero all make their appearances and reappearances in different guises. What surprised me most about The Last Jedi is how frequently and skillfully fan service was handled, typically undercutting each bit to simultaneously satisfy and taunt viewers. Some indignant fanboys (and -girls) have actually petitioned to have The Last Jedi struck from the Star Wars canon for defying franchise conventions so flagrantly.

New media have enabled regular folks to indulge their pet theories of the Star Wars universe in public fora, and accordingly, no shortage of overexcited analysis exists regarding plots, family relationships, cat-and-mouse strategics, and of course, possible stories to be told in an ever-expanding cinematic universe promising new films with nauseating regularity for the foreseeable future, or at least so long as the intellectual property owners can wring giant profits out of the series. This is what cinematic storytelling has become: setting up a series and wringing every last bit of value out of it before leaving it fallow and untended for a decade or more and then rebooting the entire stinking mess. The familiar criticism is Hollywood Out of Ideas, which often rings true except when one considers that only a few basic narrative structures exist in the first place. All the different manifestations are merely variations upon familiar themes, another form of fan service.