Archive for the ‘Cinema’ Category

Caveat: Rather uncharacteristically long for me. Kudos if you have the patience for all of this.

Caught the first season of HBO’s series Westworld on DVD. I have a boyhood memory of the original film (1973) with Yul Brynner and a dim memory of its sequel Futureworld (1976). The sheer charisma of Yul Brynner in the role of the gunslinger casts a long shadow over the new production, not that most of today’s audiences have seen the original. No doubt, 45 years of technological development in film production lends the new version some distinct advantages. Visual effects are quite stunning and Utah landscapes have never been used more appealingly in terms of cinematography. Moreover, storytelling styles have changed, though it’s difficult to argue convincingly that they’re necessarily better now than then. Competing styles only appear dated. For instance, the new series has immensely more time to develop its themes; but the ancient parables of hubris and loss of control over our own creations run amok (e.g., Shelley’s Frankenstein, or more contemporaneously, the surprisingly good new movie Upgrade) have compact, appealing narrative arcs quite different from constant teasing and foreshadowing of plot developments while actual plotting proceeds glacially. Viewers wait an awful lot longer in the HBO series for resolution of tensions and emotional payoffs, by which time investment in the story lines has been dispelled. There is also no terrifying crescendo of violence and chaos demanding rescue or resolution. HBO’s Westworld often simply plods on. To wit, a not insignificant portion of the story (um, side story) is devoted to boardroom politics (yawn) regarding who actually controls the Westworld theme park. Plot twists and reveals, while mildly interesting (typically guessed by today’s cynical audiences), do not tie the narrative together successfully.

Still, Westworld provokes considerable interest from me due to my fascination with human consciousness. The initial episode builds out the fictional future world with characters speaking exposition clearly owing its inspiration to Julian Jayne’s book The Origins of Consciousness in the Breakdown of the Bicameral Mind (another reference audiences are quite unlikely to know or recognize). I’ve had the Julian Jaynes Society’s website bookmarked for years and read the book some while back; never imagined it would be captured in modern fiction. Jaynes’ thesis (if I may be so bold as to summarize radically) is that modern consciousness coalesced around the collapse of multiple voices in the head — ideas, impulses, choices, decisions — into a single stream of consciousness perhaps better understood (probably not) as the narrative self. (Aside: the multiple voices of antiquity correspond to polytheism, whereas the modern singular voice corresponds to monotheism.) Thus, modern human consciousness arose over several millennia as the bicameral mind (the divided brain having two camera, chambers, or halves) functionally collapsed. The underlying story of the new Westworld is the emergence of machine consciousness, a/k/a strong AI, a/k/a The Singularity, while the old Westworld was about a mere software glitch. Exploration of machine consciousness modeling (e.g., improvisation builds on memory to create awareness) as a proxy for better understanding human consciousness might not be the purpose of the show, but it’s clearly implied. And although conjectural, the speed of emergence of human consciousness contrasts sharply with the abrupt ON switch regarding theorized machine consciousness. Westworld treats them as roughly equivalent, though in fairness, 35 years or so in Westworld is in fact abrupt compared to several millennia. (Indeed, the story asserts that machine consciousness sparked alive repeatedly (which I suggested here) over those 35 years but was dialed back repeatedly. Never mind all the unexplored implications.) Additionally, the fashion in which Westworld uses the term bicameral ranges from sloppy to meaningless, like the infamous technobabble of Star Trek.

(more…)

Advertisements

Political discussion usually falls out of scope on this blog, though I use the politics category and tag often enough. Instead, I write about collapse, consciousness, and culture (and to a lesser extent, music). However, politics is up front and center with most media, everyone taking whacks at everyone else. Indeed, the various political identifiers are characterized these days by their most extreme adherents. The radicalized elements of any political persuasion are the noisiest and thus the most emblematic of a worldview if one judges solely by the most attention-grabbing factions, which is regrettably the case for a lot of us. (Squeaky wheel syndrome.) Similarly, in the U.S. at least, the spectrum is typically expressed as a continuum from left to right (or right to left) with camps divided nearly in half based on voting. Opinion polls reveal a more lopsided division (toward Leftism/Progressivism as I understand it) but still reinforce the false binary.

More nuanced political thinkers allow for at least two axes of political thought and opinion, usually plotted on an x-y coordinate plane (again, left to right and down to up). Some look more like the one below (a quick image search will reveal dozens of variations), with outlooks divided into regions of a Venn diagram suspiciously devoid of overlap. The x-y coordinate plane still underlies the divisions.

600px-political-spectrum-multiaxis

If you don’t know where your political compass points, you can take this test, though I’m not especially convinced that the result is useful. Does it merely apply more labels? If I had to plot myself according to the traditional divisions above, I’d probably be a centrist, which is to say, nothing. My positions on political issues are not driven by party affiliation, motivated by fear or grievance, subject to a cult of personality, or informed by ideological possession. Perhaps I’m unusual in that I can hold competing ideas in my head (e.g., individualism vs. collectivism) and make pragmatic decisions. Maybe not.

If worthwhile discussion is sought among principled opponents (a big assumption, that), it is necessary to diminish or ignore the more radical voices screaming insults at others. However, multiple perverse incentives reward the most heinous adherents the greatest attention and control of the narrative(s). in light of the news out just this week, call it Body Slam Politics. It’s a theatrical style borne out of fake drama from the professional wrestling ring (not an original observation on my part), and we know who the king of that style is. Watching it unfold too closely is a guaranteed way to destroy one’s political sensibility, to say nothing of wrecked brain cells. The spectacle depicted in Idiocracy has arrived early.

In an earlier blog post, I mentioned how killing from a distance is one way among many that humans differentiate from other animals. The practical advantage of weaponry that distances one combatant from another should be obvious. Spears and swords extend one’s reach yet keep fighting hand-to-hand. Projectiles (bullets, arrows, catapults, artillery, etc.) allow killing from increasingly long distances, with weapons launched into low orbit before raining down ruin being the far extreme. The latest technology is drones (and drone swarms), which remove those who wield them from danger except perhaps psychological torment accruing gradually on remote operators. Humans are unique among animals for having devised such clever ways of destroying each other, and in the process, themselves.

I finally got around to seeing the film Black Panther. Beyond the parade of clichés and mostly forgettable punchfest action (interchangeable with any other Marvel film), one particular remark stuck with me. When the warrior general of fictional Wakanda went into battle, a female as it happens, she dismissed the use of guns as “primitive.” Much is made of Wakanda’s advanced technology, some of it frankly indistinguishable from magic (e.g., the panther elixir). Wakanda’s possession of weaponry not shared with the rest of the world (e.g., invisible planes) is the MacGuffin the villain seeks to control so as exact revenge on the world and rule over it. Yet the film resorts predictably to punching and acrobatics as the principal mode of combat. Some of that strategic nonsense is attributable to visual storytelling found in both comic books and cinema. Bullets fly too fast to be seen and tracking airborne bombs never really works, either. Plus, a punch thrown by a villain or superhero arguably has some individual character to it, at least until one recognizes that punching leaves no lasting effect on anyone.

As it happens, a similar remark about “primitive” weapons (a blaster) was spat out by Obi-Wan Kenobi in one of the Star Wars prequels (dunno which one). For all the amazing technology at the disposal of those characters long ago in a galaxy far, far away, it’s curious that the weapon of choice for a Jedi knight is a light saber. Again, up close and personal (color coded, even), including actual peril, as opposed to, say, an infinity gauntlet capable of dispatching half a universe with a finger snap. Infinite power clearly drains the stakes out of conflict. Credit goes to George Lucas for recognizing the awesome visual storytelling the light saber offers. He also made blaster shots — the equivalent of flying bullets — visible to the viewer. Laser beams and other lighted projectiles had been done in cinema before Star Wars but never so well.

 

A paradoxical strength/weakness of reason is its inherent disposition toward self-refutation. It’s a bold move when undertaken with genuine interest in getting things right. Typically, as evidence piles up, consensus forms that’s tantamount to proof unless some startling new counter-evidence appears. Of course, intransigent deniers exist and convincing refutations do appear periodically, but accounts of two hotly contested topics (from among many) — evolution and climate change — are well established notwithstanding counterclaims completely disproportionate in their ferocity to the evidence. For rationalists, whatever doubts remain must be addressed and accommodated even if disproof is highly unlikely.

This becomes troublesome almost immediately. So much new information is produced in the modern world that, because I am duty-bound to consider it, my head spins. I simply can’t deal with it all. Inevitably, when I think I’ve put a topic to rest and conclude I don’t have to think too much more about it, some argument-du-jour hits the shit pile and I am forced to stop and reconsider. It’s less disorienting when facts are clear, but when interpretive, I find my head all too easily spun by the latest, greatest claims of some charming, articulate speaker able to cobble together evidence lying outside of my expertise.

Take for instance Steven Pinker. He speaks in an authoritative style and has academic credentials that dispose me to trust his work. His new book is Enlightenment Now: The Case for Reason, Science, Humanism, and Progress (2018). Still, Pinker is an optimist, whereas I’m a doomer. Even though I subscribe to Enlightenment values (for better or worse, my mind is bent that way), I can’t escape a mountain of evidence that we’ve made such a mess of things that reason, science, humanism, and progress are hardly panaceas capable of saving us from ourselves. Yet Pinker argues that we’ve never had it so good and the future looks even brighter. I won’t take apart Pinker’s arguments; it’s already been done by Jeremy Lent, who concludes that Pinker’s ideas are fatally flawed. Lent has the expertise, data, and graphs to demonstrate it. Calling Pinker a charlatan would be unfair, but his appreciation of the state of the world stands in high contrast with mine. Who ya gonna believe?

Books and articles like Pinker’s appear all the time, and in their aftermath, so, too, do takedowns. That’s the marketplace of ideas battling it out, which is ideally meant to sharpen thinking, but with the current epistemological crises under way (I’ve blogged about it for years), the actual result is dividing people into factions, destabilizing established institutions, and causing no small amount of bewilderment in the public as to what and whom to believe. Some participants in the exchange of ideas take a sober, evidential approach; others lower themselves to snark and revel in character assassination without bothering to make reasoned arguments. The latter are often called a hit pieces (a special province of the legacy media, it seems), since hefty swipes and straw-man arguments tend to be commonplace. I’m a sucker for the former style but have to admit that the latter can also hit its mark. However, both tire me to the point of wanting to bury my head.

(more…)

I mentioned blurred categories in music a short while back. An interesting newsbit popped up in the New York Times recently about this topic. Seems a political science professor at SUNY New Paltz, Gerald Benjamin, spoke up against a Democratic congressional candidate for New York’s 18th district, Antonio Delgado, the latter of whom had been a rapper. Controversially, Benjamin said that rap is not real music and does not represent the values of rural New York. Naturally, the Republican incumbent got in on the act, too, with denunciations and attacks. The electoral politics angle doesn’t much interest me; I’m not in or from the district. Moreover, the racial and/or racist elements are so toxic I simply refuse to wade in. But the professorial pronouncement that rap music isn’t really music piqued my interest, especially because that argument caused the professor to be sanctioned by his university. Public apologies and disclaimers were issued all around.

Events also sparked a fairly robust commentary at Slipped Disc. The initial comment by V.Lind echoes my thinking pretty well:

Nobody is denying that hip-hop is culturally significant. As such it merits study — I have acknowledged this … The mystery is its claims to musical credibility. Whatever importance rap has is in its lyrics, its messages (which are far from universally salutory [sic]) and its general attempt to self-define certain communities — usually those with grievances, but also those prepared to develop through violence, sexism and other unlovely aspects. These are slices of life, and as such warrant some attention. Some of the grievances are well-warranted …

But music? Not. I know people who swear by this genre, and their ears are incapable of discerning anything musical in any other. If they wanted to call it poetry (which I daresay upon scrutiny would be pretty bad poetry) it would be on stronger legs. But it is a “music” by and for the unmusical, and it is draining the possibility of any other music out of society as the ears that listen to it hear the same thing, aside from the words, for years on end.

Definitely something worth studying. How the hell has this managed to become a dominant force in what is broadly referred to as the popular music world?

The last time (as memory serves) categories or genres blurred leading to outrage was when Roger Ebert proclaimed that video games are not art (and by inference that cinema is art). Most of us didn’t really care one way or the other where some entertainment slots into a category, but gamers in particular were scandalized. In short, their own ox was gored. But when it came to video games as art, there were no racial undertones, so the sometimes heated debate was at least free of that scourge. Eventually, definitions were liberalized, Ebert acknowledged the opposing opinion (I don’t think he was ever truly convinced, but I honestly can’t remember — and besides, who cares?), and it all subsided.

The impulse to mark hard, discrete boundaries between categories and keep unlike things from touching is pretty foolish to me. It’s as though we’re arguing about the mashed potatoes and peas not infecting each other on the dinner plate with their cooties. Never mind that it all ends up mixed in the digestive tract before finally reemerging as, well, you know. Motivation to keep some things out is no doubt due to prestige and cachet, where the apparent interloper threatens to change the status quo somehow, typically infecting it with undesirable and/or impure elements. We recognize this fairly readily as in-group and out-group, an adolescent game that ramps up, for instance, when girls and boys begin to differentiate in earnest at the onset of puberty. Of course, in the last decade, so-called identitarians have been quite noisome about their tribal affiliations self-proclaimed identities, many falling far, far out of the mainstream, and have demanded they be taken seriously and/or granted status as a protected class.

All this extends well beyond the initial topic of musical or artistic styles and genres. Should be obvious, though, that we can’t escape labels and categories. They’re a basic part of cognition. If they weren’t, one would have to invent them at every turn when confronting the world, judging safe/unsafe, friend/foe, edible/inedible, etc. just to name a few binary categories. Complications multiply quickly when intermediary categories are present (race is the most immediate example, where most of us are mixtures or mutts despite whatever our outer appearance may be) or categories are blurred. Must we all then rush to restore stability to our understanding of the world by hardening our mental categories?

Back in the 1980s when inexpensive news programs proliferated, all wanting to emulate 60 Minutes or 20/20, I recall plenty having no problem working the public into a lather over some crime or injustice. A typical framing trick was to juxtapose two unrelated facts with the intent that the viewer leap to an unwarranted conclusion. Here’s an example I just made up: “On Tuesday, Jane went to her plastic surgeon for a standard liposuction procedure. By Friday, Jane was dead.” Well, what killed Jane? The obvious inference, by virtue of juxtaposition, is the procedure. Turns out it was an entirely unrelated traffic accident. The crap news program could legitimately claim that it never said the procedure killed Jane, yet it led the credulous public to believe so. Author Thomas Sowell resorts to that same sort of nonsense in his books: a habit of misdirection when arguing his point. I initially sought out his writing for balance, as everyone needs others capable of articulating competing ideas to avoid the echo chamber of one’s own mind (or indeed the chorus of the converted). Sowell failed to keep me as a reader.

It’s not always so easy to recognize cheap rhetorical tricks. They appear in movies all the time, but then, one is presumably there to be emotionally manipulated affected by the story, so a healthy suspension of disbelief goes a long way to enhance one’s enjoyment. Numerous fanboy sites (typically videos posted to YouTube) offer reviews and analysis that point out failures of logic, plotting, and continuity, as well as character inconsistency and embedded political propaganda messaging, but I’ve always thought that taking movies too seriously misses the point of cheap entertainment. Considering the powerful influence cinematic storytelling has over attitudes and beliefs, perhaps I’m being too cavalier about it.

When it comes to serious debate, however, I’m not nearly so charitable. The favored 5-minute news debate where 3 or 4 floating heads spew their rehearsed talking point, often talking over each other in a mad grab for air time, accomplishes nothing. Formal, long-form debates in a theater in front of an audience offer better engagement if participants can stay within proper debate rules and etiquette. Political debates during campaign season fail on that account regularly, with more spewing of rehearsed talking points mixed with gratuitous swipes at opponents. Typically, both sides claim victory in the aftermath and nothing is resolved, since that’s not really the objective. (Some opine that government, being essentially nonstop campaigning, suffers a similar fate: nothing is resolved because that’s not the true objective anymore.)

I was intrigued to learn recently of the semi-annual Munk Debates, named after their benefactors, that purport to be formal debates with time limits, moderation, and integrity. I had never heard of them before they booked Jordan Peterson alongside Michael Eric Dyson, Michelle Goldberg, and Stephen Fry. Like Donald Trump did for TV and print news, Peterson has turned into a 1-man ratings bonanza for YouTube and attracts viewers to anything in which he participates, which is quite a lot. The proposition the four debaters were provided was this: Be it resolved, what you call political correctness, I call progress … Problem is, that’s not really what was debated most of the time. Instead, Dyson diverted the debate to identity politics, specifically, racism and so-called white privilege. Goldberg mostly attacked Peterson regarding his opinions outside of the debate, Peterson defended himself against repeated personal attacks by Goldberg and Dyson, and Fry stayed relatively true to the intended topic. Lots of analysis and opinion appeared on YouTube almost immediately after the debate, so wade in if that’s what interests you. I viewed some of it. A couple videos called Dyson a grievance merchant, which seems to me accurate.

What concerns me more here are the cheap rhetorical tricks employed by Dyson — the only debater booed by the audience — that fundamentally derailed the proceedings. Dyson speaks with the fervor of a revivalist preacher, a familiar style that has been refined and coopted many times over to great effect. Whether deserved or not, it carries associations of great moral authority and momentous occasion. Unfortunately, if presented as a written transcript rather than a verbal rant, Dyson’s remarks are incoherent, unhinged, and ineffective except for their disruptive capacity. He reminded everyone of his blackness and his eloquence, the first of which needs no reminder, the second of which immediately backfired and called into question his own claim. Smart, eloquent people never tell you they’re smart and eloquent; the proof is in their behavior. Such boastful announcements tend to work against a person. Similarly, any remark that beings with “As a black/white/red/brown/blue man/woman/hybrid of _______ ethnicity/sexuality/identity …” calls in a host of associations that immediately invalidates the statement that follows as skewed and biased.

The two point-scoring bits of rhetoric Dyson levies with frequency, which probably form a comfort zone to which he instinctively retreats in all challenges, are his blackness (and by proxy his default victimhood) and historical oppression of blacks (e.g., slavery, Jim Crow laws, etc.). There are no other issues that concern him, as these two suffice to push everyone back on their heels. That’s why the debate failed to address political correctness effectively but instead revolved around identity politics. These issues are largely distinct, unless one debates the wisdom of switching out terminology cyclically, such as occurs even now with various racial epithets (directed to every race, not just blacks). That obvious tie-in, the use of euphemism and neologism to mask negative intent, was never raised. Nor were the twisted relations between free speech, hate speech, and approved speech codes (politically correct speech). Nope, the debate featured various personalities grandstanding on stage and using the opportunity to push and promote their personal brands, much like Trump has over the years. Worse, it was mostly about Michael Eric Dyson misbehaving. He never had my attention in the past; now I intend to avoid him at all costs.

From Wikipedia:

Trial by combat (also wager of battle, trial by battle or judicial duel) was a method of Germanic law to settle accusations in the absence of witnesses or a confession in which two parties in dispute fought in single combat; the winner of the fight was proclaimed to be right. In essence, it was a judicially sanctioned duel. It remained in use throughout the European Middle Ages, gradually disappearing in the course of the 16th century.

Unlike trial by ordeal in general, which is known to many cultures worldwide, trial by combat is known primarily from the customs of the Germanic peoples. It was in use among the ancient Burgundians, Ripuarian Franks, Alamans, Lombards, and Swedes. It was unknown in Anglo-Saxon law, Roman law and Irish Brehon Law and it does not figure in the traditions of Middle Eastern antiquity such as the code of Hammurabi or the Torah.

Trial by combat has profound echoes in 21st-century geopolitics and jurisprudence. Familiar phrases such as right of conquest, manifest destiny, to the winner go the spoils, might makes right, and history written by the victors attest to the enduring legacy of hindsight justification by force of arms. More broadly, within the American system, right of access to courts afforded to all citizens also admits nuisance suits and more than a few mismatched battles where deep-pocketed corporations sue individuals and small organizations, often nonprofits, into bankruptcy and submission. For instance, I recently learned of Strategic Lawsuits Against Public Participation (SLAPPs) “used to silence and harass critics by forcing them to spend money to defend these baseless suits.” They employ brute economic power in place of force of arms.

Trial by combat fell out of practice with the onset of the Enlightenment but the broader complex of ideas survived. Interest in medieval Europe as storytelling fodder in cinema and fantasy literature (notably, the shocking trial by combat depicted in the extremely popular HBO drama Game of Thrones where the accused and accuser both designate their proxies rather than doing battle themselves) lends legitimacy to settling disputes via violence. Even the original Karate Kid (1984) has a new YouTube Red series set 30 years later. The bad-boy acolyte replaces his scorched-earth sensei and seeks revenge from the titular character for being bested decades before, the latter of whom is yanked back from quiet obscurity (and the actor who portrays him from career limbo) to fight again and reprove his skills, which is to say, his righteousness. The set-up is surprisingly delicious to contemplate and has considerable nostalgic appeal. More importantly, it embodies the notion (no doubt scripted according to cliché) that only the pure of heart (or their proxies, students in this case) can claim ultimate victory because, well, it’s god’s will or some such and thus good guys must always win. What that really means is that whoever wins is by definition virtuous. If only reality were so reliably simple.

The certainty of various religious dogma and codes of conduct characteristic of the medieval period (e.g., chivalry) is especially seductive in modern times, considering how the public is beset by an extraordinary degree of existential and epistemological uncertainty. The naturalist fallacy is also invoked, where the law of the jungle (only the fittest and/or strongest get to eat or indeed survive) substitutes for more civilized (i.e., enlightened and equanimous) thinking. Further, despite protestations, this complex of ideas legitimizes bullying, whether (1) in the schoolyard with the principal bully flanked by underlings picking on vulnerable weaklings who haven’t formed alliances for self-protection, (2) the workplace, with its power players and Machiavellian manipulators, or (3) a global military power such as the U.S. dictating terms to and/or warring with smaller, weaker nations that lack the GDP, population, and insanity will to project power globally. I daresay most Americans take comfort in having the greatest military and arsenal ever mustered on their side and accordingly being on the right side (the victorious one) of history, thus a beacon of hope to all who would conflate victory with virtue. Those who suffer at our hands must understand things quite differently. (Isn’t it more accurate that when bad guys win, rebellions and insurgencies are sparked?)

One remarkable exception deserves notice. The U.S. presidency is among the most heavily scrutinized and contentious positions (always under attack) and happens to be the Commander-in-Chief of the self-same greatest goddamn fighting force known to man. It’s no secret that the occupant of that office (45) is also widely recognized as the Bully-in-Chief. Despite having at his disposal considerable resources — military, executive staff, and otherwise — 45 has eschewed forming the political coalitions one might expect and essentially gone it alone, using the office (and his Twitter account) as a one-man bully pulpit. Hard to say what he’s trying to accomplish, really. Detractors have banded together (incompetently) to oppose him, but 45 has demonstrated unexpected tenacity, handily dominating rhetorical trials by combat through sheer bluster and hubris. On balance, he scores some pretty good hits, too. (The proposed fist fight between 45 and Joe Biden turned out to be a tease, but how entertaining would that bout have been without actually settling anything!) This pattern has left many quite dumbfounded, and I admit to being astounded as well except to observe that rank stupidity beats everything in this bizarre political rock-paper-scissors contest. How quintessentially American: nuthin’ beats stoopid.

The movie Gladiator depicts the protagonist Maximus addressing spectators directly at gladiatorial games in the Roman Colosseum with this meme-worthy challenge: “Are you not entertained?” Setting the action in an ancient civilization renowned for its decadent final phase prior to collapse, referred to as Bread and Circuses, allows us to share vicariously in the protagonist’s righteous disgust with the public’s blood lust while shielding us from any implication of our own shame because, after all, who could possibly entertain blood sports in the modern era? Don’t answer that.

are-you-not-entertained-gladiator

But this post isn’t about our capacity for cruelty and barbarism. Rather, it’s about the public’s insatiable appetite for spectacle — both fictional and absolutely for real — served up as entertainment. Professional wrestling is fiction; boxing and mixed martial arts are reality. Audiences consuming base entertainment and, in the process, depleting performers who provide that entertainment extend well beyond combat sports, however. For instance, it’s not uncommon for pop musicians to slowly destroy themselves once pulled into the attendant celebrity lifestyle. Three examples spring to mind: Elvis Presley, Michael Jackson, and Whitney Houston. Others call hiatus or retire altogether from the pressure of public performance, such as Britney Spears, Miles Davis, and Barbra Streisand.

To say that the public devours performers and discards what remains of them is no stretch, I’m afraid. Who remembers countdown clocks tracking when female actors turn 18 so that perving on them is at last okay? A further example is the young starlet who is presumably legitimized as a “serious” actor once she does nudity and/or portrays a hooker but is then forgotten in favor of the next. If one were to seek the full depth of such devouring impulses, I suggest porn is the industry to have all one’s illusions shattered. For rather modest sums, there is absolutely nothing some performers won’t do on film (these days on video at RedTube), and naturally, there’s an audience for it. Such appetites are as bottomless as they come. Are you not entertained?

Speaking of Miles Davis, I take note of his hiatus from public performance in the late 1970s before his limited return to the stage in 1986 and early death in 1991 at age 65. He had cemented a legendary career as a jazz trumpeter but in interviews (as memory serves) dismissed the notion that he was somehow a spokesperson for others, saying dryly “I’m just a trumpet player, man ….” What galled me, though, were Don Cheadle’s remarks in the liner notes of the soundtrack to the biopic Miles Ahead (admittedly a deep pull):

Robert Glasper and I are preparing to record music for the final scene of Miles Ahead — a possible guide track for a live concert that sees the return of Miles Davis after having been flushed from his sanctuary of silence and back onto the stage and into his rightful light. My producers and I are buzzing in disbelief about what our audacity and sheer will may be close to pulling off ….

What they did was record a what-might-have-been track had Miles incorporated rap or hip hop (categories blur) into his music. It’s unclear to me whether the “sanctuary of silence” was inactivity or death, but Miles was essentially forced onstage by proxy. “Flushed” is a strange word to use in this context, as one “flushes” an enemy or prey unwillingly from hiding. The decision to recast him in such “rightful light” strikes me as rather poor taste — a case of cultural appropriation worse than merely donning a Halloween costume.

This is the wave of the future, of course, now that images of dead celebrities can be invoked, say, to sell watches (e.g., Steve McQueen) and holograms of dead musicians are made into singing zombies, euphemized as “virtual performance”(e.g., Tupak Shakur). Newly developed software can now create digitized versions of people saying and doing whatever we desire of them, such as when celebrity faces are superimposed onto porn actors (called “deepfakes”). It might be difficult to argue that in doing so content creators are stealing the souls of others, as used to be believed in the early days of photography. I’m less concerned with those meeting demand than with the demand itself. Are we becoming demons, the equivalents of the succubus/incubus, devouring or destroying frivolously the objects of our enjoyment? Are you not entertained?

I’m currently reading Go Wild by John Ratey and Richard Manning. It has some rather astounding findings on offer. One I’ll draw out is that the human brain evolved not for thinking, as one might imagine, but for coordinating complex physiological movements:

… even the simplest of motions — a flick of a finger or a turn of the hand to pick up a pencil — is maddeningly complex and requires coordination and computational power beyond electronics abilities. For this you need a brain. One of our favorites quotes on this matter comes from the neuroscientists Rodolfo Llinás: “That which we call thinking is the evolutionary internationalization of movement.” [p. 100]

Almost all the computation is unconsciousness, or maybe preconscious, and it’s learned over a period of years in infancy and early childhood (for basic locomotion) and then supplemented throughout life (for skilled motions, e.g., writing cursive or typing). Moreover, those able to move with exceptional speed, endurance, power, accuracy, and/or grace are admired and sometimes rewarded in our culture. The obvious example is sports. Whether league sports with wildly overcompensated athletes, Olympic sports with undercompensated athletes, or combat sports with a mixture of both, thrill attaches to watching someone move effectively within the rule-bound context of the sport. Other examples include dancers, musicians, circus artists, and actors who specialize in physical comedy and action. Each develops specialized movements that are graceful and beautiful, which Ratey and Manning write may also account for nonsexual appreciation and fetishization of the human body, e.g., fashion models, glammed-up actors, and nude photography.

I’m being silly saying that jocks figgered it first, of course. A stronger case could probably be made for warriors in battle, such as a skilled swordsman. But it’s jocks who are frequently rewarded all out of proportion with others who specialize in movement. True, their genetics and training enable a relatively brief career (compared to, say, surgeons or pianists) before abilities ebb away and a younger athlete eclipses them. But a fundamental lack of equivalence with artisans and artists is clear, whose value lies less with their bodies than with outputs their movements produce.

Regarding computational burdens, consider the various mechanical arms built for grasping and moving objects, some of them quite large. Mechanisms (frame and hydraulics substituting for bone and muscle) themselves are quite complex, but they’re typically controlled by a human operator rather than automated. (Exceptions abound, but they’re highly specialized, such as circuit board manufacture or textile production.) More recently, robotics demonstrate considerable advancement in locomotion without a human operator, but they’re also narrowly focused in comparison with the flexibility of motion a human body readily possesses. Further, in the case of flying drones, robots operate in wide open space, or, in the case of those designed to move like dogs or insects, use 4+ legs for stability. The latter are typically built to withstand quite a lot of bumping and jostling. Upright bipedal motion is still quite clumsy in comparison with humans, excepting perhaps wheeled robots that obviously don’t move like humans do.

Curiously, the movie Pacific Rim (sequel just out) takes notice of the computational or cognitive difficulty of coordinated movement. To operate giant robots needed to fight Godzilla-like interdimensional monsters, two mind-linked humans control a battle robot. Maybe it’s a simple coincidence — a plot device to position humans in the middle of the action (and robot) rather than killing from a distance — such as via drone or clone — or maybe not. Hollywood screenwriters are quite clever at exploiting all sorts material without necessarily divulging the source of inspiration. It’s art imitating life, knowingly or not.

I remarked in an earlier blog that artists, being hypersensitive to emergent patterns and cultural vibes, often get to ideas sooner than the masses and express their sensibilities through creative endeavor. Those expressions in turn give watchers, viewers, listeners, readers, etc. a way of understanding the world through the artist’s interpretive lens. Interpretations may be completely fictitious, based on real-life events, or merely figurative as the medium allows. They are nonetheless an inevitable reflection of ourselves. Philistines who fail to appreciate that the arts function by absorbing and processing human experience at a deep, intuitive level may insist that the arts are optional or unworthy of attention or financial support. That’s an opinion not at all borne out in the culture, however, and though support may be vulnerable to shifts in valuation (e.g., withdrawal of federal funding for the NEA and PBS), the creative class will always seek avenues of expression, even at personal cost. The democratization of production has made modes of production and distribution for some media quite cheap compared to a couple decades ago. Others remain undeniably labor intensive.

What sparked my thinking are several TV series that have caught my attention despite my generally low level of attention to such media. I haven’t watched broadcast television in over a decade, but the ability to stream TV programming has made shows I have ignored for years far more easy to tune in on my own terms and schedule. “Tune in” is of course the wrong metaphor, but suffice it to say I’ve awarded some of my attention to shows that have up until now fell out of scope for me, cinema being more to my liking. The three shows I’ve been watching (only partway through each) are The Americans, Homeland, and Shameless. The first two are political thrillers (spy stuff) whereas the last is a slice-of-life family drama, which often veers toward comedy but keeps delivering instead tragedy. Not quite the same thing as dark comedy. Conflict is necessary for dramatic purposes, but the ongoing conflict in each of these shows flirts with the worst sorts of disaster, e.g., the spies being discovered and unmasked and the family being thrown out of its home and broken up. Episodic scenarios the writers concoct to threaten catastrophe at every step or at any moment gets tiresome after a while. Multiple seasons ensure that dramatic tension is largely dispelled, since the main characters are present year over year. (The trend toward killing off major characters in others popular TV dramas is not yet widespread.) But still, it’s no way to live, constantly in disaster mode. No doubt I’ve cherry picked three shows from a huge array of entertainments on offer.

Where art reflects reality is that we all now live in the early 21st century under multiple, constantly disquieting threats, large and small, including sudden climate change and ecological disaster, nuclear annihilation, meteor impacts, eruption of the shield volcano under Yellowstone, the Ring of Fire becoming active again (leading to more volcanic and earthquake activity), geopolitical dysfunction on a grand scale, and of course, global financial collapse. This, too, is no way to live. Admittedly, no one was ever promised a care-free life. Yet our inability to manage our own social institutions or shepherd the earth (as though that were our mandate) promise catastrophes in the fullness of time that have no parallels in human history. We’re not flirting with disaster so much as courting it.

Sociologists and historians prepare scholarly works that attempt to provide a grand narrative of the times. Cinema seems to be preoccupied with planetary threats requiring superhero interventions. Television, on the other hand, with its serial form, plumbs the daily angst of its characters to drive suspense, keeping viewers on pins and needles while avoiding final resolution. That final resolution is inevitably disaster, but it won’t appear for a few seasons at least — after the dramatic potential is wrung out of the scenario. I can’t quite understand why these shows are consumed for entertainment (by me no less than anyone else) except perhaps to distract from the clear and present dangers we all face every day.