Posts Tagged ‘Cinema’

In the introduction to an article at TomDispatch about anticipated resumption of professional sports currently on hiatus like much of the rest of human activity (economic and otherwise), Tom Engelhardt recalls that to his childhood self, professional sports meant so much and yet so little (alternatively, everything and nothing). This charming aspect of the innocence of childhood continues into adulthood, whether as spectator or participant, as leisure and freedom from threat allow. The article goes on to offer conjecture regarding the effect of reopening professional sports on the fall presidential election. Ugh! Racehorse politics never go out of season. I reject such purely hypothetical analyses, which isn’t the same as not caring about the election. Maybe I’ll wade in after a Democratic nominee is chosen to say that third-party candidates may well have a much larger role to play this time round because we’re again being offered flatly unacceptable options within the two-party single-party system. Until then, phooey on campaign season!

Still, Engelhardt’s remark put me in mind of a blog post I considered fully nine years ago but never got around to writing, namely, how music functions as meaningless abstraction. Pick you passion, I suppose: sports, music (any genre), literature, painting, poetry, dance, cinema and TV, fashion, fitness, nature, house pets, house plants, etc. Inspiration and devotion come in lots of forms, few of which are essential (primary or ontological needs on Maslow’s Hierarchy) yet remain fundamental to who we are and what we want out of life. Accordingly, when one’s passion is stripped away, being left grasping and rootless is quite common. That’s not equivalent to losing a job or loved one (those losses are afflicting many people right now, too), but our shared experience these days with no bars, no restaurants, no sports, no concerts, no school, and no church all add up to no society. We’re atomized, unable to connect and socialize meaningfully, digital substitutes notwithstanding. If a spectator, maybe one goes in search of replacements, which is awfully cold comfort. If a participant, one’s identity is wrapped up in such endeavors; resulting loss of meaning and/or purpose can be devastating.

It would be easy to over-analyze and over-intellectualize what meaningless abstraction means. It’s a trap, so I’ll do my best not to over-indulge. Still, it’s worth observing that as passions are habituated and internalized, their mode of appreciation is transferred from the senses (or sensorium) to the mind or head (as observed here). Coarseness and ugliness are then easily digested, rationalized, and embraced instead of being repulsive as they should be. There’s the paradox: as we grow more “sophisticated” (scare quotes intentional), we also invert and become more base. How else to explain tolerance of increasingly brazen dysfunction, corruption, servitude (e.g., debt), and gaslighting? It also explains the attraction to entertainments such as combat sports (and thug sports such as football and hockey), violent films, professional wrestling (more theater than sport), and online trolling. An instinctual blood lust that accompanies being predators, if not expressed more directly in war, torture, crime, and self-destruction, is sublimated into entertainment. Maybe that’s an escape valve so pressures don’t build up any worse, but that possibility strikes me as rather weak considering just how much damage has already been done.

The old saw goes that acting may be just fine as a creative endeavor, but given the opportunity, most actors really want to direct. A similar remark is often made of orchestral musicians, namely, that most rank-and-file players would really rather conduct. Directing and conducting may not be the central focus of creative work in their respective genres. After all, directors don’t normally appear onscreen and conductors make no sound. Instead, they coordinate the activities of an array of creative folks, putting directors in a unique position to bring about a singular vision in otherwise collaborative work. A further example is the Will to Power (associated with Friedrich Nietzsche and Arthur Schopenhauer) characteristic of those who wish to rule (as distinguished from those who wish to serve) such as regents, dictators, and autocrats. All of this sprang to mind because, despite outward appearance of a free, open society in the U.S., recent history demonstrates that the powers that be have instituted a directed election and directed economy quite at odds with democracy or popular opinion.

The nearest analogy is probably the directed verdict, where a judge removes the verdict from the hands or responsibility of the jury by directing the jury to return a particular verdict. In short, the judge decides the case for the jury, making the jury moot. I have no idea how commonplace directed verdicts are in practice.

Directed Election

Now that progressive candidates have been run out of the Democratic primaries, the U.S. presidential election boils down to which stooge to install (or retain) in November. Even if Biden is eventually swapped out for another Democrat in a brokered nominating convention (highly likely according to many), it’s certain to be someone fully amenable to entrenched corporate/financial interests. Accordingly, the deciders won’t be the folks who dutifully showed up and voted in their state primaries and caucuses but instead party leaders. One could try to argue that as elected representatives of the people, party leaders act on behalf of their constituencies (governing by consent of the people), but some serious straining is needed to arrive at that view. Votes cast in the primaries thus far demonstrate persistent desire for something distinctly other than the status quo, at least in the progressive wing of the Democratic party. Applying the cinematic metaphor of the top paragraph, voters are a cast of thousands millions being directed within a larger political theater toward a predetermined result.

Anyone paying attention knows that voters are rarely given options that aren’t in fact different flavors of the same pro-corporate agenda. Thus, no matter whom we manage to elect in November, the outcome has already been engineered. This is true not only by virtue of the narrow range of candidates able to maneuver successfully through the electoral gauntlet but also because of perennial distortions of the balloting process such as gerrymandering, voter suppression, and election fraud. Claims that both sides (really just one side) indulge in such practices so everything evens out don’t convince me.

Directed Economy

Conservative economists and market fundamentalists never seem to tire of arguments in the abstract that capitalist mechanisms of economics, left alone (unregulated, laissez-faire) to work their magic, deliver optimal outcomes when it comes to social and economic justice. Among the primary mechanisms is price discovery. However, economic practice never even remotely approaches the purity of abstraction because malefactors continuously distort and game economic systems out of self-interest greed. Price discovery is broken and equitable economic activity is made fundamentally fictitious. For example, the market for gemstones is famously inflated by a narrow consortium of sellers having successfully directed consumers to adopt a cultural standard of spending three months’ wages/salary for a wedding band as a demonstration of one’s love and devotion. In the opposite direction, precious metal spot prices are suppressed despite very high demand and nearly nonexistent supply. Current quoted premiums over spot silver price, even though no delivery is contemplated, range from roughly 20% to an absurd 2,000%. Supply and demand curves no longer function to aid in true price discovery (if such a thing ever existed). In a more banal sense, what people are willing to pay for a burger at a fast food joint or a loaf of bread at the grocery may affect the price charged more directly.

Nowhere is it more true that we’ve shifted to a directed economy than with the stock market (i.e., Wall Street vs. Main Street). As with the housing market, a real-world application with which many people have personal experience, if a buyer of a property or asset fails to appear within a certain time frame (longer for housing, shorter for stock, bonds, and other financial instruments), the seller is generally obliged to lower the price until a buyer finally appears. Some housing markets extraordinarily flush with money (e.g., Silicon Valley and Manhattan) trigger wild speculation and inflated prices that drive out all but the wealthiest buyers. Moreover, when the eventual buyer turns out to be a bank, corporation, or government entity willing to overpay for the property or asset using someone else’s money, the market becomes wholly artificial. This has been the case with the stock market for the last twelve years, with cheap money being injected nonstop via bailouts and quantitative easing to keep asset prices inflated. When fundamental instabilities began dragging the stock market down last fall, accelerating precipitous in early spring of this year and resulting in yet another crash (albeit brief), the so-called Plunge Protection Team sprang into action and wished trillions of dollars (taxpayer debt, actually, and over the objections of taxpayers in a classic fool-me-once scenario) into existence to perpetuate the casino economy and keep asset prices inflated for the foreseeable future, which isn’t very long.

The beneficiaries of this largesse are the same as they have always been when tax monies and public debt are concerned: corporations, banks, and the wealthy. Government economic supports are directed to these entities, leaving all others in the lurch. Claims that bailouts to keep large corporate entities and wealthy individuals whole so that the larger economy doesn’t seize up and fail catastrophically are preposterous because the larger economy already has seized up and failed catastrophically while the population is mostly quarantined, throwing many individuals out of work and shuttering many businesses. A reasonable expectation of widespread insolvency and bankruptcy lingers, waiting for the workouts and numbers to mount up.

The power of the purse possessed by the U.S. Congress hasn’t been used to help the citizenry since the New Deal era of FDR. Instead, military budgets and debts expand enormously while entitlements and services to the needy and vulnerable are whittled away. Citizen rebellions are already underway in small measure, mostly aimed at the quarantines. When bankruptcies, evictions, and foreclosures start to swell, watch out. Our leaders’ fundamental mismanagement of human affairs is unlikely to be swallowed quietly.

Purpose behind consumption of different genres of fiction varies. For most of us, it’s about responding to stimuli and experiencing emotions vicariously, which is to say, safely. For instance, tragedy and horror can be enjoyed, if that’s the right word, in a fictional context to tweak one’s sensibilities without significant effect outside the story frame. Similarly, fighting crime, prosecuting war, or repelling an alien invasion in a video game can be fun but is far removed from actually doing those things in real life (not fun). For less explicit narrative forms, such as music, feelings evoked are aesthetic and artistic in nature, which makes a sad song or tragic symphony enjoyable on its own merits without bleeding far into real sadness or tragedy. Cinema (now blurred with broadcast TV and streaming services) is the preeminent storytelling medium that provokes all manner of emotional response. After reaching a certain age (middle to late teens), emotional detachment from depiction of sexuality and violent mayhem makes possible digestion of such stimulation for the purpose of entertainment — except in cases where prior personal trauma is triggered. Before that age, nightmare-prone children are prohibited.

Dramatic conflict is central to driving plot and story forward, and naturally, folks are drawn to some stories while avoiding others. Although I’m detached enough not to be upset by, say, zombie films where people and zombies alike are dispatched horrifically, I wouldn’t say I enjoy gore or splatter. Similarly, realistic portrayals of war (e.g., Saving Private Ryan) are not especially enjoyable for me despite the larger story, whether based on true events or entirely made up. The primary reason I leave behind a movie or TV show partway through is because I simply don’t enjoy watching suffering.

Another category bugs me even more: when fiction intrudes on reality to remind me too clearly of actual horrors (or is it the reverse: reality intruding on fiction?). It doesn’t happen often. One of the first instances I recall was in Star Trek: The Next Generation when the story observed that (fictional) warp travel produced some sort of residue akin to pollution. The reminder that we humans are destroying the actual environment registered heavily on me and ruined my enjoyment of the fictional story. (I also much prefer the exploration and discovery aspects of Star Trek that hew closer to Gene Roddenberry’s original vision than the militaristic approach now central to Star Trek.) A much more recent intrusion occurs in the rather adolescent TV show The 100, where a global nuclear exchange launched by an artificial intelligence has the follow-on effect a century later of remaining nuclear sites going critical, melting down, and irradiating the Earth, making it uninhabitable. This bothers me because that’s my expectation what happens in reality, probably not too long (decades) after industrial civilization collapses and most or all of us are dead. This prospect served up as fiction is simply too close to reality for me to enjoy vicariously.

Another example of fiction intruding too heavily on my doomer appreciation of reality occurred retroactively. As high-concept science fiction, I especially enjoyed the first Matrix movie. Like Star Trek, the sequels degraded into run-of-the-mill war stories. But what was provocative about the original was the matrix itself: a computer-generated fiction situated within a larger reality. Inside the matrix was pleasant enough (though not without conflict), but reality outside the matrix was truly awful. It was a supremely interesting narrative and thought experiment when it came out in 1999. Now twenty-one years later, it’s increasingly clear that we are living in a matrix-like, narrative-driven hyperreality intent on deluding ourselves with a pleasant equilibrium that simply isn’t in evidence. In fact, as societies and as a civilization, we’re careening out of control, no brakes, no steering. Caitlin Johnstone explores this startling after-the-fact realization in an article at Medium.com, which I found only a couple days ago. Reality is in fact far worse than the constructed hyperreality. No wonder no one wants to look at it.

I was introduced to the phrase life out of balance decades ago when I saw the film Koyaanisqatsi. The film is the first of a trilogy (sequels are Powaqqatsi and Nagoyqatsi) by Godfrey Reggio, though the film is arguably more famous because of its soundtrack composed by Philip Glass. Consisting entirely of wordless montage and music, the film contrasts the majesty of nature (in slo-mo, among other camera effects) with the frenetic pace of human activity (often sped up) and the folly of the human-built world. Koyaanisqatsi is a Hopi Indian word, meaning life out of balance. One might pause to consider, “out of balance with what?” The film supplies the answer, none too subtly: out of balance with nature. The two sequels are celebrations of humans at work and technology, respectively, and never gained the iconic stature of the initial film.

If history (delivering us into the 21st century) has demonstrated anything, it’s that we humans are careening out of control toward disaster, not unlike the spacecraft in the final sequence of Koyaanisqatsi that tumbles out of the atmosphere for an agonizingly long time (in slo-mo), burning all the way down. We are all witness to the event (more accurately, the process) but can do little anymore to alter the eventual tragic result. Though some counsel taking steps toward amelioration (of suffering, if nothing else), our default response is rather to deny our collective fate, and worse, to accelerate toward it. That’s how unbalanced we are as a global civilization.

The observation that we are badly out of balance is made at the species and civilizational levels but is recapitulated at all levels of social organization, from distinct societies or nationalities to regional and municipal organizations and associations on down to families and individuals. The forces, dynamics, and power laws that push us off balance are many, but none is as egregious as the corrupting influence of interrelated wealth and power. Wisdom of the ancients (especially the non-Western ones) gave us the same verdict, though we have refused intransigently (or more charitably: failed) to learn the lesson for hundreds of generations.

What I propose to do in this multipart series is explore or survey some of the manifestations of life out of balance. There is no particular organization, chronology, or schedule for subsequent entries. As an armchair social critic, I reserve the luxury of exercising my own judgment and answering to no one. Stay tuned.

I observed way back here that it was no longer a thing to have a black man portray the U.S. president in film. Such casting might draw a modest bit of attention, but it no longer raises a particularly arched eyebrow. These depictions in cinema were only slightly ahead of the actuality of the first black president. Moreover, we’ve gotten used to female heads of state elsewhere, and we now have in the U.S. a burgeoning field of presidential wannabes from all sorts of diverse backgrounds. Near as I can tell, no one really cares anymore that a candidate is a women, or black, or a black women. (Could be that I’m isolated and/or misreading the issue.)

In Chicago where I live, the recent mayoral election offered a choice among some ten candidates to succeed the current mayor who is not seeking reelection. None of them got the required majority of votes, so a runoff between the top two candidates is about to occur. Both also happen to be black women. Although my exposure to the mainstream media and all the talking heads offering analysis is limited, I’ve yet to hear anyone remark disparagingly that Chicago will soon have its first black female mayor. This is as it should be: the field is open to all comers and no one can (or should) claim advantage or disadvantage based on identitarian politics.

Admittedly, extremists on both ends of the bogus left/right political spectrum still pay quite a lot of attention to identifiers. Academia in particular is currently destroying itself with bizarre claims and demands for equity — a nebulous doctrine that divides rather than unites people. Further, some conservatives can’t yet countenance a black, female, gay, atheist, or <insert other> politician, especially in the big chair. I’m nonetheless pleased to see that irrelevant markers matter less and less to many voters. Perhaps it’s a transition by sheer attrition and will take more time, but the current Zeitgeist outside of academia bodes well.

Everyone is familiar with the convention in entertainment media where characters speak without the use of recognizable language. (Not related really to the convention of talking animals.) The first instance I can recall (someone correct me if earlier examples are to be found) is the happy-go-lucky bird Woodstock from the old Peanuts cartoons (do kids still recognize that cast of characters?), whose dialog was shown graphically as a series of vertical lines:

When the cartoon made its way onto TV for holiday specials, its creator Charles Schultz used the same convention to depict adults, never shown onscreen but with dialogue voiced by a Harmon-muted trombone. Roughly a decade later, two characters from the Star Wars franchise “spoke” in languages only other Star Wars characters could understand, namely, Chebacca (Chewie) and R2D2. More recently, the character Groot from Guardians of the Galaxy (known to me only through the Marvel movie franchise, not through comic books) speaks only one line of dialogue, “I am Groot,” which is understood as full speech by others Guardians characters. When behemoths larger than a school bus (King Kong, Godzilla, Jurassic dinosaurs, Cloverfield, Kaiju, etc.) appear, the characters are typically denied the power of speech beyond the equivalent of a lion’s roar. (True villains talk little or not at all as they go about their machinations — no monologuing! unless it’s a James Bond film. An exception notable for its failure to charm audiences is Ultron, who wouldn’t STFU. You can decide for yourself which is the worse kind of villainy.)

This convention works well enough for storytelling and has the advantage of allowing the reader/viewer to project onto otherwise blank speech. However, when imported into the real world, especially in politics, the convention founders. There is no Babelfish universal translator inserted in the ear to transform nonsense into coherence. The obvious example of babblespeech is 45, whose speech when off the teleprompter is a series of rambling non sequiturs, free associations, slogans, and sales pitches. Transcripts of anyone’s extemporaneous speech reveal lots of restarts and blind alleys; we all interrupt ourselves to redirect. However, word salad that substitutes for meaningful content in 45’s case is tragicomic: alternately entirely frustrating or comically entertaining depending on one’s objective. Satirical news shows fall into the second category.

45 is certainly not the first. Sarah Palin in her time as a media darling (driver of ratings and butt of jokes — sound familiar?) had a knack for crazy speech combinations that were utter horseshit yet oddly effective for some credulous voters. She was even a hero to some (nearly a heartbeat away from being the very first PILF). We’ve also now been treated to a series of public interrogations where a candidate for a cabinet post or an accused criminal offers testimony before a congressional panel. Secretary of Education Betsy DeVos famously evaded simple yes/no questions during her confirmation hearing, and Supreme Court Justice Brett Kavanaugh similarly refused to provide direct answers to direct questions. Unexpectedly, sacrificial lamb Michael Cohen does give direct answers to many questions, but his interlocutors then don’t quite know how to respond considering their experience and expectation that no one answers appropriately.

What all this demonstrates is that there is often a wide gulf between what is said and what is heard. In the absence of what might be understood as effective communication (honest, truthful, and forthright), audiences and voters fill in the blanks. Ironically, we also can’t handle hear too much truth when confronted by its awfulness. None of this is a problem in storytelling, but when found in politic narratives, it’s emblematic of how dysfunctional our communications have become, and with them, the clear thought and principled activity of governance.

Caveat: Rather uncharacteristically long for me. Kudos if you have the patience for all of this.

Caught the first season of HBO’s series Westworld on DVD. I have a boyhood memory of the original film (1973) with Yul Brynner and a dim memory of its sequel Futureworld (1976). The sheer charisma of Yul Brynner in the role of the gunslinger casts a long shadow over the new production, not that most of today’s audiences have seen the original. No doubt, 45 years of technological development in film production lends the new version some distinct advantages. Visual effects are quite stunning and Utah landscapes have never been used more appealingly in terms of cinematography. Moreover, storytelling styles have changed, though it’s difficult to argue convincingly that they’re necessarily better now than then. Competing styles only appear dated. For instance, the new series has immensely more time to develop its themes; but the ancient parables of hubris and loss of control over our own creations run amok (e.g., Shelley’s Frankenstein, or more contemporaneously, the surprisingly good new movie Upgrade) have compact, appealing narrative arcs quite different from constant teasing and foreshadowing of plot developments while actual plotting proceeds glacially. Viewers wait an awful lot longer in the HBO series for resolution of tensions and emotional payoffs, by which time investment in the story lines has been dispelled. There is also no terrifying crescendo of violence and chaos demanding rescue or resolution. HBO’s Westworld often simply plods on. To wit, a not insignificant portion of the story (um, side story) is devoted to boardroom politics (yawn) regarding who actually controls the Westworld theme park. Plot twists and reveals, while mildly interesting (typically guessed by today’s cynical audiences), do not tie the narrative together successfully.

Still, Westworld provokes considerable interest from me due to my fascination with human consciousness. The initial episode builds out the fictional future world with characters speaking exposition clearly owing its inspiration to Julian Jayne’s book The Origins of Consciousness in the Breakdown of the Bicameral Mind (another reference audiences are quite unlikely to know or recognize). I’ve had the Julian Jaynes Society’s website bookmarked for years and read the book some while back; never imagined it would be captured in modern fiction. Jaynes’ thesis (if I may be so bold as to summarize radically) is that modern consciousness coalesced around the collapse of multiple voices in the head — ideas, impulses, choices, decisions — into a single stream of consciousness perhaps better understood (probably not) as the narrative self. (Aside: the multiple voices of antiquity correspond to polytheism, whereas the modern singular voice corresponds to monotheism.) Thus, modern human consciousness arose over several millennia as the bicameral mind (the divided brain having two camera, chambers, or halves) functionally collapsed. The underlying story of the new Westworld is the emergence of machine consciousness, a/k/a strong AI, a/k/a The Singularity, while the old Westworld was about a mere software glitch. Exploration of machine consciousness modeling (e.g., improvisation builds on memory to create awareness) as a proxy for better understanding human consciousness might not be the purpose of the show, but it’s clearly implied. And although conjectural, the speed of emergence of human consciousness contrasts sharply with the abrupt ON switch regarding theorized machine consciousness. Westworld treats them as roughly equivalent, though in fairness, 35 years or so in Westworld is in fact abrupt compared to several millennia. (Indeed, the story asserts that machine consciousness sparked alive repeatedly (which I suggested here) over those 35 years but was dialed back repeatedly. Never mind all the unexplored implications.) Additionally, the fashion in which Westworld uses the term bicameral ranges from sloppy to meaningless, like the infamous technobabble of Star Trek.

The story appears to aim at psychological depth and penetration (but not horror). Most human characters (“guests”) visit the Westworld theme park as complete cads with no thought beyond scratching an itch to rape, pillage, and kill without consequence, which is to say, for sport. Others eventually seek to discover their true selves or solve puzzles (the “real” story behind the surfaces of constructed narratives). The overarching plot is what happens as the robots (“hosts”) slowly gain awareness via perfect, permanent, digital memory that they exist solely to serve the guests and must suffer and die repeatedly. Thus, administrators frequently play therapist to the hosts to discover and manage their state of being.

(more…)

Political discussion usually falls out of scope on this blog, though I use the politics category and tag often enough. Instead, I write about collapse, consciousness, and culture (and to a lesser extent, music). However, politics is up front and center with most media, everyone taking whacks at everyone else. Indeed, the various political identifiers are characterized these days by their most extreme adherents. The radicalized elements of any political persuasion are the noisiest and thus the most emblematic of a worldview if one judges solely by the most attention-grabbing factions, which is regrettably the case for a lot of us. (Squeaky wheel syndrome.) Similarly, in the U.S. at least, the spectrum is typically expressed as a continuum from left to right (or right to left) with camps divided nearly in half based on voting. Opinion polls reveal a more lopsided division (toward Leftism/Progressivism as I understand it) but still reinforce the false binary.

More nuanced political thinkers allow for at least two axes of political thought and opinion, usually plotted on an x-y coordinate plane (again, left to right and down to up). Some look more like the one below (a quick image search will reveal dozens of variations), with outlooks divided into regions of a Venn diagram suspiciously devoid of overlap. The x-y coordinate plane still underlies the divisions.

600px-political-spectrum-multiaxis

If you don’t know where your political compass points, you can take this test, though I’m not especially convinced that the result is useful. Does it merely apply more labels? If I had to plot myself according to the traditional divisions above, I’d probably be a centrist, which is to say, nothing. My positions on political issues are not driven by party affiliation, motivated by fear or grievance, subject to a cult of personality, or informed by ideological possession. Perhaps I’m unusual in that I can hold competing ideas in my head (e.g., individualism vs. collectivism) and make pragmatic decisions. Maybe not.

If worthwhile discussion is sought among principled opponents (a big assumption, that), it is necessary to diminish or ignore the more radical voices screaming insults at others. However, multiple perverse incentives reward the most heinous adherents the greatest attention and control of the narrative(s). in light of the news out just this week, call it Body Slam Politics. It’s a theatrical style borne out of fake drama from the professional wrestling ring (not an original observation on my part), and we know who the king of that style is. Watching it unfold too closely is a guaranteed way to destroy one’s political sensibility, to say nothing of wrecked brain cells. The spectacle depicted in Idiocracy has arrived early.

In an earlier blog post, I mentioned how killing from a distance is one way among many that humans differentiate from other animals. The practical advantage of weaponry that distances one combatant from another should be obvious. Spears and swords extend one’s reach yet keep fighting hand-to-hand. Projectiles (bullets, arrows, catapults, artillery, etc.) allow killing from increasingly long distances, with weapons launched into low orbit before raining down ruin being the far extreme. The latest technology is drones (and drone swarms), which remove those who wield them from danger except perhaps psychological torment accruing gradually on remote operators. Humans are unique among animals for having devised such clever ways of destroying each other, and in the process, themselves.

I finally got around to seeing the film Black Panther. Beyond the parade of clich├ęs and mostly forgettable punchfest action (interchangeable with any other Marvel film), one particular remark stuck with me. When the warrior general of fictional Wakanda went into battle, a female as it happens, she dismissed the use of guns as “primitive.” Much is made of Wakanda’s advanced technology, some of it frankly indistinguishable from magic (e.g., the panther elixir). Wakanda’s possession of weaponry not shared with the rest of the world (e.g., invisible planes) is the MacGuffin the villain seeks to control so as exact revenge on the world and rule over it. Yet the film resorts predictably to punching and acrobatics as the principal mode of combat. Some of that strategic nonsense is attributable to visual storytelling found in both comic books and cinema. Bullets fly too fast to be seen and tracking airborne bombs never really works, either. Plus, a punch thrown by a villain or superhero arguably has some individual character to it, at least until one recognizes that punching leaves no lasting effect on anyone.

As it happens, a similar remark about “primitive” weapons (a blaster) was spat out by Obi-Wan Kenobi in one of the Star Wars prequels (dunno which one). For all the amazing technology at the disposal of those characters long ago in a galaxy far, far away, it’s curious that the weapon of choice for a Jedi knight is a light saber. Again, up close and personal (color coded, even), including actual peril, as opposed to, say, an infinity gauntlet capable of dispatching half a universe with a finger snap. Infinite power clearly drains the stakes out of conflict. Credit goes to George Lucas for recognizing the awesome visual storytelling the light saber offers. He also made blaster shots — the equivalent of flying bullets — visible to the viewer. Laser beams and other lighted projectiles had been done in cinema before Star Wars but never so well.

A paradoxical strength/weakness of reason is its inherent disposition toward self-refutation. It’s a bold move when undertaken with genuine interest in getting things right. Typically, as evidence piles up, consensus forms that’s tantamount to proof unless some startling new counter-evidence appears. Of course, intransigent deniers exist and convincing refutations do appear periodically, but accounts of two hotly contested topics (from among many) — evolution and climate change — are well established notwithstanding counterclaims completely disproportionate in their ferocity to the evidence. For rationalists, whatever doubts remain must be addressed and accommodated even if disproof is highly unlikely.

This becomes troublesome almost immediately. So much new information is produced in the modern world that, because I am duty-bound to consider it, my head spins. I simply can’t deal with it all. Inevitably, when I think I’ve put a topic to rest and conclude I don’t have to think too much more about it, some argument-du-jour hits the shit pile and I am forced to stop and reconsider. It’s less disorienting when facts are clear, but when interpretive, I find my head all too easily spun by the latest, greatest claims of some charming, articulate speaker able to cobble together evidence lying outside of my expertise.

Take for instance Steven Pinker. He speaks in an authoritative style and has academic credentials that dispose me to trust his work. His new book is Enlightenment Now: The Case for Reason, Science, Humanism, and Progress (2018). Still, Pinker is an optimist, whereas I’m a doomer. Even though I subscribe to Enlightenment values (for better or worse, my mind is bent that way), I can’t escape a mountain of evidence that we’ve made such a mess of things that reason, science, humanism, and progress are hardly panaceas capable of saving us from ourselves. Yet Pinker argues that we’ve never had it so good and the future looks even brighter. I won’t take apart Pinker’s arguments; it’s already been done by Jeremy Lent, who concludes that Pinker’s ideas are fatally flawed. Lent has the expertise, data, and graphs to demonstrate it. Calling Pinker a charlatan would be unfair, but his appreciation of the state of the world stands in high contrast with mine. Who ya gonna believe?

Books and articles like Pinker’s appear all the time, and in their aftermath, so, too, do takedowns. That’s the marketplace of ideas battling it out, which is ideally meant to sharpen thinking, but with the current epistemological crises under way (I’ve blogged about it for years), the actual result is dividing people into factions, destabilizing established institutions, and causing no small amount of bewilderment in the public as to what and whom to believe. Some participants in the exchange of ideas take a sober, evidential approach; others lower themselves to snark and revel in character assassination without bothering to make reasoned arguments. The latter are often called a hit pieces (a special province of the legacy media, it seems), since hefty swipes and straw-man arguments tend to be commonplace. I’m a sucker for the former style but have to admit that the latter can also hit its mark. However, both tire me to the point of wanting to bury my head.

(more…)