At last, getting to my much, much delayed final book blogs (three parts) on Iain McGilchrist’s The Master and His Emissary. The book came out in 2010, I picked it up in 2012 (as memory serves), and it took me nearly two years to read its entirety, during which time I blogged my observations. I knew at the time of my previous post on the book that there would be more to say, and it’s taken considerable time to get back to it.

McGilchrist ends with a withering criticism of the Modern and Postmodern (PoMo) Eras, which I characterized as an account of how the world went mad. That still seems accurate to me: the madness that overtook us in the Modern Era led to world wars, genocides, and systematic reduction of humanity to mere material and mechanism, what Ortega y Gasset called Mass Man. Reduction of the rest of the living world to resources to be harvested and exploited by us is a worldview often called instrumental reality. From my armchair, I sense that our societal madness has shape-shifted a few times since the fin de siècle 1880s and 90s. Let’s start with quotes from McGilchrist before I extend into my own analysis. Here is one of his many descriptions of the left-hemisphere paradigm under which we now operate:

In his book on the subject, Modernity and Self-identity, Anthony Giddens describes the characteristic disruption of space and time required by globalisation, itself the necessary consequence of industrial capitalism, which destroys the sense of belonging, and ultimately of individual identity. He refers to what he calls ‘disembedding mechanisms’, the effect of which is to separate things from their context, and ourselves from the uniqueness of place, what he calls ‘locale’. Real things and experiences are replaced by symbolic tokens; ‘expert’ systems replace local know-how and skill with a centralised process dependent on rules. He sees a dangerous form of positive feedback, whereby theoretical positions, once promulgated, dictate the reality that comes about, since they are then fed back to us through the media, which form, as much as reflect, reality. The media also promote fragmentation by a random juxtaposition of items of information, as well as permitting the ‘intrusion of distant events into everyday consciousness’, another aspect of decontextualisation in modern life adding to loss of meaning in the experienced world. [p. 390]

Reliance on abstract, decontextualized tokens having only figurative, nonintrinsic power and meaning is a specific sort of distancing, isolation, and reduction that describes much of modern life and shares many characteristics with schizophrenia, as McGilchrist points out throughout the chapter. That was the first shape-shift of our madness: full-blown mechanization borne out of reductionism and materialism, perspectives bequeathed to us by science. The slow process had been underway since the invention of the mechanical clock and discovery of heliocentrism, but it gained steam (pun intended) as the Industrial Revolution matured in the late 19th century.

The PoMo Era is recognized as having begun just after the middle of the 20th century, though its attributes are questionably defined or understood. That said, the most damning criticism leveled at PoMo is its hall-of-mirrors effect that renders objects in the mirrors meaningless because the original reference point is obscured or lost. McGilchrist also refers repeatedly to loss of meaning resulting from the ironizing effect of left-brain dominance. The corresponding academic fad was PoMo literary criticism (deconstruction) in the 1970s, but it had antecedents in quantum theory. Here is McGilchrist on PoMo:

With post-modernism, meaning drains away. Art becomes a game in which the emptiness of a wholly insubstantial world, in which there is nothing beyond the set of terms we have in vain used to ‘construct’ mean, is allowed to speak for its own vacuity. The set of terms are now seen simply to refer to themselves. They have lost transparency; and all conditions that would yield meaning have been ironized out of existence. [pp. 422–423]

This was the second shape-shift: loss of meaning in the middle of the 20th century as purely theoretical formulations, which is to say, abstraction, gained adherents. He goes on:

Over-awareness … alienates us from the world and leads to a belief that only we, or our thought processes, are real … The detached, unmoving, unmoved observer feels that the world loses reality, becomes merely ‘things seen’. Attention is focussed on the field of consciousness itself, not on the world beyond, and we seem to experience experience … [In hyperconsciousness, elements] of the self and of experience which normally remain, and need to remain, intuitive, unconscious, become the objects of a detached, alienating attention, the levels of consciousness multiply, so that there is an awareness of one’s own awareness, and so on. The result of this is a sort of paralysis, in which even everyday ‘automatic’ actions such as moving one leg in front of another in order to walk can become problematic … The effect of hyperconsciousness is to produce a flight from the body and from its attendant emotions. [pp. 394–396]

Having devoted a fair amount of my intellectual life to trying to understand consciousness, I immediately recognized the discussion of hyperconsciousness (derived from Louis Sass) as what I often call recursion error, where consciousness becomes the object of its own contemplation, with obvious consequences. Modern, first-world people all suffer from this effect to varying degrees because that is how modern consciousness is warped shaped.

I believe we can observe now two more characteristic extensions or variations of our madness, probably overlapping, not discrete, following closely on each other: the Ironic and Post-Ironic. The characteristics are these:

  • Modern — reductive, mechanistic, instrumental interpretation of reality
  • Postmodern — self-referential (recursive) and meaningless reality
  • Ironic — reversed reality
  • Post-Ironic — multiplicity of competing meanings/narratives, multiple realities

All this is quite enough to the chew on for a start. I plan to continue in pts. 2 and 3 with description of the Ironic and Post-Ironic.

Updates to my blogroll are infrequent. I only add blogs that present interesting ideas (with which I don’t always agree) and/or admirable writing. Deletions are typically the result of a change of focus at the linked blog, or regrettably, the result of a blogger becoming abusive or self-absorbed. This time, it’s latter. So alas, another one bites the dust. Dropping off my blogroll — no loss since almost no one reads my blog — is On an Overgrown Path (no link), which is about classical music.

My indignation isn’t about disagreements (we’ve had a few); it’s about inviting discussion in bad faith. I’m very interested in contributing to discussion and don’t mind moderated comments to contend with trolls. However, my comments drive at ideas, not authors, and I’m scarcely a troll. Here’s the disingenuously titled blog post, “Let’s Start a Conversation about Concert Hall Sound,” where the blogger declined to publish my comment, handily blocking conversation. So for maybe the second time in the nearly 10-year history of this blog, I am reproducing the entirety of another’s blog post (minus the profusion of links, since that blogger tends to create link mazes, defying readers to actually explore) followed by my unpublished comment, and then I’ll expound and perhaps rant a bit. Apologies for the uncharacteristic length. Read the rest of this entry »

Everyone knows how to play Rock, Paper, Scissors, which typically comes up as a quick means of settling some minor negotiation with the caveat that the winner is entirely arbitrary. The notion of a Rock, Paper, Scissors tournament is therefore a non sequitur, since the winner by no means possesses skill, or strategic combinations of throws devised to reliably defeat opponents. Rather, winners are the unfortunate recipients of a blind but lucky sequence, an algorithm, that produces an eventual winner yet is indifferent to the outcome. I can’t say quite why, exactly, but I’ve been puzzling over how three-way conflicts might be decided were the categories instead Strong, Stupid, and Smart, respectively.

Rock is Strong, obviously, because it’s blunt force, whereas Paper is Stupid because it’s blank, and Scissors is Smart because it’s the only one that has any design or sophistication. For reassignments to work, however, the circle of what beats what would have to be reversed: Strong beats Stupid, Stupid beats Smart, and Smart beats Strong. One could argue that Strong and Stupid are equally dense, but arguendo, let’s grant Strong supremacy in that contest. Interestingly, Stupid always beats Smart because Smart’s advantage is handily nullified by Stupid. Finally, Smart beats Strong because the David and Goliath parable has some merit. Superhero fanboys are making similar arguments with respect to the hotly anticipated Superman v. Batman (v. Wonder Woman) film to be released in 2016. The Strong argument is that Superman need land only one punch to take out Batman (a mere human with gadgets and bad-ass attitude), but the Smart argument is that Batman will outwit Superman by, say, deploying kryptonite or exploiting Superman’s inherent good guyness to defeat him.

A further puzzle is how the game Strong, Stupid, Smart works out in geopolitics. The U.S. is clearly Strong, the last remaining world superpower (though still dense as a board — new revelations keep reinforcing that judgment), and uses its strength to bully Stupids into submission. Numerous countries have shifted categories from Strong to Stupid over time — quite a few in fact if one surveys more than a few decades of world history. Stupids have also fought each other to effective stalemate in most of the world, though not without a few wins and losses chalked up. What remains, however, is for a truly Smart regime to emerge to take down Strong. The parody version of such a match-up is told in the book The Mouse That Roared (also a movie with Peter Sellars). But since Smart is vanquished by Stupid, and the world has an overabundance of Stupids, it is unlikely that Smart can ever do better than momentary victory.

Our current slate of presidential candidates is a mostly a field of Stupids with a couple Strongs thrown in (remember: still equally dense as Stupid). Then there are a couple insanely Stupids who distort the circle into an out-of-kilter bizarro obloid. As with geopolitics, a Smart candidate is yet to emerge, but such a candidate would only defeat Strongs, clearing the way for a Stupid victory. This should be obvious to any strategist, and indeed, no truly Smart candidates have declared, knowing full well that they would gain no traction with the half-literate, mouth-breathing public composed largely of Stupids who predictably fall in love with the most insanely Stupid candidate out there. An engaged Smart candidate would thus hand the victory to the insanely Stupid, who should be unelectable from the outset, but go figger. So then the deep strategy (gawd, how I hate this) would be to go with the devil you know, since a saint could never prevail against all the demons.

“Any sufficiently advanced technology is indistinguishable from magic.” –Arthur C. Clarke

/rant on

Jon Evans at TechCrunch has an idiot opinion article titled “Technology Is Magic, Just Ask The Washington Post” that has gotten under my skin. His risible assertion that the WaPo editorial board uses magical thinking misframes the issue whether police and other security agencies ought to have backdoor or golden-key access to end-users’ communications carried over electronic networks. He marshals a few experts in the field of encryption and information security (shortened to “infosec” — my, how hep) who insist that even if such a thing (security that is porous to select people or agencies only) were possible, that demand is incompatible with the whole idea of security and indeed privacy. The whole business strikes me as a straw man argument. Here is Evans’ final paragraph:

If you don’t understand how technology works — especially a technical subgenre as complex and dense as encryption and information security — then don’t write about it. Don’t even have an opinion about what is and isn’t possible; just accept that you don’t know. But if you must opine, then please, at least don’t pretend technology is magic. That attitude isn’t just wrong, it’s actually dangerous.

Evans is pushing on a string, making the issue seem as though agencies that simply want what they want believe in turn that those things come into existence by the snap of one’s fingers, or magically. But in reality beyond hyperbole, absolutely no one believes that science and technology are magic. Rather, software and human-engineered tools are plainly understood as mechanisms we design and fabricate through our own effort even if we don’t understand the complexity of the mechanism under the hood. Further, everyone beyond the age of 5 or 6 loses faith in magical entities such as the Tooth Fairy, unicorns, Fairy God Mothers, etc. at about the same time that Santa Claus is revealed to be a cruel hoax. A sizable segment of the population for whom the Reality Principle takes firm root goes on to lose faith in progress, humanity, religion, and god (which version becomes irrelevant at that point). Ironically, the typically unchallenged thinking that technology delivers, among other things, knowledge, productivity, leisure, and other wholly salutary effects — the very thinking a writer for TechCrunch might exhibit — falls under the same category.

Who are these magical creatures who believe their smartphones, laptops, TVs, vehicles, etc. are themselves magical simply because their now routine operations lie beyond the typical end-user’s technical knowledge? And who besides Arthur C. Clarke is prone to calling out the bogus substitution of magic for mechanism besides ideologues? No one, really. Jon Evans does no one any favors by raising this argument — presumably just to puncture it.

If one were to observe how people actually use the technology now available in, say, handheld devices with 24/7/365 connection to the Internet (so long as the batteries hold out, anyway), it’s not the device that seems magical but the feeling of being connected, knowledgeable, and at the center of activity, with a constant barrage of information (noise, mostly) barreling at them and defying them to turn attention away lest something important be missed. People are so dialed into their devices, they often lose touch with reality, much like the politicians who no longer relate to or empathize with voters, preferring to live in their heads with all the chatter, noise, news, and entertainment fed to them like an endorphin drip. Who cares how the mechanism delivers, so long as supply is maintained? Similarly, who cares how vaporware delivers unjust access? Just give us what we want! Evans would do better to argue against the unjust desire for circumvention of security rather than its presumed magical mechanism. But I guess that idea wouldn’t occur to a technophiliac.

/rant off

The phrase enlightened self-interest has been been used to describe and justify supposed positive results arising over time from individuals acting competitively, as opposed to cooperatively, using the best information and strategies available. One of the most enduring examples is the prisoner’s dilemma. Several others have dominated news cycles lately.

Something for Nothing

At the Univ. of Maryland, a psychology professor has been offering extra credit on exams of either 2 or 6 points if no more that 10 percent of students elect to receive the higher amount. If more than 10% go for the higher amount, no one gets anything. The final test question, which fails as a marker of student learning or achievement and doesn’t really function so well as a psychology or object lesson, either, went viral when a student tweeted out the question, perplexed by the prof’s apparent cruelty. Journalists then polled average people and found divergence (duh!) between those who believe the obvious choice is 6 pts (or reluctantly, none) and those who righteously characterize 2 pts as “the right thing to do.” It’s unclear what conclusion to draw, but the prof reports that since 2008, only one class got any extra credit by not exceeding the 10% cutoff.

Roping One’s Eyeballs

This overlong opinion article found in the Religion and Ethics section of the Australian Broadcasting Commission (ABC) website argues that advertizing is a modern-day illustration of the tragedy of the commons:

Expensively trained human attention is the fuel of twenty-first century capitalism. We are allowing a single industry to slash and burn vast amounts of this productive resource in search of a quick buck.

I practice my own highly restrictive media ecology, defending against the fire hose of information and marketing aimed at me (and everyone else) constantly, machine-gun style. So in a sense, I treat my own limited time and attention as a resource not to be squandered on nonsense, but when the issue is scaled up to the level of society, the metaphor is inapt and breaks down. I assert that attention as an exploitable resource functions very differently when considering an individual vs. the masses, which have unique behavioral properties. Still, it’s an interesting idea to consider.

No One’s Bank Run

My last last example is entirely predictable bank runs in Greece that were forestalled when banks closed for three weeks and placed withdrawal limits (euphemism: capital controls) on what cash deposits are actually held in the vaults. Greek banks have appealed to depositors to trust them — that their deposits are guaranteed and there will be no “haircut” such as occurred in Cyprus — but appeals were met with laughter and derision. Intolerance of further risk is an entirely prudent response, and a complete and rapid flight of capital would no doubt have ensued if it weren’t disallowed.

What these three examples have in common is simple: it matters little what any individual may do, but it matters considerably what everyone does. Institutions and systems typically have enough resilience to weather a few outliers who exceed boundaries (opting for 6 pts, pushing media campaigns to the idiotic point of saturation, or withdrawing all of one’s money from a faltering bank), but when everyone acts according to enlightened self-interest, well, it’s obvious that something’s gotta give. In the examples above, no one gets extra points, no one pays heed to much of anything anymore (or perhaps more accurately, attention is debased and diluted to the point of worthlessness), and banks fail. In the professor’s example, the threshold for negative results is only 10%. Different operating environments probably vary, but the modesty of that number is instructive.

More than a few writers have interpreted the tragedy of the commons on a global level. As a power law, it probably functions better at a feudal level, where resources are predominantly local and society is centered around villages rather than megalopolises and/or nation-states. However, it’s plain to observe, if one pays any attention (good luck with that in our new age of distraction, where everyone is trained to hear only what our own culture instructs, ignoring what nature tells us), that interlocking biological systems worldwide are straining and failing under the impacts of anthropomorphic climate change. Heating at the poles and deep in the oceans are where the worst effects are currently being felt, but climate chaos is certainly not limited to out-of-sight, out-of-mind locations. What’s happening in the natural world, however, is absolutely and scarily for real, unlike bogus stress tests banks undergo to buttress consumer sentiment (euphemism: keep calm and carry on). Our failure to listen to the right messages and heed warnings properly will be our downfall, but we will have lots of company because everyone is doing it.

Among numerous elephants in the room, trampling everything in sight and leaving behind giant, steaming piles of shit, the one that galls me the most is the time, effort, expense, and lives we Americans sacrifice to the Dept. of Defense, Dept. of Homeland Security, and various other government agencies. The gargantuan corporate-military-industrial complex they have grown into over the past 65 years diverts our attention away from other honorable and worthwhile endeavors we might undertake if we weren’t instead so consumed with blowing people up and taking their stuff while playing bully-as-victim. I’m a little too young to have been scarred they way many of my elders were, ducking, covering, and cowering under schoolroom desks, so I never formed a worldview based on bogeymen. Yet that is the prevailing view, and we currently have the capacity to interfere and cause mischief globally. Impunity for doing so cannot be expected to last. Indeed, many of the current crop of clown presidential candidates see use of force to redistribute their (furriners) resources to us (Murricans) as the best option as eroding wealth and increasing scarcity threaten difficulty maintaining the vaunted American way of life. Blowhard candidate Donald Trump is probably most honest about it, promising that as president he would basically forgo diplomacy in favor of smash-and-grab escalation. Pretty fucking scary, if you ask me.

One of my favorite films is The Hunt for Red October, a taut thriller balancing on the edge of nuclear Armageddon. That clever analysts might assess situations for what they truly are and steer geopolitics away from unnecessary bombing (and concomitant self-annihilation) is especially appealing to me. However, if those people exist beyond fiction, they are below my radar. Instead, in the marketplace of ideas, we have unsubtle thinkers committed to the same useless conventions (bombing didn’t work? then we need more bombing!) as Robert McNamara famously finally(!) recognized and admitted to late in life and as described in the documentary film The Fog of War. Yet as much as unconventional thinking is admired (some bloggers have made themselves into clichés with their predictable topsy-turvy argumentation), operationally, we’re stuck with Cold War strategizing, not least because minor powers threaten to become irrational, world-ending demons should any acquire a nuclear bomb. Current negotiations with Iran to limit its nuclear ambitions are of just that sort, and America never fails to rise to the bait. However, as attractive as nuclear capability must seem to those not yet in the club, weaponized versions offer little or no practical utility, even as deterrents, in an age of mutually assured destruction (a MAD world, quite literally) should that genie be let back out of the bottle. Any analyst can recognize that.

Once striking act of unconventional thinking is Pres. Obama’s recent step toward ending the U.S. embargo of Cuba. Thus far, economic sanctions are still in place, and travel restrictions have been relaxed only in the case of missionary or educational work. Still, even minor revisions to this Cold War relic suggest further changes may be in store. I’m of mixed opinion about it; I expect Cuba to be ruined if overrun by American tourists and capital. It would be a different kind of bomb exploded on foreign soil but no less destructive.

Lastly, Greece is the current trial balloon (one that bursts) for exit from the European Union and its currency. The trope about the historical seat of modern democracy being the first to fail is a red herring; pay it no attention. We’re all failing in this best of all possible worlds. Thus far, events have been relatively orderly, at least so far as media reports portray. Who can know just how disruptive, violent, and ghastly things will get when the gears of industrial machinery seize up and stop providing everything we have come to expect as normal. Some countries are better equipped psychologically to handle such setbacks. Least among them is the U.S. Having just passed Bastille Day on the calendar, it occurred to me that it has been many generations since the U.S. has seen blood flowing in the streets (not counting a spate of massacres and police murders of civilians, which show no signs of abating), but considering how we are armed to the teeth and have the impulse control of a typical three-year-old, going positively apeshit is pretty much guaranteed when, say, food supplies dwindle. I’m hardly alone in saying such things, and it seems equally obvious that over the past decade or more, the federal government has been not-so-quietly preparing for that eventuality. How the mob is managed will be ugly, and one has to pause and wonder how far things will go before a complete crack-up occurs.

The English language has words for everything, and whenever something new comes along, we coin a new word. The latest neologism I heard is bolthole, which refers to the the location one bolts to when collapse and civil unrest reach intolerable proportions. At present, New Zealand is reputed to be the location of boltholes purchased and kept by the ultrarich, which has the advantage of being located in the Southern Hemisphere, meaning remote from the hoi polloi yet reachable by private plane or oceangoing yacht. Actually, bolthole is an older term now being repurposed, but it seems hip and current enough to be new coin.

Banned words are the inverse of neologisms, not in the normal sense that they simply fall out of use but in their use being actively discouraged. Every kid learns this early on when a parent or older sibling slips and lets an “adult” word pass his or her lips that the kid isn’t (yet) allowed to use. (“Mom, you said fuck!”) George Carlin made a whole routine out of dirty words (formerly) banned from TV. Standards have been liberalized since the 1970s, and now people routinely swear or refer to genitalia on TV and in public. Sit in a restaurant or ride public transportation (as I do), eavesdrop a little speech within easy earshot (especially private cellphone conversations), and just count the casual F-bombs.

The worst field of banned-words nonsense is political correctness, which is intertwined with identity politics. All the slurs and epithets directed at, say, racial groups ought to be disused, no doubt, but we overcompensate by renaming everyone (“____-American”) to avoid terms that have little or no derogation. Even more ridiculous, at least one egregiously insulting term has been reclaimed as an badge of honor unbanned banned word by the very group it oppresses. It takes Orwellian doublethink to hear that term — you all know what it is — used legitimately exclusively by those allowed to use it. (I find it wholly bizarre yet fear to wade in with my own prescriptions.) Self-disparaging language, typically in a comedic context, gets an unwholesome pass, but only if one is within the identity group. (Women disparage women, gays trade on gay stereotypes, Jews indulge in jokey anti-Semitism, etc.) We all laugh and accept it as safe, harmless, and normal. President Obama is continuously mixed up appearances (“optics”), or what to call things — or not call them, as the case may be. For instance, his apparent refusal to call terrorism originating in the Middle East “Muslim terrorism” has been met with controversy.

I’m all for calling a thing what it is, but the term terrorism is too loosely applied to any violent act committed against (gasp!) innocent Americans. Recent events in Charleston, SC, garnered the terrorism label, though other terms would be more apt. Further, there is nothing intrinsically Muslim about violence and terrorism. Yeah, sure, Muslims have a word or doctrine — jihad — but it doesn’t mean what most think or are led to believe it means. Every religion across human history has some convenient justification for the use of force, mayhem, and nastiness to promulgate its agenda. Sometimes it’s softer and inviting, others time harder and more militant. Unlike Bill Maher, however, circumspect thinkers recognize that violence used to advance an agenda, like words used to shape narratives, are not the province of any particular hateful or hate-filled group. Literally everyone does it to some extent. Indeed, the passion with which anyone pursues an agenda is paradoxically celebrated and reviled depending on content and context, and it’s a long, slow, ugly process of sorting to arrive as some sort of Rightthink®, which then becomes conventional wisdom before crossing over into political correctness.

If I were to get twisted and strained over every example of idiocy on parade, I’d be permanently distorted. Still, a few issues have crossed my path that might be worth bringing forward.

Fealty to the Flag

An Illinois teacher disrespected the American flag during a classroom lesson on free speech. Context provided in this article is pretty slim, but it would seem to me that a lesson on free speech might be precisely the opportunity to demonstrate that tolerance of discomfiting counter-opinion is preferable to the alternative: squelching it. Yet in response to complaints, the local school board voted unanimously to fire the teacher of the offending lesson. The ACLU ought to have a field day with this one, though I must admit there can be no convincing others that desecrating the flag is protected free speech. Some remember a few years ago going round and round on this issue with a proposed Constitutional amendment. Patriots stupidly insist on carving out an exception to free speech protections when it comes to the American flag, which shows quite clearly that they are immune to the concept behind the 1st Amendment, which says this:

Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the government for a redress of grievances. [emphasis added]

Naturally, interpretations of the Bill of Rights vary widely, but it doesn’t take a Constitution scholar to parse the absolute character of these rights. Rights are trampled all the time, of course, as the fired Illinois teacher just found out.

Fealty to the Wrong Flag

The Confederate battle flag has come back into the national spotlight following racially inspired events in Charleston, SC. (Was it ever merely a quaint, anachronistic, cultural artifact of the American South?) CNN has a useful article separating fact from fiction, yet some Southerners steadfastly defend the flag. As a private issue of astonishingly poor taste, idiocy, and free speech, individuals should be allowed to say what they want and fly their flags at will, but as a public issue for states and/or institutions that still fly the flag or emblazon it on websites, letterhead, etc., it’s undoubtedly better to give up this symbol and move on. Read the rest of this entry »

I get exasperated when I read someone insisting dogmatically upon ideological purity. No such purity exists, as we are all participants, in varying degrees, in the characteristics of global civilization. One of those characteristics is the thermodynamic cycle of energy use and consumption that gradually depletes available energy. The Second Law guarantees depletion, typically over cosmological time frames, but we are seeing it manifest over human history as EROI decreases dramatically since the start of the fossil fuel era. So playing gotcha by arguing, for instance, “You use electricity, too, right? Therefore, you have no right to tell me what I can and can’t do with electricity!” is laughably childish. Or put another way, if even an inkling of agreement exists that maybe we should conserve, forgo needless waste, and accept some discomfort and hardship, then it’s typically “you first” whenever the issue is raised in the public sphere.

In a risible article published at Townhall.com, Michelle Malkin calls the Pope a hypocrite for having added his authority to what scientists and environmentalists have been saying: we face civilization-ending dangers from having fouled our own nest, or “our common home” as the Pope calls it. As though that disrespect were not yet enough, Malkin also tells the Pope essentially to shut it:

If the pontiff truly believes “excessive consumption” of modern conveniences is causing evil “climate change,” will he be shutting down and returning the multi-million-dollar system Carrier generously gifted to the Vatican Museums?

If not, I suggest, with all due respect, that Pope Francis do humanity a favor and refrain from blowing any more hot air unless he’s willing to stew in his own.

The disclaimer “with all due respect” does nothing to ease the audacity of a notorious ideologue columnist picking a fight over bogus principles with the leader of the world’s largest church, who (I might add) is slowly regaining some of the respect the Catholic Church lost over the past few scandalous decades. I suspect Malkin is guilelessly earnest in the things she writes and found a handy opportunity to promote the techno-triumphalist book she researched and wrote for Mercury Ink (owned by Glenn Beck). However, I have no trouble ignoring her completely, since she clearly can’t think straight.

Plenty of other controversy followed in the wake of the latest papal encyclical, Laudato Si. That’s to be expected, I suppose, but religious considerations and gotcha arguments aside, the Pope is well within the scope of his official concern to sound the alarm alongside the scientific community that was once synonymous with the Church before they separated. If indeed Pope Francis has concluded that we really are in the midst of both an environmental disaster and a mass extinction (again, more process than event), it’s a good thing that he’s bearing witness. Doomers like me believe it’s too little, too late, and that our fate is already sealed, but there will be lots of ministry needed when human die-offs get rolling. Don’t bother seeking any sort of grace from Michelle Malkin.

I’m not a serious cineaste, but I have offered a few reviews on The Spiral Staircase. There are many, many cineastes out there, though, and although cinema is now an old medium (roughly 100 years old), cineastes tend to be on the younger side of 35 years. Sure, lots of established film critics are decidedly older, typically acting under the aegis of major media outlets, but I’m thinking specifically of the cohort who use new, democratized media (e.g., cheap-to-produce and -distribute YouTube channels) to indulge in their predilections. For example, New Media Rockstars has a list of their top 100 YouTube channels (NMR No. 1 contains links to the rest). I have heard of almost none of them, since I don’t live online like so many born after the advent of the Information/Communications Age. The one I pay particular attention to is Screen Junkies (which includes Honest Trailers, the Screen Junkies Show, and Movie Fights), and I find their tastes run toward childhood enthusiasms that mire their criticism in a state of permanent adolescence and self-mocking geekdom. The preoccupation with cartoons, comic books, action figures, superheros, and popcorn films couldn’t be more clear. Movies Fights presumes to award points on the passion, wit, and rhetoric of the fighters rather than quality of the films they choose to defend. However, adjudication is rarely neutral, since trump cards tend to get played when a superior film or actor is cited against an inferior one.

So I happened to catch three recent flicks that are central to Screen Junkies canon: Captain America: Winter Soldier, The Avengers: Age of Ultron, and Transformers: Age of Extinction (links unnecessary). They all qualify as CGI festivals — films centered on hyperkinetic action rather than story or character (opinions differ, naturally). The first two originate from the MCU (acronym alert: MCU = Marvel Cinematic Universe, which is lousy with comic book superheros) and the last is based on a Saturday-morning children’s cartoon. Watching grown men and a few women on Screen Junkies getting overexcited about content originally aimed at children gives me pause, yet I watch them to see what fighters say, knowing full well that thoughtful remarks are infrequent.

Were I among the fighters (no chance, since I don’t have my own media fiefdom), I would likely be stumped when a question needs immediate recall (by number, as in M:I:3 for the third Mission Impossible film) of a specific entry from any of numerous franchises pumping out films regularly like those named above. Similarly, my choices would not be so limited to films released after 1990 as theirs, that year being the childhood of most of the fighters who appear. Nor would my analysis be so embarrassingly visual in orientation, since I understand good cinema to be more about story and character than whiz-bang effects.

Despite the visual feast fanboys adore (what mindless fun!), lazy CGI festivals suffer worst from overkill, far outstripping the eye’s ability to absorb onscreen action fully or effectively. Why bother with repeat viewing of films with little payoff in the first place? CGI characters were interesting in and of themselves the first few times they appeared in movies without causing suspension of belief, but now they’re so commonplace that they feel like cheating. Worse, moviegoers are now faced with so many CGI crowds, clone and robot armies, zombie swarms, human-animal hybrids, et cetera ad nauseum, little holds the interest of jaded viewers. Thus, because so few scenes resonate emotionally, sheer novelty substitutes (ineffectively) for meaning, not that most chases or slugfests in the movies offer much truly original. The complaint is heard all the time: we’ve seen it before.

Here’s my basic problem with the three CGI-laden franchise installments I saw recently: their overt hypermilitarism. When better storytellers such as Kubrick or Coppola make films depicting the horrors of war (or other existential threats, such as the ever-popular alien invasion), their perspective is indeed that war is horrible, and obvious moral and ethical dilemmas flow from there. When hack filmmakers pile up frenzied depictions of death and destruction, typically with secondary or tertiary characters whose dispatch means and feels like nothing, and with cities destroyed eliciting no emotional response because it’s pure visual titillation, they have no useful, responsible, or respectable commentary. Even the Screen Junkies recognize that, unlike, say, Game of Thrones, none of their putative superheroes really face much more than momentary distress before saving the day in the third act and certainly no lasting injury (a little make-up blood doesn’t convince me). Dramatic tension simply drains away, since happy resolutions are never in doubt. Now, characters taking fake beatdowns are laughter inducing, sorta like professional wrestling after the sheepish admission that they’ve been acting all along. Frankly, pretend drama with nothing at stake is a waste of effort and the audience’s time and trust. That so many fanboys enjoy being goosed or that some films make lots of money is no justification. The latter is one reason why cinema so often fails to rise to the aspiration of art: it’s too bound up in grubbing for money.