I’ve been puzzling for some time over my increasingly visceral aversion to folks face-planted in their phones and tablets. It’s not merely their often being stumblebums clogging hallways, corridors, sidewalks, and elevators with rank inattention to traffic flows, though that public nuisance has been jangling my nerves to a startling degree. The answer, I was surprised to discover when I began reading Matthew Crawford’s highly regarded book The World Beyond Your Head, is my sense that those staring unwaveringly at their screens are in effect denying sociability in the most ordinary of ways by failing to acknowledge my presence with a nod or even eye contact. This is most surprising to me because I used to eschew common social graces (a couple decades ago) but have revised my thinking through recognition that, as social creatures, we take cues from each other ranging from inconsequential to life and death. None should be discarded. Even though I don’t expect soul-felt validation of my very person in day-to-day interactions, the notable absence of any acknowledgement whatsoever feels less passively neutral, more aggressively hostile. Indeed, I’ve heard stories of people wearing earbuds (without being jacked into anything) precisely to forestall anyone striking up a conversation. I call that protective headgear.

Crawford describes a familiar scene: travelers in an airport gate lounge zoned maniacally into one type of screen or another, some handheld, others mounted overhead, but in either case oblivious to each other in what might have been a social milieu in the day before electronic gadgetry and TVs (e.g., a train depot). Social conduct even in the traditional liquor bar is difficult to maintain when so many screens commandeer one’s attention. Blanket disregard for each other is understandable to Crawford because we now face so many arbitrary demands on our attention (e.g., advertising everywhere, now even on the trays the TSA uses at security checkpoints) that the response is often to cocoon oneself away from the world. Thus, according to Crawford, “we engage less than we once did in everyday activities that structure our attention.” His antidote to living in our heads, transfixed by representations of reality (as opposed to actuality), is to develop skilled practices that focus and refine attention. This is his subject in his previous book Shop Class as Soulcraft, which I have not read. Thus, to be more authentically human, or to be “a powerful, independent mind working at full song,” is to be situated within “narrow and highly structured patterns of attention” that require bodily engagement and submission to constraints that remove the faux freedom of choice.

Here I must pause to register my dismay that Crawford fails to acknowledge Albert Borgmann and his description of focal practices. Many philosophers and their ideas are cited in the book so far (I’m up to p. 95), but to omit Borgmann is an egregious error someone should have caught. I also find it astonishing that Crawford quite clearly speaks my language and makes many of the same points I have been making here at The Spiral Staircase, though with far greater detail and thoroughness as book form requires. However, the language is often clunky and reads too much like a psychology text (which is why I stopped reading books by Robert Putnam — and Albert Borgmann — partway through). I will read to the end of The World Beyond Your Head, but I won’t turn it into a book blogging project since it’s so close to the things I already write about.

So what’s one to do in the presence of others who are steadfastly disengaged from everyone else? I recall last month stepping onto an elevator with maybe five others on the way home from work where no one had yet face-planted into a phone. We all looked at each other briefly, relaxed, not in that awkward elevator way, when inevitably one fellow dove into his pocket a produced a phone. I blurted out without exercising my usual self-editing restraint, “So you’re the one who just had to whip out his pacifier.” Luckily, it came across as a joke and everyone laughed, but I think my intent was really sanction. I resist the pull of electronics as much as I can (I have a cell phone but no data line), but I recognize that though I may be swimming upstream, I cannot redirect the flow of everyone’s attention inexorably into personal screens. Storms along the eastern seaboard last week knocked out power for many for a few hours. No doubt some (re)discovered what it means to be with their families (or alone with their thoughts) without electronic mediation. Do they look dumbly at each other and say “What now?” (as was reported to me) or be sociable?

I picked up and read Christopher McDougall’s Born to Run (2009), which achieved immense popularity after publication but only came into view for me quite recently. I can’t help but to project onto the book or draw from it considerable resonance with themes I have been developing on this blog over the years. By my reading, the book is fundamentally about resolving the mind-body disconnect commonplace in modern, post-industrial society. (The author may not see it that way at all.) The point of entry is the conflict between what our bodies are evolved to do — running long distances in persistent hunts (which I blogged about here) — and what modern medicine insists is highly destructive to the body if not impossible, namely, ultrarunning. However, running is merely the context in which the larger goal to reconnect mind-body occurs. One might even say that the mechanics of our bodies make running something elemental, undeniable, so a natural bridge. Though we (post-)moderns have often lost touch with the sense of our bodies as our selves, locating identity instead in the brain/mind, runners sometimes regain that connection, albeit temporarily, and in the case of the Tarahumara people from the Copper Canyons of Mexico, a people characterized by their running ability, they may never have lost the connection.

If you don’t recognize the notion of mind-body duality, you’re hardly alone. My contention for some years now is that we live too much in our heads and mental noise is increasingly drowning out what the body supplies in day-to-day life with respect to self-knowledge, contentment, and serenity. Think of the Zen of the cat. There may be spiritual aspect, too, but that lies beyond my sensibilities and does not seem to be within the scope of the book, either. Sensitive folks, or sometimes those who have simply been left behind by the incessant struggle of getting and having, may reach toward an unknown horizon in search of something otherwise fulfilling; running is a natural candidate. McDougall tells what amounts to an underground history of the second and third incarnations of U.S. running crazes, not unlike cyclical religious and political awakenings and reawakenings. One can argue whether such fads are part of our deep culture (if often feels that way), responsive to social turmoil, or merely more surface noise. There is nothing conspiratorial about it, but McDougall revels in the lost secrets and unheeded goings-on that constitute the subculture of ultrarunning. Runners’ athletic prowess is hard not to admire, but their compulsive pursuit often feels more than a little unhinged, not at all Zen.

The author employs a whole bag of writerly tricks to engage the reader, as though he distrusts his own subject matter and must resort to storytelling clichés to keep readers teased and entrained. Irritatingly, he starts one story but smash cuts to other tangential stories repeatedly within the larger structure. The stories all tie together, but the thread is interrupted so often that I felt my chain being yanked, which ejected me from the flow to contemplate the seams and joints within the narrative. That style probably works for dull readers, much like the TV news constantly strings viewers from segment to segment with rapid-fire disorientation and discontinuity mixed with flashes of what news is about to be reported, but first … this (typically, a word from sponsors). The best aspect of McDougall’s many diversions from the main story arc are what amount to detective stories behind body mechanics, running shoe design, persistence hunting, etc. Although they suggest we have arrived at a final understanding of such topics, handily turning conventional wisdom on its head in most cases, I rather suspect that further refinements are inevitable, especially if the real story is mind-body rather than running.

Most of the people profiled in the book suffer from mild to severe character distortion (by modern, post-industrial standards). Perhaps it’s exactly those who abandon or rebel against the dominant paradigm who are redeemed by what they (re)discover beyond. More than a couple of them are just assholes, though. Having done a little additional Internet research on some of the characters, it’s difficult to decide whether ultrarunning is indeed for them redemption or merely the fruitless chasing of lost souls. The unifying character (besides the author), Caballo Blanco or Micah True, died while running wilderness trails, though his autopsy revealed he died from heart disease. Others achieved of a mixture of notoriety and infamy both before and after the publication of the book. McDougall appears to have become a running guru and inspirational speaker, with numerous YouTube videos promoting his books and findings.

In summary, I’m glad to have read the book. Its potboiler style aside, the book has fascinating content and fairly good storytelling. The denouement felt a little incomplete, but then, many detours from the main story were left hanging, so finishing with a 50-mile race in the Copper Canyons may be as good a finale as anything. Being a (lousy) endurance athlete myself, I was encouraged to learn that there is still some opportunity for me to improve my athletic ability. According to the book, the human body doesn’t have to slow down appreciably until the middle 60s. So I’m encouraged that, unlike the steep drop-off in participation in the late fifties typical of endurance athletes, I can keep going a while longer. However, I am seeking elsewhere for mind-body connection.

Every blog post I write suffers from the same basic problem: drawing disparate ideas together in succinct, cogent form that expresses enough of the thesis to make sense while leaving room for commentary, discussion, and development. Alas, commentary and discussion are nearly nonexistent, but that’s always been my expectation and experience given my subjects. When expanding a blog into several parts, the greatest risk is that ideas fail to coalesce legibly, compounded by the unlikelihood that readers who happen to navigate here will bother to read all the parts. (I suspect this is due in part to most readers’ inability to comprehend complex, multipart writing, as discussed in this blog post by Ugo Bardi describing surprising levels of functional illiteracy.) So this addendum to my three-part blog on Dissolving Reality is doomed, like the rest of my blog, to go unread and ignored. Plus ça change

Have you had the experience of buying a new model of vehicle and suddenly noticed other vehicles of the same model on the road? That’s what I’ve been noticing since I hatched my thesis (noting with habitual resignation that there nothing is new under the sun), which is that the debased information environment now admits multiple interpretations of reality, none of which can lay exclusive claim to authority as an accurate account. Reality has instead dissolved into a stew of competing arguments, often extremely politicized, which typically appeal to emotion. Historically, the principal conflict was between different ways of knowing exemplified by faith and reason, perhaps better understood as the church (in the West, the Catholic Church) vs. science. Floodgates have now opened to any wild interpretation one might concoct, all of which coexist on roughly equal footing in the marketplace of ideas. Read the rest of this entry »

Continuing from part 1 and part 2, let me add one further example of how meaning is reversed under the Ironic perspective. At my abandoned group blog, Creative Destruction, which garners more traffic than The Spiral Staircase despite being woefully out of date, the post that gets the most hits argues (without irony) that, in the Star Wars universe, the Empire represents the good guys and the Jedi are the terrorists despite the good vs. evil archetypes being almost cartoonishly drawn, with the principal villain having succumbed to the dark side only to be redeemed by his innate goodness in the 11th hour. The reverse argument undoubtedly has some merit, but it requires overthinking and outsmarting oneself to arrive at the backwards conclusion. A similar dilemma of competing perspectives is present in The Avengers, where Captain America is unconflicted in his all-American goodness and straightforward identification of villainy but is surrounded by other far-too-clever superheroes who overanalyze (snarkily so), cannot agree on strategy, and/or question motivations and each others’ double or triple agency. If I understand correctly, this plot hook is the basis for the civil war among allies in the next Avengers movie.

The Post-Ironic takes the reversal of meaning and paradoxical retention of opposites that characterizes the Ironic and expands issues from false dualisms (e.g., either you’re with us or against us) to multifaceted free-for-alls where anyone’s wild interpretation of facts, events, policy, and strategy has roughly equal footing with another’s precisely because no authority exists to satisfy everyone as to the truth of matters. The cacophony of competing viewpoints — the multiplicity of possible meanings conjured from any collection of evidence — virtually guarantees that someone out there (often someone loony) will speak as though reading your mind. Don’t trust politicians, scientists, news anchors, pundits, teachers, academics, your parents, or even the pope? No problem. Just belly up to the ideological buffet and cherry pick choose from any of a multitude of viewpoints, few of which have much plausibility. But no matter: it’s a smorgasbord of options, and almost none of them can be discarded out of hand for being too beyond the pale. All must be tried and entertained.

One of the themes of this blog is imminent (i.e., occurring within the lifetimes of most readers) industrial collapse resulting from either financial collapse or loss of habitat for humans (or a combination of factors). Either could happen first, but my suspicion is that financial collapse will be the lit fuse leading to explosion of the population bomb. Collapse is quite literally the biggest story of our time despite its being prospective. However, opinion on the matter is loose, undisciplined, and ranges all over the map. Consensus within expert bodies such as the Intergovernmental Panel on Climate Change, assembled specifically to study climate change and reports its findings, ought to put an end to controversy, yet waters have been so muddied by competing narratives that credulous folks, if they bother paying attention at all, can’t really tell whom to believe. It doesn’t help that even well-educated folks, including many professionals, often lack critical thinking skill with which to evaluate evidence. So instead, wishy-washy emotionalism and psychological vulnerability awards hearts and minds to the most charismatic storyteller, not the truth-teller.

Perhaps the best instance of multiple meanings being simultaneously present and demanding consideration is found in the game of poker, which has become enormously popular in the past decade. To play the game effectively, one must weigh the likelihood and potential for any one of several competing narratives based on opponents’ actions. Mathematical analysis and intuition combine to recommend which scenario is most likely true and whether the risk is worth it (pot odds). If, for just one example, an opponent bets big at any point in the poker hand, several scenarios that must be considered:

  • the opponent has made his hand and cannot be beaten (e.g., nut flush, full house)
  • the opponent has a dominating hand and can be beaten only if one draws to make a better hand (e.g., top pair with high kicker or two pair)
  • the opponent has not yet fully made his hand and is on a draw (open-ended straight or four cards to a flush)
  • the opponent has a partial or weak hand and is bluffing at the pot

Take note that, as with climate change, evaluation in poker is prospective. Sometimes an opponent’s betting strategy is discovered in a showdown where players must reveal their cards; but often, one player or another mucks or folds and the actual scenario is undisclosed. The truth of climate change, until the future manifests, is to some tantalizingly unknown and contingent, as though it could be influenced by belief, hope, and/or faith. To rigorous thinkers, however, the future is charted for us with about the same inevitability as the sun rising in the morning — the biggest remaining unknown being timing.

Habitual awareness of multiple, competing scenarios extends well beyond table games and climate change. In geopolitics, the refusal to rule out the nuclear option, even when it would be completely disproportionate to a given provocation, is reckless brinkmanship. The typical rhetoric is that, like fighting dogs, any gesture of backing down would be interpreted as a display of submission or weakness and thus invite attack. So is the provocation or the response a bluff, a strong hand, or both? Although it is difficult to judge how U.S. leadership is perceived abroad (since I’m inside the bubble), the historical record demonstrates that the U.S. never hesitates to get mixed up in military action and adopts overweening strategies to defeat essentially feudal societies (e.g., Korea and Vietnam). Never mind that those strategies have been shown to fail or that those countries represented no credible threat to the U.S. Our military escapades in the 21st century are not so divergent, with the perception of threats being raised well beyond their true proportions relative to any number of health and social scourges that routinely kill many more Americans than terrorism ever did.

Because this post is already running long, conclusions will be in an addendum. Apologies for the drawn out posts.

Continuing from part 1, the Ironic is characterized by (among other things) reversal of meaning, sometimes understood as the unexpected manifested but more commonly as sarcasm. The old joke goes that in pompous, authoritarian fashion, the language/semiotics professor says to his class of neophytes, “In many languages, a double negative equals a positive, but in no language does a double positive make a negative.” In response, a student mutters under his breath, “yeah, right ….” Up to a certain age and level of cognitive development, children don’t process sarcasm; they are literal-minded and don’t understand subtext. Transcripts and text (e.g., blog posts and comments) also typically fail to transmit nonverbal cues that one may be less than earnest making certain statements. Significantly, no one is allowed to make offhand jokes in line at security checkpoints because, in that context, remarks such as “yeah, like my shoes are full of C4” are treated quite literally.

I have a vague memory of the period in my adolescence when I discovered sarcasm, at which time it was deployed almost continuously, saying the opposite of what I meant with the expectation that others (older than me) would understand the implied or latent meaning. I also adopted the same mock abuse being used elsewhere, which regrettably lasted into my late 20s. Maybe it’s a phase everyone must go through, part of growing up, and as a society, our cultural development must also pass through that phase, though I contend we remain mired in irony or ironic posturing.

The model for me was insult comedy, still in style now but more familiar from my childhood. Like most during this developmental phase, I accepted the TV as social tutor for how people communicate and what’s acceptable to say. So who can blame me or other children, fed a diet of snark and attitude (adult writers of TV shows being a lot more clever than the adolescent actors who voice the lines) from speaking the same way? But to appreciate irony more directly, consider the comedian (then and now) who levies criticism using clichés drawn from his or her own gender, race, religion, social class, etc. In comedy, sexism, racism, and class conflict are not just joke fodder but stereotyped bigotry that reinforces the very scourges they ostensibly criticize. Oh, sure, the jokes are often funny. We all know to laugh at the black comedian who trades nonstop in nigger jokes or the female who complains of being nothing more than an object for male titillation. Comedians (and special interest groups — minority or not — that lay claim to victimhood) may coopt the language of their oppressors (some actual, some imagined — see for instance those complaining about the War on Christmas), but the language and attitudes are broken down and reinforced at the same time.

This isn’t solely the domain of comedy, either. Whereas TV sitcoms are ruled by hip, ironic posturing — the show about nothing that plumbs the surprising depths inside everything trivial, banal, and inane, the show full of nerd archetypes who rise above their inherent nerdiness to be real people worthy of respect (or not surprisingly, not so worthy after all), or the endless parade of sitcom families with unrealistically precocious, smart aleck kids who take aim at everyone with a continuous stream of baleful insults, take-downs, and mockery but are, despite truly cretinous behavior, always forgiven (or passed over because another joke is imminent) and still lovable — in the virtual world (the Internet, where you are reading this), sarcasm, snark, irony, abuse, and corrosive jokiness are legion. Take, for instance, this video at Military.com and tell me there isn’t something deeply wrong with it:

One might wonder whether the intent is interdiction or recruitment (or both at once), especially if one acknowledges that most of the awful things depicted in the video are precisely what the U.S. military has been doing in the Middle East for well over a decade. The Fox News blurb linked below the video says, “The State Department is launching a tough and graphic propaganda counteroffensive against the Islamic State, using some of the group’s own images of barbaric acts against fellow Muslims to undercut its message.” Maybe the word propaganda is a mistake and publicity was intended, but I suspect that propaganda is the right word precisely because it’s understood as both pejorative and superlative. As with everything else, meaning has become polysemous.

Iain McGilchrist illustrates this with special emphasis on the arts and how substitution of symbolic tokens normalizes distortion. For instance, art theory of the Aesthetes contains a fundamental paradox:

The Aesthetes’ creed of ‘art for art’s sake’, while it sounds like an elevation of the value of art, in that it denies that it should have an ulterior purpose beyond itself — so far so good — is also a devaluation of art, in that it marginalizes its relationship with life. In other words it sacrifices the betweenness of art with life, instead allowing art to become self-reflexively fulfilled. There is a difference between the forlorn business of creating ‘art for art’s sake’, and art nonetheless being judged solely ‘as art’, not as for another purpose. [p. 409]

Isolating artistic creation in a mental or virtual transactional space ought to be quite familiar (or perhaps more accurately, assumed and thus invisible) to 21st-century people, but it was a new concept at the outset of the 20th century. The paradox is that the doctrine is both a reversal of meaning and retention of opposites together. Over the course of the 20th century, we became habituated to such thinking, namely, that a thing automatically engenders its opposite and is both things at once. For instance, what used to be called the War on Poverty, meant to help those suffering deprivation, is now also its reverse: literally a war on the poverty-stricken. Similarly, the War on Drugs, meant to eradicate drug use as a social ill, is also quite literally a war against drug users, who are a large and improper part of the bloated U.S. prison population. Reduction of government services to the poor and rampant victim-blaming demonstrate that programs once meant to assist those in need now often instead leave them to fend for themselves, or worse, pile on with criminal charges. Disinformation campaign about welfare cheats and the minimum wage are further examples of information being distorted and used to serve an unwholesome agenda.

My conclusion is not yet ready to be drawn; it’s far too subtle to fit in a Tweet or even a series of blog posts. However, consider what it means when the language we use is laden with ironic twists that force recipients of any message to hold simultaneously forward/backward, up/down, left/right, and true/false meanings. Little can be established beyond reasonable doubt not just because so many of us have been poorly served by educational institutions (or is it the students themselves — sort of a chicken-and-egg question) more interested in business and credentialing than teaching and learning that few possess the ability to assess and evaluate information (ironically, from a variety of perspectives) being spun and spoon fed to us by omnimedia but because the essential underlying structure of language and communications has been corrupted by disembedding, decontextualization, and deconstruction that relegate reality to something to be dreamt up and then used to convince others. In the end, of course, we’re only fooling ourselves.

At last, getting to my much, much delayed final book blogs (three parts) on Iain McGilchrist’s The Master and His Emissary. The book came out in 2010, I picked it up in 2012 (as memory serves), and it took me nearly two years to read its entirety, during which time I blogged my observations. I knew at the time of my previous post on the book that there would be more to say, and it’s taken considerable time to get back to it.

McGilchrist ends with a withering criticism of the Modern and Postmodern (PoMo) Eras, which I characterized as an account of how the world went mad. That still seems accurate to me: the madness that overtook us in the Modern Era led to world wars, genocides, and systematic reduction of humanity to mere material and mechanism, what Ortega y Gasset called Mass Man. Reduction of the rest of the living world to resources to be harvested and exploited by us is a worldview often called instrumental reality. From my armchair, I sense that our societal madness has shape-shifted a few times since the fin de siècle 1880s and 90s. Let’s start with quotes from McGilchrist before I extend into my own analysis. Here is one of his many descriptions of the left-hemisphere paradigm under which we now operate:

In his book on the subject, Modernity and Self-identity, Anthony Giddens describes the characteristic disruption of space and time required by globalisation, itself the necessary consequence of industrial capitalism, which destroys the sense of belonging, and ultimately of individual identity. He refers to what he calls ‘disembedding mechanisms’, the effect of which is to separate things from their context, and ourselves from the uniqueness of place, what he calls ‘locale’. Real things and experiences are replaced by symbolic tokens; ‘expert’ systems replace local know-how and skill with a centralised process dependent on rules. He sees a dangerous form of positive feedback, whereby theoretical positions, once promulgated, dictate the reality that comes about, since they are then fed back to us through the media, which form, as much as reflect, reality. The media also promote fragmentation by a random juxtaposition of items of information, as well as permitting the ‘intrusion of distant events into everyday consciousness’, another aspect of decontextualisation in modern life adding to loss of meaning in the experienced world. [p. 390]

Reliance on abstract, decontextualized tokens having only figurative, nonintrinsic power and meaning is a specific sort of distancing, isolation, and reduction that describes much of modern life and shares many characteristics with schizophrenia, as McGilchrist points out throughout the chapter. That was the first shape-shift of our madness: full-blown mechanization borne out of reductionism and materialism, perspectives bequeathed to us by science. The slow process had been underway since the invention of the mechanical clock and discovery of heliocentrism, but it gained steam (pun intended) as the Industrial Revolution matured in the late 19th century.

The PoMo Era is recognized as having begun just after the middle of the 20th century, though its attributes are questionably defined or understood. That said, the most damning criticism leveled at PoMo is its hall-of-mirrors effect that renders objects in the mirrors meaningless because the original reference point is obscured or lost. McGilchrist also refers repeatedly to loss of meaning resulting from the ironizing effect of left-brain dominance. The corresponding academic fad was PoMo literary criticism (deconstruction) in the 1970s, but it had antecedents in quantum theory. Here is McGilchrist on PoMo:

With post-modernism, meaning drains away. Art becomes a game in which the emptiness of a wholly insubstantial world, in which there is nothing beyond the set of terms we have in vain used to ‘construct’ mean, is allowed to speak for its own vacuity. The set of terms are now seen simply to refer to themselves. They have lost transparency; and all conditions that would yield meaning have been ironized out of existence. [pp. 422–423]

This was the second shape-shift: loss of meaning in the middle of the 20th century as purely theoretical formulations, which is to say, abstraction, gained adherents. He goes on:

Over-awareness … alienates us from the world and leads to a belief that only we, or our thought processes, are real … The detached, unmoving, unmoved observer feels that the world loses reality, becomes merely ‘things seen’. Attention is focussed on the field of consciousness itself, not on the world beyond, and we seem to experience experience … [In hyperconsciousness, elements] of the self and of experience which normally remain, and need to remain, intuitive, unconscious, become the objects of a detached, alienating attention, the levels of consciousness multiply, so that there is an awareness of one’s own awareness, and so on. The result of this is a sort of paralysis, in which even everyday ‘automatic’ actions such as moving one leg in front of another in order to walk can become problematic … The effect of hyperconsciousness is to produce a flight from the body and from its attendant emotions. [pp. 394–396]

Having devoted a fair amount of my intellectual life to trying to understand consciousness, I immediately recognized the discussion of hyperconsciousness (derived from Louis Sass) as what I often call recursion error, where consciousness becomes the object of its own contemplation, with obvious consequences. Modern, first-world people all suffer from this effect to varying degrees because that is how modern consciousness is warped shaped.

I believe we can observe now two more characteristic extensions or variations of our madness, probably overlapping, not discrete, following closely on each other: the Ironic and Post-Ironic. The characteristics are these:

  • Modern — reductive, mechanistic, instrumental interpretation of reality
  • Postmodern — self-referential (recursive) and meaningless reality
  • Ironic — reversed reality
  • Post-Ironic — multiplicity of competing meanings/narratives, multiple realities

All this is quite enough to the chew on for a start. I plan to continue in pts. 2 and 3 with description of the Ironic and Post-Ironic.

Updates to my blogroll are infrequent. I only add blogs that present interesting ideas (with which I don’t always agree) and/or admirable writing. Deletions are typically the result of a change of focus at the linked blog, or regrettably, the result of a blogger becoming abusive or self-absorbed. This time, it’s latter. So alas, another one bites the dust. Dropping off my blogroll — no loss since almost no one reads my blog — is On an Overgrown Path (no link), which is about classical music.

My indignation isn’t about disagreements (we’ve had a few); it’s about inviting discussion in bad faith. I’m very interested in contributing to discussion and don’t mind moderated comments to contend with trolls. However, my comments drive at ideas, not authors, and I’m scarcely a troll. Here’s the disingenuously titled blog post, “Let’s Start a Conversation about Concert Hall Sound,” where the blogger declined to publish my comment, handily blocking conversation. So for maybe the second time in the nearly 10-year history of this blog, I am reproducing the entirety of another’s blog post (minus the profusion of links, since that blogger tends to create link mazes, defying readers to actually explore) followed by my unpublished comment, and then I’ll expound and perhaps rant a bit. Apologies for the uncharacteristic length. Read the rest of this entry »

Everyone knows how to play Rock, Paper, Scissors, which typically comes up as a quick means of settling some minor negotiation with the caveat that the winner is entirely arbitrary. The notion of a Rock, Paper, Scissors tournament is therefore a non sequitur, since the winner by no means possesses skill, or strategic combinations of throws devised to reliably defeat opponents. Rather, winners are the unfortunate recipients of a blind but lucky sequence, an algorithm, that produces an eventual winner yet is indifferent to the outcome. I can’t say quite why, exactly, but I’ve been puzzling over how three-way conflicts might be decided were the categories instead Strong, Stupid, and Smart, respectively.

Rock is Strong, obviously, because it’s blunt force, whereas Paper is Stupid because it’s blank, and Scissors is Smart because it’s the only one that has any design or sophistication. For reassignments to work, however, the circle of what beats what would have to be reversed: Strong beats Stupid, Stupid beats Smart, and Smart beats Strong. One could argue that Strong and Stupid are equally dense, but arguendo, let’s grant Strong supremacy in that contest. Interestingly, Stupid always beats Smart because Smart’s advantage is handily nullified by Stupid. Finally, Smart beats Strong because the David and Goliath parable has some merit. Superhero fanboys are making similar arguments with respect to the hotly anticipated Superman v. Batman (v. Wonder Woman) film to be released in 2016. The Strong argument is that Superman need land only one punch to take out Batman (a mere human with gadgets and bad-ass attitude), but the Smart argument is that Batman will outwit Superman by, say, deploying kryptonite or exploiting Superman’s inherent good guyness to defeat him.

A further puzzle is how the game Strong, Stupid, Smart works out in geopolitics. The U.S. is clearly Strong, the last remaining world superpower (though still dense as a board — new revelations keep reinforcing that judgment), and uses its strength to bully Stupids into submission. Numerous countries have shifted categories from Strong to Stupid over time — quite a few in fact if one surveys more than a few decades of world history. Stupids have also fought each other to effective stalemate in most of the world, though not without a few wins and losses chalked up. What remains, however, is for a truly Smart regime to emerge to take down Strong. The parody version of such a match-up is told in the book The Mouse That Roared (also a movie with Peter Sellars). But since Smart is vanquished by Stupid, and the world has an overabundance of Stupids, it is unlikely that Smart can ever do better than momentary victory.

Our current slate of presidential candidates is a mostly a field of Stupids with a couple Strongs thrown in (remember: still equally dense as Stupid). Then there are a couple insanely Stupids who distort the circle into an out-of-kilter bizarro obloid. As with geopolitics, a Smart candidate is yet to emerge, but such a candidate would only defeat Strongs, clearing the way for a Stupid victory. This should be obvious to any strategist, and indeed, no truly Smart candidates have declared, knowing full well that they would gain no traction with the half-literate, mouth-breathing public composed largely of Stupids who predictably fall in love with the most insanely Stupid candidate out there. An engaged Smart candidate would thus hand the victory to the insanely Stupid, who should be unelectable from the outset, but go figger. So then the deep strategy (gawd, how I hate this) would be to go with the devil you know, since a saint could never prevail against all the demons.

“Any sufficiently advanced technology is indistinguishable from magic.” –Arthur C. Clarke

/rant on

Jon Evans at TechCrunch has an idiot opinion article titled “Technology Is Magic, Just Ask The Washington Post” that has gotten under my skin. His risible assertion that the WaPo editorial board uses magical thinking misframes the issue whether police and other security agencies ought to have backdoor or golden-key access to end-users’ communications carried over electronic networks. He marshals a few experts in the field of encryption and information security (shortened to “infosec” — my, how hep) who insist that even if such a thing (security that is porous to select people or agencies only) were possible, that demand is incompatible with the whole idea of security and indeed privacy. The whole business strikes me as a straw man argument. Here is Evans’ final paragraph:

If you don’t understand how technology works — especially a technical subgenre as complex and dense as encryption and information security — then don’t write about it. Don’t even have an opinion about what is and isn’t possible; just accept that you don’t know. But if you must opine, then please, at least don’t pretend technology is magic. That attitude isn’t just wrong, it’s actually dangerous.

Evans is pushing on a string, making the issue seem as though agencies that simply want what they want believe in turn that those things come into existence by the snap of one’s fingers, or magically. But in reality beyond hyperbole, absolutely no one believes that science and technology are magic. Rather, software and human-engineered tools are plainly understood as mechanisms we design and fabricate through our own effort even if we don’t understand the complexity of the mechanism under the hood. Further, everyone beyond the age of 5 or 6 loses faith in magical entities such as the Tooth Fairy, unicorns, Fairy God Mothers, etc. at about the same time that Santa Claus is revealed to be a cruel hoax. A sizable segment of the population for whom the Reality Principle takes firm root goes on to lose faith in progress, humanity, religion, and god (which version becomes irrelevant at that point). Ironically, the typically unchallenged thinking that technology delivers, among other things, knowledge, productivity, leisure, and other wholly salutary effects — the very thinking a writer for TechCrunch might exhibit — falls under the same category.

Who are these magical creatures who believe their smartphones, laptops, TVs, vehicles, etc. are themselves magical simply because their now routine operations lie beyond the typical end-user’s technical knowledge? And who besides Arthur C. Clarke is prone to calling out the bogus substitution of magic for mechanism besides ideologues? No one, really. Jon Evans does no one any favors by raising this argument — presumably just to puncture it.

If one were to observe how people actually use the technology now available in, say, handheld devices with 24/7/365 connection to the Internet (so long as the batteries hold out, anyway), it’s not the device that seems magical but the feeling of being connected, knowledgeable, and at the center of activity, with a constant barrage of information (noise, mostly) barreling at them and defying them to turn attention away lest something important be missed. People are so dialed into their devices, they often lose touch with reality, much like the politicians who no longer relate to or empathize with voters, preferring to live in their heads with all the chatter, noise, news, and entertainment fed to them like an endorphin drip. Who cares how the mechanism delivers, so long as supply is maintained? Similarly, who cares how vaporware delivers unjust access? Just give us what we want! Evans would do better to argue against the unjust desire for circumvention of security rather than its presumed magical mechanism. But I guess that idea wouldn’t occur to a technophiliac.

/rant off

The phrase enlightened self-interest has been been used to describe and justify supposed positive results arising over time from individuals acting competitively, as opposed to cooperatively, using the best information and strategies available. One of the most enduring examples is the prisoner’s dilemma. Several others have dominated news cycles lately.

Something for Nothing

At the Univ. of Maryland, a psychology professor has been offering extra credit on exams of either 2 or 6 points if no more that 10 percent of students elect to receive the higher amount. If more than 10% go for the higher amount, no one gets anything. The final test question, which fails as a marker of student learning or achievement and doesn’t really function so well as a psychology or object lesson, either, went viral when a student tweeted out the question, perplexed by the prof’s apparent cruelty. Journalists then polled average people and found divergence (duh!) between those who believe the obvious choice is 6 pts (or reluctantly, none) and those who righteously characterize 2 pts as “the right thing to do.” It’s unclear what conclusion to draw, but the prof reports that since 2008, only one class got any extra credit by not exceeding the 10% cutoff.

Roping One’s Eyeballs

This overlong opinion article found in the Religion and Ethics section of the Australian Broadcasting Commission (ABC) website argues that advertizing is a modern-day illustration of the tragedy of the commons:

Expensively trained human attention is the fuel of twenty-first century capitalism. We are allowing a single industry to slash and burn vast amounts of this productive resource in search of a quick buck.

I practice my own highly restrictive media ecology, defending against the fire hose of information and marketing aimed at me (and everyone else) constantly, machine-gun style. So in a sense, I treat my own limited time and attention as a resource not to be squandered on nonsense, but when the issue is scaled up to the level of society, the metaphor is inapt and breaks down. I assert that attention as an exploitable resource functions very differently when considering an individual vs. the masses, which have unique behavioral properties. Still, it’s an interesting idea to consider.

No One’s Bank Run

My last last example is entirely predictable bank runs in Greece that were forestalled when banks closed for three weeks and placed withdrawal limits (euphemism: capital controls) on what cash deposits are actually held in the vaults. Greek banks have appealed to depositors to trust them — that their deposits are guaranteed and there will be no “haircut” such as occurred in Cyprus — but appeals were met with laughter and derision. Intolerance of further risk is an entirely prudent response, and a complete and rapid flight of capital would no doubt have ensued if it weren’t disallowed.

What these three examples have in common is simple: it matters little what any individual may do, but it matters considerably what everyone does. Institutions and systems typically have enough resilience to weather a few outliers who exceed boundaries (opting for 6 pts, pushing media campaigns to the idiotic point of saturation, or withdrawing all of one’s money from a faltering bank), but when everyone acts according to enlightened self-interest, well, it’s obvious that something’s gotta give. In the examples above, no one gets extra points, no one pays heed to much of anything anymore (or perhaps more accurately, attention is debased and diluted to the point of worthlessness), and banks fail. In the professor’s example, the threshold for negative results is only 10%. Different operating environments probably vary, but the modesty of that number is instructive.

More than a few writers have interpreted the tragedy of the commons on a global level. As a power law, it probably functions better at a feudal level, where resources are predominantly local and society is centered around villages rather than megalopolises and/or nation-states. However, it’s plain to observe, if one pays any attention (good luck with that in our new age of distraction, where everyone is trained to hear only what our own culture instructs, ignoring what nature tells us), that interlocking biological systems worldwide are straining and failing under the impacts of anthropomorphic climate change. Heating at the poles and deep in the oceans are where the worst effects are currently being felt, but climate chaos is certainly not limited to out-of-sight, out-of-mind locations. What’s happening in the natural world, however, is absolutely and scarily for real, unlike bogus stress tests banks undergo to buttress consumer sentiment (euphemism: keep calm and carry on). Our failure to listen to the right messages and heed warnings properly will be our downfall, but we will have lots of company because everyone is doing it.