Archive for the ‘Nomenclature’ Category

Another modest surprise (to me at least) offered by Anthony Giddens (from The Consequences of Modernity) follows a discussion of reflexivity (what I call recursion when discussing consciousness), which is the dynamic of information and/or knowledge feeding back to influence later behavior and information/knowledge. His handy example is the populace knowing divorce rates, which has an obvious influence on those about to get married (but may decide to defer or abjure entirely). The surprise is this:

The discourse of sociology and the concepts, theories, and findings of the other social sciences continually “circulate in and out” of what it is that they are about. In so doing they reflexively restructure their subject matter, which itself has learned to think sociologically … Much that is problematic in the position of the professional sociologist, as the purveyor of expert knowledge about social life, derives from the fact that she or he is at most one step ahead of enlightened lay practitioners of the discipline. [p. 43]

I suppose “enlightened lay practitioners” are not the same as the general public, which I hold in rather low esteem as knowing (much less understanding) anything of value. Just consider electoral politics. Still, the idea that an expert in an academic field admits he is barely ahead of wannabes (like me) seems awfully damning. Whereas curious types will wade in just about anywhere, and in some cases, amateurs will indulge themselves enthusiastically in endeavors also practiced by experts (sports and music are the two principal examples that spring to mind), the distance (in both knowledge and skill) between experts and laypersons is typically quite far. I suspect those with high intellect and/or genetic gifts often bridge that gap, but then they join the ranks of the experts, so the exception leads nowhere.

Advertisements

I revisit my old blog posts when I see some reader activity in the WordPress backstage and was curious to recall a long quote of Iain McGilchrist summarizing arguments put forth by Anthony Giddens in his book Modernity and Self-identity (1991). Giddens had presaged recent cultural developments, namely, the radicalization of nativists, supremacists, Social Justice Warriors (SJWs), and others distorted by absorbed in identity politics. So I traipsed off to the Chicago Public Library (CPL) and sought out the book to read. Regrettably, CPL didn’t have a copy, so I settled on a slightly earlier book, The Consequences of Modernity (1990), which is based on a series of lectures delivered at Stanford University in 1988.

Straight away, the introduction provides a passage that goes to the heart of matters with which I’ve been preoccupied:

Today, in the late twentieth century, it is argued by many, we stand at the opening of a new era … which is taking us beyond modernity itself. A dazzling variety of terms has been suggested to refer to this transition, a few of which refer positively to the emergence of a new type of social system (such as the “information society” or the “consumer society”) but most of which suggest rather than a preceding state of affairs is drawing to a close … Some of the debates about these matters concentrate mainly upon institutional transformations, particularly those which propose that we are moving from a system based upon the manufacture of material goods to one concerned more centrally with information. More commonly, however, those controversies are focused largely upon issues of philosophy and epistemology. This is the characteristic outlook, for example, of the the author who has been primarily responsible for popularising the notion of post-modernity, Jean-François Lyotard. As he represents it, post-modernity refers to a shift away from attempts to ground epistemology and from from faith in humanly engineered progress. The condition of post-modernity is distinguished by an evaporating of the “grand narrative” — the overarching “story line” by means of which we are placed in history as being having a definite past and a predictable future. The post-modern outlook sees a plurality of heterogeneous claims to knowledge, in which science does not have a privileged place. [pp. 1–2, emphasis added]

That’s a lot to unpack all at once, but the fascinating thing is that notions now manifesting darkly in the marketplace of ideas were already in the air in the late 1980s. Significantly, this was several years still before the Internet brought the so-called Information Highway to computer users, before the cell phone and smart phone were developed, and before social media displaced traditional media (TV was only 30–40 years old but had previously transformed our information environment) as the principal way people gather news. I suspect that Giddens has more recent work that accounts for the catalyzing effect of the digital era (including mobile media) on culture, but for the moment, I’m interested in the book in hand.

Regular readers of this blog (I know of one or two) already know my armchair social criticism directed to our developing epistemological crisis (challenges to authority and expertise, psychotic knowledge, fake news, alternative facts, dissolving reality, and science denial) as well as the Transhumanist fantasy of becoming pure thought (once we evolve beyond our bodies). Until that’s accomplished with imagined technology, we increasingly live in our heads, in the abstract, disoriented and adrift on a bewildering sea of competing narratives. Moreover, I’ve stated repeatedly that highly mutable story (or narrative) underlie human cognition and consciousness, making most of us easy marks for charismatic thought leaders storytellers. Giddens was there nearly 30 years ago with these same ideas, though his terms differ.

Giddens dispels the idea of post-modernity and insists that, from a sociological perspective, the current period is better described as high modernism. This reminds me of Oswald Spengler and my abandoned book blogging of The Decline of the West. It’s unimportant to me who got it more correct but note that the term Postmodernism has been adopted widely despite its inaccuracy (at least according to Giddens). As I get further into the book, I’ll have plenty more to say.

The Internet is now a little more than two decades old (far more actually, but I’m thinking of its widespread adoption). Of late, it’s abundantly clear that, in addition to being a wholesale change in the way we disseminate and gather information and conduct business, we’re running live social experiments bearing psychological influence, some subtle, some invasive, much like the introduction of other media such as radio, cinema, and TV back in the day. About six years ago, psychologists coined the term digital crowding, which I just discovered, referring to an oppressive sense of knowing too much about people, which in turn provokes antisocial reactions. In effect, it’s part of the Dark Side of social media (trolling and comments sections being other examples), one of numerous live social experiments.

I’ve given voice to this oppressive knowing-too-much on occasion by wondering why, for instance, I know anything — largely against my will, mind you — about the Kardashians and Jenners. This is not the sole domain of celebrities and reality TV folks but indeed anyone who tends to overshare online, typically via social media such as Facebook, less typically in the celebrity news media. Think of digital crowding as the equivalent of seeing something you would really prefer not to have seen, something no amount of figurative eye bleach can erase, something that now simply resides in your mind forever. It’s the bell that can’t be unrung. The crowding aspect is that now everyone’s dirty laundry is getting aired simultaneously, creating pushback and defensive postures.

One might recognize in this the familiar complaint of Too Much Information (TMI), except that the information in question is not the discomfiting stuff such as personal hygiene, medical conditions, or sexual behaviors. Rather, it’s an unexpected over-awareness of everyone’s daily minutiae as news of it presses for attention and penetrates our defenses. Add it to the deluge that is causing some of us to adopt information avoidance.

I see plenty of movies over the course of a year but had not been to a theater since The Force Awakens came out slightly over a year ago. The reason is simple: it costs too much. With ticket prices nearing $15 and what for me had been obligatory popcorn and soda (too much of both the way they’re bundled and sold — ask anyone desperately holding back their pee until the credits roll!), the endeavor climbed to nearly $30 just for one person. Never mind that movie budgets now top $100 million routinely; the movie-going experience simply isn’t worth $30 a pop. Opening weekend crowds (and costumes)? Fuggedaboudit! Instead, I view films at home on DVD (phooey on Blueray) or via a streaming service. Although I admit I’m missing out on being part of an audience, which offers the possibility of being carried away on a wave of crowd emotion, I’m perfectly happy watching at home, especially considering most films are forgettable fluff (or worse) and filmmakers seem to have forgotten how to shape and tell good stories. So a friend dragged me out to see Rogue One, somewhat late after its opening by most standards. Seeing Star Wars and other franchise installments now feels like an obligation just to stay culturally relevant. Seriously, soon enough it will be Fast & Furious Infinitum. We went to a newly built theater with individual recliners and waiters (no concession stands). Are film-goers no longer satisfied by popcorn and Milk Duds? No way would I order an $80 bottle of wine to go with Rogue One. It’s meant to be a premium experience, with everything served to you in the recliner, and accordingly, charges premium prices. Too bad most films don’t warrant such treatment. All this is preliminary to the actual review, of course.

I had learned quite a bit about Rogue One prior to seeing it, not really caring about spoilers, and was pleasantly surprised it wasn’t as bad as some complain. Rogue One brings in all the usual Star Wars hallmarks: storm troopers, the Force, X-Wings and TIE Fighters, ray guns and light sabers, the Death Star, and familiar characters such as Grand Moff Tarkin, Darth Vader, Princess Leia, etc. Setting a story within the Star Wars universe makes most of that unavoidable, though some specific instances did feel like gratuitous fan service, such as the 3-second (if that) appearance of C3PO and R2D2. The appearance of things and characters I already knew about didn’t feel to me like an extra thrill, but how much I needed to already know about Star Wars just to make sense of Rogue One was a notable weakness. Thus, one could call Rogue One a side story, but it was by no means a stand-alone story. Indeed, characters old and new were given such slipshod introductions (or none at all!) that they functioned basically as chess pieces moved around to drive the game forward. Good luck divining their characteristic movements and motivations. Was there another unseen character manipulating everyone? The Emperor? Who knows? Who cares! It was all a gigantic, faceless, pawn sacrifice. When at last the main rebels died, there was no grief or righteousness over having at least accomplished their putative mission. Turns out the story was all about effects, not emotional involvement. And that’s how I felt: uninvolved. It was a fireworks display ending with a pointless though clichéd grand finale. Except I guess that watching a bunch of fake stuff fake blow up was the fake point.

About what passed for a story: the Rebellion learns (somehow?!) that they face total annihilation from a new superweapon called the Death Star. (Can’t remember whether that term was actually used in the film.) While the decision of leadership is to scatter and flee, a plucky band of rebels within the rebellion insist on flinging themselves against the enemy without a plan except to improvise once on site, whereupon leadership decides irrationally to do the same. The lack of strategy is straight out of The Return of the King, distracting the enemy from the true mission objective, but the visual style is more like the opening of Saving Private Ryan, which is to say, full, straight-on bombardment and invasion. Visual callbacks to WWII infantry uniforms and formations couldn’t be more out of place. To call these elements charmless is to give them too much credit. Rather, they’re hackneyed. However, they probably fit well enough within the Saturday-morning cartoon, newsreel, swashbuckler sensibility that informed the original Star Wars films from the 1970s. Problem is, those 1970s kids are grown and want something with greater gravitas than live-action space opera. Newer Star Wars audiences are stuck in permanent adolescence because of what cinema has become, with its superhero franchises and cynical money grabs.

As a teenager when the first trilogy came out, I wanted more of the mystical element — the Force — than I wanted aerial battles, sword fights, or chase scenes. The goofy robots, reluctant heroes, and bizarre aliens were fun, but they were balanced by serious, steady leadership (the Jedi) and a couple really bad-ass villains. While it’s known George Lucas had the entire character arc of Anakin Skywalker/Darth Vader in mind from the start, it’s also fair to say that no one quite knew in Episode 4 just how iconic Vader the villain would become, which is why his story became the centerpiece of the first two trilogies (how many more to come?). However, Anakin/Vader struggled with the light/dark sides of the Force, which resonated with anyone familiar with the angel/demon nomenclature of Christianity. When the Force was misguidedly explained away as Midi-clorians (science, not mysticism), well, the bottom dropped out of the Star Wars universe. At that point, it became a grand WWII analogue populated by American GIs and Nazis — with some weird Medievalism and sci-fi elements thrown in — except that the wrong side develops the superweapon. Rogue One makes that criticism even more manifest, though it’s fairly plain to see throughout the Star Wars films.

Let me single out one actor for praise: Ben Mendelsohn as Orson Krennic. It’s hard for me to decide whether he chews the scenery, upstaging Darth Vader as a villain in the one scene they share, or he’s among a growing gallery of underactors whose flat line delivery and blandness invites viewers to project upon them characterization telegraphed through other mechanisms (costuming, music, plot). Either way, I find him oddly compelling and memorable, unlike the foolish, throwaway, sacrificial band of rebellious rebels against the rebellion and empire alike. Having seen Ben Mendelsohn in other roles, he possesses an unusual screen magnetism that reminds me of Sean Connery. He tends to play losers and villains and be a little one-note (not a bag of tricks but just one trick), but he is riveting on-screen for the right reasons compared to, say, the ookiness of the two gratuitous CGI characters in Rogue One.

So Rogue One is a modestly enjoyable and ephemeral romp through the Star Wars universe. It delivers and yet fails to deliver, which about as charitable as I can be.

Caveat: Apologies for this overlong post, which random visitors (nearly the only kind I have besides the spambots) may find rather challenging.

The puzzle of consciousness, mind, identity, self, psyche, soul, etc. is an extraordinarily fascinating subject. We use various terms, but they all revolve around a unitary property and yet come from different approaches, methodologies, and philosophies. The term mind is probably the most generic; I tend to use consciousness interchangeably and more often. Scientific American has a entire section of its website devoted to the mind, with subsections on Behavior & Society, Cognition, Mental Health, Neurological Health, and Neuroscience. (Top-level navigation offers links to these sections: The Sciences, Mind, Health, Tech, Sustainability, Education, Video, Podcasts, Blogs, and Store.) I doubt I will explore very deeply because science favors the materialist approach, which I believe misses the forest through the trees. However, the presence of this area of inquiry right at the top of the page indicates how much attention and research the mind/consciousness is currently receiving.

A guest blog at Scientific American by Adam Bear entitled “What Neuroscience Says about Free Will” makes the fashionable argument (these days) that free will doesn’t exist. The blog/article is disclaimed: “The views expressed are those of the author(s) and are not necessarily those of Scientific American.” I find that a little weaselly. Because the subject is still wide open to interpretation and debate, Scientific American should simply offer conflicting points of view without worry. Bear’s arguments rest on the mind’s ability to revise and redate experience occurring within the frame of a few milliseconds to allow for processing time, also known as the postdictive illusion (the opposite of predictive). I wrote about this topic more than four years ago here. Yet another discussion is found here. I admit to being irritated that the questions and conclusions stem from a series of assumptions, primarily that whatever free will is must occur solely in consciousness (whatever that is) as opposed to originating in the subconscious and subsequently transferring into consciousness. Admittedly, we use these two categories — consciousness and the subconscious — to account for the rather limited amount of processing that makes it all the way into awareness vs. the significant amount that remains hidden or submerged. A secondary assumption, the broader project of neuroscience in fact, is that, like free will, consciousness is housed somewhere in the brain or its categorical functions. Thus, fruitful inquiry results from seeking its root, seed, or seat as though the narrative constructed by the mind, the stream of consciousness, were on display to an inner observer or imp in what Daniel Dennett years ago called the Cartesian Theater. That time-worn conceit is the so-called ghost in the machine. (more…)

Lingua Nova 02

Posted: August 20, 2016 in Idle Nonsense, Nomenclature, Writing
Tags:

From time to time, I indulge my predilection for nomenclature and neologism. Those collected below are not novel so much as repurposed.

code blue: the closing of ranks and circling of wagons undertaken by police departments in the wake of an officer (or officers) killing an unarmed citizen, usually black, accompanied by the insistence that the officer(s) by definition can do no wrong.

Christian fiction: an alternative narrative promulgated by Christian fundamentalists in a field of inquiry not based on faith and belief.

STEAM: the addition of Arts to STEM fields (Science, Technology, Engineering, Mathematics) to acknowledge the fundamental source of creativity and innovation.

nostalgia mining: the creation of entertainments that guilelessly seek to exploit the resource of fond remembrance of days past.

That’s a short list, to be sure. If you feel cheated, here’s an additional list of words sure to confound any interlocutors: espial, telic, desuetude, fubbed, girandoles, catarrh, avulse, gobbet, diaphanouspipping, panicles, cark, cantilene, fisc, phylactery, princox, and funest. I knew only a few of them before stealing most from something I read and can’t imagine using any in speech unless I’m trying not to communicate.

In my travels and readings upon the Intertubes, which proceed in fits and starts, I stumbled across roughly the same term — The NOW! People — used in completely different contexts and with different meanings. Worth some unpacking for idle consideration.

Meaning and Usage the First: The more philosophical of the two, this refers to those who feel anxiety, isolation, estrangement, disenfranchisement, and alienation from the world in stark recognition of the self-other problem and/or mind-body dualism. They seek to lose their identity and the time-boundedness that goes with being a separate self by entering a mental state characterized by the eternal NOW, much as animals without consciousness are believed to think. Projection forward and back more than a few moments in time is foreclosed; one simply exists NOW! Seminars and YouTube videos on radical nonduality are offers by Tony Parsons, Jim Newman, Andreas Müller, and Kenneth Madden, but according to my source (unacknowledged and unlinked), they readily admit that despite study, meditation, openness, and desire to achieve this state of mind, it is not prone to being triggered. It either happens or it doesn’t. Nonetheless, some experiences and behaviors allow individuals to transcend themselves at least to some degree, such as music, dance, and sex.

Meaning and Usage the Second: The more populist and familiar of the two, this refers to people for whom NOW! is always the proper time to do whatever the hell they most urgently desire with no consideration given to those around them. The more mundane instance is someone stopping in a doorway or on an escalator to check their phones for, oh, I dunno, Facebook updates and new e-mail. A similar example is an automobile driver over whom traffic and parking controls have no effect: someone double-parked (flashers optional) in the middle of the road or in a fire lane, some who executes a U-turn in the middle of traffic, or someone who pointlessly jumps the line in congestion just to get a few cars lengths ahead only to sit in yet more traffic. The same disregard and disrespect for others is evident in those who insist on saving seats or places in line, or on the Chicago L, those who occupy seats with bags that really belong on their laps or stand blocking the doorways (typically arms extended looking assiduously at their phones), making everyone climb past them to board or alight the train. These examples are all about someone commandeering public space as personal space at the anonymous expense of anyone else unfortunate enough to be in the same location, but examples multiply quickly beyond these. Courtesy and other social lubricants be damned! I want what I want right NOW! and you can go pound sand.

Both types of NOW! behavior dissolve the thinking, planning, orchestrating, strategizing mind in favor of narrowing thought and perception to this very moment. The first gives away willfulness and desire in favor of tranquility and contentedness, whereas the second demonstrates single-minded pursuit of a single objective without thought of consequence, especially to others. Both types of NOW! People also fit within the Transhumanist paradigm, which has among its aims leaving behind worldly concerns to float freely as information processors. If I were charitable about The NOW! People, I might say they lose possession of themselves by absorption into a timeless, mindless present; if less charitable, I might say that annihilation of the self (however temporary) transforms them into automatons.

The sole appeal I can imagine to retreating from oneself to occupy the eternal moment, once one has glimpsed, sensed, or felt the bitter loneliness of selfhood, is cessation of suffering. To cross over into selflessness is to achieve liberation from want, or in the Buddhist sense, Nirvana. Having a more Romantic aesthetic, my inclination is instead to go deeper and to seek the full flower of humanity in all its varieties. That also means recognizing, acknowledging, and embracing darker aspects of human experience, and yes, no small amount of discomfort and suffering. Our psycho-spiritual capacity demands it implicitly. But it takes strong character to go toward extremes of light and dark. The NOW! People narrow their range radically and may well be the next phase of human consciousness if I read the tea leaves correctly.

An enduring trope of science fiction is naming of newly imagined gadgets and technologies (often called technobabble with a mixture of humor and derision), as well as naming segments of human and alien societies. In practice, that means renaming already familiar things to add a quasi-futuristic gleam, and it’s a challenge faced by every story that adopts an alternative or futuristic setting: describing the operating rules of the fictional world but with reference to recognizable human characteristics and institutions. A variety of recent Young Adult (YA) fiction has indulged in this naming and renaming, some of which have been made into movies, mostly dystopic in tone, e.g., the Hunger Games tetralogy, the Twilight saga, the Harry Potter series, the Maze Runner, and the Divergent trilogy. (I cite these because, as multipart series, they are stronger cultural touchstones, e.g., Star Wars, than similar once-and-done adult cinematic dystopias, e.g., Interstellar and ElyseumStar Trek is a separate case, considering how it has devolved after being rebooted from its utopian though militaristic origins into a pointless series of action thrillers set in space.) Some exposition rises to the level of lore but is mostly mere scene-setting removed slightly from our own reality. Similar naming schemes are used in cinematic universes borne out of comic books, especially character names, powers, and origins. Because comic book source material is extensive, almost all of it becomes lore, which is enjoyed by longtime children initiates into the alternate universes created by the writers and illustrators but mildly irritating to adult moviegoers like me.

History also has names for eras and events sufficiently far back in time for hindsight to provide a clear vantage point. In the U.S., we had the Colonial Era, the Revolutionary Period, The Frontier Era and Wild West, the Industrial/Mechanical Age, Modernism, and Postmodernism, to name a few but by no means all. Postmodernism is already roughly 40 years old, yet we have not yet named the era in which we now live. Indeed, because we’re the proverbial fish inside the fishbowl, unable to recognize the water in which we swim, the contemporary moment may have no need of naming, now or at any given time. That task awaits those who follow. We have, however, given names to the succession of generations following the Baby Boom. How well their signature characteristics fit their members is the subject of considerable debate.

As regular readers of this blog already know, I sense that we’re on the cusp of something quite remarkable, most likely a hard, discontinuous break from our recent past. Being one of the fish in the bowl, I probably possess no better understanding of our current phase of history than the next. Still, if had to choose one word to describe the moment, it would be dissolution. My 4-part blog post about dissolving reality is one attempt to provide an outline. A much older post called aged institutions considers the time-limited effectiveness of products of human social organization. The grand question of our time might be whether we are on the verge of breaking apart or simply transitioning into something new — will it be catastrophe or progress?

News this past week of Britain’s exit from the European Union may be only one example of break-up vs. unity, but the drive toward secession and separatism (tribal and ideological, typically based on bogus and xenophobic identity groups constantly thrown in our faces) has been gaining momentum even in the face of economic globalization (collectivism). Scotland very nearly seceded from the United Kingdom last year; Quebec has had multiple referenda about seceding from Canada, none yet successful; and Vermont, Texas, and California have all flirted with secession from the United States. No doubt some would argue that such examples of dissolution, actual or prospective, are actually transitional, meaning progressive. And perhaps they do in fact fulfill the need for smaller, finer, localized levels of social organization that many have argued are precisely what an era of anticipated resource scarcity demands. Whether what actually manifests will be catastrophe (as I expect it will) is, of course, what history and future historians will eventually name.

While I’m revisiting old posts, the term digital exhaust came up again in a very interesting article by Shoshana Zuboff called “The Secrets of Surveillance Capitalism,” published in Frankfurter Allgemeine in March 2016. I no longer remember how I came upon it, but its publication in a obscure (to Americans) German newspaper (German and English versions available) was easily located online with a simple title search. The article has certainly not gone viral the way social media trends work, but someone obviously picked it up, promoted it, and raised it to the level of awareness of lots of folks, including me.

My earlier remarks about digital exhaust were that the sheer volume of information produced and exchanged across digital media quickly becomes unmanageable, with the result that much of it becomes pointless ephemera — the useless part of the signal-to-noise ratio. Further, I warned that by turning our attention to digital sources of information of dubious value, quality, and authority, we face an epistemological shift that could take considerable hindsight to describe accurately. That was 2007. It may not yet be long enough to fully understand effect(s) too well, or to crystallize the moment (to reuse my pet phrase yet again), but the picture is already clarifying somewhat terrifyingly.

(more…)

I don’t watch political debates. Being of sound mind and reason, I’m not part of the target audience. However, I do catch murmurs of the debates from time to time. Because torture is a sore subject with me, this excerpt (full transcript here) from the Feb. 6 debate moderated by World News Tonight anchor David Muir perked up my ears:

MUIR: … we’re going to stay on ISIS here and the war on terror, because as you know, there’s been a debate in this country about how to deal with the enemy and about enhanced interrogation techniques ever since 9/11.

So Senator Cruz, you have said, quote, “torture is wrong, unambiguously, period. Civilized nations do not engage in torture.” Some of the other candidates say they don’t think waterboarding is torture. Mr. Trump has said, I would bring it back. Senator Cruz, is waterboarding torture?

CRUZ: Well, under the definition of torture, no, it’s not. Under the law, torture is excruciating pain that is equivalent to losing organs and systems, so under the definition of torture, it is not. It is enhanced interrogation, it is vigorous interrogation, but it does not meet the generally recognized definition of torture.

MUIR: If elected president, would you bring it back?

CRUZ: I would not bring it back in any sort of widespread use. And indeed, I joined with Senator McCain in legislation that would prohibit line officers from employing it because I think bad things happen when enhanced interrogation is employed at lower levels.

But when it comes to keeping this country safe, the commander in chief has inherent constitutional authority to keep this country safe. And so, if it were necessary to, say, prevent a city from facing an imminent terrorist attack, you can rest assured that as commander in chief, I would use whatever enhanced interrogation methods we could to keep this country safe.

Cruz is obviously squirming to avoid answering the simple questions directly and unambiguously. Whose definition has Cruz cited? Certainly not one of these. Another page at the previous link says plainly that waterboarding is “torture plus” precisely because of its ability to inflict “unbearable suffering with minimal evidence” repeatedly. Relying on some unsubstantiated definition to keep waterboarding among available interrogation options and then invoking the ticking time bomb scenario is callous and inhumane. Cruz is unfit as a presidential candidate for lots of reasons, but his stance on torture is an automatic disqualification for me.

Muir then turns the same question(s) over to Trump:

MUIR: Senator Cruz, thank you. Mr. Trump, you said not only does it work, but that you’d bring it back.

TRUMP: Well, I’ll tell you what. In the Middle East, we have people chopping the heads off Christians, we have people chopping the heads off many other people. We have things that we have never seen before — as a group, we have never seen before, what’s happening right now.

The medieval times — I mean, we studied medieval times — not since medieval times have people seen what’s going on. I would bring back waterboarding and I’d bring back a hell of a lot worse than waterboarding.

Trump, in contrast to Cruz, doesn’t squirm at all (though he does struggle to complete a sentence, resorting instead to a stammering, repetitive word salad no one seems to mind). Instead, he goes full war criminal without hesitation (though at this point in time it’s only postulated). Trump’s polarizing, inflammatory style has earned him both severe disapprobation and earnest support. Like Cruz, Trump has a variety of automatic disqualifications as a presidential candidate. My thinking is that, even though I can’t peer into his mind and guess his true motivations (which may be as obvious as they appear) or anticipate his behavior should he attain office, his moral judgment vis-à-vis torture (and frankly, most other topics as well) is so impaired that I don’t trust him as a playground monitor.

In narrative, there are four essential types of conflict:

  1. man against man
  2. man against society
  3. man against nature
  4. man against self

One might argue that Cruz, Trump, and their supporters who applaud “get tough” rhetoric (add Hillary Clinton to this group) fall into the first category, ever battling enemies like besieged heroes. I would argue they fall into the fourth as well, battling their own inhumanity, though there is a notable lack of wrestling with anything approaching a conscience. But in truth, debate over torture might better be categorized as man against everything, considering who and what is destroyed even by entertaining the fantasy of torturing others. Some still argue that a strategic advantage can be retained using torture, whereas Trump (always the extremist) merely relishes the possibility of obliterating others. However, we become monsters by keeping the option alive.