Posts Tagged ‘Book Blogging’

Here’s the last interesting bit I am lifting from Anthony Gidden’s The Consequences of Modernity. Then I will be done with this particular book-blogging project. As part of Gidden’s discussion of the risk profile of modernity, he characterizes risk as either objective or perceived and further divides in into seven categories:

  1. globalization of risk (intensity)
  2. globalization of risk (frequency)
  3. environmental risk
  4. institutionalized risk
  5. knowledge gaps and uncertainty
  6. collective or shared risk
  7. limitations of expertise

Some overlap exists, and I will not distinguish them further. The first two are of primary significance today for obvious reasons. Although the specter of doomsday resulting from a nuclear exchange has been present since the 1950s, Giddens (writing in 1988) provides this snapshot of today’s issues:

The sheer number of serious risks in respect of socialised nature is quite daunting: radiation from major accidents at nuclear power-stations or from nuclear waste; chemical pollution of the seas sufficient to destroy the phytoplankton that renews much of the oxygen in the atmosphere; a “greenhouse effect” deriving from atmospheric pollutants which attack the ozone layer, melting part of the ice caps and flooding vast areas; the destruction of large areas of rain forest which are a basic source of renewable oxygen; and the exhaustion of millions of acres of topsoil as a result of widespread use of artificial fertilisers. [p. 127]

As I often point out, these dangers were known 30–40 years ago (in truth, much longer), but they have only worsened with time through political inaction and/or social inertia. After I began to investigate and better understand the issues roughly a decade ago, I came to the conclusion that the window of opportunity to address these risks and their delayed effects had already closed. In short, we’re doomed and living on borrowed time as the inevitable consequences of our actions slowly but steadily manifest in the world.

So here’s the really interesting part. The modern worldview bestows confidence borne out of expanding mastery of the built environment, where risk is managed and reduced through expert systems. Mechanical and engineering knowledge figure prominently and support a cause-and-effect mentality that has grown ubiquitous in the computing era, with its push-button inputs and outputs. However, the high modern outlook is marred by overconfidence in our competence to avoid disaster, often of our own making. Consider the abject failure of 20th-century institutions to handle geopolitical conflict without devolving into world war and multiple genocides. Or witness periodic crashes of financial markets, two major nuclear accidents, and numerous space shuttles and rockets destroyed. Though all entail risk, high-profile failures showcase our overconfidence. Right now, engineers (software and hardware) are confident they can deliver safe self-driving vehicles yet are blithely ignoring (says me, maybe not) major ethical dilemmas regarding liability and technological unemployment. Those are apparently problems for someone else to solve.

Since the start of the Industrial Revolution, we’ve barrelled headlong into one sort of risk after another, some recognized at the time, others only apparent after the fact. Nuclear weapons are the best example, but many others exist. The one I raise frequently is the live social experiment undertaken with each new communications technology (radio, cinema, telephone, television, computer, social networks) that upsets and destabilizes social dynamics. The current ruckus fomented by the radical left (especially in the academy but now infecting other environments) regarding silencing of free speech (thus, thought policing) is arguably one concomitant.

According to Giddens, the character of modern risk contrasts with that of the premodern. The scale of risk prior to the 17th century was contained and expectation of social continuity was strong. Risk was also transmuted through magical thinking (superstition, religion, ignorance, wishfulness) into providential fortuna or mere bad luck, which led to feelings of relative security rather than despair. Modern risk has now grown so widespread, consequential, and soul-destroying, situated at considerable remove leading to feelings of helplessness and hopelessness, that those not numbed by the litany of potential worries afflicting daily life (existential angst or ontological insecurity) often develop depression and other psychological compulsions and disturbances. Most of us, if aware of globalized risk, set it aside so that we can function and move forward in life. Giddens says that this conjures up anew a sense of fortuna, that our fate is no longer within our control. This

relieves the individual of the burden of engagement with an existential situation which might otherwise be chronically disturbing. Fate, a feeling that things will take their own course anyway, thus reappears at the core of a world which is supposedly taking rational control of its own affairs. Moreover, this surely exacts a price on the level of the unconscious, since it essentially presumes the repression of anxiety. The sense of dread which is the antithesis of basic trust is likely to infuse unconscious sentiments about the uncertainties faced by humanity as a whole. [p. 133]

In effect, the nature of risk has come full circle (completed a revolution, thus, revolutionized risk) from fate to confidence in expert control and back to fate. Of course, a flexibility of perspective is typical as situation demands — it’s not all or nothing — but the overarching character is clear. Giddens also provides this quote by Susan Sontag that captures what he calls the low-probability, high-consequence character of modern risk:

A permanent modern scenario: apocalypse looms — and it doesn’t occur. And still it looms … Apocalypse is now a long-running serial: not ‘Apocalypse Now,’ but ‘Apocalypse from now on.’ [p. 134]


Here’s another interesting tidbit from Anthony Giddens’ book The Consequences of Modernity, which is the subject of a series of book blogs I’ve been writing. In his discussion of disembedding mechanisms, he introduces the idea of civil inattention (from Goffman, actually). This has partly to do with presence or absence (including inattention) in both public and private settings where face-to-face contact used to be the only option but modern technologies have opened up the possibility of faceless interactions over distance, such as with the telegraph and telephone. More recently, the face has been reintroduced with videoconferencing, but nonverbal cues such as body language are largely missing; the fullness of communication remains attenuated. All manner of virtual or telepresence are in fact cheap facsimiles of true presence and the social cohesion and trust enabled by what Giddens calls facework commitments. Of course, we delude ourselves that interconnectivity mediated by electronics is a reasonable substitute for presence and attention, which fellow blogger The South Roane Agrarian bemoans with this post.

Giddens’ meaning is more specific than this, though. The inattention of which Giddens writes is not the casual distraction of others with which we all increasingly familiar. Rather, Giddens takes note social behaviors embedded in deep culture having to do with signalling trust.

Two people approach and pass one another on a city sidewalk. What could be more trivial and uninteresting? … Yet something is going on here which links apparently minor aspects of bodily management to some of the most pervasive features of modernity. The “inattention” displayed is not indifference. Rather it is a carefully monitored demonstration of what might be called polite estrangement. As the two people approach one another, each rapidly scans the face of the other, looking away as they pass … The glance accords recognition of the other as an agent and as a potential acquaintance. Holding the gaze of the other only briefly, then looking ahead as each passes the other couples such an attitude with an implicit reassurance of lack of hostile intent. [p. 81]

It’s a remarkably subtle interaction: making eye contact to confirm awareness of another but then averting one’s eyes to establish that copresence poses no particular threat in either direction. Staring too fixedly at another communicates something quite else, maybe fear or threat or disapprobation. By denying eye contact — by keeping one’s eyes buried in a handheld device, for instance — the opportunity to establish a modicum of trust between strangers is missed. Intent (or lack thereof) is a mystery. In practice, such modern-day inattention is mere distraction, not a sign of malevolence, but the ingrained social cue is obviated and otherwise banal happenstances become sources of irritation, discomfort, and/or unease, as with someone who doesn’t shake hands or perform others types of greeting properly.

I wrote before about my irritation with others face-planted in their phones. It is not a matter of outright offense but rather a quiet sense of affront at failure to adopt accepted social behaviors (as I once did). Giddens puts it this way:

Tact and rituals of politeness are mutual protective devices, which strangers or acquaintances knowingly use (mostly on the level of practical consciousness) as a kind of implicit social contact. Differential power, particularly where it is very marked, can breach or skew norms …. [pp. 82–83]

That those social behaviors have adapted to omnipresent mobile media, everyone pacified or hypnotized within their individual bubbles, is certainly not a salutary development. It is, however, a clear consequence of modernity.

Another modest surprise (to me at least) offered by Anthony Giddens (from The Consequences of Modernity) follows a discussion of reflexivity (what I call recursion when discussing consciousness), which is the dynamic of information and/or knowledge feeding back to influence later behavior and information/knowledge. His handy example is the populace knowing divorce rates, which has an obvious influence on those about to get married (but may decide to defer or abjure entirely). The surprise is this:

The discourse of sociology and the concepts, theories, and findings of the other social sciences continually “circulate in and out” of what it is that they are about. In so doing they reflexively restructure their subject matter, which itself has learned to think sociologically … Much that is problematic in the position of the professional sociologist, as the purveyor of expert knowledge about social life, derives from the fact that she or he is at most one step ahead of enlightened lay practitioners of the discipline. [p. 43]

I suppose “enlightened lay practitioners” are not the same as the general public, which I hold in rather low esteem as knowing (much less understanding) anything of value. Just consider electoral politics. Still, the idea that an expert in an academic field admits he is barely ahead of wannabes (like me) seems awfully damning. Whereas curious types will wade in just about anywhere, and in some cases, amateurs will indulge themselves enthusiastically in endeavors also practiced by experts (sports and music are the two principal examples that spring to mind), the distance (in both knowledge and skill) between experts and laypersons is typically quite far. I suspect those with high intellect and/or genetic gifts often bridge that gap, but then they join the ranks of the experts, so the exception leads nowhere.

I revisit my old blog posts when I see some reader activity in the WordPress backstage and was curious to recall a long quote of Iain McGilchrist summarizing arguments put forth by Anthony Giddens in his book Modernity and Self-identity (1991). Giddens had presaged recent cultural developments, namely, the radicalization of nativists, supremacists, Social Justice Warriors (SJWs), and others distorted by absorbed in identity politics. So I traipsed off to the Chicago Public Library (CPL) and sought out the book to read. Regrettably, CPL didn’t have a copy, so I settled on a slightly earlier book, The Consequences of Modernity (1990), which is based on a series of lectures delivered at Stanford University in 1988.

Straight away, the introduction provides a passage that goes to the heart of matters with which I’ve been preoccupied:

Today, in the late twentieth century, it is argued by many, we stand at the opening of a new era … which is taking us beyond modernity itself. A dazzling variety of terms has been suggested to refer to this transition, a few of which refer positively to the emergence of a new type of social system (such as the “information society” or the “consumer society”) but most of which suggest rather than a preceding state of affairs is drawing to a close … Some of the debates about these matters concentrate mainly upon institutional transformations, particularly those which propose that we are moving from a system based upon the manufacture of material goods to one concerned more centrally with information. More commonly, however, those controversies are focused largely upon issues of philosophy and epistemology. This is the characteristic outlook, for example, of the the author who has been primarily responsible for popularising the notion of post-modernity, Jean-François Lyotard. As he represents it, post-modernity refers to a shift away from attempts to ground epistemology and from from faith in humanly engineered progress. The condition of post-modernity is distinguished by an evaporating of the “grand narrative” — the overarching “story line” by means of which we are placed in history as being having a definite past and a predictable future. The post-modern outlook sees a plurality of heterogeneous claims to knowledge, in which science does not have a privileged place. [pp. 1–2, emphasis added]

That’s a lot to unpack all at once, but the fascinating thing is that notions now manifesting darkly in the marketplace of ideas were already in the air in the late 1980s. Significantly, this was several years still before the Internet brought the so-called Information Highway to computer users, before the cell phone and smart phone were developed, and before social media displaced traditional media (TV was only 30–40 years old but had previously transformed our information environment) as the principal way people gather news. I suspect that Giddens has more recent work that accounts for the catalyzing effect of the digital era (including mobile media) on culture, but for the moment, I’m interested in the book in hand.

Regular readers of this blog (I know of one or two) already know my armchair social criticism directed to our developing epistemological crisis (challenges to authority and expertise, psychotic knowledge, fake news, alternative facts, dissolving reality, and science denial) as well as the Transhumanist fantasy of becoming pure thought (once we evolve beyond our bodies). Until that’s accomplished with imagined technology, we increasingly live in our heads, in the abstract, disoriented and adrift on a bewildering sea of competing narratives. Moreover, I’ve stated repeatedly that highly mutable story (or narrative) underlie human cognition and consciousness, making most of us easy marks for charismatic thought leaders storytellers. Giddens was there nearly 30 years ago with these same ideas, though his terms differ.

Giddens dispels the idea of post-modernity and insists that, from a sociological perspective, the current period is better described as high modernism. This reminds me of Oswald Spengler and my abandoned book blogging of The Decline of the West. It’s unimportant to me who got it more correct but note that the term Postmodernism has been adopted widely despite its inaccuracy (at least according to Giddens). As I get further into the book, I’ll have plenty more to say.

I picked up a copy of Daniel Siegel’s book Mind: A Journey to the Heart of Being Human (2017) to read and supplement my ongoing preoccupation with human consciousness. Siegel’s writing is the source of considerable frustration. Now about 90 pp. into the book (I am considering putting it aside), he has committed several grammatical errors (where are book editors these days?), doesn’t really know how to use a comma properly, and doesn’t write in recognizable paragraph form. He has a bad habit of posing questions to suggest the answers he wants to give and drops constant hints of something soon to be explored like news broadcasts that tease the next segment. He also deploys a tired, worn metaphor that readers are on a journey of discovery with him, embarked on a path, exploring a subject, etc. Yecch. (A couple Amazon reviews also note that grayish type on parchment (cream) paper poses a legibility problem due to poor contrast even in good light — undoubtedly not really Siegel’s fault.)

Siegel’s writing is also irritatingly circular, casting and recasting the same sentences in repetitious series of assertions that have me wondering frequently, “Haven’t I already read this?” Here are a couple examples:

When energy flows inside your body, can you sense its movement, how it changes moment by moment?

then only three sentences later

Energy, and energy-as-information, can be felt in your mental experience as it emerges moment by moment. [p. 52]

Another example:

Seeing these many facets of mind as emergent properties of energy and information flow helps link the inner and inter aspect of mind seamlessly.

then later in the same paragraph

In other words, mind seen this way could be in what seems like two places at once as inner and inter are part of one interconnected, undivided system. [p. 53]

This is definitely a bug, not a feature. I suspect the book could easily be condensed from 330 pp. to less than 200 pp. if the writing weren’t so self-indulgent of the author. Indeed, while I recognize a healthy dose of repetition is an integral part of narrative form (especially in music), Siegel’s relentless repetition feels like propaganda 101, where guileless insistence (of lies or merely the preferred story one seeks to plant in the public sphere) wears down the reader rather than convinces him or her. This is also marketing 101 (e.g., Coca-Cola, McDonald’s, Budweiser, etc. continuing to advertise what are by now exceedingly well-established brands).


I finished Graham Hancock’s Fingerprints of the Gods (1995). He saved the best part of the book, an examination of Egyptian megalithic sites, for the final chapters and held back his final conclusion — more conjecture, really — for the tail end. The possible explanation Hancock offers for the destruction and/or disappearance of a supposed civilization long predating the Egyptian dynasties, the subject of the entire book, is earth-crust displacement, a theory developed by Charles Hapgood relating to polar shifts. Long story short, evidence demonstrates that the Antarctic continent used to be 2,000 miles away from the South Pole (about 30° from the pole) in a temperate zone and may have been, according to Hancock, the home of a seafaring civilization that had traveled and mapped the Earth. It’s now buried under ice. I find the explanation plausible, but I wonder how much the science and research has progressed since the publication of Fingerprints. I have not yet picked up Magicians of the Gods (2015) to read Hancock’s update but will get to it eventually.

Without having studied the science, several competing scenarios exist regarding how the Earth’s crust, the lithosphere, might drift, shift, or move over the asthenosphere. First, it’s worth recognizing that the Earth’s rotational axis defines the two poles, which are near but not coincident with magnetic north and south. Axial shifts are figured in relation to the crust, not the entire planet (crust and interior). From a purely geometric perspective, I could well imagine the crust and interior rotating as different speeds, but since we lack more than theoretical knowledge of the Earth’s liquid interior (the inner core is reputedly solid), only the solid portions at the surface of the sphere offer a useful frame of reference. The liquid surfaces (oceans, seas) obviously flow, too, but are also understood primarily in relation to the solid crust both below and above sea level.

The crust could wander slowly and continuously, shift all at once, or some combination of both. If all at once, the inciting event might be a sudden change in magnetic stresses that breaks the entire lithosphere loose or perhaps a gigantic meteor hit that knocks the planet as a whole off its rotational axis. Either would be catastrophic for living things that are suddenly moved into a substantially different climate. Although spacing of such events is unpredictable and irregular, occurring in geological time, Hancock assembles considerable evidence to conclude that the most recent such occurrence was probably about 12,000 BCE at the conclusion of the last glacial maximum or ice age. This would have been well within the time humans existed on Earth but long enough ago in our prehistory that human memory systems record events only as unreliable myth and legend. They are also recorded in stone, but we have yet to decipher their messages fully other than to demonstrate that significant scientific knowledge of astronomy and engineering were once possessed by mankind but was lost until redeveloped during the last couple of centuries.

As I read into Fingerprints of the Gods by Graham Hancock and learn more about antiquity, it becomes clear that weather conditions on Earth were far more hostile then (say, 15,000 years ago) than now. Looking way, way back into millions of years ago, scientists have plotted global average temperature and atmospheric carbon, mostly using ice cores as I understand it, yielding this graph:


I’ve seen this graph before, which is often used by climate change deniers to show a lack of correlation between carbon and temperature. That’s not what concerns me. Instead, the amazing thing is how temperature careens up and down quickly (in geological time) between two limits, 12°C and 22°C, and forms steady states known at Ice Age Earth and Hot House Earth. According to the graph, we’re close to the lower limit. It’s worth noting that because of the extremely long timescale, the graph is considerably smoothed.


Every blog post I write suffers from the same basic problem: drawing disparate ideas together in succinct, cogent form that expresses enough of the thesis to make sense while leaving room for commentary, discussion, and development. Alas, commentary and discussion are nearly nonexistent, but that’s always been my expectation and experience given my subjects. When expanding a blog into several parts, the greatest risk is that ideas fail to coalesce legibly, compounded by the unlikelihood that readers who happen to navigate here will bother to read all the parts. (I suspect this is due in part to most readers’ inability to comprehend complex, multipart writing, as discussed in this blog post by Ugo Bardi describing surprising levels of functional illiteracy.) So this addendum to my three-part blog on Dissolving Reality is doomed, like the rest of my blog, to go unread and ignored. Plus ça change

Have you had the experience of buying a new model of vehicle and suddenly noticed other vehicles of the same model on the road? That’s what I’ve been noticing since I hatched my thesis (noting with habitual resignation that there is nothing new under the sun), which is that the debased information environment now admits multiple interpretations of reality, none of which can lay exclusive claim to authority as an accurate account. Reality has instead dissolved into a stew of competing arguments, often extremely politicized, which typically appeal to emotion. Historically, the principal conflict was between different ways of knowing exemplified by faith and reason, perhaps better understood as the church (in the West, the Catholic Church) vs. science. Floodgates have now opened to any wild interpretation one might concoct, all of which coexist on roughly equal footing in the marketplace of ideas. (more…)

Continuing from part 1 and part 2, let me add one further example of how meaning is reversed under the Ironic perspective. At my abandoned group blog, Creative Destruction, which garners more traffic than The Spiral Staircase despite being woefully out of date, the post that gets the most hits argues (without irony) that, in the Star Wars universe, the Empire represents the good guys and the Jedi are the terrorists despite the good vs. evil archetypes being almost cartoonishly drawn, with the principal villain having succumbed to the dark side only to be redeemed by his innate goodness in the 11th hour. The reverse argument undoubtedly has some merit, but it requires overthinking and outsmarting oneself to arrive at the backwards conclusion. A similar dilemma of competing perspectives is present in The Avengers, where Captain America is unconflicted in his all-American goodness and straightforward identification of villainy but is surrounded by other far-too-clever superheroes who overanalyze (snarkily so), cannot agree on strategy, and/or question motivations and each others’ double or triple agency. If I understand correctly, this plot hook is the basis for the civil war among allies in the next Avengers movie.

The Post-Ironic takes the reversal of meaning and paradoxical retention of opposites that characterizes the Ironic and expands issues from false dualisms (e.g., either you’re with us or against us) to multifaceted free-for-alls where anyone’s wild interpretation of facts, events, policy, and strategy has roughly equal footing with another’s precisely because no authority exists to satisfy everyone as to the truth of matters. The cacophony of competing viewpoints — the multiplicity of possible meanings conjured from any collection of evidence — virtually guarantees that someone out there (often someone loony) will speak as though reading your mind. Don’t trust politicians, scientists, news anchors, pundits, teachers, academics, your parents, or even the pope? No problem. Just belly up to the ideological buffet and cherry pick choose from any of a multitude of viewpoints, few of which have much plausibility. But no matter: it’s a smorgasbord of options, and almost none of them can be discarded out of hand for being too beyond the pale. All must be tried and entertained.

One of the themes of this blog is imminent (i.e., occurring within the lifetimes of most readers) industrial collapse resulting from either financial collapse or loss of habitat for humans (or a combination of factors). Either could happen first, but my suspicion is that financial collapse will be the lit fuse leading to explosion of the population bomb. Collapse is quite literally the biggest story of our time despite its being prospective. However, opinion on the matter is loose, undisciplined, and ranges all over the map. Consensus within expert bodies such as the Intergovernmental Panel on Climate Change, assembled specifically to study climate change and reports its findings, ought to put an end to controversy, yet waters have been so muddied by competing narratives that credulous folks, if they bother paying attention at all, can’t really tell whom to believe. It doesn’t help that even well-educated folks, including many professionals, often lack critical thinking skill with which to evaluate evidence. So instead, wishy-washy emotionalism and psychological vulnerability awards hearts and minds to the most charismatic storyteller, not the truth-teller.

Perhaps the best instance of multiple meanings being simultaneously present and demanding consideration is found in the game of poker, which has become enormously popular in the past decade. To play the game effectively, one must weigh the likelihood and potential for any one of several competing narratives based on opponents’ actions. Mathematical analysis and intuition combine to recommend which scenario is most likely true and whether the risk is worth it (pot odds). If, for just one example, an opponent bets big at any point in the poker hand, several scenarios that must be considered:

  • the opponent has made his hand and cannot be beaten (e.g., nut flush, full house)
  • the opponent has a dominating hand and can be beaten only if one draws to make a better hand (e.g., top pair with high kicker or two pair)
  • the opponent has not yet fully made his hand and is on a draw (open-ended straight or four cards to a flush)
  • the opponent has a partial or weak hand and is bluffing at the pot

Take note that, as with climate change, evaluation in poker is prospective. Sometimes an opponent’s betting strategy is discovered in a showdown where players must reveal their cards; but often, one player or another mucks or folds and the actual scenario is undisclosed. The truth of climate change, until the future manifests, is to some tantalizingly unknown and contingent, as though it could be influenced by belief, hope, and/or faith. To rigorous thinkers, however, the future is charted for us with about the same inevitability as the sun rising in the morning — the biggest remaining unknown being timing.

Habitual awareness of multiple, competing scenarios extends well beyond table games and climate change. In geopolitics, the refusal to rule out the nuclear option, even when it would be completely disproportionate to a given provocation, is reckless brinkmanship. The typical rhetoric is that, like fighting dogs, any gesture of backing down would be interpreted as a display of submission or weakness and thus invite attack. So is the provocation or the response a bluff, a strong hand, or both? Although it is difficult to judge how U.S. leadership is perceived abroad (since I’m inside the bubble), the historical record demonstrates that the U.S. never hesitates to get mixed up in military action and adopts overweening strategies to defeat essentially feudal societies (e.g., Korea and Vietnam). Never mind that those strategies have been shown to fail or that those countries represented no credible threat to the U.S. Our military escapades in the 21st century are not so divergent, with the perception of threats being raised well beyond their true proportions relative to any number of health and social scourges that routinely kill many more Americans than terrorism ever did.

Because this post is already running long, conclusions will be in an addendum. Apologies for the drawn out posts.

Continuing from part 1, the Ironic is characterized by (among other things) reversal of meaning, sometimes understood as the unexpected manifested but more commonly as sarcasm. The old joke goes that in pompous, authoritarian fashion, the language/semiotics professor says to his class of neophytes, “In many languages, a double negative equals a positive, but in no language does a double positive make a negative.” In response, a student mutters under his breath, “yeah, right ….” Up to a certain age and level of cognitive development, children don’t process sarcasm; they are literal-minded and don’t understand subtext. Transcripts and text (e.g., blog posts and comments) also typically fail to transmit nonverbal cues that one may be less than earnest making certain statements. Significantly, no one is allowed to make offhand jokes in line at security checkpoints because, in that context, remarks such as “yeah, like my shoes are full of C4” are treated quite literally.

I have a vague memory of the period in my adolescence when I discovered sarcasm, at which time it was deployed almost continuously, saying the opposite of what I meant with the expectation that others (older than me) would understand the implied or latent meaning. I also adopted the same mock abuse being used elsewhere, which regrettably lasted into my late 20s. Maybe it’s a phase everyone must go through, part of growing up, and as a society, our cultural development must also pass through that phase, though I contend we remain mired in irony or ironic posturing.

The model for me was insult comedy, still in style now but more familiar from my childhood. Like most during this developmental phase, I accepted the TV as social tutor for how people communicate and what’s acceptable to say. So who can blame me or other children, fed a diet of snark and attitude (adult writers of TV shows being a lot more clever than the adolescent actors who voice the lines) from speaking the same way? But to appreciate irony more directly, consider the comedian (then and now) who levies criticism using clichés drawn from his or her own gender, race, religion, social class, etc. In comedy, sexism, racism, and class conflict are not just joke fodder but stereotyped bigotry that reinforces the very scourges they ostensibly criticize. Oh, sure, the jokes are often funny. We all know to laugh at the black comedian who trades nonstop in nigger jokes or the female who complains of being nothing more than an object for male titillation. Comedians (and special interest groups — minority or not — that lay claim to victimhood) may coopt the language of their oppressors (some actual, some imagined — see for instance those complaining about the War on Christmas), but the language and attitudes are broken down and reinforced at the same time.

This isn’t solely the domain of comedy, either. Whereas TV sitcoms are ruled by hip, ironic posturing — the show about nothing that plumbs the surprising depths inside everything trivial, banal, and inane, the show full of nerd archetypes who rise above their inherent nerdiness to be real people worthy of respect (or not surprisingly, not so worthy after all), or the endless parade of sitcom families with unrealistically precocious, smart aleck kids who take aim at everyone with a continuous stream of baleful insults, take-downs, and mockery but are, despite truly cretinous behavior, always forgiven (or passed over because another joke is imminent) and still lovable — in the virtual world (the Internet, where you are reading this), sarcasm, snark, irony, abuse, and corrosive jokiness are legion. Take, for instance, this video at and tell me there isn’t something deeply wrong with it:

One might wonder whether the intent is interdiction or recruitment (or both at once), especially if one acknowledges that most of the awful things depicted in the video are precisely what the U.S. military has been doing in the Middle East for well over a decade. The Fox News blurb linked below the video says, “The State Department is launching a tough and graphic propaganda counteroffensive against the Islamic State, using some of the group’s own images of barbaric acts against fellow Muslims to undercut its message.” Maybe the word propaganda is a mistake and publicity was intended, but I suspect that propaganda is the right word precisely because it’s understood as both pejorative and superlative. As with everything else, meaning has become polysemous.

Iain McGilchrist illustrates this with special emphasis on the arts and how substitution of symbolic tokens normalizes distortion. For instance, art theory of the Aesthetes contains a fundamental paradox:

The Aesthetes’ creed of ‘art for art’s sake’, while it sounds like an elevation of the value of art, in that it denies that it should have an ulterior purpose beyond itself — so far so good — is also a devaluation of art, in that it marginalizes its relationship with life. In other words it sacrifices the betweenness of art with life, instead allowing art to become self-reflexively fulfilled. There is a difference between the forlorn business of creating ‘art for art’s sake’, and art nonetheless being judged solely ‘as art’, not as for another purpose. [p. 409]

Isolating artistic creation in a mental or virtual transactional space ought to be quite familiar (or perhaps more accurately, assumed and thus invisible) to 21st-century people, but it was a new concept at the outset of the 20th century. The paradox is that the doctrine is both a reversal of meaning and retention of opposites together. Over the course of the 20th century, we became habituated to such thinking, namely, that a thing automatically engenders its opposite and is both things at once. For instance, what used to be called the War on Poverty, meant to help those suffering deprivation, is now also its reverse: literally a war on the poverty-stricken. Similarly, the War on Drugs, meant to eradicate drug use as a social ill, is also quite literally a war against drug users, who are a large and improper part of the bloated U.S. prison population. Reduction of government services to the poor and rampant victim-blaming demonstrate that programs once meant to assist those in need now often instead leave them to fend for themselves, or worse, pile on with criminal charges. Disinformation campaign about welfare cheats and the minimum wage are further examples of information being distorted and used to serve an unwholesome agenda.

My conclusion is not yet ready to be drawn; it’s far too subtle to fit in a Tweet or even a series of blog posts. However, consider what it means when the language we use is laden with ironic twists that force recipients of any message to hold simultaneously forward/backward, up/down, left/right, and true/false meanings. Little can be established beyond reasonable doubt not just because so many of us have been poorly served by educational institutions (or is it the students themselves — sort of a chicken-and-egg question) more interested in business and credentialing than teaching and learning that few possess the ability to assess and evaluate information (ironically, from a variety of perspectives) being spun and spoon fed to us by omnimedia but because the essential underlying structure of language and communications has been corrupted by disembedding, decontextualization, and deconstruction that relegate reality to something to be dreamt up and then used to convince others. In the end, of course, we’re only fooling ourselves.