Posts Tagged ‘Communications’

I saw something a short while back that tweaked my BS meter into the red: the learning pyramid. According to research done by The NTL Institute for Applied Behavioral Science in the 1960s (… behaviorists, ugh) and reported here (among other places), there are widely disparate rates of knowledge retention across different passive and active teaching methods:

learning-pyramid-synap-2

Let me state first something quite obvious: learning and retention (memory) aren’t the same things. If one seeks sheer retention of information as a proxy for learning, that’s a gross misunderstanding of both cognition and learning. For example, someone who has managed to memorize, let’s say, baseball statistics going back to the 1950s or Bible verses, may have accomplished an impressive mental task not at all aligned with normal cognitive function (the leaky bucket analogy is accurate), but neither example qualifies someone as learned the way most understand the term. Information (especially raw data) is neither knowledge, understanding, nor wisdom. They’re related, sure, but not the same (blogged about this before here). Increasing levels of organization and possession are required to reach each threshold.

The passive/active (participatory) labels are also misleading. To participate actively, one must have something to contribute, to be in possession of knowledge/skill already. To teach something effectively, one must have greater expertise than one’s students. Undoubtedly, teaching others solidifies one’s understanding and expertise, and further learning is often a byproduct, but one certainly can’t begin learning a new subject area by teaching it. Information (input) needs to come from elsewhere, which understandably has a lower retention rate until it’s been gone over repeatedly and formed the cognitive grooves that represent acquisition and learning. This is also the difference between reception and expression in communications. One’s expressive vocabulary (the words one can actually deploy in speech and writing) is a subset of one’s receptive vocabulary (the words one can understand readily upon hearing or reading). The expressive vocabulary is predicated on prior exposure that imbues less common words with power, specificity, and nuance. While it’s possible to learn new words quickly (in small quantities), it’s not generally possible to skip repetition that enables memorization and learning. Anyone studying vocabulary lists for the SAT/ACT (as opposed to a spelling bee) knows this intuitively.

Lastly, where exactly is most prospective knowledge and skill located, inside the self or outside? One doesn’t simply peel back layers of the self to reveal knowledge. Rather, one goes out into the world and seeks it (or doesn’t, sadly), disturbing it from its natural resting place. The great repositories of knowledge are books and other people (especially those who write books — whoda thunk?). So confronting knowledge, depending on the subject matter, occurs more efficiently one-on-one (an individual reading a book) or in groups (25 or so students in a classroom headed by 1 teacher). The inefficiency of a 1:1 ratio between student and teacher (a/k/a tutoring) is obviously available to those who place a high enough value on learning to hire a tutor. However, that’s not how education (primary through postgraduate) is undertaken in most cases. And just imagine the silliness of gathering a classroom of students to teach just for one person to learn with 90% retention, as the learning pyramid would suggest.

Advertisements

This is the inverse of a prior post called “Truth Based on Fiction.”

Telling stories about ourselves is one of the most basic of human attributes stretching across oral and recorded history. We continue today to memorialize events in short, compact tellings, frequently movies depicting real-life events. I caught two such films recently: Truth (about what came to be known as Rathergate) and Snowden (about whistle-blower Edward Snowden).

Although Dan Rather is the famous figure associated with Truth, the story focuses more on his producer Mary Mapes and the group decisions leading to airing of a controversial news report about George W. Bush’s time in the Air National Guard. The film is a dramatization, not a documentary, and so is free to present the story with its own perspective and some embellishment. Since I’m not a news junkie, my memory of the events in 2004 surrounding the controversy are not especially well informed, and I didn’t mind the potential for the movie’s version of events to color my thinking. About some controversies and conspiracies, I feel no particular demand to adopt a strong position. The actors did well enough, but I felt Robert Redford was poorly cast as Dan Rather. Redford is too famous in his own right to succeed as a character actor playing a real-life person.

Debate over the patriotism or treason of Edward Snowden’s actions continues to swirl, but the film covers the issues pretty well, from his discovery of an intelligence services surveillance dragnet (in violation of the 4th Amendment to the U.S. Constitution) to his eventual disclosure of same to a few well-respected journalists. The film’s director and joint screenwriter, Oliver Stone, has made a career out of fiction based on truth, dramatizing many signal events from the nation’s history, repackaging them as entertainment in the process. I’m wary of his interpretations of history when presented in cinematic form, less so his alternative history lessons given as documentary. Unlike Truth, however, I have clear ideas in my mind regarding Snowden the man and Snowden the movie, so from a different standpoint, was again unconcerned about potential bias. Joseph Gordon-Levitt does well enough as the titular character, though he doesn’t project nearly the same insight and keen intelligence as Snowden himself does. I suspect the documentary Citizen Four (which I’ve not yet seen) featuring Snowden doing his own talking is a far better telling of the same episode of history.

In contrast, I have assiduously avoided several other recent films based on actual events. United 93, World Trade Center, and Deepwater Horizon spring to mind, but there are many others. The wounds and controversies stemming from those real-life events still smart too much for me to consider exposing myself to propaganda historical fictions. Perhaps in a few decades, after living memory of such events has faded or disappeared entirely, such stories can be told effectively, though probably not accurately. A useful comparison might be any one of several films called The Alamo.

Violent events of the past week (Charleston, VA; Barcelona, Spain) and political responses to them have dominated the news cycle, pushing other newsworthy items (e.g., U.S.-South Korean war games and a looming debt ceiling crisis) off the front page and into the darker recesses of everyone’s minds (those paying attention, anyway). We’re absorbed instead with culture wars run amok. I’m loath to apply the term terrorism to regular periodic eruptions of violence, both domestic and foreign. That term carries with it intent, namely, the objective to create day-to-day terror in the minds of a population so as to interfere with proper functions of society. It’s unclear to me whether recent perpetrators of violence are coherent enough to formulate sophisticated motivations or plans. The dumb, obvious way of doing things — driving into crowds of people — takes little or no planning and may just as well be the result of inchoate rage boiling over in a moment of high stress and opportunity. Of course, it needn’t be all or nothing, and considering our reflexively disproportionate responses, the term terrorism and attendant destabilization is arguably accurate even without specified intent. That’s why in the wake of 9/11 some 16 years ago, the U.S. has become a security state.

It’s beyond evident that hostilities have been simmering below the not-so-calm surface. Many of those hostilities, typically borne out of economic woes but also part of a larger clash of civilizations, take the form of identifying an “other” presumably responsible for one’s difficulties and then victimizing the “other” in order to elevate oneself. Of course, the “other” isn’t truly responsible for one’s struggles, so the violent dance doesn’t actually elevate anyone, as in “supremacy”; it just wrecks both sides (though unevenly). Such warped thinking seems to be a permanent feature of human psychology and enjoys popular acceptance when the right “other” is selected and universal condemnation when the wrong one is chosen. Those doing the choosing and those being chosen haven’t changed much over the centuries. Historical Anglo-Saxons and Teutons choose and people of color (all types) get chosen. Jews are also chosen with dispiriting regularity, which is an ironic inversion of being the Chosen People (if you believe in such things — I don’t). However, any group can succumb to this distorted power move, which is why so much ongoing, regional, internecine conflict exists.

As I’ve been saying for years, a combination of condemnation and RightThink has simultaneously freed some people from this cycle of violence but merely driven the holdouts underground. Supremacy in its various forms (nationalism, racism, antisemitism, etc.) has never truly been expunged. RightThink itself has morphed (predictably) into intolerance, which is now veering toward radicalism. Perhaps a positive outcome of this latest resurgence of supremacist ideology is that those infected with the character distortion have been emboldened to identify themselves publicly and thus can be dealt with somehow. Civil authorities and thought leaders are not very good at dealing with hate, often shutting people out of the necessary public conversation and/or seeking to legislate hate out of existence with restrictions on free speech. But it is precisely through free expression and diplomacy that we address conflict. Violence is a failure to remain civil (duh!), and war (especially the genocidal sort) is the extreme instance. It remains to be seen if the lid can be kept on this boiling pot, but considering cascade failures lined up to occur within the foreseeable future, I’m pessimistic that we can see our way past the destructive habit of shifting blame onto others who often suffer even worse than those holding the reins of power.

Back in undergraduate college, when just starting on my music education degree, I received an assignment where students were asked to formulate a philosophy of education. My thinking then was influenced by a curious textbook I picked up: A Philosophy of Music Education by Bennett Reimer. Of course, it was the wrong time for an undergraduate to perform this exercise, as we had neither maturity nor understanding equal to the task. However, in my naïvté, my answer was all about learning/teaching an aesthetic education — one that focused on appreciating beauty in music and the fine arts. This requires the cultivation of taste, which used to be commonplace among the educated but is now anathema. Money is the preeminent value now. Moreover, anything that smacks of cultural programming and thought control is now repudiated reflexively, though such projects are nonetheless undertaken continuously and surreptitiously through a variety of mechanisms. As a result, the typical American’s sense of what is beautiful and admirable is stunted. Further, knowledge of the historical context in which the fine arts exist is largely absent. (Children are ahistorical in this same way.) Accordingly, many Americans are coarse philistines whose tastes rarely extend beyond those acquired naturally during adolescence (including both biophilia and biophobia), thus the immense popularity of comic book movies, rock and roll music, and all manner of electronica.

When operating with a limited imagination and undeveloped ability to perceive and discern (and disapprove), one is a sitting duck for what ought to be totally unconvincing displays of empty technical prowess. Mere mechanism (spectacle) then possesses the power to transfix and amaze credulous audiences. Thus, the ear-splitting volume of amplified instruments substitutes for true emotional energy produced in exceptional live performance, ubiquitous CGI imagery (vistas and character movements, e.g., fight skills, that simply don’t exist in reality) in cinema produces wonderment, and especially, blinking lights and animated GIFs deliver the equivalent of a sugar hit (cookies, ice cream, soda) when they’re really placebos or toxins. Like hypnosis, the placebo effect is real and pronounced for those unusually susceptible to induction. Sitting ducks.

Having given the fine arts (including their historical contexts) a great deal of my academic attention and acquired an aesthetic education, my response to the video below fell well short of the blasé relativism most exhibit; I actively dislike it. (more…)

I have just one previous blog post referencing Daniel Siegel’s book Mind and threatened to put the book aside owing to how badly it’s written. I haven’t yet turned in my library copy and have made only modest additional progress reading the book. However, Siegel came up over at How to Save the World, where at least one commentator was quite enthusiastic about Siegel’s work. In my comment there, I mentioned the book only to suggest that his appreciation of the relational nature of the mind (and cognition) reinforces my long-held intuition that the self doesn’t exist in an idealized vacuum, capable of modeling and eventually downloading to a computer or some other Transhumanist nonsense, but is instead situated as much between us as within us. So despite Siegel’s clumsy writing, this worthwhile concept deserves support.

Siegel goes on to wonder (without saying he believes it to be true — a disingenuous gambit) that perhaps there exists an information field, not unlike the magnetic field or portions of the light spectrum, that affects us yet falls outside the scope of our direct perception or awareness. Credulous readers might leap to the conclusion that the storied collective consciousness is real. Some fairly trippy theories of consciousness propose that the mind is actually more like an antenna receiving signals from some noncorporeal realm (e.g., a quantum dimension) we cannot identify yet tap into constantly, measuring against and aligning with the wider milieu in which we function. Even without expertise in zoology, one must admit that humans are social creatures operating at various levels of hierarchy including individual, family, clan, pack, tribe, nation-state, etc. We’re less like mindless drones in a hive (well, some of us) and more like voluntary and involuntary members of gangs or communities formed along various familial, ethnic, regional, national, language group, and ideological lines. Unlike Siegel, I’m perfectly content with existing terminology and feel no compulsion to coin new lingo or adopt unwieldy acronyms to mark my territory.

What Siegel hasn’t offered is an observation on how our reliance on and indebtedness to the public sphere (via socialization) have changed with time as our mode of social organization has morphed from a predominantly localized, agrarian existence prior to the 20th century to a networked, high-density, information-saturated urban and suburban existence in the 21st century. The public sphere was always out there, of course, especially as embodied in books, periodicals, pamphlets, and broadsides (if one was literate and had reliable access to them), but the unparalleled access we now enjoy through various electronic devices has not only reoriented but disoriented us. Formerly slow, isolated information flow has become a veritable torrent or deluge. It’s not called the Information Age fer nuthin’. Furthermore, the bar to publication  — or insertion into the public sphere — has been lowered to practical nonexistence as the democratization of production has placed the tools of widely distributed exposure into the hands of everyone with a blog (like mine) or Facebook/Instagram/Twitter/Pinterest/LinkedIn account. As a result, a deep erosion of authority has occurred, since any yahoo can promulgate the most reckless, uninformed (and disinformed) opinions. The public’s attention riveted on celebrity gossip and House of Cards-style political wrangling, false narratives, fake news, alternative facts, and disinformation also make navigating the public sphere with much integrity impossible for most. For instance, the MSN and alternative media alike are busy selling a bizarre pageant of Russian collusion and interference with recent U.S. elections as though the U.S. were somehow innocent of even worse meddling abroad. Moreover, it’s naïve to think that the public sphere in the U.S. isn’t already completely contaminated from within by hucksters, corporations (including news media), and government entities with agendas ranging from mere profit seeking to nefarious deployment and consolidation of state power. For example, the oil and tobacco industries and the Bush Administration all succeeded in suppressing truth and selling rank lies that have landed us in various morasses from which there appears to be no escape.

If one recognizes his or her vulnerability to the depredations of info scammers of all types and wishes to protect oneself, there are two competing strategies: insulation and inoculation. Insulation means avoiding exposure, typically by virtue of mind-cleansing behaviors, whereas inoculation means seeking exposure in small, harmless doses so that one can handle a larger infectious attack. It’s a medical metaphor that springs from meme theory, where ideas propagate like viruses, hence, the notion of a meme “going viral.” Neither approach is foolproof. Insulation means plugging one’s ears or burying one’s head in the sand at some level. Inoculation risks spreading the infection. If one regards education as an inoculation of sorts, seeking more information of the right types from authoritative sources should provide means to combat the noise in the information signals received. However, as much as I love the idea of an educated, informed public, I’ve never regarded education as a panacea. It’s probably a precondition for sound thinking, but higher education in particular has sent an entire generation scrambling down the path of identity politics, which sounds like good ideas but leads inevitably to corruption via abstraction. That’s all wishful thinking, though; the public sphere we actually witness has gone haywire, a condition of late modernism and late-stage capitalism that has no known antidote. Enjoy the ride!

For a variety of reasons, I go to see movies in the theater only a handful of times any given year. The reasons are unimportant (and obvious) and I recognize that, by eschewing the theater, I’m giving up the crowd experience. Still, I relented recently and went to see a movie at a new AMC Dolby Cinema, which I didn’t even know exists. The first thing to appreciate was that is was a pretty big room, which used to be standard when cinema was first getting established in the 1920s but gave way sometime in the 1970s to multiplex theaters able to show more than one title at a time in little shoebox compartments with limited seating. Spaciousness was a welcome throwback. The theater also had oversized, powered, leather recliners rather than cloth, fold-down seats with shared armrests. The recliners were quite comfortable but also quite unnecessary (except for now typical Americans unable to fit their fat asses in what used to be a standard seat). These characteristics are shared with AMC Prime theaters that dress up the movie-going experience and charge accordingly. Indeed, AMC now offers several types of premium cinema, including RealD 3D, Imax, Dine-In, and BigD.

Aside I: A friend only just reported on her recent trip to the drive-in theater, a dated cinema experience that is somewhat degraded unenhanced yet retains its nostalgic charm for those of us old enough to remember as kids the shabby chic of bringing one’s own pillows, blankets, popcorn, and drinks to a double feature and sprawling out on the hood and/or roof of the car (e.g., the family station wagon). My friend actually brought her dog to the drive-in and said she remembered and sorta missed the last call on dollar hot dogs at 11 PM that used to find all the kids madly, gleefully rushing the concession stand before food ran out.

What really surprised me, however, was how the Dolby Cinema experience turned into a visual, auditory, and kinesthetic assault. True, I was watching Wonder Woman (sorry, no review), which is set in WWI and features lots of gunfire and munitions explosions in addition to the usual invincible superhero punchfest, so I suppose the point is partly to be immersed in the environment, a cinematic stab at verisimilitude. But the immediacy of all the wham-bam, rock ’em-sock ’em action made me feel more like a participant in a theater of war than a viewer. The term shell shock (a/k/a battle fatigue a/k/a combat neurosis) refers to the traumatized disorientation one experiences in moments of high stress and overwhelming sensory input; it applies here. Even the promo before the trailers and feature, offered to demonstrate the theater’s capabilities themselves, was off-putting because of unnecessary and overweening volume and impact. Unless I’m mistaken, the seats even have built-in subwoofers to rattle theatergoers from below when loud, concussive events occur, which is often because, well, filmmakers love their spectacle as much as audiences do.

Aside II: One real-life lesson to be gleaned from WWI, or the Great War as it was called before WWII, went well beyond the simplistic truism that war is hell. It was that civility (read: civilization) had failed and human progress was a chimera. Technical progress, however, had made WWI uglier in many respects than previous warfare. It was an entirely new sort of horror. Fun fact: there are numerous districts in France, known collectively as Le Zone Rouge, where no one is allowed to live because of all the unexploded ordnance (1oo years later!). Wonder Woman ends up having it both ways: acknowledging the horrific nature of war on the one hand yet valorizing and romanticizing personal sacrifice and eventual victory on the other. Worse, perhaps, it establishes that there’s always another enemy in the wings (otherwise, how could there be sequels?), so keep fighting. And for the average viewer, uniformed German antagonists are easily mistakable for Nazis of the subsequent world war, a historical gloss I’m guessing no one minds … because … Nazis.

So here’s my problem with AMC’s Dolby Cinema: why settle for routine or standard theater experience when it can be amped up to the point of offense? Similarly, why be content with the tame and fleeting though reliable beauty of a sunset when one can enjoy a widescreen, hyperreal view of cinematic worlds that don’t actually exist? Why settle for the subtle, old-timey charm of the carousel (painted horses, dizzying twirling, and calliope music) when instead one can strap in and get knocked sideways by roller coasters so extreme that riders leave wobbly and crying at the end? (Never mind the risk of being stranded on the tracks for hours, injured, or even killed by a malfunction.) Or why bother attending a quaint symphonic band concert in the park or an orchestral performance in the concert hall when instead one can go to Lollapalooza and see/hear/experience six bands in the same cacophonous space grinding it out at ear-splitting volume, along with laser light shows and flash-pot explosions for the sheer sake of goosing one’s senses? Coming soon are VR goggles that trick the wearer’s nervous system into accepting they are actually in the virtual game space, often first-person shooters depicting killing bugs or aliens or criminals without compunction. Our arts and entertainments have truly gotten out of hand.

If those criticisms don’t register, consider my post more than a decade ago on the Paradox of the Sybarite and Catatonic, which argues that our senses are so overwhelmed by modern life that we’re essentially numb from overstimulation. Similarly, let me reuse this Nietzsche quote (used before here) to suggest that on an aesthetic level, we’re not being served well in display and execution of refined taste so much as being whomped over the head and dragged willingly? through ordeals:

… our ears have become increasingly intellectual. Thus we can now endure much greater volume, much greater ‘noise’, because we are much better trained than our forefathers were to listen for the reason in it. All our senses have in fact become somewhat dulled because we always inquire after the reason, what ‘it means’, and no longer for what ‘it is’ … our ear has become coarsened. Furthermore, the ugly side of the world, originally inimical to the senses, has been won over for music … Similarly, some painters have made the eye more intellectual, and have gone far beyond what was previously called a joy in form and colour. Here, too, that side of the world originally considered ugly has been conquered by artistic understanding. What is the consequence of this? The more the eye and ear are capable of thought, the more they reach that boundary line where they become asensual. Joy is transferred to the brain; the sense organs themselves become dull and weak. More and more, the symbolic replaces that which exists … the vast majority, which each year is becoming ever more incapable of understanding meaning, even in the sensual form of ugliness … is therefore learning to reach out with increasing pleasure for that which is intrinsically ugly and repulsive, that is, the basely sensual. [italics not in original]

I picked up a copy of Daniel Siegel’s book Mind: A Journey to the Heart of Being Human (2017) to read and supplement my ongoing preoccupation with human consciousness. Siegel’s writing is the source of considerable frustration. Now about 90 pp. into the book (I am considering putting it aside), he has committed several grammatical errors (where are book editors these days?), doesn’t really know how to use a comma properly, and doesn’t write in recognizable paragraph form. He has a bad habit of posing questions to suggest the answers he wants to give and drops constant hints of something soon to be explored like news broadcasts that tease the next segment. He also deploys a tired, worn metaphor that readers are on a journey of discovery with him, embarked on a path, exploring a subject, etc. Yecch. (A couple Amazon reviews also note that grayish type on parchment (cream) paper poses a legibility problem due to poor contrast even in good light — undoubtedly not really Siegel’s fault.)

Siegel’s writing is also irritatingly circular, casting and recasting the same sentences in repetitious series of assertions that have me wondering frequently, “Haven’t I already read this?” Here are a couple examples:

When energy flows inside your body, can you sense its movement, how it changes moment by moment?

then only three sentences later

Energy, and energy-as-information, can be felt in your mental experience as it emerges moment by moment. [p. 52]

Another example:

Seeing these many facets of mind as emergent properties of energy and information flow helps link the inner and inter aspect of mind seamlessly.

then later in the same paragraph

In other words, mind seen this way could be in what seems like two places at once as inner and inter are part of one interconnected, undivided system. [p. 53]

This is definitely a bug, not a feature. I suspect the book could easily be condensed from 330 pp. to less than 200 pp. if the writing weren’t so self-indulgent of the author. Indeed, while I recognize a healthy dose of repetition is an integral part of narrative form (especially in music), Siegel’s relentless repetition feels like propaganda 101, where guileless insistence (of lies or merely the preferred story one seeks to plant in the public sphere) wears down the reader rather than convinces him or her. This is also marketing 101 (e.g., Coca-Cola, McDonald’s, Budweiser, etc. continuing to advertise what are by now exceedingly well-established brands).

(more…)

I pull in my share of information about current events and geopolitics despite a practiced inattention to mainstream media and its noisome nonsense. (See here for another who turned off the MSM.) I read or heard somewhere (can’t remember where) that most news outlets and indeed most other media, to drive traffic, now function as outrage engines, generating no small amount of righteousness, indignation, anger, and frustration at all the things so egregiously wrong in our neighborhoods, communities, regions, and across the world. These are all negative emotions, though legitimate responses to various scourges plaguing us currently, many of which are self-inflicted. It’s enough aggregate awfulness to draw people into the street again in principled protest, dissent, and resistance; it’s not yet enough to effect change. Alan Jacobs comments about outrage engines, noting that sharing via retweets is not the same as caring. In the Age of Irony, a decontextualized “yo, check this out!” is nearly as likely to be interpreted as support rather than condemnation (or mere gawking for entertainment value). Moreover, pointing, linking, and retweeting are each costless versions of virtue signaling. True virtue makes no object of publicity.

So where do I get my outrage quotient satisfied? Here is a modest linkfest, in no particular order, of sites not already on my blogroll. I don’t habituate these sites daily, but I drop in, often skimming, enough to keep abreast of themes and events of importance. (more…)

The Internet is now a little more than two decades old (far more actually, but I’m thinking of its widespread adoption). Of late, it’s abundantly clear that, in addition to being a wholesale change in the way we disseminate and gather information and conduct business, we’re running live social experiments bearing psychological influence, some subtle, some invasive, much like the introduction of other media such as radio, cinema, and TV back in the day. About six years ago, psychologists coined the term digital crowding, which I just discovered, referring to an oppressive sense of knowing too much about people, which in turn provokes antisocial reactions. In effect, it’s part of the Dark Side of social media (trolling and comments sections being other examples), one of numerous live social experiments.

I’ve given voice to this oppressive knowing-too-much on occasion by wondering why, for instance, I know anything — largely against my will, mind you — about the Kardashians and Jenners. This is not the sole domain of celebrities and reality TV folks but indeed anyone who tends to overshare online, typically via social media such as Facebook, less typically in the celebrity news media. Think of digital crowding as the equivalent of seeing something you would really prefer not to have seen, something no amount of figurative eye bleach can erase, something that now simply resides in your mind forever. It’s the bell that can’t be unrung. The crowding aspect is that now everyone’s dirty laundry is getting aired simultaneously, creating pushback and defensive postures.

One might recognize in this the familiar complaint of Too Much Information (TMI), except that the information in question is not the discomfiting stuff such as personal hygiene, medical conditions, or sexual behaviors. Rather, it’s an unexpected over-awareness of everyone’s daily minutiae as news of it presses for attention and penetrates our defenses. Add it to the deluge that is causing some of us to adopt information avoidance.

I often review my past posts when one receives a reader’s attention, sometimes adding tags and fixing typos, grammar, and broken links. One on my greatest hits (based on voting, not traffic) is Low Points in Education. It was among the first to tackle what I have since called our epistemological crisis, though I didn’t begin to use the epistemology tag until later. The crisis has caught up with a vengeance, though I can’t claim I’m the first to observe the problem. That dubious honor probably goes to Stephen Colbert, who coined the word truthiness in 2005. Now that alternative facts and fake news have entered the lingo as well (gaslighting has been revived), everyone has jumped on the bandwagon questioning the truthfulness or falsity behind anything coughed up in our media-saturated information environment. But as suggested in the first item discussed in Low Points in Education, what’s so important about truth?

It would be obvious and easy yet futile to argue in favor of high-fidelity appreciation of the world, even if only within the surprisingly narrow limits of human perception, cognition, and memory (all interrelated). Numerous fields of endeavor rely upon consensus reality derived from objectivity, measurement, reason, logic, and, dare I say it, facticity. Regrettably, human cognition doesn’t adhere any too closely to those ideals except when trained to value them. Well-educated folks have better acquaintance with such habits of mind; folks with formidable native intelligence can develop true authority, too. For the masses, however, those attributes are elusive, even for those who have partied through earned college degrees. Ironically worse, perhaps, are specialists, experts, and overly analytical intellectuals who exhibit what the French call a déformation professionelle. Politicians, pundits, and journalists are chief among the deformed and distorted. Mounting challenges to establishing truth now destabilize even mundane matters of fact, and it doesn’t help that myriad high-profile provocateurs (including the Commander in Chief, to whom I will henceforth refer only as “45”) are constantly throwing out bones for journalists to chase like so many unnourishing rubber chew toys.

Let me suggest, then, that human cognition, or more generally the mind, is an ongoing balancing act, making adjustments to stay upright and sane. Like the routine balance one keeps during locomotion, shifting weight side to side continuously, falling a bit only to catch oneself, difficulty is not especially high. But with the foundation below one’s feet shaking furiously, so to speak, legs get wobbly and many end up (figuratively at least) ass over teakettle. Further, the mind is highly situational, contingent, and improvisational and is prone to notoriously faulty perception even before one gets to marketing, spin, and arrant lies promulgated by those intent on coopting or directing one’s thinking. Simply put, we’re not particularly inclined toward accuracy but instead operate within a wide margin of error. Accordingly, we’re quite strong at adapting to ever-changing circumstance.

That strength turns out to be our downfall. Indeed, rootless adjustment to changing narrative is now so grave that basic errors of attribution — which entities said and did what — make it impossible to distinguish allies from adversaries reliably. (Orwell captured this with his line from the novel 1984, “Oceania was at war with Eurasia; therefore Oceania had always been at war with Eurasia.) Thus, on the back of a brazen propaganda campaign following 9/11, Iraq morphed from U.S. client state to rogue state demanding preemptive war. (Admittedly, the U.S. State Department had already lost control of its puppet despot, who in a foolish act of naked aggression tried to annex Kuwait, but that was a brief, earlier war quite unlike the undeclared one in which the U.S. has been mired for 16 years.) Even though Bush Administration lies have been unmasked and dispelled, many Americans continue to believe (incorrectly) that Iraq possessed WMDs and posed an existential threat to the U.S. The same type of confusion is arguably at work with respect to China, Russia, and Israel, which are mixed up in longstanding conflicts having significant U.S. involvement and provocation. Naturally, the default villain is always Them, never Us.

So we totter from moment to moment, reeling drunkenly from one breathtaking disclosure to the next, and are forced to reorient continuously in response to whatever the latest spin and spew happen to be. Some institutions retain the false sheen of respectability and authority, but for the most part, individuals are free to cherry-pick information and assemble their own truths, indulging along the way in conspiracy and muddle-headedness until at last almost no one can be reached anymore by logic and reason. This is our post-Postmodern world.