Posts Tagged ‘Epistemology’

Most poets in the West believe that some sort of democracy is preferable to any sort of totalitarian state and accept certain political obligations … but I cannot think of a single poet of consequence whose work does not, either directly or by implication, condemn modern civilisation as an irremediable mistake, a bad world which we have to endure because it is there and no one knows how it could be made into a better one, but in which we can only retain our humanity in the degree to which we resist its pressures. — W.H. Auden

A while back, I made an oblique reference (a comment elsewhere, no link) to a famous Krishnamurti quote: “It is no measure of health to be well adjusted to a profoundly sick society.” Taken on its face, who would agree to be swept up in the madness and absurdity of any given historical moment? Turns out, almost everyone — even if that means self-destruction. The brief reply to my comment was along the lines of “Why shouldn’t you or I also make mental adjustments to prevailing sickness to obtain peace of mind and tranquility amidst the tumult?” Such an inversion of what seems to me right, proper, and acceptable caused me to reflect and recall the satirical movie Dr. Strangelove or: How I Learned to Stop Worrying and Love the Bomb. The full title is not often given, but the forgotten second part is what’s instructive (e.g., mutually assured destruction: MAD). Events spinning out of control? Nothing any individual can do to restore sanity? Stop squirming and embrace it.

That’s one option when faced with the prospect of futile resistance, I suppose. Give in, succumb, and join the party (more like a rager since the beginning of the Cold War). I also recognize that I’m not special enough to warrant any particular consideration for my intransigence. Yet it feels like self-betrayal to abandon the good character I’ve struggled (with mixed success) to build and maintain over the course of a lifetime. Why chuck all that now? Distinguishing character growth from decay it not always so simple. In addition, given my openness to new ideas and interpretations, established bodies of thought (often cultural consensus) are sometimes upended and destabilized by someone arguing cogently for or against something settled and unexamined for a long time. And then there is the epistemological crisis that has rendered sense-making nearly impossible. That crisis is intensified by a variety of character types acting in bad faith to pollute the public sphere and drive false narratives.

For instance, the show trial public hearings just begun regarding the January 6 attack on the U.S. Capitol (or whatever it’s being called, I prefer “Storming of the Capitol”) are commonly understood, at least from one side of the political spectrum, as a deliberate and brazen attempt to brainwash the public. I decline to tune in. But that doesn’t mean my opinions on that topic are secure any more than I know how true and accurate was the 2020 election that preceded and sparked the Jan. 6 attack. Multiple accounts of the election and subsequent attack aim to convert me (opinion-wise) to one exclusive narrative or another, but I have no way to evaluate narrative claims beyond whatever noise reaches me through the mainstream media I try to ignore. Indeed, those in the streets and Capitol building on Jan. 6 were arguably swept into a narrative maelstrom that provoked a fairly radical if ultimately harmless event. No one knew at the time, of course, exactly how it would play out.

So that’s the current state of play. Ridiculous, absurd events, each with competing narratives, have become the new normal. Yours facts and beliefs do daily battle with my facts and beliefs in an ideological battle of all against all — at least until individuals form into tribes declare their political identity and join that absurdity.

This intended follow-up has been stalled (pt. 1 here) for one simple reason: the premise presented in the embedded YouTube video is (for me at least) easy to dismiss out of hand and I haven’t wanted to revisit it. Nevertheless, here’s the blurb at the top of the comments at the webpage:

Is reality created in our minds, or are the things you can touch and feel all that’s real? Philosopher Bernardo Kastrup holds doctorates in both philosophy and computer science, and has made a name for himself by arguing for metaphysical idealism, the idea that reality is essentially a mental phenomenon.

Without going into point-by-point discussion, the top-level assertion, if I understand it correctly (not assured), is that material reality comes out of mental experience rather than the reverse. It’s a chicken-and-egg question with materialism and idealism (fancy technical terms not needed) each vying for primacy. The string of conjectures (mental gymnastics, really, briefly impressive until one recognizes how quickly they lose correlation with how most of us think about and experience reality) that inverts the basic relationship of inner experience to outer reality is an example of waaaay overthinking a problem. No doubt quite a lot of erudition can be brought to bear on such questions, but even if those questions were resolved satisfactorily on an intellectual level and an internally coherent structure or system were developed or revealed, it doesn’t matter or lead anywhere. Humans are unavoidably embodied beings. Each individual existence also occupies a tiny sliver of time (the timeline extending in both directions to infinity). Suggesting that mental experience is briefly instantiated in personhood but is actually drawn out of some well of souls, collective consciousness, or panpsychism and rejoins them in heaven, hell, or elsewhere upon expiration is essentially a religious claim. It’s also an attractive supposition, granting each of us not permanence or immortality but rather something somehow better (?) though inscrutable because it lies beyond perception (but not conceptualization). Except for an eternity of torments in hell, I guess, if one deserves that awful fate.

One comment about Kastrup. He presents his perspective (his findings?) with haughty derision of others who can’t see or understand what it so (duh!) obvious. He falls victim to the very same over-intellectualized flim-flam he mentions when dismissing materialists who need miracles and shortcuts to smooth over holes in their scientific/philosophical systems. The very existence of earnest disagreement by those who occupy themselves with such questions might suggest some humility, as in “here’s my explanation, they have theirs, judge for yourself.” But there’s a third option: the great unwashed masses (including nearly all our ancestors) for whom such questions are never even fleeting thoughts. It’s all frankly immaterial (funnily, both the right and wrong word at once). Life is lived and experienced fundamentally on its surface — unless, for instance, one has been incubated too long within the hallowed halls of academia, lost touch with one’s brethren, and become preoccupied with cosmic investigations. Something quite similar happens to politicians and the wealthy, who typically hyperfocus on gathering to themselves power and then exercising that power over others (typically misunderstood as simply pulling the levers and operating the mechanisms of society). No wonder their pocket of reality looks so strikingly different.

From Joseph Bernstein’s article “Bad News” in the Sept. 2021 issue of Harper’s Magazine:

Compared with other, more literally toxic corporate giants, those in the tech industry have been rather quick to concede the role they played in corrupting the allegedly pure stream of American reality. Only five years ago, Mark Zuckerberg said it was a “pretty crazy idea” that bad content on his website had persuaded enough voters to swing the 2016 election to Donald Trump. “Voters make decisions based on their lived experience,” he said. “There is a profound lack of empathy in asserting that the only reason someone could have voted the way they did is because they saw fake news.” A year later, suddenly chastened, he apologized for being glib and pledged to do his part to thwart those who “spread misinformation.”

Denial was always untenable, for Zuckerberg in particular. The so-called techlash, a season of belatedly brutal media coverage and political pressure in the aftermath of Brexit and Trump’s win, made it difficult. But Facebook’s basic business pitch made denial impossible. Zuckerberg’s company profits by convincing advertisers that it can standardize its audience for commercial persuasion. How could it simultaneously claim that people aren’t persuaded by its [political] content?

I use the tag redux to signal that the topic of a previous blog post is being revisited, reinforced, and repurposed. The choice of title for this one could easily have gone instead to Your Brain on Postmodernism, Coping with the Post-Truth World, or numerous others. The one chosen, however, is probably the best fit given than compounding crises continue pushing along the path of self-annihilation. Once one crisis grows stale — at least in terms of novelty — another is rotated in to keep us shivering in fear, year after year. The date of civilizational collapse is still unknown, which is really more process anyway, also of an unknown duration. Before reading what I’ve got to offer, perhaps wander over to Clusterfuck Nation and read James Howard Kunstler’s latest take on our current madness.

/rant on

So yeah, various cultures and subcultures are either in the process of going mad or have already achieved that sorry state. Because madness is inherently irrational and unrestrained, specific manifestations are unpredictable. However, the usual trigger for entire societies to lose their tether to reality is relatively clear: existential threat. And boy howdy are those threats multiplying and gaining intensity. Pick which of the Four Horsemen of the Apocalypse with whom to ride to the grave, I guess. Any one will do; all four are galloping simultaneously, plus a few other demonic riders not identified in that mythological taxonomy. Kunstler’s focus du jour is censorship and misinformation (faux disambiguation: disinformation, malinformation, dishonesty, gaslighting, propaganda, fake news, falsehood, lying, cozenage, confidence games, fraud, conspiracy theories, psyops, personal facts), about which I’ve blogged repeatedly under the tag epistemology. Although major concerns, censorship and misinformation are outgrowths of spreading madness, not the things that will kill anyone directly. Indeed, humans have shown a remarkable capacity to hold in mind crazy belief systems or stuff down discomfiting and disapproved thoughts even without significant threat. Now that significant threats spark the intuition that time is running perilously short, no wonder so many have fled reality into the false safety of ideation. Inability to think and express oneself freely or to detect and divine truth does, however, block what few solutions to problems remain to be discovered.

Among recent developments I find unsettling and dispiriting is news that U.S. officials, in their effort to — what? — defeat the Russians in a war we’re not officially fighting, are just making shit up and issuing statements to their dutiful stenographers in the legacy press to report. As I understand it, there isn’t even any pretense about it. So to fight phantoms, U.S. leaders conjure out of nothingness justifications for involvements, strategies, and actions that are the stuff of pure fantasy. This is a fully, recognizably insane: to fight monsters, we must become monsters. It’s also maniacally stupid. Further, it’s never been clear to me that Russians are categorically baddies. They have dealt with state propaganda and existential threats (e.g., the Bolshevik Revolution, WWII, the Cold War, the Soviet collapse, being hemmed in by NATO countries) far more regularly than most Americans and know better than to believe blindly what they’re told. On a human level, who can’t empathize with their plights? (Don’t answer that question.)

In other denial-of-reality news, demand for housing in Sun Belt cities has driven rent increases ranging between approximately 30% and 60% over the past two years compared to many northern cities well under 10%. Americans are migrating to the Sun Belt despite, for instance, catastrophic drought and wild fires. Lake Powell sits at an historically low level, threatening reductions in water and electrical power. What happens when desert cities in CA, AZ, NV, and NM become uninhabitable? Texas isn’t far behind. This trend has been visible for decades, yet many Americans (and immigrants, too) are positioning themselves directly in harm’s way.

I’ve been a doomsayer for over a decade now, reminding my two or three readers (on and off) that the civilization humans built for ourselves cannot stand much longer. Lots of people know this yet act as though concerns are overstated or irrelevant. It’s madness, no? Or is it one last, great hurrah before things crack up apocalyptically? On balance, what’s a person to do but to keep trudging on? No doubt the Absurdists got something correct.

/rant off

Continuing from the previous blog post, lengthy credit scrolls at the ends of movies have become a favorite hiding place for bloopers and teasers. The purpose of this practice is unclear, since I can’t pretend (unlike many reckless opinonators) to inhabit the minds of filmmakers, but it has become a fairly reliable afterthought for film-goers willing to wait out the credits. Those who depart the theater, change the channel, or click away to other content may know they are relinquishing some last tidbit to be discovered, but there’s no way to know in advance if one is being punked or pleased, or indeed if there is anything at all there. Clickbait news often employs this same technique, teasing some newsbit in the headline to entice readers to wade (or skim) through a series of (ugh!) one-sentence paragraphs to find the desired content, which sometimes is not even provided. At least one film (Monty Python’s The Secret Policeman’s Other Ball (1982) as memory serves) pranked those in a rush to beat foot traffic out of the theater (back when film-going meant visiting the cinema) by having an additional thirty minutes of material after the (first) credit sequence.

This also put me in mind of Paul Harvey radio broadcasts ending with the sign-off tag line, “… the rest of the story.” Harvey supplemented the news with obscure yet interesting facts and analysis that tended to reshape one’s understanding of consensus narrative. Such reshaping is especially important as an ongoing process of clarification and revision. When served up in delectable chunks by winning personalities like Paul Harvey, supplemental material is easily absorbed. When material requires effort to obtain and/or challenges one’s beliefs, something strongly, well, the default response is probably not to bother. However, those possessing intellectual integrity welcome challenging material and indeed seek it out. Indeed, invalidation of a thesis or hypothesis is fundamental to the scientific method, and no body of work can be sequestered from scrutiny and then be held as legitimately authoritative.

Yet that’s what happens routinely in the contemporary infosphere. A government press office or corporate public relations officer issues guidance or policy in direct conflict with earlier guidance or policy and in doing so seeks to place any resulting cognitive dissonance beyond examination and out of scope. Simple matters of adjustment are not what concern me. Rather, it’s wholesale brainwashing that is of concern, when something is clear within one’s memory or plainly documented in print/video yet brazenly denied, circumvented, and deflected in favor of a new directive. The American public has contended with this repeatedly as each new presidential administration demonizes the policies of its predecessors but typically without demonstrating the self-reflection and -examination to admit, wrongdoing, responsibility, or error on anyone’s part. It’s a distinctly American phenomenon, though others have cottoned onto it and adopted the practice for themselves.

Exhaustion from separating the spin-doctored utterances of one malefactor or another from one’s own direct experience and sense-making drives many to simply give up. “Whatever you say, sir. Lemme go back to my entertainments.” The prospect of a never-ending slog through evidence and analysis only to arrive on unsteady ground, due to shift underfoot again and again with each new revelation, is particularly unsatisfactory. And as discussed before, those who nonetheless strain to achieve knowledge and understanding that reach temporary sufficiency yet remain permanently, intransigently provisional find themselves thwarted by those in the employ of organizations willing and eager to game information systems in the service of their not-even-hidden agendas. Alternative dangers for the muddled thinker include retreating into fixed ideology or collapsing into solipsism. Maybe none of it matters in the end. We can choose our beliefs from the buffet of available options without adherence to reality. We can create our own reality. Of course, that’s a description of madness, to which many have already succumbed. Why aren’t they wearing straitjackets?

Let me first restate axioms developed in previous blog posts. Narrative is the essential outward form of consciousness. Cognition has many preverbal and nonverbal subtleties, but the exchange of ideas occurs predominantly through narrative, and the story of self (told to oneself) can be understood as stream of consciousness: ongoing self-narration of sensations and events. The principal characteristic of narrative, at least that which is not pure fantasy, is in-the-moment sufficiency. Snap-judgment heuristics are merely temporary placeholders until, ideally at least, thoughtful reconsideration and revision that take time and discernment can be brought to bear. Stories we tell and are told, however, often do not reflect reality well, partly because our perceptual apparatuses are flawed, partly because individuals are untrained and unskilled in critical thinking (or overtrained and distorted), and partly because stories are polluted with emotions that make clear assessments impossible (to say nothing of malefactors with agendas). Some of us struggle to remove confabulation from narrative (as best we can) whereas others embrace it because it’s emotionally gratifying.

A good example of the reality principle is recognition, similar to the 1970s energy crisis, that energy supplies don’t magically appear by simply digging and drilling more of the stuff out of the ground. Those easy-to-get resources have been plundered already. The term peak oil refers to eventual decline in energy production (harvesting, really) when the easy stuff is more than half gone and undiminished (read: increasing) demand impels energy companies to go in search of more exotic supply (e.g., underwater or embedded in shale). If that reality is dissatisfying, a host of dreamt-up stories offer us deliverance from inevitable decline and reduction of lifestyle prerogatives by positing extravagant resources in renewables, hydrogen fuel cells, fusion (not to be confused with fission), or as-yet unexploited regions such as The Arctic National Wildlife Refuge. None of these represent plausible realities (except going into heretofore protected regions and bringing ecological devastation).

The relationship of fictional stories to reality is quite complex. For this blog post, a radically narrow description is that fiction is the imaginary space whereas ideas can be tried out and explored safely in preparation for implementation in reality. Science fiction (i.e., imagining interstellar space travel despite its flat impossibility in Newtonian physics) is a good example. Some believe humans can eventually accomplish what’s depicted in sci-fi, and in certain limited examples we already have. But many sci-fi stories simply don’t present a plausible reality. Taken as vicarious entertainment, they’re AOK superfine with me. But given that Western cultures (I can’t opine on cultures outside the West) have veered dangerously into rank ideation and believing their own hype, too many people believe fervently in aspirational futures that have no hope of ever instantiating. Just like giant pools of oil hidden under the Rocky Mountains (to cite something sent to me just today offering illusory relief from skyrocketing gasoline prices).

Among the many genres of narrative now on offer in fiction, no better example of sought-after-power is the superhero story. Identifying with the technological and financial power of Ironman and Batman or the god-powers of Thor and Wonder Woman is thrilling, perhaps, but again, these are not plausible realities. Yet these superrich, superstrong, superintelligent superheros are everywhere in fiction, attesting to liminal awareness of lack of power and indeed frailty. Many superhero stories are couched as coming-of-age stories for girls, who with grit and determination can fight toe-to-toe with any man and dominate. (Too many BS examples to cite.) Helps, of course, if the girl has magic at her disposal. Gawd, do I tire of these stories, told as origins in suffering, acquisition of skills, and coming into one’s own with the mature ability to force one’s will on others, often in the form of straight-up killing and assassination. Judge, jury, and executioner all rolled into one but entirely acceptable vigilantism if done wearing a supersuit and claiming spurious, self-appointed moral authority.

There are better narratives that don’t conflate power with force or lack plausibility in the world we actually inhabit. In a rather complicated article by Adam Tooze entitled “John Mearsheimer and the Dark Origins of Realism” at The New Statesman, after a lengthy historical and geopolitical analysis of competing narratives, a mode of apprehending reality is described:

… adopting a realistic approach towards the world does not consist in always reaching for a well-worn toolkit of timeless verities, nor does it consist in affecting a hard-boiled attitude so as to inoculate oneself forever against liberal enthusiasm. Realism, taken seriously, entails a never-ending cognitive and emotional challenge. It involves a minute-by-minute struggle to understand a complex and constantly evolving world, in which we are ourselves immersed, a world that we can, to a degree, influence and change, but which constantly challenges our categories and the definitions of our interests. And in that struggle for realism – the never-ending task of sensibly defining interests and pursuing them as best we can – to resort to war, by any side, should be acknowledged for what it is. It should not be normalised as the logical and obvious reaction to given circumstances, but recognised as a radical and perilous act, fraught with moral consequences. Any thinker or politician too callous or shallow to face that stark reality, should be judged accordingly.

When the Canadian Freedom Convoy appeared out of nowhere over a month ago and managed to bring the Canadian capitol (Ottawa, Ontario) to a grinding halt, the news was reported with a variety of approaches. Witnessing “democracy” in action, even though initiated by a small but important segment of society, became a cause célèbre, some rallying behind the truckers as patriots and other deploring them as terrorists. Lots of onlookers in the middle ground, to be certain, but the extremes tend to define issues these days, dividing people into permafeuding Hatfields and McCoys. The Canadian government stupidly branded the truckers as terrorists, finally dispersing the nonviolent protest with unnecessary force. The Canadian model sparked numerous copycat protests around the globe.

One such copycat protest, rather late to the party, is The People’s Convoy in the U.S., which is still underway. Perhaps the model works only in the first instance, or maybe U.S. truckers learned something from the Canadian example, such as illegal seizure of crowdfunded financial support. Or maybe the prospect of confronting the U.S. military in one of the most heavily garrisoned locations in the world gave pause. (Hard to imagine Ottawa, Ontario, being ringed by military installations like D.C. is.) Either way, The People’s Convoy has not attempted to blockade D.C. Nor has the U.S. convoy been widely reported as was the Canadian version, which was a grass-roots challenge to government handling of the pandemic. Yeah, there’s actually an underlying issue. Protesters are angry about public health mandates and so-called vaccine passports that create a two-tier society. Regular folks must choose between bodily autonomy and freedom of movement on one hand and on the other compliance with mandates that have yet to prove themselves effective against spread of the virus. Quite a few people have already chosen to do as instructed, whether out of earnest belief in the efficacy of mandated approaches or to keep from falling into the lower of the two tiers. So they socially distance, wear masks, take the jab (and follow-up boosters), and provide papers upon demand. Protesters are calling for all those measures to end.

If the Canadian convoy attracted worldwide attention, the U.S. convoy has hardly caused a stir and is scarcely reported outside the foreign press and a few U.S. superpatriot websites. I observed years ago about The Republic of Lakota that the U.S. government essentially stonewalled that attempt at secession. Giving little or no official public attention to the People’s Convoy, especially while attention has turned to war between Russia and Ukraine, has boiled down to “move along, nothing to see.” Timing for the U.S. truckers could not possibly be worse. However, my suspicion is that privately, contingency plans were made to avoid the embarrassment the Canadian government suffered, which must have included instructing the media not to report on the convoy and getting search engines to demote search results that might enable the movement to go viral, so to speak. The conspiracy of silence is remarkable. Yet people line the streets and highways in support of the convoy. Sorta begs the question “what if they threw a protest but no one came?” A better question might be “what if they started a war but no one fought?”

Gross (even criminal) mismanagement of the pandemic is quickly being shoved down the memory hole as other crises and threats displace a two-year ordeal that resulted in significant loss of life and even greater, widespread loss of livelihoods and financial wellbeing among many people who were already teetering on the edge. Psychological impacts are expected to echo for generations. Frankly, I’m astonished that a far-reaching civil crack-up hasn’t already occurred. Yet despite these foreground tribulations and more besides (e.g., inflation shifting into hyperinflation, food and energy scarcity, the financial system failing every few years, and the epistemological crisis that has made every institution flatly untrustworthy), the background crisis is still the climate emergency. Governments around the world, for all the pomp and circumstance of the IPCC and periodic cheerleading conferences, have stonewalled that issue, too. Some individuals take the climate emergency quite seriously; no government does, at least by their actions. Talk is comparatively cheap. Like foreground and background, near- and far-term prospects just don’t compete. Near-term appetites and desires always win. Psychologists report that deferred gratification (e.g., the marshmallow test) is among the primary predictors of future success for individuals. Institutions, governments, and societies are in aggregate mindless and can’t formulate plans beyond the next election cycle, academic year, or business quarter to execute programs that desperately need doing. This may well be why political theorists observe that liberal democracies are helpless to truly accomplish things, whereas authoritarian regimes centered on an individual (i.e., a despot) can get things done though at extreme costs to members of society.

As a sometimes presenter of aphorisms, felicitous and humorous turns of phrase and logic interest me as examples of heuristics aimed as parsimony and cognitive efficiency. Whether one recognizes those terms or not, everyone uses snap categorization and other shortcuts to manage and alleviate crowded thinking from overwhelming demands on perception. Most of us, most of the time, use sufficiency as the primary decision-making mode, which boils down to “close enough for horseshoes and hand grenades.” Emotion is typically the trigger, not rational analysis. After enough repetition is established, unthinking habit takes over. Prior to habituation, however, the wisdom of sages has provided useful rubrics to save unnecessary and pointless labor over casuistry flung into one’s way to impede, convince, or gaslight. (I previously wrote about this effect here).

As categories, I pay close attention to razors, rules, laws, principles, and Zuihitsu when they appear as aphorisms in the writing of those I read and follow online. Famous rules, laws, and principles include Occam’s Razor, (Finagle’s Corollary to) Murphy’s Law, Godwin’s Law, Jevon’s Paradox, and the Dunning-Kruger Effect (do your own searches if these escape you). Some are quite useful at dispelling faulty thinking and argumentation. Café Bedouin (see blogroll) has an ongoing series of Zuihitsu, which has grown quite long. Many ring fundamentally true; others are either highly situational or wrong on their face, perhaps revealing the cardinal weakness of reduction of ideas to short, quotable phrases.

I recently learned of Hitchens’ Razor (after Christopher Hitchens), usually given as “What can be asserted without evidence can also be dismissed without evidence.” According to the Wikipedia entry, it may well have been reconstituted, repurposed, or revived from other sources stretching back into antiquity. Caitlin Johnson, a notable aphorist I’ve quoted numerous times, uses Hitchens’ Razor to put the lie to claims from the U.S. war machine and its dutiful media lapdogs that the “situation in Ukraine” (whatever that is) demands intervention by Western powers lest the utility bad guys of the moment, the Russians, be allowed to run roughshod over its neighbor Ukraine, which (significantly) used to be part of the now-defunct Soviet Union. As with many controversial, inflammatory claims and assertions continuously heaped like a dog pile on hapless U.S. citizens with little time, few resources, and no obligation to perform their own investigations and analyses, I have only weak opinions but very strong suspicions. That’s where Hitchens’ Razor comes in handy. Under its instruction, I can discard out-of-hand and in disbelief extraordinary claims designed to whip me and the wider public into an emotional frenzy and thus accept or support actions that shouldn’t just raise eyebrows but be met with considerable dissent, protest, and disobedience. Saves me a lot of time entertaining nonsense just because it gets repeated often enough to be accepted as truth (Bernays’ Principle).

I had that dream again. You know the one: I have to go take a final test in a class I forgot about, never attended or dropped from my schedule. Most higher-ed students have this dream repeatedly, as do former students (or for those who take the educational enterprise seriously as a life-long endeavor, perpetual students). The dream usually features open-ended anxiety because it’s all anticipation — one never steps into the classroom to sit for the test. But this time, the twist was that the final test transformed into a group problem-solving seminar. The subject matter was an arcane post-calculus specialty (maybe I’ve seen too many Big Bang Theory whiteboards strewn with undecipherable equations), and the student group was stumped trying to solve some sort of engineering problem. In heroic dream mode, I recontextualized the problem despite my lack of expertise, which propelled the group past its block. Not a true test of knowledge or understanding, since I hadn’t attended class and didn’t learn its subject matter, but a reminder that problem-solving is often not straight application of factors easily set forth and manipulable.

Outside of the dream, in my morning twilight (oxymoron alert), I mused on the limitations of tackling social issues like there were engineering problems, which typically regards materials, processes, and personnel as mere resources to be marshaled and acted upon to achieve a goal but with little consideration — at least in the moment — of downstream effects or indeed human values. The Manhattan Project is a prime example, which (arguably) helped the allied powers win WWII but launched the world into the Atomic Age, complete with its own Cold War and the awful specter of mutually assured destruction (MAD). Borrowing a term from economics, it’s easy to rationalize negative collateral effects in terms of creative destruction. I object: the modifier creative masks that the noun is still destruction (cracked eggs needed to make omelets, ya know). Otherwise, maybe the term would be destructive creation. Perhaps I misunderstand, but the breakthrough with the Manhattan Project came about through synthesis of knowledge that lay beyond the purview of most narrowly trained engineers.

That is precisely the problem with many social ills today, those that actually have solutions anyway. The political class meant to manage and administer views problems primarily through a political lens (read: campaigning) and is not especially motivated to solve anything. Similarly, charitable organizations aimed at eradicating certain problems (e.g., hunger, homelessness, crime, educational disadvantage) can’t actually solve any problems because that would be the end of their fundraising and/or government funding, meaning that the organization itself would cease. Synthetic knowledge needed to solve a problem and then terminate the project is anathema to how society now functions; better that problems persist.

Past blog posts on this topic include “Techies and Fuzzies” and “The Man Who Knew Too Little,” each of which has a somewhat different emphasis. I’m still absorbed by the conflict between generalists and specialists while recognizing that both are necessary for full effectiveness. That union is the overarching message, too, of Iain McGilchrist’s The Master and His Emissary (2010), the subject of many past blog posts.

Further to this blog post, see this quote from Daniel Schwindt’s The Case Against the Modern World (2016), which will be the subject of a new book blogging project:

As Frank Herbert, the master of science fiction, once put it: “fear is the mind-killer.” And this is the precise truth, because a person acting in fear loses his capacity for judgment precisely insofar as he is affected by his fear. In fear, he does things that, in a peaceful frame of mind, he’d have found ridiculous. This is why we would expect that, if fear were to become a generalized condition in a civilization, knowledge itself would begin to deteriorate. [p. 35]