Archive for the ‘Idealism’ Category

I continue against my better judgment listening in fits and starts to Jordan Peterson on YouTube. No doubt he’s prolific, influential, interesting, infuriating, and by all accounts, earnest. I often come away frustrated, recognizing how I’ve been fed an extended line of BS in some sort of confidence game run by an overconfident intellectual bully. Because he’s the host inviting others onto his own platform, at least of late, everyone is very polite and disagreement — if it occurs — is quite tame, which allows Peterson to elide corrections smoothly. (Live conversation runs that way: piling on top of what was already said displaces and obscures ideas because memory is limited and the most recent utterance typically assumes primacy.) I avoid some topics on Peterson’s webcasts because they’re simply too far outside his expertise to be worthwhile, which he openly admits then stomps right in anyway. For example, Peterson has a series with the caption “Climategate” (putting the conclusion before the discussion, or is that biasing his audience?). Episode 329 (which I do not embed) is titled “The Models Are OK, the Predictions Are Wrong.” His guest is Dr. Judith Curry. I should have avoided this one, too. In the course of the 1.5-hour episode, Peterson repeatedly offers a characterization of some aspect of the climate emergency, to which Dr. Curry responds “I wouldn’t describe it quite that way.” Better characterizations may follow, but that’s neither the tone nor the takeaway.

One of Peterson’s contentions is that, if indeed humans inhabit and treat the surface of the planet problematically, the best way to address the problem is to raise out of poverty those billions of people still struggling to survive. Then they, too, will be ontologically secure and positioned to start caring more about the environment. Sure, just like all those secure, multimillionaire CEOs care while running corporations that extract resources and pollute. (Incidentally, someone in a recent DarkHorse Podcast Q&A asked if Peterson’s hypothetical solution makes any sense. Disappointingly, and perhaps because DarkHorse hosts are chummy with Peterson, they said it depends on how the solution is implemented, which I take to mean that the stars must align and everyone start rowing in unison. Yeah, right.) Peterson follows up his climate solution with the indignant question “Who are we to deny those struggling to raise themselves out of poverty their chance?” Which brings me round to the title of this multipart blog.

Survival is by no means an idle notion but poses a struggle everywhere, even in the affluent West. Just ask the burgeoning homeless population or those laboring frantically to keep mortgages or rent paid so they don’t also become homeless (unhoused is the new euphemism, fooling exactly no one). Even a casual look history reveals that competition among peoples and nations to survive and prosper has wildly uneven and shifting results. Some “succeed” earlier than others or not at all and winners may in time lose their preeminence. Never has there been an all-men-are-brothers approach to competition, though temporary alliances may form. Someone (us, not them) or something (profit, not unspoilt nature) is inevitably privileged. In this context, Peterson’s “Who are we to …?” question is a non sequitur, though it may pull on heartstrings because of quite recent embrace of the idea of equity. A glib answer might be that “we are we, not them,” so of course “we” get available spoils before anyone else. Doesn’t the leader of a pack of wolves eat first? Isn’t that dynamic repeated throughout nature? Aren’t humans embedded in nature just like all other species? Don’t we privilege human life above, say, food animals we farm for sustenance? (We eat them, they rarely eat us until we die and microbes — but nothing else — consume us. Or we give ourselves up to flames, denying even the microbes. We’re selfish that way.) It’s also why the rare individual who gives away all his or her money to charity and winds up penniless is regarded as mental. For nearly all of us, it’s always me (or my progeny) first. Another way to put this that Peterson should understand is that hierarchies exist in nature. Hierarchy and privilege are impossible to disentangle, and attempts to redistribute equitably borne out of ideology tend to devolve into tyranny.

On the opposite end of the spectrum, I heard another webcast where the interviewer (Nate Hagens I believe) asked his guest what do you value (i.e., privilege) above all other things? (The word all invites an unbalanced reply.) The extended answer rather took me aback. The guest values life in all its profundity yet declined to privilege human life. In the context of the webcast, which was about the climate emergency and anticipated human die-off and/or extinction, that answer sorta made sense. Should humans survive, even if we eventually sacrifice everything else (our current operational strategy)? Or do we leave the Earth to hardier competitors such as cockroaches and rats? Most people (humans) would unhesitatingly choose us over them as, well, um, always. It’s a strange hypothetical to ponder. Taken to its extreme, if one doesn’t privilege one life form over another, then what’s the problem with criminals, scavengers, and parasites winning the battle for survival? Or more colorfully, why not give zombies and vampires their bite at the apple? They may be undead but their basic strategy for propagation is undoubtedly a winning one.

Continuing from pt. 01, the notion of privilege took an unexpected turn for me recently when the prospect inevitability of demographic collapse came back onto my radar. I scoffed at the demographer’s crystal ball earlier not because I disagree with the assessments or numbers but because, like so many aspects of the collapse of industrial civilization, demographic collapse lies squarely beyond anyone’s control. My questions in reply are basically Yeah? And? So? Do a search on demographic collapse and it will reveal a significant number of reports, with reporters themselves (including the worst contemporary media whore and glory hound who shall forever remain unnamed at this site) freaking out and losing their collective minds, on the hows, whys, wheres, and whens population will crash. The most immediate worrisome aspect is anticipated inversion of the youngest being the most numerous, the usual state of affairs, to the oldest being the most numerous and flatly unsupportable by the young. This has been the basic warning about Social Security for decades already: too few paying in, too many granted benefits. See also this documentary film being prepared for imminent release. I suspect supporting annotations will appear in time.

Probably not fair to call capitalism and its support structures a Ponzi scheme (latecomers to the scheme supporting earlier entrants), but the desire to perpetuate historical demographic distributions (as opposed to what? yielding to the drift of history?) is clearly part of the perpetual growth mentality. Traditionally, prior to the 20th century in the West, when the vast majority of people participated in agrarian and/or subsistence economies instead of the money economy, children were desirable not least because they provided free labor until they were grown, were positioned to take over family farms and businesses, and cared for oldsters when that time came. Intergenerational continuity and stability were maintained and it was generally accepted that each generation would supplant the previous through the seasons of life. The money economy destroyed most of that. Many young adult children now exercise their options (privilege) and break away as soon as possible (as I did) in search of economic opportunity in cities and form their own families (or don’t, as I didn’t). Estrangement and abandonment may not be complete, but families being spread across the continent certainly limits extended family cohesion to holidays and occasional visits. Oldsters (in the affluent West anyway) are now typically shuttled off to (euphemism alert) retirement homes to be warehoused prior to dying. Easier to pay someone to perform that service than to do it oneself, apparently. So if older people are currently privileged over the young (in some ways at least), that condition is being reversed because, dammit, children are in short supply yet needed desperately to keep growth on track.

Demographers point to a number of factors that have conspired to create the crisis (one of many interlocking crises some intellectuals have begun calling the polycrisis). The two main factors are declining fertility and reproductive choice. My suspicion is that the toxic environment, one result of centuries of industrial activity with plastics and drugs now found in human bodies where they don’t belong, accounts for many fertility issues. Add to that poor food quality (i.e., malnutrition, not just poor diets) and it’s easy to understand why healthy pregnancies might be more difficult in the 21st century than before. I’m not qualified to support that assessment, so take it for what it’s worth. Reproductive choice, another recently acquired female privilege (in historical terms), is a function of several things: financial independence, educational attainment, and availability of birth control. Accordingly, more women are choosing either to defer having children while they establish careers or choose not to have children at all. (Men make those choices, too.) Delays unexpectedly leave lots of women unable to have children for failure to find a suitable mate or having spent their best reproductive years doing other things. As I understand it, these correlations are borne out in some harrowing statistics. As the polycrisis deepens, a philosophical choice not to bring children into the world (just to suffer and die young) is also a motivation to remain childless.

(more…)

My inquiries into media theory long ago led me to Alan Jacobs and his abandoned, reactivated, then reabandoned blog Text Patterns. Jacobs is a promiscuous thinker and even more promiscuous technologist in that he has adopted and abandoned quite a few computer apps and publishing venues over time, offering explanations each time. Always looking for better tools, perhaps, but this roving public intellectual requires persistent attention lest one lose track of him. His current blog (for now) is The Homebound Symphony (not on my ruthlessly short blogroll), which is updated roughly daily, sometimes with linkfests or simple an image, other times with thoughtful analysis. Since I’m not as available as most academics to spend all day reading and synthesizing what I’ve read to put into a blog post, college class, or book, I am not on any sort of schedule and only publish new blog posts when I’m ready. Discovered in my latest visit to The Homebound Symphony was a plethora of super-interesting subject matter, which I daresay is relevant to the more literate and literary among us. Let me draw out the one that most piqued my interest. (That was the long way of tipping my hat to Jacobs for the link.)

In an old (by Internet standards) yet fascinating book review by Michael Walzer of Siep Stuurman’s The Invention of Humanity: Equality and Cultural Difference in World History (2017), Walzer describes the four inequalities that have persisted throughout human history, adding a fifth identified by Stuurman:

  • geographic inequality
  • racial inequality
  • hierarchical inequality
  • economic inequality
  • temporal inequality

I won’t unpack what each means if they’re not apparent on their face. Read for yourself. Intersections and overlapping are common in taxonomies of this sort, so don’t expect categories to be completely separate and distinct. The question of equality (or its inverse inequality) is a fairly recent development, part of a stew of 18th-century thought in the West that was ultimately distilled to one famous phrase “all men are created equal.” Seems obvious, but the phrase is fraught, and we’ve never really been equal, have we? So is it equal before god? Equal before the law? Equal in all opportunities and outcomes as social justice warriors now insist? On a moment’s inspection, no one can possibly believe we’re all equal despite aspirations that everyone be treated fairly. The very existence of perennial inequalities puts the lie to any notion of equality trucked in with the invention of humanity during the Enlightenment.

To those inequalities I would add a sixth: genetic inequality. Again, overlap with the others is acknowledged, but it might be worth observing that divergent inherited characteristics (other than wealth) appear quite early in life among siblings and peers, before most others manifest. By that, I certainly don’t mean race or sex, though differences clearly exist there as well. Think instead of intelligence, height, beauty, athletic ability, charisma, health and constitution, and even longevity (life span). Each of us has a mixture of characteristics that are plainly different from those of others and which provide either springboards or produce disadvantages. Just as it’s unusual to find someone in possession of all positive characteristics at once — the equivalent of rolling a 12 for each attribute of a new D&D character — few possess all negatives (a series of 1’s), either. Also, there’s probably no good way to rank best to worst, strongest to weakest, or most to least successful. Bean counters from one discipline or another might try, but that runs counter to the mythology “all men are created equal” and thus becomes a taboo to acknowledge, much less scrutinize.

What to do with the knowledge that all men are not in fact created equal and never will be? That some are stronger; more charming; smarter; taller with good teeth (or these days, dentists), hair, vision, and square jaws; luckier in the genetic lottery? Well, chalk it up, buddy. We all lack some things and possess others.

A bunch of unrelated things I have been reading and hearing suddenly and rather unexpectedly came together synthetically. The profusion is too great to provide a handy set of links, and backstage analytics indicate that almost no one follows the links I provide anyway, so this will be free form.

Hyperanalysis

As I have previously blogged, peering (or staring) too intently at a physical object or subject of inquiry tends to result in the object or subject being examined at the wrong resolution. An obtuse way of restating this is that one can’t study the cosmos under a microscope or cell biology through a telescope. The common mistake is to study minutia and fail to perceive the whole, rarely the reverse. Iain McGilchrist suggests that hyperanalysis is a hallmark of an imbalance in brain lateralization, where the left brain (the Emissary) takes primacy over the right brain (the Master). Others building on McGilchrist’s thesis have theorized that human culture and styles of cognition have swung in and out of proper balance periodically throughout history. One theory used portraiture to support how depiction of the human face can be either humanistic or clinical in orientation. What artists perceive then produce divergent aesthetics where the eyes and mouth in particular suggest different ways of encountering the world. Using art to develop theories of psychohistory fits well with one of the preoccupations of this blog, namely, the nature of consciousness.

Intervention

Armed with a precise appreciation of some small aspect of a whole, often abstracted and idealized independently from easily observed fact in actuality, some make the leap that human activity can be acted upon confidently with entirely foreseeable outcomes. Thus, incautious decision-makers intervene to correct, adjust, or game areas within their concern and control. The applicable aphorism that restrains cautious people is “fools rush in where angels fear to tread.” Perhaps the Dunning-Kruger Effect helps to explain (at least in part) why actors plunge forward despite ample evidence that outcomes don’t adhere reliably to planning. Indeed, the rise and fall of many institutions demonstrate the fallibility of interventions in dynamic systems. Just ask the Federal Reserve. Periodic recourse to warfare is also characterized as a breakdown of peaceful stratagems accompanied by a fog where clarity of purpose is lost. Faced with deteriorating conditions, the demand that something be done, not nothing, also testifies to human frailty and frequent refusal to accept unknown contingency or inevitability.

Micromanagement

When ideology truly misfires, strenuous interventions take the form of tyranny in an attempt to force resistant outcomes into predetermined channels. Too many awful examples of abject failure on that account to mention. Small-scale tyranny might be better recognized as micromanagement. Control freaks adopt that behavior out of any number of motivations, such as distrust of others’ competence, inability to delegate or share responsibility, and hero syndrome. Ironically, the targets of control are often petty and do not contribute meaningfully to enhanced behavior or function. Rather, they focus on a subsidiary metric as a proxy for overall health, wellbeing, and effectiveness.

As people encounter ideologies and learn to systematize, they are especially prone to using incomplete understandings of complex systems and then seizing upon one intervention or another to attempt to solve problems. This is not limited to students in the early years of college who discover a supposed Rosetta stone for fixing ills but rather includes any reform political candidate or newly minted CEO with a mandate to remove corruption, dead weight, or inefficiency. This is also the domain of specialists, experts, and academics who have concentrated their efforts in a narrow subfield and feel confident generalizing that acquired skill and applied knowledge outside the area of focus. Of course, it’s warranted to rely on brain surgeons for brain surgery and car mechanics for automotive repairs. But no one expects them to offer advice on intractable social problems, correct historical wrongs, or develop a centered philosophy of anything. Indeed, institutions and societies are so complex and inherently unsteerable that, despite many futile attempts, no one has ever developed a comprehensive understanding sufficient to engineer programs that lead anywhere remotely in the vicinity of utopia. Yet with an abundance of confidence, agitators and activists — sometime quite young but unafraid to instruct their elders — seek to implement their ideological designs, by force if necessary, to enact desired change. Maybe those changes are considered humble baby steps toward improved social justice and greater equity, tinkering around the edges perhaps, but I doubt most ambitions are so constrained. While that energy is absolutely necessary in a pluralistic society to avoid cynicism and stagnation, it often becomes a victim of its own success when radicals gain power and impose orthodoxies that are ultimately self-defeating and thus short lived. History is full of movements, civil wars, and revolutions that demonstrate the point. Dystopian fiction also forewarns how tyrannies develop out of misguided application of ideology and power.

I probably haven’t described any too well the power law that coalesced in my thinking. Nor do I pretend to have solutions for anything. As I’ve often written, given the myriad problems global civilization now faces, it’s well nigh impossible to know what to wish for with much clarity since deliverance from one set of problems often exacerbates others. World systems theorists try to piece together various dynamics into a coherent unified theory, and I admire the effort to understand how the world really works, but I still subscribe to the precautionary principle when it comes to implementing globe-spanning programs.

This blog has never been obliged to observe every passing holiday or comment on celebrity deaths or public events via press release, public statement, command performance, ritual oversharing, or other characterization more closely associated with institutions and public figures who cannot keep from thrusting themselves wantonly onto the public despite having nothing of value to say. The chattering class maintains noise levels handily, so no need to add my voice to that cacophonous chorus. To wit, the recent Thanksgiving holiday prompts each of us every year to take stock anew and identify some area(s) of contentedness and gratitude, which can be challenging considering many Americans feel abandoned, betrayed, or worse as human history and civilization lurch despotically toward their end states. However, one overheard statement of gratitude this year made a strong impression on me, and as is my wont, I couldn’t help but to connect a variety of disparate ideas. Let me digress, starting with music.

Decades ago, the London Philharmonic under Jorge Mester recorded a collection of fanfares commissioned during WWII. American composers represented include (in no particular order) Henry Cowell, Howard Hanson, Roy Harris, Morton Gould, Leonard Bernstein, Virgil Thomson, and Walter Piston. None of their respective fanfares has entered the standard repertoire. However, the sole composer whose stirring fanfare has become legitimate and instantly recognizable Americana is Aaron Copland. His fanfare celebrates no famous figure or fighting force but rather the common man. Copland’s choice to valorize the common man was a masterstroke and the music possesses appealing directness and simplicity that are unexpectedly difficult to capture in music. Far more, um, common is elaborate, noisy, surface detail that fails to please the ear nearly so well as Copland’s stately fanfare. Indeed, the album is called Twenty Fanfares for the Common Man even though that title only applies to Copland’s entry.

The holiday comment that stuck with me was a son’s gratitude for the enduring example set by his father, a common man. Whether one is disposed to embrace or repudiate the patriarchy, there can be no doubt that a father’s role within a family and community is unique. (So, too, is the mother’s. Relax, it’s not a competition; both are important and necessary.) The father-protector and father-knows-best phase of early childhood is echoed in the humorous observation that a dog sees its master as a god. Sadly, the my-dad-can-beat-up-your-dad taunt lives on, transmuted in … superhero flicks. As most of us enter adulthood, coming to terms with the shortcomings of one or both parents (nobody’s perfect …) is part of the maturation process: establishing one’s own life and identity independent yet somehow continuous from those of one’s parents. So it’s not unusual to find young men in particular striking out on their own, distancing from and disapproving of their fathers (sometimes sharply) but later circling back to reflect and reconcile. How many of us can honestly express unalloyed admiration for our fathers and their character examples? I suspect frustration when feet of clay are revealed is more typical.

(more…)

The difference between right and wrong is obvious to almost everyone by the end of kindergarten. Temptations persist and everyone does things great and small known to be wrong when enticements and advantages outweigh punishments. C’mon, you know you do it. I do, too. Only at the conclusion of a law degree or the start of a political career (funny how those two often coincide) do things get particularly fuzzy. One might add military service to those exceptions except that servicemen are trained not to think, simply do (i.e., follow orders without question). Anyone with functioning ethics and morality also recognizes that in legitimate cases of things getting unavoidably fuzzy in a hypercomplex world, the dividing line often can’t be established clearly. Thus, venturing into the wide, gray, middle area is really a signal that one has probably already gone too far. And yet, demonstrating that human society has not really progressed ethically despite considerable gains in technical prowess, egregiously wrong things are getting done anyway.

The whopper of which nearly everyone is guilty (thus, guilty pleasure) is … the Whopper. C’mon, you know you eat it do it. I know I do. Of course, the irresistible and ubiquitous fast food burger is really only one example of a wide array of foodstuffs known to be unhealthy, cause obesity, and pose long-term health problems. Doesn’t help that, just like Big Tobacco, the food industry knowingly refines their products (processed foods, anyway) to be hyperstimuli impossible to ignore or resist unless one is iron willed or develops an eating disorder. Another hyperstimulus most can’t escape is the smartphone (or a host of other electronic gadgets). C’mon, you know you crave the digital pacifier. I don’t, having managed to avoid that particular trap. For me, electronics are always only tools. However, railing against them with respect to how they distort cognition (as I have) convinces exactly no one, so that argument goes on the deferral pile.

Another giant example not in terms of participation but in terms of effect is the capitalist urge to gather to oneself as much filthy lucre as possible only to sit heartlessly on top of that nasty dragon’s hoard while others suffer in plain sight all around. C’mon, you know you would do it if you could. I know I would — at least up to a point. Periods of gross inequality come and go over the course of history. I won’t make direct comparisons between today and any one of several prior Gilded Ages in the U.S., but it’s no secret that the existence today of several hundy billionaires and an increasing number of mere multibillionaires represents a gross misallocation of financial resources: funneling the productivity of the masses (and fiat dollars whiffed into existence with keystrokes) into the hands of a few. Fake philanthropy to launder reputations fail to convince me that such folks are anything other than miserly Scrooges fixated on maintaining and growing their absurd wealth, influence, and bogus social status at the cost of their very souls. Seriously, who besides sycophants and climbers would want to even be in the same room as one of those people (names withheld)? Maybe better not to answer that question.

(more…)

Cynics knew it was inevitable: weaponized drones and robots. Axon Enterprises, Inc., maker of police weaponry (euphemistically termed “public safety technologies”), announced its development of taser equipped drones presumed capable of neutralizing an active shooter inside of 60 seconds. Who knows what sorts of operating parameters restrict their functions or if they can be made invulnerable to hacking or disallowed use as offensive weapons?

A sane, civilized society would recognize that, despite bogus memes about an armed society being a polite society, the prospect of everyone being strapped (like the fabled Old American West) and public spaces (schools, churches, post offices, laundromats, etc.) each being outfitted with neutralizing technologies is fixing the wrong problem. But we are no longer a sane society (begging the question whether we ever were). So let me suggest something radical yet obvious: the problem is not technological, it’s cultural. The modern world has made no progress with respect to indifference toward the suffering of others. Dehumanizing attitudes and technologies are no longer, well, medieval, but they’re no less cruel. For instance, people are not put in public stocks or drawn and quartered anymore, but they are shamed, cancelled, tortured, terrorized, propagandized, and abandoned in other ways that allow maniacs to pretend to others and to themselves that they are part of the solution. Hard to believe that one could now feel nostalgia for the days when, in the aftermath of yet another mass shooting, calls for gun control were met with inaction (other then empty rhetoric) rather than escalation.

The problem with diagnosing the problem as cultural is that no one is in control. Like water, culture goes where it goes and apparently sinks to its lowest ebb. Attempts to channel, direct, and uplift culture might work on a small scale, but at the level of society — and with distorted incentives freedom is certain to deliver — malefactors are guaranteed to appear. Indeed, anything that contributes to the arms race (now tiny, remote-controlled, networked killing devices rather than giant atomic/nuclear ones) only invites greater harm and is not a solution. Those maniacs (social and technical engineers promising safety) have the wrong things wrong.

Small, insular societies with strict internal codes of conduct may have figured out something that large, free societies have not, namely, that mutual respect, knowable communities, and repudiation of advanced technologies give individuals something and someone to care about, a place to belong, and things to do. When the entire world is thrown open, such as with social media, populations become atomized and anonymized, unable to position or understand themselves within a meaningful social context. Anomie and nihilism are often the rotten fruit. Splintered family units, erosion of community involvement, and dysfunctional institutions add to the rot. Those symptoms of cultural collapse need to be addressed even if they are among the most difficult wrong things to get right.

This intended follow-up has been stalled (pt. 1 here) for one simple reason: the premise presented in the embedded YouTube video is (for me at least) easy to dismiss out of hand and I haven’t wanted to revisit it. Nevertheless, here’s the blurb at the top of the comments at the webpage:

Is reality created in our minds, or are the things you can touch and feel all that’s real? Philosopher Bernardo Kastrup holds doctorates in both philosophy and computer science, and has made a name for himself by arguing for metaphysical idealism, the idea that reality is essentially a mental phenomenon.

Without going into point-by-point discussion, the top-level assertion, if I understand it correctly (not assured), is that material reality comes out of mental experience rather than the reverse. It’s a chicken-and-egg question with materialism and idealism (fancy technical terms not needed) each vying for primacy. The string of conjectures (mental gymnastics, really, briefly impressive until one recognizes how quickly they lose correlation with how most of us think about and experience reality) that inverts the basic relationship of inner experience to outer reality is an example of waaaay overthinking a problem. No doubt quite a lot of erudition can be brought to bear on such questions, but even if those questions were resolved satisfactorily on an intellectual level and an internally coherent structure or system were developed or revealed, it doesn’t matter or lead anywhere. Humans are unavoidably embodied beings. Each individual existence also occupies a tiny sliver of time (the timeline extending in both directions to infinity). Suggesting that mental experience is briefly instantiated in personhood but is actually drawn out of some well of souls, collective consciousness, or panpsychism and rejoins them in heaven, hell, or elsewhere upon expiration is essentially a religious claim. It’s also an attractive supposition, granting each of us not permanence or immortality but rather something somehow better (?) though inscrutable because it lies beyond perception (but not conceptualization). Except for an eternity of torments in hell, I guess, if one deserves that awful fate.

One comment about Kastrup. He presents his perspective (his findings?) with haughty derision of others who can’t see or understand what it so (duh!) obvious. He falls victim to the very same over-intellectualized flim-flam he mentions when dismissing materialists who need miracles and shortcuts to smooth over holes in their scientific/philosophical systems. The very existence of earnest disagreement by those who occupy themselves with such questions might suggest some humility, as in “here’s my explanation, they have theirs, judge for yourself.” But there’s a third option: the great unwashed masses (including nearly all our ancestors) for whom such questions are never even fleeting thoughts. It’s all frankly immaterial (funnily, both the right and wrong word at once). Life is lived and experienced fundamentally on its surface — unless, for instance, one has been incubated too long within the hallowed halls of academia, lost touch with one’s brethren, and become preoccupied with cosmic investigations. Something quite similar happens to politicians and the wealthy, who typically hyperfocus on gathering to themselves power and then exercising that power over others (typically misunderstood as simply pulling the levers and operating the mechanisms of society). No wonder their pocket of reality looks so strikingly different.

Let me first restate axioms developed in previous blog posts. Narrative is the essential outward form of consciousness. Cognition has many preverbal and nonverbal subtleties, but the exchange of ideas occurs predominantly through narrative, and the story of self (told to oneself) can be understood as stream of consciousness: ongoing self-narration of sensations and events. The principal characteristic of narrative, at least that which is not pure fantasy, is in-the-moment sufficiency. Snap-judgment heuristics are merely temporary placeholders until, ideally at least, thoughtful reconsideration and revision that take time and discernment can be brought to bear. Stories we tell and are told, however, often do not reflect reality well, partly because our perceptual apparatuses are flawed, partly because individuals are untrained and unskilled in critical thinking (or overtrained and distorted), and partly because stories are polluted with emotions that make clear assessments impossible (to say nothing of malefactors with agendas). Some of us struggle to remove confabulation from narrative (as best we can) whereas others embrace it because it’s emotionally gratifying.

A good example of the reality principle is recognition, similar to the 1970s energy crisis, that energy supplies don’t magically appear by simply digging and drilling more of the stuff out of the ground. Those easy-to-get resources have been plundered already. The term peak oil refers to eventual decline in energy production (harvesting, really) when the easy stuff is more than half gone and undiminished (read: increasing) demand impels energy companies to go in search of more exotic supply (e.g., underwater or embedded in shale). If that reality is dissatisfying, a host of dreamt-up stories offer us deliverance from inevitable decline and reduction of lifestyle prerogatives by positing extravagant resources in renewables, hydrogen fuel cells, fusion (not to be confused with fission), or as-yet unexploited regions such as The Arctic National Wildlife Refuge. None of these represent plausible realities (except going into heretofore protected regions and bringing ecological devastation).

The relationship of fictional stories to reality is quite complex. For this blog post, a radically narrow description is that fiction is the imaginary space whereas ideas can be tried out and explored safely in preparation for implementation in reality. Science fiction (i.e., imagining interstellar space travel despite its flat impossibility in Newtonian physics) is a good example. Some believe humans can eventually accomplish what’s depicted in sci-fi, and in certain limited examples we already have. But many sci-fi stories simply don’t present a plausible reality. Taken as vicarious entertainment, they’re AOK superfine with me. But given that Western cultures (I can’t opine on cultures outside the West) have veered dangerously into rank ideation and believing their own hype, too many people believe fervently in aspirational futures that have no hope of ever instantiating. Just like giant pools of oil hidden under the Rocky Mountains (to cite something sent to me just today offering illusory relief from skyrocketing gasoline prices).

Among the many genres of narrative now on offer in fiction, no better example of sought-after-power is the superhero story. Identifying with the technological and financial power of Ironman and Batman or the god-powers of Thor and Wonder Woman is thrilling, perhaps, but again, these are not plausible realities. Yet these superrich, superstrong, superintelligent superheros are everywhere in fiction, attesting to liminal awareness of lack of power and indeed frailty. Many superhero stories are couched as coming-of-age stories for girls, who with grit and determination can fight toe-to-toe with any man and dominate. (Too many BS examples to cite.) Helps, of course, if the girl has magic at her disposal. Gawd, do I tire of these stories, told as origins in suffering, acquisition of skills, and coming into one’s own with the mature ability to force one’s will on others, often in the form of straight-up killing and assassination. Judge, jury, and executioner all rolled into one but entirely acceptable vigilantism if done wearing a supersuit and claiming spurious, self-appointed moral authority.

There are better narratives that don’t conflate power with force or lack plausibility in the world we actually inhabit. In a rather complicated article by Adam Tooze entitled “John Mearsheimer and the Dark Origins of Realism” at The New Statesman, after a lengthy historical and geopolitical analysis of competing narratives, a mode of apprehending reality is described:

… adopting a realistic approach towards the world does not consist in always reaching for a well-worn toolkit of timeless verities, nor does it consist in affecting a hard-boiled attitude so as to inoculate oneself forever against liberal enthusiasm. Realism, taken seriously, entails a never-ending cognitive and emotional challenge. It involves a minute-by-minute struggle to understand a complex and constantly evolving world, in which we are ourselves immersed, a world that we can, to a degree, influence and change, but which constantly challenges our categories and the definitions of our interests. And in that struggle for realism – the never-ending task of sensibly defining interests and pursuing them as best we can – to resort to war, by any side, should be acknowledged for what it is. It should not be normalised as the logical and obvious reaction to given circumstances, but recognised as a radical and perilous act, fraught with moral consequences. Any thinker or politician too callous or shallow to face that stark reality, should be judged accordingly.

I had that dream again. You know the one: I have to go take a final test in a class I forgot about, never attended or dropped from my schedule. Most higher-ed students have this dream repeatedly, as do former students (or for those who take the educational enterprise seriously as a life-long endeavor, perpetual students). The dream usually features open-ended anxiety because it’s all anticipation — one never steps into the classroom to sit for the test. But this time, the twist was that the final test transformed into a group problem-solving seminar. The subject matter was an arcane post-calculus specialty (maybe I’ve seen too many Big Bang Theory whiteboards strewn with undecipherable equations), and the student group was stumped trying to solve some sort of engineering problem. In heroic dream mode, I recontextualized the problem despite my lack of expertise, which propelled the group past its block. Not a true test of knowledge or understanding, since I hadn’t attended class and didn’t learn its subject matter, but a reminder that problem-solving is often not straight application of factors easily set forth and manipulable.

Outside of the dream, in my morning twilight (oxymoron alert), I mused on the limitations of tackling social issues like there were engineering problems, which typically regards materials, processes, and personnel as mere resources to be marshaled and acted upon to achieve a goal but with little consideration — at least in the moment — of downstream effects or indeed human values. The Manhattan Project is a prime example, which (arguably) helped the allied powers win WWII but launched the world into the Atomic Age, complete with its own Cold War and the awful specter of mutually assured destruction (MAD). Borrowing a term from economics, it’s easy to rationalize negative collateral effects in terms of creative destruction. I object: the modifier creative masks that the noun is still destruction (cracked eggs needed to make omelets, ya know). Otherwise, maybe the term would be destructive creation. Perhaps I misunderstand, but the breakthrough with the Manhattan Project came about through synthesis of knowledge that lay beyond the purview of most narrowly trained engineers.

That is precisely the problem with many social ills today, those that actually have solutions anyway. The political class meant to manage and administer views problems primarily through a political lens (read: campaigning) and is not especially motivated to solve anything. Similarly, charitable organizations aimed at eradicating certain problems (e.g., hunger, homelessness, crime, educational disadvantage) can’t actually solve any problems because that would be the end of their fundraising and/or government funding, meaning that the organization itself would cease. Synthetic knowledge needed to solve a problem and then terminate the project is anathema to how society now functions; better that problems persist.

Past blog posts on this topic include “Techies and Fuzzies” and “The Man Who Knew Too Little,” each of which has a somewhat different emphasis. I’m still absorbed by the conflict between generalists and specialists while recognizing that both are necessary for full effectiveness. That union is the overarching message, too, of Iain McGilchrist’s The Master and His Emissary (2010), the subject of many past blog posts.