Posts Tagged ‘Epistemology’

In my preparations for a speech to be given in roughly two months, I stumbled across a prescient passage in an essay entitled “Jesuitism” from Latter-Day Pamphlets (1850) by Thomas Carlyle. Connect your own dots as this is offered without comment.

… this, then, is the horrible conclusion we have arrived at, in England as in all countries; and with less protest against it hitherto, and not with more, in England than in other countries? That the great body of orderly considerate men; men affecting the name of good and pious, and who, in fact, excluding certain silent exceptionary individuals one to the million, such as the Almighty Beneficence never quite withholds, are accounted our best men,–have unconsciously abnegated the sacred privilege and duty of acting or speaking the truth; and fancy that it is not truth that is to be acted, but that an amalgam of truth and falsity is the safe thing. In parliament and pulpit, in book and speech, in whatever spiritual thing men have to commune of, or to do together, this is the rule they have lapsed into, this is the pass they have arrived at. We have to report than Human Speech is not true! That it is false to a degree never witnessed in this world till lately. Such a subtle virus of falsity in the very essence of it, as far excels all open lying, or prior kinds of falsity; false with consciousness of being sincere! The heart of the world is corrupted to the core; a detestable devil’s-poison circulates in the life-blood of mankind; taints with abominable deadly malady all that mankind do. Such a curse never fell on men before.

For the falsity of speech rests on a far deeper falsity. False speech, as is inevitable when men long practise it, falsifies all things; the very thoughts, or fountains of speech and action become false. Ere long, by the appointed curse of Heaven, a man’s intellect ceases to be capable of distinguishing truth, when he permits himself to deal in speaking or acting what is false. Watch well the tongue, for out of it are the issues of life! O, the foul leprosy that heaps itself in monstrous accumulation over Human Life, and obliterates all the divine features of it into one hideous mountain of purulent disease, when Human Life parts company with truth; and fancies, taught by Ignatius or another, that lies will be the salvation of it! We of these late centuries have suffered as the sons of Adam never did before; hebetated, sunk under mountains of torpid leprosy; and studying to persuade ourselves that this is health.

And if we have awakened from the sleep of death into the Sorcerer’s Sabbath of Anarchy, is it not the chief of blessings that we are awake at all? Thanks to Transcendent Sansculottism and the long-memorable French Revolution, the one veritable and tremendous Gospel of these bad ages, divine Gospel such as we deserved, and merciful too, though preached in thunder and terror! Napoleon Campaignings, September Massacres, Reigns of Terror, Anacharsis Clootz and Pontiff Robespierre, and still more beggarly tragicalities that we have since seen, and are still to see: what frightful thing were not a little less frightful than the thing we had? Peremptory was our necessity of putting Jesuitism away, of awakening to the consciousness of Jesuitism. ‘Horrible,’ yes: how could it be other than horrible? Like the valley of Jehoshaphat, it lies round us, one nightmare wilderness, and wreck of dead-men’s bones, this false modern world; and no rapt Ezekiel in prophetic vision imaged to himself things sadder, more horrible and terrible, than the eyes of men, if they are awake, may now deliberately see. Many yet sleep; but the sleep of all, as we judge by their maundering and jargoning, their Gorham Controversies, street-barricadings, and uneasy tossings and somnambulisms, is not far from ending. Novalis says, ‘We are near awakening when we dream that we are dreaming.’ [italics in original]

Well, dammit! Guess I’m gonna have to add a SWOTI tag after all. Obviously, I’ve been paying too much attention to bogus pronouncements by economists.

/rant on

Yet more fools stating confidently that climate change is not really a serious concern has me gasping in exasperation. Take, for instance, this astounding paragraph by Egon von Greyerz:

Yes, of course global warming has taken place recently as the effect of climate cycles. But the cycle has just peaked again which means that all the global warming activists will gradually cool down with the falling temperatures in the next few decades. The sun and the planets determine climate cycles and temperatures, like they have for many millions of years, and not human beings. [emphasis added]

So no climate change worries to disturb anyone’s dreams. Sleep soundly. I’m so relieved. All the effort expended over the past decades toward understanding climate change can be waived off with a mere three sentences by a motivated nonexpert. The linked webpage offers no support whatsoever for these bald statements but instead goes on to offer economic prophecy (unironically, of certain doom). For minimal counter-evidence regarding climate change, embedded below is a two-year-old video explaining how some regions are expected to become uninhabitable due to high wet-bulb temperatures.

The article ends with these brief paragraphs:

There is no absolute protection against this scenario [economic collapse] since it will hit all aspects of life and virtually all people. Obviously, people living off the land in remote areas will suffer less whilst people in industrial and urban areas will suffer considerably.

The best financial protection is without hesitation physical gold and some silver. These metals are critical life insurance. But there are clearly many other important areas of protection to plan for. A circle of friends and family is absolutely essential. [emphasis in original]

Ok, so I’m wrong: they guy’s not an economist at all; he’s a salesman. After placating one catastrophe only to trot out another, his scaremongering message clear: buy gold and silver. Might not be a bad idea, actually, but that won’t protect against TEOTWAWKI. So whose eyes are deceiving them, Egon’s or mine (or yours)? He’s selling precious metals; I’m sharing the truth (best as I can ascertain, anyway).

The other idiotic thing to darker my brow was several actual economists asked about the economic effects of implementing Greta Thunberg’s dream world (sarcasm much?). If her dream world is spelled out somewhere, I haven’t seen it, nor is it provided (link or otherwise) in the article. Seems like the sort of invented argument attached to a trending name for the purpose of clickbait attacking the messenger and thus shooting down her message. However, let me be generous for a moment and suggest that efforts to stop climate change include, at a minimum, getting off fossil fuels, reforming Big Ag, and denying developing nations their quest to join the First-World Age of Abundance. Those are the three subjects discussed in the article. These economists’ conclusion? It will be, um, costly. Well, yeah, true! Very costly indeed. I agree entirely. But what of the cost if those things aren’t done? Isn’t that question implied? Isn’t that what Greta Thunberg has insisted upon? The answer is it will cost far more, though perhaps not in something as cravenly readily quantifiable as profit or cost. Referring again to the embedded video above, it will cost us the very habitability of the planet, and not in just a few restricted regions we can add to existing sacrifice zones. Widespread species dislocation and die-off will include the human species, since we rely on all the others. Some prophesy a human death pulse of monstrous proportion (several billions, up to perhaps 90% of us) or even near-term human extinction. Is that costly enough to think about the problem differently, urgently, as Greta Thunberg does? Might the question be better framed as the cost of not implementing Greta Thunberg’s dream world so that economists are sent off on a different analytical errand?

In the middle of the 19th century, Scottish satirist Thomas Carlyle called economics The Dismal Science, which description stuck. The full context of that coinage may have had more to do with slavery than poor scholarship, so in the context of lying or at least misleading with numbers, I propose instead calling it The Deceitful Science. Among the stupid habits to dispel is the risible notion that, by measuring something as a means of understanding it, we grasp its fullness, and concomitantly, what’s really important. I suggest further that most economists deceive themselves by performing a fundamentally wrong kind of analysis.

The issue of deceit is of some importance beyond getting at the truth of climate change. Everything in the public sphere these days is susceptible to spin, massage, and reframing to such a degree that an epistemological crisis (my apt term) has fundamentally altered sense making, with the result that most nonexperts simply don’t know what to believe anymore. Economists are doing no one any favors digressing into areas beyond their Deceitful Science.

/rant off

Delving slightly deeper after the previous post into someone-is-wrong-on-the-Internet territory (worry not: I won’t track far down this path), I was dispirited after reading some economist dude with the overconfidence hubris to characterize climate change as fraud. At issue is the misframing of proper time periods in graphical data for the purpose of overthrowing government and altering the American way of life. (Um, that’s the motivation? Makes no sense.) Perhaps this fellow’s intrepid foray into the most significant issue of our time (only to dismiss it) is an aftereffect of Freakonomics emboldening economists to offer explanations and opinions on matters well outside their field of expertise. After all, truly accurate, relevant information is only ever all about numbers (read: the Benjamins), shaped and delivered by economists, physical sciences be damned.

The author of the article has nothing original to say. Rather, he repackages information from the first of two embedded videos (or elsewhere?), which examines time frames of several trends purportedly demonstrating global warming (a term most scientists and activists have disused in favor of climate change, partly to distinguish climate from weather). Those trends are heat waves, extent of Arctic ice, incidence of wildfires, atmospheric carbon, sea level, and global average temperature. Presenters of weather/climate information (such as the IPCC) are accused of cherry-picking dates (statistical data arranged graphically) to present a false picture, but then similar data with other dates are used to depict another picture supposedly invalidating the first set of graphs. It’s a case of lying with numbers and then lying some more with other numbers.

Despite the claim that “reports are easily debunked as fraud,” I can’t agree that this example of climate change denial overcomes overwhelming scientific consensus on the subject. It’s not so much that the data are wrong (I acknowledge they can be misleading) but that the interpretation of effects of industrial activity since 1750 (a more reasonable comparative baseline) isn’t so obvious as simply following shortened or lengthened trend lines and demographics up or down. That’s typically zooming in or out to render the picture most amenable to a preferred narrative, precisely what the embedded video does and in turn accuses climate scientists and activists of doing. The comments under the article indicate a chorus of agreement with the premise that climate change is a hoax or fraud. Guess those commentators haven’t caught up yet with rising public sentiment, especially among the young.

Having studied news and evidence of climate change as a layperson for roughly a dozen years now, the conclusions drawn by experts (ignoring economists) convince me that we’re pretty irredeemably screwed. The collapse of industrial civilization and accompanying death pulse are the predicted outcomes but a precise date is impossible to provide because it’s a protracted process. An even worse possibility is near-term human extinction (NTHE), part of the larger sixth mass extinction. Absorbing this information has been a arduous, ongoing, soul-destroying undertaking for me, and evidence keeps being supplemented and revised, usually with ever-worsening prognoses. However, I’m not the right person to argue the evidence. Instead, see this lengthy article (with profuse links) by Dr. Guy McPherson, which is among the best resources outside of the IPCC.

In fairness, except for the dozen years I’ve spent studying the subject, I’m in no better position to offer inexpert opinion than some economist acting the fool. But regular folks are implored to inform and educate themselves on a variety of topics if nothing else than so that they can vote responsibly. My apprehension of reality and human dynamics may be no better than the next, but as history proceeds, attempting to make sense of the deluge of information confronting everyone is something I take seriously. Accordingly, I’m irked when contentious issues are warped and distorted, whether earnestly or malignantly. Maybe economists, like journalists, suffer from a professional deformation that confers supposed explanatory superpowers. However, in the context of our current epistemological crisis, I approach their utterances and certainty with great skepticism.

Continuing (after some delay) from part 1, Pankaj Mishra concludes chapter 4 of The Age of Anger with an overview of Iranian governments that shifted from U.S./British client state (headed by the Shah of Iran, reigned 1941–1979) to its populist replacement (headed by Ayatollah Khomeini, ruled 1979–1989), both leaders having been authoritarians. During the period discussed, Iran underwent the same modernization and infiltration by liberal, Western values and economics, which produced a backlash familiar from Mishra’s descriptions of other nations and regions that had experienced the same severed roots of place since the onset of the Enlightenment. Vacillation among two or more styles of government might be understood as a thermostatic response: too hot/cold one direction leads to correction in another direction. It’s not a binary relationship, however, between monarchy and democracy (to use just one example). Nor are options between a security state headed by an installed military leader and a leader elected by popular vote. Rather, it’s a question of national identity being alternatively fractured and unified (though difficult to analyze and articulate) in the wake of multiple intellectual influences.

According to Lewis and Huntington, modernity has failed to take root in intransigently traditional and backward Muslim countries despite various attempts to impose it by secular leaders such as Turkey’s Atatürk, the Shah of Iran, Algeria’s Ben Bella, Egypt’s Nasser and Sadat, and Pakistan’s Ayub Khan.

Since 9/11 there have been many versions, crassly populist as well as solemnly intellectual, of the claims by Lewis and Huntington that the crisis in Muslim countries is purely self-induced, and [that] the West is resented for the magnitude of its extraordinary success as a beacon of freedom, and embodiment of the Enlightenment’s achievements … They have mutated into the apparently more sophisticated claim that the clash of civilizations occurs [primarily] within Islam, and that Western interventions are required on behalf of the ‘good Muslim’, who is rational, moderate and liberal. [p. 127]

This is history told by the putative winners. Mishra goes on:

Much of the postcolonial world … became a laboratory for Western-style social engineering, a fresh testing site for the Enlightenment ideas of secular progress. The philosophes had aimed at rationalization, or ‘uniformization’, of a range of institutions inherited from an intensely religious era. Likewise, postcolonial leaders planned to turn illiterate peasants into educated citizens, to industrialize the economy, move the rural population to cities, alchemize local communities into a singular national identity, replace the social hierarchies of the past with an egalitarian order, and promote the cults of science and technology among a pious and often superstitious population. [p. 133]

Readers may recognize this project and/or process by its more contemporary name: globalization. It’s not merely a war of competing ideas, however, because those ideas manifest in various styles of social and political organization. Moreover, the significance of migration from rural agrarian settings to primarily urban and suburban ones can scarcely be overstated. This transformation (referring to the U.S. in the course of the 20th century) is something James Howard Kunstler repeatedly characterizes rather emphatically as the greatest misallocation of resources in the history of the world. Mishra summarizes the effects of Westernization handily:

In every human case, identity turns out to be porous and inconsistent rather than fixed and discrete; and prone to get confused and lost in the play of mirrors. The cross-currents of ideas and inspirations — the Nazi reverence for Atatürk, a gay French philosopher’s denunciation of the modern West and sympathy for the Iranian Revolution, or the various ideological inspirations for Iran’s Islamic Revolution (Zionism, Existentialism, Bolshevism and revolutionary Shiism) — reveal that the picture of a planet defined by civilizations closed off from one another and defined by religion (or lack thereof) is a puerile cartoon. They break the simple axis — religious-secular, modern-medieval, spiritual-materialist — on which the contemporary world is still measured, revealing that its populations, however different their pasts, have been on converging and overlapping paths. [p. 158]

These descriptions and analyses put me in mind of a fascinating book I read some years ago and reviewed on Amazon (one of only a handful of Amazon reviews): John Reader’s Man on Earth (1988). Reader describes and indeed celebrates incredibly diverse ways of inhabiting the Earth specially adapted to the landscape and based on evolving local practices. Thus, the notion of “place” is paramount. Comparison occurs only by virtue of juxtaposition. Mishra does something quite different, drawing out the connective ideas that account for “converging and overlapping paths.” Perhaps inevitably, disturbances to collective and individual identities that flow from unique styles of social organization, especially those now operating at industrial scale (i.e., industrial civilization), appear to be picking up. For instance, in the U.S., even as mass shootings (a preferred form of attack but not the only one) appear to be on the rise at the same time that violent crime is at an all-time low, perpetrators of violence are not limited to a few lone wolves, as the common trope goes. According to journalist Matt Agorist,

mass shootings — in which murdering psychopaths go on rampages in public spaces — have claimed the lives of 339 people since 2015 [up to mid-July 2019]. While this number is certainly shocking and far too high, during this same time frame, police in America have claimed the lives of 4,355 citizens.

And according to this article in Vox, this crazy disproportion (police violence to mass shootings) is predominantly an American thing at least partly because of our high rate of fetishized civilian gun ownership. Thus, the self-described “land of the free, home of the brave” has transformed itself into a paranoid garrison state affecting civil authority even more egregiously than the disenfranchised (mostly young men). Something similar occurred during the Cold War, when leaders became hypervigilant for attacks and invasions that never came. Whether a few close calls during the height of the Cold War were the result of escalating paranoia, brinkmanship, or true, maniacal, existential threats from a mustache-twirling, hand-rolling despot hellbent on the destruction of the West is a good question, probably impossible to answer convincingly. However, the result today of this mindset couldn’t be more disastrous:

It is now clear that the post-9/11 policies of pre-emptive war, massive retaliation, regime change, nation-building and reforming Islam have failed — catastrophically failed — while the dirty war against the West’s own Enlightenment [the West secretly at war with itself] — inadvertently pursued through extrajudicial murder, torture, rendition, indefinite detention and massive surveillance — has been a wild success. The uncodified and unbridled violence of the ‘war on terror’ ushered in the present era of absolute enmity in which the adversaries, scornful of all compromise, seek to annihilate each other. Malignant zealots have emerged at the very heart of the democratic West after a decade of political and economic tumult; the simple explanatory paradigm set in stone soon after the attacks of 9/11 — Islam-inspired terrorism versus modernity — lies in ruins. [pp.124–125]

I’ve written a different form of this blog post at least once before, maybe more. Here’s the basic thesis: the bizarro unreality of the world in which we now live is egregious enough to make me wonder if we haven’t veered wildly off the path at some point and now exist within reality prime. I suppose one can choose any number of historical inflections to represent the branching point. For me, it was the reelection of George W. Bush in 2004. (The 9/11 attacks and “wars” in Afghanistan and Iraq had already occurred or commenced by then, and it had already revealed as well that lies — Saddam had WMDs — that sold the American public on the Iraq “war” were effective and remain so today.) Lots of other events changed the course of history, but none other felt as much to me like a gut punch precisely because, in the case of the 2004 presidential election, we chose our path. I fantasized waking up from my reality-prime nightmare but eventually had to grudgingly accept that if multiverses exist, ours mine had become one where we chose (collectively, and just barely) to keep in office an executive who behaved like a farce of stupidity. Well, joke’s on us. Twelve years later, we chose someone even more stupid, though with a “certain serpentine cunning,” and with arguably the worst character of any U.S. executive in living history.

So what to do in the face of this dysfunctional state of affairs? Bret Weinstein below has ideas. (As usual, I’m quite late, embedding a video that by Internet standards is already ancient. I also admit this is equivalent to a smash cut because I don’t have a particularly good transition or justification for turning so suddenly to Weinstein.) Weinstein is an evolutionary biologist, so no surprise that the approach he recommends is borne out of evolutionary thinking. In fairness, a politician would logically recommend political solutions, a financier would recommend economic solutions, and other professionals would seek solutions from within their areas of expertise.

The title of the interview is “Harnessing Evolution,” meaning Weinstein suggests we use evolutionary models to better understand our own needs and distortions to guide or plot proper path(s) forward and get back on track. Never mind that a healthy minority of the U.S. public rejects evolution outright while an additional percentage takes a hybrid stance. While I’m impressed that Weinstein has an answer for everything (pedagogue or demagogue or both?) and has clearly thought through sociopolitical issues, I daresay he’s living in reality double-prime if he thinks science education can be a panacea for what ails us. My pessimism is showing.

A paradoxical strength/weakness of reason is its inherent disposition toward self-refutation. It’s a bold move when undertaken with genuine interest in getting things right. Typically, as evidence piles up, consensus forms that’s tantamount to proof unless some startling new counter-evidence appears. Of course, intransigent deniers exist and convincing refutations do appear periodically, but accounts of two hotly contested topics (from among many) — evolution and climate change — are well established notwithstanding counterclaims completely disproportionate in their ferocity to the evidence. For rationalists, whatever doubts remain must be addressed and accommodated even if disproof is highly unlikely.

This becomes troublesome almost immediately. So much new information is produced in the modern world that, because I am duty-bound to consider it, my head spins. I simply can’t deal with it all. Inevitably, when I think I’ve put a topic to rest and conclude I don’t have to think too much more about it, some argument-du-jour hits the shit pile and I am forced to stop and reconsider. It’s less disorienting when facts are clear, but when interpretive, I find my head all too easily spun by the latest, greatest claims of some charming, articulate speaker able to cobble together evidence lying outside of my expertise.

Take for instance Steven Pinker. He speaks in an authoritative style and has academic credentials that dispose me to trust his work. His new book is Enlightenment Now: The Case for Reason, Science, Humanism, and Progress (2018). Still, Pinker is an optimist, whereas I’m a doomer. Even though I subscribe to Enlightenment values (for better or worse, my mind is bent that way), I can’t escape a mountain of evidence that we’ve made such a mess of things that reason, science, humanism, and progress are hardly panaceas capable of saving us from ourselves. Yet Pinker argues that we’ve never had it so good and the future looks even brighter. I won’t take apart Pinker’s arguments; it’s already been done by Jeremy Lent, who concludes that Pinker’s ideas are fatally flawed. Lent has the expertise, data, and graphs to demonstrate it. Calling Pinker a charlatan would be unfair, but his appreciation of the state of the world stands in high contrast with mine. Who ya gonna believe?

Books and articles like Pinker’s appear all the time, and in their aftermath, so, too, do takedowns. That’s the marketplace of ideas battling it out, which is ideally meant to sharpen thinking, but with the current epistemological crises under way (I’ve blogged about it for years), the actual result is dividing people into factions, destabilizing established institutions, and causing no small amount of bewilderment in the public as to what and whom to believe. Some participants in the exchange of ideas take a sober, evidential approach; others lower themselves to snark and revel in character assassination without bothering to make reasoned arguments. The latter are often called a hit pieces (a special province of the legacy media, it seems), since hefty swipes and straw-man arguments tend to be commonplace. I’m a sucker for the former style but have to admit that the latter can also hit its mark. However, both tire me to the point of wanting to bury my head.

(more…)

One of the very best lessons I took from higher education was recognizing and avoiding the intentional fallacy — in my own thinking no less than in that of others. Although the term arguably has more to do with critical theory dealing specifically with texts, I learned about it in relation to abstract fine arts, namely, painting and music. For example, the enigmatic expression of the Mona Lisa by Leonardo Da Vinci continues to spark inquiry and debate. What exactly does that smile mean? Even when words or programs are included in musical works, it’s seductively easy to conclude that the composer intends this or the work itself means that. Any given work purportedly allows audiences to peer into the mind of its creator(s) to interrogate intent. Conclusions thus drawn, however, are notoriously unreliable though commonplace.

It’s inevitable, I suppose, to read intent into artistic expression, especially when purpose feels so obvious or inevitable. Similar excavations of meaning and purpose are undertaken within other domains of activity, resulting in no end of interpretation as to surface and deep strategies. Original intent (also originalism) is a whole field of endeavor with respect to interpretation of the U.S. Constitution and imagining the framers’ intent. Geopolitics is another domain where hindsight analysis results in some wildly creative but ultimately conjectural interpretations of events. Even where authorial (and political) intent is explicitly recorded, such as with private diaries or journals, the possibility of deceptive intent by authors keeps everyone wondering. Indeed, although “fake news” is modern coin, a long history of deceptive publishing practice well beyond the adoption of a nom de plume attests to hidden or unknowable intent making “true intent” a meta property.

The multi-ring circus that the modern information environment has become, especially in the wake of electronic media (e.g., YouTube channels) produced by anyone with a camera and an Internet connection, is fertile ground for those easily ensnared by the intentional fallacy. Several categories of intent projected onto content creators come up repeatedly: profit motive, control of the narrative (no small advantage if one believes this blog post), setting the record straight, correcting error, grandstanding, and trolling for negative attention. These categories are not mutually exclusive. Long ago, I pointed to the phenomenon of arguing on-line and how it typically accomplishes very little, especially as comment threads lengthen and civility breaks down. These days, comments are an Internet legacy and/or anachronism that many content creators persist in offering to give the illusion of a wider discussion but in fact roundly ignore. Most blogs and channels are actually closed conversations. Maybe a Q&A follows the main presentation when held before an audience, but video channels are more often one-way broadcasts addressing an audience but not really listening. Public square discussion is pretty rare.

Some celebrate this new era of broadcasting, noting with relish how the mainstream media is losing its former stranglehold on attention. Such enthusiasm may be transparently self-serving but nonetheless rings true. A while back, I pointed to New Media Rockstars, which traffics in nerd culture entertainment media, but the term could easily be expanded to include satirical news, comedy, and conversational webcasts (also podcasts). Although some folks are rather surprised to learn that an appetite for substantive discussion and analysis exists among the public, I surmise that the shifting media landscape and disintegrated cultural narrative have bewildered a large segment of the public. The young in particular are struggling to make sense of the world, figure out what to be in life and how to function, and working out an applied philosophy that eschews more purely academic philosophy.

By way of example of new media, let me point to a trio of YouTube channels I only recently discovered. Some More News parodies traditional news broadcasts by sardonically (not quite the same as satirically) calling bullshit on how news is presented. Frequent musical cues between segments make me laugh. Unlike the mainstream media, which are difficult not to regard as propaganda arms of the government, Some More News is unapologetically liberal and indulges in black humor, which doesn’t make me laugh. Its raw anger and exasperation are actually a little terrifying. The second YouTube channel is Three Arrows, a sober, thorough debunking of news and argumentation found elsewhere in the public sphere. The speaker, who doesn’t appear onscreen, springs into action especially when accusations of current-day Nazism come up. (The current level of debate has devolved to recklessly calling nearly everyone a Nazi at some stage. Zero points scored.) Historical research often puts things into proper context, such as the magnitude of the actual Holocaust compared to some garden-variety racist running his or her mouth comparatively harmlessly. The third YouTube channel is ContraPoints, which is rather fanciful and profane but remarkably erudite considering the overall tone. Labels and categories are explained for those who may not have working definitions at the ready for every phrase or ideology. Accordingly, there is plenty of jargon. The creator also appears as a variety of different characters to embody various archetypes and play devil’s advocate.

While these channels may provide abundant information, correcting error and contextualizing better than most traditional media, it would be difficult to conclude they’re really moving the conversation forward. Indeed, one might wonder why bother preparing these videos considering how time consuming it has to be to do research, write scripts, assemble pictorial elements, etc. I won’t succumb to the intentional fallacy and suggest I know why they bother holding these nondebates. Further, unless straight-up comedy, I wouldn’t say they’re entertaining exactly, either. Highly informative, perhaps, if one pays close attention to frenetic online pace and/or mines for content (e.g., studying transcripts or following links). Interestingly, within a fairly short period of time, these channels are establishing their own rhetoric, sometimes useful, other times too loose to make strong impressions. It’s not unlike the development of new stylistic gestures in music or painting. What if anything worthwhile will emerge from the scrum will be interesting.

If the previous blog in this series was about how some ideas and beliefs become lodged or stuck in place (fixity bias), this one is about how other ideas are notoriously mutable (flexibility bias), especially the latest, loudest thing to turn one’s head and divert attention. What makes any particular idea (or is it the person?) prone to one bias or another (see this list) is mysterious to me, but my suspicion is that a character disposition toward openness and adherence to authoritative evidence figure prominently in the case of shifting opinion. In fact, this is one of the primary problems with reason: if evidence can be deployed in favor of an idea, those who consider themselves “reasonable” and thus rely on accumulation of evidence and argumentation to sharpen their thinking are vulnerable to the latest “finding” or study demonstrating sumpinorutha. It’s the intellectual’s version of “New! Improved!”

Sam Harris exploits rationalism to argue against the existence of free will, saying that if sufficient evidence can be brought to bear, a disciplined thinker is compelled to subscribe to the conclusions of reasoned argument. Choice and personal agency (free will) are removed. I find that an odd way to frame the issue. Limitless examples of lack of choice are nonequivalent to the destruction of free will. For example, one can’t decide not to believe in gravity and fly up into the air more than a few inches. One can’t decide that time is an illusion (as theoretical physicists now instruct) and decide not to age. One can’t decide that pooping is too disgusting and just hold it all in (as some children attempt). Counter-evidence doesn’t even need to be argued because almost no one pretends to believe such nonsense. (Twisting one’s mind around to believe in the nonexistence of time, free will, or the self seems to be the special province of hyper-analytical thinkers.) Yet other types of belief/denial — many of them conspiracy theories — are indeed choices: religion, flat Earth, evolution, the Holocaust, the moon landings, 9/11 truth, who really killed JFK, etc. Lots of evidence has been mustered on different sides (multiple facets, actually) of each of these issues, and while rationalists may be compelled by a preponderance of evidence in favor of one view, others are free to fly in the face of that evidence for reasons of their own or adopt by default the dominant narrative and not worry or bother so much.

The public struggles in its grasp of truthful information, as reported in a Pew Research Center study called “Distinguishing Between Factual and Opinion Statements in the News.” Here’s the snapshot:

The main portion of the study, which measured the public’s ability to distinguish between five factual statements and five opinion statements, found that a majority of Americans correctly identified at least three of the five statements in each set. But this result is only a little better than random guesses. Far fewer Americans got all five correct, and roughly a quarter got most or all wrong.

Indiscriminate adoption by many Americans of a faulty viewpoint, or more pointedly, the propaganda and “fake news” on offer throughout the information environment, carries the implication that disciplined thinkers are less confused about truth or facts, taking instead a rational approach as the basis for belief. However, I suggest that reason suffers its own frailties not easily recognized or acknowledged. In short, we’re all confused, though perhaps not hopelessly so. For several years now, I’ve sensed the outline of a much larger epistemological crisis where quintessential Enlightenment values have come under open attack. The irony is that the wicked stepchild of science and reason — runaway technology —  is at least partially responsible for this epochal conflict. It’s too big an idea to grok fully or describe in a paragraph or two, so I’ll simply point to it an move on.

My own vulnerability to flexibility bias manifests specifically in response to appeals to authority. Although well educated, a lifelong autodidact, and an independent thinker, I’m careful not to succumb to the hubris of believing I’ve got it all figgered. Indeed, it’s often said that as one gains expertise and experience in the world, the certainty of youth yields to caution precisely because the mountain of knowledge and understanding one lacks looms larger even as one accumulates wisdom. Bodies of thought become multifaceted and all arguments must be entertained. When an expert, researcher, or academic proposes something outside my wheelhouse, I’m a sitting duck: I latch onto the latest, greatest utterance as the best truth yet available. I don’t fall for it nearly so readily with journalists, but I do recognize that some put in the effort and gain specialized knowledge and context well outside the bounds of normal life, such as war reporters. Various perverse incentives deeply embedded in the institutional model of journalism, especially those related to funding, make it nearly impossible to maintain one’s integrity without becoming a pariah, so only a handful have kept my attention. John Pilger, Chris Hedges, and Matt Taibbe figure prominently.

By way of example, one of the topics that has been of supreme interest to me, though its historical remove renders it rather toothless now, is the cataclysm(s) that occurred at the conclusion of the last ice age roughly 12,000 years ago. At least three hypotheses (of which I’m aware) have been proposed to explain why glacial ice disappeared suddenly over the course of a few weeks, unleashing the Biblical Flood: Earth crust displacement, asteroidal impact(s), and coronal mass ejection(s). Like most hypotheses, evidence is both physical and conjectural, but a sizable body of evidence and argumentation for each is available. As I became familiar with each, my head turned and I became a believer, sorta. Rather than “last one is the rotten egg,” however, the latest, most recent one typically displaces the previous one. No doubt another hypothesis will appear to turn my head and disorient me further. With some topics, especially politics, new information piling on top of old is truly dizzying. And as I’ve written about many topics, I simply lack the expertise to referee competing claims, so whatever beliefs I eventually adopt are permanently provisional.

Finally, my vulnerability to authoritative appeal also reacts to the calm, unflappable tones and complexity of construction of speakers such as Sam Harris, Steven Pinker, and Charles Murray. Their manner of speaking is sometimes described pejoratively as “academese,” though only Pinker has a teaching position. Murray in particular relies heavily on psychometrics, which may not be outright lying with statistics but allows him to rationalize (literally) extraordinarily taboo subjects. In contrast, it’s easy to disregard pundits and press agents foaming and fulminating over their pet narratives. Yet I also recognize that with academese, I’m being soothed more by style than by substance, a triumph of form over function. In truth, this communication style is an appeal to emotion masquerading as an appeal to authority. I still prefer it, just as I prefer a steady, explanatory style of journalism over the snarky, reinterpretive style of disquisition practiced by many popular media figures. What communicates most effectively to me and (ironically) pushes my emotional buttons also weakens my ability to discriminate and think properly.

Yet still more to come in part 5.

YouTube ratings magnet Jordan Peterson had a sit-down with Susan Blackmore to discuss/debate the question, “Do We Need God to Make Sense of Life?” The conversation is lightly moderated by Justin Brierley and is part of a weekly radio broadcast called Unbelievable? (a/k/a The Big Conversation, “the flagship apologetics and theology discussion show on Premier Christian Radio in the UK”). One might wonder why evangelicals are so eager to pit believers and atheists against each other. I suppose earnest questioning of one’s faith is preferable to proselytizing, though both undoubtedly occur. The full episode (47 min.) is embedded below: (more…)

Language acquisition in early childhood is aided by heavy doses of repetition and the memorable structure of nursery rhymes, songs, and stories that are repeated ad nauseum to eager children. Please, again! Again, again … Early in life, everything is novel, so repetition and fixity are positive attributes rather than causes for boredom. The music of one’s adolescence is also the subject of endless repetition, typically through recordings (radio and Internet play, mp3s played over headphones or earbuds, dances and dance clubs, etc.). Indeed, most of us have mental archives of songs heard over and over to the point that the standard version becomes canonical: that’s just the way the song goes. When someone covers a Beatles song, it’s recognizably the same song, yet it’s not the same and may even sound wrong somehow. (Is there any acceptable version of Love Shack besides that of the B52’s?) Variations of familiar folk tales and folk songs, or different phrasing in The Lord’s Prayer, imprinted in memory through sheer repetition, also possess discomfiting differences, sometimes being offensive enough to cause real conflict. (Not your Abrahamic deity, mine!)

Performing musicians traverse warhorses many times in rehearsal and public performance so that, after an undetermined point, how one performs a piece just becomes how it goes, admitting few alternatives. Casual joke-tellers may improvise over an outline, but as I understand it, the pros hone and craft material over time until very little is left to chance. Anyone who has listened to old comedy recordings of Bill Cosby, Steve Martin, Richard Pryor, and others has probably learned the jokes (and timing and intonation) by heart — again through repetition. It’s strangely comforting to be able to go back to the very same performance again and again. Personally, I have a rather large catalogue of classical music recordings in my head. I continue to seek out new renditions, but often the first version I learned becomes the default version, the way something goes. Dislodging that version from its definitive status is nearly impossible, especially when it’s the very first recording of a work (like a Beatles song). This is also why live performance often fails in comparison with the studio recording.

So it goes with a wide variety of phenomenon: what is first established as how something goes easily becomes canonical, dogmatic, and unquestioned. For instance, the origin of the universe in the big bang is one story of creation to which many still hold, while various religious creation myths hold sway with others. News that the big bang has been dislodged from its privileged position goes over just about as well as dismissing someone’s religion. Talking someone out of a fixed belief is hardly worth the effort because some portion of one’s identity is anchored to such beliefs. Thus, to question a cherished belief is to impeach a person’s very self.

Political correctness is the doctrine that certain ideas and positions have been worked out effectively and need (or allow) no further consideration. Just subscribe and get with the program. Don’t bother doing the mental work or examining the issue oneself; things have already been decided. In science, steady evidenciary work to break down a fixed understanding is often thankless, or thanks arrives posthumously. This is the main takeaway of Thomas Kuhn’s The Structure of Scientific Revolutions: paradigms are changed as much through attrition as through rational inquiry and accumulation of evidence.

One of the unanticipated effects of the Information and Communications Age is the tsunami of information to which people have ready access. Shaping that information into a cultural narrative (not unlike a creation myth) is either passive (one accepts the frequently shifting dominant paradigm without compunction) or active (one investigates for oneself as an attribute of the examined life, which with wizened folks never really arrives at a destination, since it’s the journey that’s the point). What’s a principled rationalist to do in the face of a surfeit of alternatives available for or even demanding consideration? Indeed, with so many self-appointed authorities vying for control over cultural narratives like the editing wars on Wikipedia, how can one avoid the dizzying disorientation of gaslighting and mendacity so characteristic of the modern information environment?

Still more to come in part 4.