Posts Tagged ‘Epistemology’

YouTube ratings magnet Jordan Peterson had a sit-down with Susan Blackmore to discuss/debate the question, “Do We Need God to Make Sense of Life?” The conversation is lightly moderated by Justin Brierley and is part of a weekly radio broadcast called Unbelievable? (a/k/a The Big Conversation, “the flagship apologetics and theology discussion show on Premier Christian Radio in the UK”). One might wonder why evangelicals are so eager to pit believers and atheists against each other. I suppose earnest questioning of one’s faith is preferable to proselytizing, though both undoubtedly occur. The full episode (47 min.) is embedded below: (more…)

Advertisements

Language acquisition in early childhood is aided by heavy doses of repetition and the memorable structure of nursery rhymes, songs, and stories that are repeated ad nauseum to eager children. Please, again! Again, again … Early in life, everything is novel, so repetition and fixity are positive attributes rather than causes for boredom. The music of one’s adolescence is also the subject of endless repetition, typically through recordings (radio and Internet play, mp3s played over headphones or earbuds, dances and dance clubs, etc.). Indeed, most of us have mental archives of songs heard over and over to the point that the standard version becomes canonical: that’s just the way the song goes. When someone covers a Beatles song, it’s recognizably the same song, yet it’s not the same and may even sound wrong somehow. (Is there any acceptable version of Love Shack besides that of the B52’s?) Variations of familiar folk tales and folk songs, or different phrasing in The Lord’s Prayer, imprinted in memory through sheer repetition, also possess discomfiting differences, sometimes being offensive enough to cause real conflict. (Not your Abrahamic deity, mine!)

Performing musicians traverse warhorses many times in rehearsal and public performance so that, after an undetermined point, how one performs a piece just becomes how it goes, admitting few alternatives. Casual joke-tellers may improvise over an outline, but as I understand it, the pros hone and craft material over time until very little is left to chance. Anyone who has listened to old comedy recordings of Bill Cosby, Steve Martin, Richard Pryor, and others has probably learned the jokes (and timing and intonation) by heart — again through repetition. It’s strangely comforting to be able to go back to the very same performance again and again. Personally, I have a rather large catalogue of classical music recordings in my head. I continue to seek out new renditions, but often the first version I learned becomes the default version, the way something goes. Dislodging that version from its definitive status is nearly impossible, especially when it’s the very first recording of a work (like a Beatles song). This is also why live performance often fails in comparison with the studio recording.

So it goes with a wide variety of phenomenon: what is first established as how something goes easily becomes canonical, dogmatic, and unquestioned. For instance, the origin of the universe in the big bang is one story of creation to which many still hold, while various religious creation myths hold sway with others. News that the big bang has been dislodged from its privileged position goes over just about as well as dismissing someone’s religion. Talking someone out of a fixed belief is hardly worth the effort because some portion of one’s identity is anchored to such beliefs. Thus, to question a cherished belief is to impeach a person’s very self.

Political correctness is the doctrine that certain ideas and positions have been worked out effectively and need (or allow) no further consideration. Just subscribe and get with the program. Don’t bother doing the mental work or examining the issue oneself; things have already been decided. In science, steady evidenciary work to break down a fixed understanding is often thankless, or thanks arrives posthumously. This is the main takeaway of Thomas Kuhn’s The Structure of Scientific Revolutions: paradigms are changed as much through attrition as through rational inquiry and accumulation of evidence.

One of the unanticipated effects of the Information and Communications Age is the tsunami of information to which people have ready access. Shaping that information into a cultural narrative (not unlike a creation myth) is either passive (one accepts the frequently shifting dominant paradigm without compunction) or active (one investigates for oneself as an attribute of the examined life, which with wizened folks never really arrives at a destination, since it’s the journey that’s the point). What’s a principled rationalist to do in the face of a surfeit of alternatives available for or even demanding consideration? Indeed, with so many self-appointed authorities vying for control over cultural narratives like the editing wars on Wikipedia, how can one avoid the dizzying disorientation of gaslighting and mendacity so characteristic of the modern information environment?

Still more to come in part 4.

Continuing from part 1, which is altogether too much screed and frustration with Sam Harris, I now point to several analyses that support my contentions. First is an article in The Nation about the return of so-called scientific racism and speaks directly about Charles Murray, Sam Harris, and Andrew Sullivan, all of whom are embroiled in the issue. Second is an article in The Baffler about constructing arguments ex post facto to conform to conclusions motivated in advance of evidence. Most of us are familiar with the the constructed explanation, where in the aftermath of an event, pundits, press agents, and political insiders propose various explanatory narratives to gain control over what will eventually become the conventional understanding. Published reports such as the Warren Commission‘s report on the assassination of JFK is one such example, and I daresay few now believe the report and the consensus that it presents weren’t politically motivated and highly flawed. Both linked articles above are written by Edward Burmilla, who blogs at Gin and Tacos (see blogroll). Together, they paint a dismal picture of how reason and rhetoric can be corrupted despite the sheen of scientific respectability.

Third is an even more damaging article (actually a review of the new anthology Trump and the Media) in the Los Angeles Review of Books by Nicolas Carr asking the pointed question “Can Journalism Be Saved?” Admittedly, journalism is not equivalent with reason or rationalism, but it is among several professions that employ claims of objectivity, accuracy, and authority. Thus, journalism demands both attention and respect far in excess of the typical blogger (such as me) or watering-hole denizen perched atop a barstool. Consider this pullquote:

… the flaws in computational journalism can be remedied through a more open and honest accounting of its assumptions and limitations. C. W. Anderson, of the University of Leeds, takes a darker view. To much of the public, he argues, the pursuit of “data-driven objectivity” will always be suspect, not because of its methodological limits but because of its egghead aesthetics. Numbers and charts, he notes, have been elements of journalism for a long time, and they have always been “pitched to a more policy-focused audience.” With its ties to social science, computational journalism inevitably carries an air of ivory-tower elitism, making it anathema to those of a populist bent.

Computational journalism is contrasted with other varieties of journalism based on, say, personality, emotionalism, advocacy, or simply a mad rush to print (or pixels) to scoop the competition. This hyperrational approach has already revealed its failings, as Carr reports in his review.

What I’m driving at is that, despite frequent appeals to reason, authority, and accuracy (especially the quantitative sort), certain categories of argumentation fail to register on the average consumer of news and information. It’s not a question of whether arguments are right or wrong, precisely; it’s about what appeals most to those paying even a modest bit of attention. And the primary appeal for most (I judge) isn’t reason. Indeed, reason is swept aside handily when a better, um, reason for believing something appears. If one has done the difficult work of acquiring critical thinking and reasoning skills, it can be quite the wake-up call when others fail to behave according to reason, such as with acting against enlightened self-interest. The last presidential election was a case in point.

Circling back so something from an earlier blog, much of human cognition is based on mere sufficiency: whatever is good enough in the moment gets nominated then promoted to belief and/or action. Fight, flight, or freeze is one example. Considered evaluation and reason are not even factors. Snap judgments, gut feelings, emotional resonances, vibes, heuristics, and Gestalts dominate momentary decision-making, and in the absence of convincing countervailing information (if indeed one is even vulnerable to reason, which would be an unreasonable assumption), action is reinforced and suffices as belief.

Yet more in part 3 to come.

From Wikipedia:

Trial by combat (also wager of battle, trial by battle or judicial duel) was a method of Germanic law to settle accusations in the absence of witnesses or a confession in which two parties in dispute fought in single combat; the winner of the fight was proclaimed to be right. In essence, it was a judicially sanctioned duel. It remained in use throughout the European Middle Ages, gradually disappearing in the course of the 16th century.

Unlike trial by ordeal in general, which is known to many cultures worldwide, trial by combat is known primarily from the customs of the Germanic peoples. It was in use among the ancient Burgundians, Ripuarian Franks, Alamans, Lombards, and Swedes. It was unknown in Anglo-Saxon law, Roman law and Irish Brehon Law and it does not figure in the traditions of Middle Eastern antiquity such as the code of Hammurabi or the Torah.

Trial by combat has profound echoes in 21st-century geopolitics and jurisprudence. Familiar phrases such as right of conquest, manifest destiny, to the winner go the spoils, might makes right, and history written by the victors attest to the enduring legacy of hindsight justification by force of arms. More broadly, within the American system, right of access to courts afforded to all citizens also admits nuisance suits and more than a few mismatched battles where deep-pocketed corporations sue individuals and small organizations, often nonprofits, into bankruptcy and submission. For instance, I recently learned of Strategic Lawsuits Against Public Participation (SLAPPs) “used to silence and harass critics by forcing them to spend money to defend these baseless suits.” They employ brute economic power in place of force of arms.

Trial by combat fell out of practice with the onset of the Enlightenment but the broader complex of ideas survived. Interest in medieval Europe as storytelling fodder in cinema and fantasy literature (notably, the shocking trial by combat depicted in the extremely popular HBO drama Game of Thrones where the accused and accuser both designate their proxies rather than doing battle themselves) lends legitimacy to settling disputes via violence. Even the original Karate Kid (1984) has a new YouTube Red series set 30 years later. The bad-boy acolyte replaces his scorched-earth sensei and seeks revenge from the titular character for being bested decades before, the latter of whom is yanked back from quiet obscurity (and the actor who portrays him from career limbo) to fight again and reprove his skills, which is to say, his righteousness. The set-up is surprisingly delicious to contemplate and has considerable nostalgic appeal. More importantly, it embodies the notion (no doubt scripted according to cliché) that only the pure of heart (or their proxies, students in this case) can claim ultimate victory because, well, it’s god’s will or some such and thus good guys must always win. What that really means is that whoever wins is by definition virtuous. If only reality were so reliably simple.

The certainty of various religious dogma and codes of conduct characteristic of the medieval period (e.g., chivalry) is especially seductive in modern times, considering how the public is beset by an extraordinary degree of existential and epistemological uncertainty. The naturalist fallacy is also invoked, where the law of the jungle (only the fittest and/or strongest get to eat or indeed survive) substitutes for more civilized (i.e., enlightened and equanimous) thinking. Further, despite protestations, this complex of ideas legitimizes bullying, whether (1) in the schoolyard with the principal bully flanked by underlings picking on vulnerable weaklings who haven’t formed alliances for self-protection, (2) the workplace, with its power players and Machiavellian manipulators, or (3) a global military power such as the U.S. dictating terms to and/or warring with smaller, weaker nations that lack the GDP, population, and insanity will to project power globally. I daresay most Americans take comfort in having the greatest military and arsenal ever mustered on their side and accordingly being on the right side (the victorious one) of history, thus a beacon of hope to all who would conflate victory with virtue. Those who suffer at our hands must understand things quite differently. (Isn’t it more accurate that when bad guys win, rebellions and insurgencies are sparked?)

One remarkable exception deserves notice. The U.S. presidency is among the most heavily scrutinized and contentious positions (always under attack) and happens to be the Commander-in-Chief of the self-same greatest goddamn fighting force known to man. It’s no secret that the occupant of that office (45) is also widely recognized as the Bully-in-Chief. Despite having at his disposal considerable resources — military, executive staff, and otherwise — 45 has eschewed forming the political coalitions one might expect and essentially gone it alone, using the office (and his Twitter account) as a one-man bully pulpit. Hard to say what he’s trying to accomplish, really. Detractors have banded together (incompetently) to oppose him, but 45 has demonstrated unexpected tenacity, handily dominating rhetorical trials by combat through sheer bluster and hubris. On balance, he scores some pretty good hits, too. (The proposed fist fight between 45 and Joe Biden turned out to be a tease, but how entertaining would that bout have been without actually settling anything!) This pattern has left many quite dumbfounded, and I admit to being astounded as well except to observe that rank stupidity beats everything in this bizarre political rock-paper-scissors contest. How quintessentially American: nuthin’ beats stoopid.

rant on/

Authors I read and podcasters to whom I listen, mostly minor celebrities of the nonentertainment kind, often push their points of view using lofty appeals to reason and authority as though they possess unique access to truth but which is lacking among those whose critical thinking may be more limited. Seems to be the special province of pundits and thought leaders shilling their own books, blogs, newspaper columns, and media presence (don’t forget to comment and subscribe! ugh …). The worst offender on the scene may well be Sam Harris, who has run afoul of so many others recently that a critical mass is now building against him. With calm, even tones, he musters his evidence (some of it hotly disputed) and builds his arguments with the serene confidence of a Kung Fu master yet is astonished and amazed when others don’t defer to his rhetoric. He has behaved of late like he possesses heroic superpowers only to discover that others wield kryptonite or magic sufficient to defeat him. It’s been quite a show of force and folly. I surmise the indignity of suffering fools, at least from Harris’ perspective, smarts quite a bit, and his mewling does him no credit. So far, the person refusing most intransigently to take the obvious lesson from this teachable moment is Harris himself.

Well, I’m here to say that reason is no superpower. Indeed, it can be thwarted rather handily by garden-variety ignorance, stupidity, emotion, superstition, and fantasy. All of those are found in abundance in the public sphere, whereas reason is in rather short supply. Nor is reason a panacea, if only one could get everyone on board. None of this is even remotely surprising to me, but Harris appears to be taken aback that his interlocutors, many of whom are sophisticated thinkers, are not easily convinced. In the ivory tower or echo chamber Harris has constructed for himself, those who lack scientific rigor and adherence to evidence (or even better, facts and data) are infrequently admitted to the debate. He would presumably have a level playing field, right? So what’s going on that eludes Sam Harris?

As I’ve been saying for some time, we’re in the midst of an epistemological crisis. Defenders of Enlightenment values (logic, rationalism, detachment, equity, secularism), most of whom are academics, are a shrinking minority in the new democratic age. Moreover, the Internet has put regular, perhaps unschooled folks (Joe the Plumber, Ken Bone, any old Kardashian, and celebrities used to being the undeserved focus of attention) in direct dialogue with everyone else through deplorable comments sections. Journalists get their say, too, and amplify the unwashed masses when resorting to man-on-the-street interviews. At Gin and Tacos (see blogroll), this last is called the Cletus Safari. The marketplace of ideas has accordingly been so corrupted by the likes of, well, ME! that self-appointed public intellectuals like Harris can’t contend effectively with the onslaught of pure, unadulterated democracy where everyone participates. (Authorities claim to want broad civic participation, as when they exhort everyone to vote, but the reverse is more nearly true.) Harris already foundered on the shoals of competing truth claims when he hosted on his webcast a fellow academic, Jordan Peterson, yet failed to make any apparent adjustments in the aftermath. Reason remains for Harris the one true faith.

Furthermore, Jonathan Haidt argues (as I understand him, correct me if I’m mistaken) that motivated reasoning leads to cherry-picking facts and evidence. In practice, that means that selection bias results in opinions being argued as facts. Under such conditions, even well-meaning folks are prone to peddling false certainty. This may well be the case with Charles Murray, who is at the center of the Harris debacle. Murray’s arguments are fundamentally about psychometrics, a data-driven subset of sociology and psychology, which under ideal circumstances have all the dispassion of a stone. But those metrics are applied at the intersection of two taboos, race and intelligence (who knew? everyone but Sam Harris and Charles Murray …), then transmuted into public policy recommendations. If Harris were more circumspect, he might recognize that there is simply no way to divorce emotion from discussions of race and intelligence.

rant off/

More to say on this subject in part 2 to follow.

Long again this time and a bit contentious. Sorry for trying your patience.

Having watched a few hundred Joe Rogan webcasts by now (previous blog on this topic here), I am pretty well acquainted with guests and ideas that cycle through periodically. This is not a criticism as I’m aware I recycle my own ideas here, which is more nearly thematic than simply repetitive. Among all the MMA folks and comedians, Rogan features people — mostly academics — who might be called thought leaders. A group of them has even been dubbed the “intellectual dark web.” I dunno who coined the phrase or established its membership, but the names might include, in no particular order, Jordan Peterson, Bret Weinstein, Eric Weinstein, Douglas Murray, Sam Harris, Jonathan Haidt, Gad Saad, Camille Paglia, Dave Ruben, Christina Hoff Sommers, and Lawrence Krauss. I doubt any of them would have been considered cool kids in high school, and it’s unclear whether they’re any cooler now that they’ve all achieved some level of Internet fame on top of other public exposure. Only a couple seem especially concerned with being thought cool now (names withheld), though the chase for clicks, views, likes, and Patreon support is fairly upfront. That they can usually sit down and have meaningful conversations without rancor (admirably facilitated by Joe Rogan up until one of his own oxen is gored, less admirably by Dave Ruben) about free speech, Postmodernism, social justice warriors, politics, or the latest meme means that the cliquishness of high school has relaxed considerably.

I’m pleased (I guess) that today’s public intellectuals have found an online medium to develop. Lots of imitators are out there putting up their own YouTube channels to proselytize their own opinions. However, I still prefer to get deeper understanding from books (and to a lesser degree, blogs and articles online), which are far better at delivering thoughtful analysis. The conversational style of the webcast is relentlessly up-to-date and entertaining enough but relies too heavily on charisma. And besides, so many of these folks are such fast talkers, often talking over each other to win imaginary debate points or just dominate the conversational space, that they frustrate and bewilder more than they communicate or convince.

(more…)

I watched a documentary on Netflix called Jim & Andy (2017) that provides a glimpse behind the scenes of the making of Man on the Moon (1999) where Jim Carrey portrays Andy Kaufman. It’s a familiar story of art imitating life (or is it life imitating art?) as Carrey goes method and essentially channels Kaufman and Kaufman’s alter ego Tony Clifton. A whole gaggle of actors played earlier incarnations of themselves in Man on the Moon and appeared as themselves (without artifice) in Jim & Andy, adding another weird dimension to the goings on. Actors losing themselves in roles and undermining their sense of self is hardly novel. Regular people lose themselves in their jobs, hobbies, media hype, glare of celebrity, etc. all the time. From an only slightly broader perspective, we’re all merely actors playing roles, shifting subtly or dramatically based on context. Shakespeare observed it centuries ago. However, the documentary points to a deeper sense of unreality precisely because Kaufman’s principal shtick was to push discomfiting jokes/performances beyond the breaking point, never dropping the act to let his audience in on the joke or provide closure. It’s a manifestation of what I call the Disorientation Protocol.

(more…)

As time wears on and I add years to this mostly ignored blog, I keep running across ideas expressed herein, sometimes long ago, recapitulated in remarks and comments elsewhere. Absolutely disparate people can develop the same ideas independently, so I’m not claiming that my ideas are stolen. Maybe I’m merely in touch with the Zeitgeist and express it here only then to see or hear it again someplace else. I can’t judge objectively.

The latest coincidence is the growing dread with which I wake up every day, wondering what fresh new hell awaits with the morning news. The times in which we live are both an extension of our received culture and yet unprecedented in their novelty. Not only are there many more people in existence than 100 years ago and thus radical opinions and events occurring with extraordinary frequency, the speed of transmission is also faster than in the past. Indeed, the rush to publication has many news organs reporting before any solid information is available. The first instance of blanket crisis coverage I remember was the Challenger Disaster in 1986. It’s unknown to me how quickly news of various U.S. political assassinations in the 1960s spread, but I suspect reporting took more time than today and imparted to those events gravity and composure. Today is more like a renewed Wild West where anything goes, which has been the preferred characterization of the Internet since its creation. We’ll see if the recent vote to remove Net Neutrality has the effect of restraining things. I suspect that particular move is more about a money grab (selling premium open access vs. basic limited access) than thought control, but I can only guess as to true motivations.

I happened to be traveling when the news broke of a mass shooting in Las Vegas. Happily, what news I got was delayed until actual news-gathering had already sorted basic fact from confabulation. Paradoxically, after the first wave of “what the hell just happened?” there formed a second wave of “here’s what happened,” and later a third wave of “what the hell really happened?” appeared as some rather creative interpretations were offered up for consideration. That third wave is by now quite familiar to everyone as the conspiracy wave, and surfing it feels inevitable because the second wave is often so starkly unbelievable. Various websites and shows such as snopes.com, metabunk.org, MythBusters, and Penn & Teller: Bullshit! (probably others, too) presume to settle debates. While I’m inclined to believe scientific and documentary evidence, mere argument often fails to convince me, which is troubling, to say the least.

Fending off all the mis- and disinformation, or separating signal from noise, is a full-time job if one is willing to undertake it. That used to be the mandate of the journalistic news media, at least in principle. Lots of failures on that account stack up throughout history. However, since we’re in the midst of a cultural phase dominated by competing claims to authority and the public’s retreat into ideation, the substitute worlds of extended and virtual reality become attractive alternatives to the fresh new hell we now face every morning. Tune in and check in might be what we think we’re doing, but more accurately, we tune out and check out of responsible engagement with the real world. That’s the domain of incessantly chipper morning TV shows. Moreover, we like to believe in the mythical stories we tell ourselves about ourselves, such as, for example, how privacy doesn’t matter, or that the U.S. is a free, democratic, liberal beacon of hope, or that economic value inheres in made-up currencies. It’s a battle for your attention and subscription in the marketplace of ideas. Caveat emptor.

What is more tantalizing and enticing than a secret? OK, probably sex appeal, but never mind that for now. Secrets confer status on the keeper and bring those on whom the secret is bestowed into an intimate (nonsexual, for you dirty thinkers) relationship with the secret sharer. I remember the sense of relief and quiet exhilaration when the Santa Claus story was finally admitted by my parents to be a hoax untrue. I had already ceased to really believe in it/him but wasn’t yet secure enough as a 6- or 7-year-old (or whenever it was) to assert it without my parents’ confirmation. And it was a secret I withheld from my younger siblings, perhaps my first instruction on when lying was acceptable, even looked upon approvingly. Similarly, I remember how it felt to be told about sex for the first time by older kids (now you can go there, you cretins) and thus realize that my parents (and everyone else’s) had done the dirty — multiple times even for families with more than one kid. I was the possessor of secret knowledge, and everyone figured out quickly that it was best to be discreet about it. It may have been the first open secret. Powerful stuff, as we were to learn later in our hormone-addled adolescence. In early adulthood, I also began to assert my atheism, which isn’t really a secret but still took time to root fully. From my mature perspective, others who believe in one sky-god or another look like the kids who at a tender age still believe in Santa Claus and the Easter Bunny. I don’t go out of my way to dispel anyone’s faith.

Even as adults, those of us who enjoy secret knowledge feel a bit of exhilaration. We know what goes on (a little or a lot) behind the scenes, behind the curtain, in the backrooms and dark places. It may also mean that we know how the proverbial sausage is made, which is far less special. National security clearance, operating at many levels of access, may be the most obvious example, or maybe it’s just being a bug on the wall in the dugout or locker room during a pro sports contest. Being within the circle of intimates is intoxicating, though the circumstances that gets one into the circle may be rather mundane, and those on the outside may look oddly pathetic.

The psychology behind secret knowledge functions prominently with conspiracy theories. Whether the subject is political assassinations, Bigfoot or the Loch Ness Monster, the moon landings, Area 51 and alien abduction, chemtrails/contrails, or 9/11, one’s personal belief and pet theory inescapably confers special status, especially as unacknowledged or unaccepted truth. Often, as others seek to set the record straight, one digs in to defend cherished beliefs. It’s an elixir,  a dangerous cycle that traps people in contrafactual cliques. So we have flat Earthers, birthers, 9/11 truthers, creationists, climate change deniers, etc. (I count myself among one of those groups, BTW. Figure it out for yourself.) The range of interpretations floated in the political realm with respect to the machinations of the two parties and the White House boggle my mind with possibilities. However, I’m squarely outside those circles and feel no compulsion to decide what I believe when someone asserts secret knowledge from inside the circle. I float comfortably above the fray. Similarly, with so much fake news pressing for my attention, I consciously hold quite a lot of it in abeyance until time sorts it out for me.

I revisit my old blog posts when I see some reader activity in the WordPress backstage and was curious to recall a long quote of Iain McGilchrist summarizing arguments put forth by Anthony Giddens in his book Modernity and Self-identity (1991). Giddens had presaged recent cultural developments, namely, the radicalization of nativists, supremacists, Social Justice Warriors (SJWs), and others distorted by absorbed in identity politics. So I traipsed off to the Chicago Public Library (CPL) and sought out the book to read. Regrettably, CPL didn’t have a copy, so I settled on a slightly earlier book, The Consequences of Modernity (1990), which is based on a series of lectures delivered at Stanford University in 1988.

Straight away, the introduction provides a passage that goes to the heart of matters with which I’ve been preoccupied:

Today, in the late twentieth century, it is argued by many, we stand at the opening of a new era … which is taking us beyond modernity itself. A dazzling variety of terms has been suggested to refer to this transition, a few of which refer positively to the emergence of a new type of social system (such as the “information society” or the “consumer society”) but most of which suggest rather than a preceding state of affairs is drawing to a close … Some of the debates about these matters concentrate mainly upon institutional transformations, particularly those which propose that we are moving from a system based upon the manufacture of material goods to one concerned more centrally with information. More commonly, however, those controversies are focused largely upon issues of philosophy and epistemology. This is the characteristic outlook, for example, of the the author who has been primarily responsible for popularising the notion of post-modernity, Jean-François Lyotard. As he represents it, post-modernity refers to a shift away from attempts to ground epistemology and from from faith in humanly engineered progress. The condition of post-modernity is distinguished by an evaporating of the “grand narrative” — the overarching “story line” by means of which we are placed in history as being having a definite past and a predictable future. The post-modern outlook sees a plurality of heterogeneous claims to knowledge, in which science does not have a privileged place. [pp. 1–2, emphasis added]

That’s a lot to unpack all at once, but the fascinating thing is that notions now manifesting darkly in the marketplace of ideas were already in the air in the late 1980s. Significantly, this was several years still before the Internet brought the so-called Information Highway to computer users, before the cell phone and smart phone were developed, and before social media displaced traditional media (TV was only 30–40 years old but had previously transformed our information environment) as the principal way people gather news. I suspect that Giddens has more recent work that accounts for the catalyzing effect of the digital era (including mobile media) on culture, but for the moment, I’m interested in the book in hand.

Regular readers of this blog (I know of one or two) already know my armchair social criticism directed to our developing epistemological crisis (challenges to authority and expertise, psychotic knowledge, fake news, alternative facts, dissolving reality, and science denial) as well as the Transhumanist fantasy of becoming pure thought (once we evolve beyond our bodies). Until that’s accomplished with imagined technology, we increasingly live in our heads, in the abstract, disoriented and adrift on a bewildering sea of competing narratives. Moreover, I’ve stated repeatedly that highly mutable story (or narrative) underlie human cognition and consciousness, making most of us easy marks for charismatic thought leaders storytellers. Giddens was there nearly 30 years ago with these same ideas, though his terms differ.

Giddens dispels the idea of post-modernity and insists that, from a sociological perspective, the current period is better described as high modernism. This reminds me of Oswald Spengler and my abandoned book blogging of The Decline of the West. It’s unimportant to me who got it more correct but note that the term Postmodernism has been adopted widely despite its inaccuracy (at least according to Giddens). As I get further into the book, I’ll have plenty more to say.