Posts Tagged ‘Culture’

Heard a curious phrase used with some regularity lately, namely, that “we’ve Nerfed the world.” Nerf refers to the soft, foam toys popular in the 70s and beyond that made balls and projectiles essentially harmless. The implication of the phrase is that we’ve become soft and vulnerable as a result of removing the routine hazards (physical and psychological) of existence. For instance, in the early days of cell phones, I recall padded street poles (like endzone goalposts) to prevent folks with their attention fixated too intently on their phones from harming themselves when stumbling blindly down the sidewalk.

Similarly, anti-bullying sentiment has reached fevered pitch such that no level of discomfort (e.g., simple name calling) can be tolerated lest the victim be scarred for life. The balancing point between preparing children for competitive realities of the world and protecting their innocence and fragility has accordingly moved heavily in favor of the latter. Folks who never develop the resilience to suffer even modest hardships are snowflakes, and they agitate these days on college campuses (and increasingly in workplaces) to withdraw into safe spaces where their beliefs are never challenged and experiences are never challenging. The other extreme is a hostile, cruel, or at least indifferent world where no one is offered support or opportunity unless he or she falls within some special category, typically connected through family to wealth and influence. Those are entitled.

A thermostatic response (see Neil Postman for more on this metaphor) is called for here. When things veer too far toward one extreme or the other, a correction is inevitable. Neither extreme is healthy for a functioning society, though the motivations are understandable. Either it’s toughen people up by providing challenge, which risks brutalizing people unnecessarily, or protect people from the rigors of life or consequences of their own choices to such a degree that they become dependent or dysfunctional. Where the proper balance lies is a question for the ages, but I daresay most would agree it’s somewhere squarely in the middle.

Jonathan Haidt and Greg Lukianoff have a new book out called The Coddling of the American Mind: How Good Intentions and Bad Ideas Are Setting Up a Generation for Failure (2018), which is an expansion of an earlier article in The Atlantic of the same title. (Both are callbacks to Allan Bloom’s notorious The Closing of the American Mind (1987), which I’ve read twice. Similar reuse of a famous title references Robert Bork’s Slouching Toward Gomorrah (1996).) I haven’t yet read Haidt’s book and doubt I will bother, but I read the source article when it came out. I also don’t work on a college campus and can’t judge contemporary mood compared to when I was an undergraduate, but I’m familiar with the buzzwords and ​intellectual fashions reported by academics and journalists. My alma mater is embroiled in these battles, largely in connection with identity politics. I’m also aware of detractors who believe claims of Haidt and Lukianoff (and others) are essentially hysterics limited to a narrow group of progressive colleges and universities.

As with other cultural developments that lie outside my expertise, I punt when it comes to offering (too) strong opinions. However, with this particular issue, I can’t help but to think that the two extremes coexist. A noisy group of students attending highly competitive institutions of higher education lead relatively privileged lives compared to those outside the academy, whereas high school grads and dropouts not on that track (and indeed grads of less elite schools) frequently struggle getting their lives going in early adulthood. Most of us face that struggle early on, but success, despite nonsensical crowing about the “best economy ever” from the Oval Office, is difficult to achieve now as the broad socioeconomic middle is pushed toward the upper and lower margins (mostly lower). Stock market notwithstanding, economic reality is frankly indifferent to ideology.


See this post on Seven Billion Day only a few years ago as a launching point. We’re now closing in on 7.5 billion people worldwide according to the U.S. Census Bureau. At least one other counter indicates we’ve already crossed that threshold. What used to be called the population explosion or the population bomb has lost its urgency and become generically population growth. By now, application of euphemism to mask intractable problems should be familiar to everyone. I daresay few are fooled, though plenty are calmed enough to stop paying attention. If there is anything to be done to restrain ourselves from proceeding down this easily recognized path to self-destruction, I don’t know what it is. The unwillingness to accept restraints in other aspects of human behavior demonstrate pretty well that consequences be damned — especially if they’re far enough delayed in time that we get to enjoy the here and now.

Two additional links (here and here) provide abundant further information on population growth if one desired to delve more deeply into the topic. The tone of these sites is sober, measured, and academic. As with climate change, hysterical and panic-provoking alarmism is avoided, but dangers known decades and centuries ago have persisted without serious redress. While it’s true that growth rate (a/k/a replacement rate) has decreased considerably since its peak in 1960 or so (the height of the postwar baby boom), absolute numbers continue to climb. The lack of immediate concern reminds me of Al Bartlett’s articles and lectures on the failure to understand the exponential function in math (mentioned in my prior post). Sure, boring old math about which few care. The metaphor that applies is yeast growing in a culture with a doubling factor that makes everything look just peachy until the final doubling that kills everything. In this metaphor, people are the unthinking yeast that believe there’s plenty of room and food and other resources in the culture (i.e., on the planet) and keep consuming and reproducing until everyone dies en mass. How far away in time that final human doubling is no one really knows.

Which brings me to something rather ugly: hearings to confirm Brett Kavanaugh’s appointment to the U.S. Supreme Court. No doubt conservative Republican presidents nominate similarly conservative judges just as Democratic presidents nominate progressive centrist judges. That’s to be expected. However, Kavanaugh is being asked pointed questions about settled law and legal precedents perpetually under attack by more extreme elements of the right wing, including Roe v. Wade from 1973. Were we (in the U.S.) to revisit that decision and remove legal abortion (already heavily restricted), public outcry would be horrific, to say nothing of the return of so-called back-alley abortions. Almost no one undertakes such actions lightly. A look back through history, however, reveals a wide range of methods to forestall pregnancy, end pregnancies early, and/or end newborn life quickly (infanticide). Although repugnant to almost everyone, attempts to legislate abortion out of existence and/or punish lawbreakers will succeed no better than did Prohibition or the War Against Drugs. (Same can be said of premarital and underage sex.) Certain aspects of human behavior are frankly indelible despite the moral indignation of one or another political wing. Whether Kavanaugh truly represents the linchpin that will bring new upheavals is impossible to know with certainty. Stay tuned, I guess.

Abortion rights matter quite a lot when placed in context with population growth. Aggregate human behaviors drive out of existence all sorts of plant and animal populations routinely. This includes human populations (domestic and foreign) reduced to abject poverty and mad, often criminal scrambles for survival. The view from on high is that those whose lives fall below some measure of worthwhile contribution are useless eaters. (I don’t recommend delving deeper into that term; it’s a particularly ugly ideology with a long, tawdry history.) Yet removing abortion rights would almost certainly  swell those ranks. Add this topic to the growing list of things I just don’t get.

Not a person alive having reached even a modest level of maturity hasn’t looked back at some choice or attitude of his or her past and wondered “What on earth was I thinking?” Maybe it was some physical stunt resulting in a fall or broken bone (or worse), or maybe it was an intolerant attitude later softened by empathy and understanding when the relevant issue became personal. We’ve all got something. Some of us, many somethings. As a kid, my cohorts and I used to play in leaves raked into piles in the autumn. A pile of leaves isn’t a trampoline and doesn’t really provide cushion, but as kids, it didn’t matter for the purpose of play. At one point, the kid next door dared me to jump from the roof of his front porch into a pile of leaves. The height was probably 15 feet. I remember climbing out and peering over the gutters, wavering a bit before going back inside. I didn’t jump. What was I thinking? It would have been folly to take that dare.

Some youthful indiscretion is to be expected and can be excused as teaching moments, but in truth, most of us don’t have to go far back in time to wonder “what in hell was I thinking?” Maybe it was last week, last month, or a few years ago. The interval matters less than the honest admission that, at any point one might believe he or she has things figured out and can avoid traps that look clear only in hindsight, something will come up and remind that, despite being wizened through experience, one still misjudges and makes egregious mistakes.


A paradoxical strength/weakness of reason is its inherent disposition toward self-refutation. It’s a bold move when undertaken with genuine interest in getting things right. Typically, as evidence piles up, consensus forms that’s tantamount to proof unless some startling new counter-evidence appears. Of course, intransigent deniers exist and convincing refutations do appear periodically, but accounts of two hotly contested topics (from among many) — evolution and climate change — are well established notwithstanding counterclaims completely disproportionate in their ferocity to the evidence. For rationalists, whatever doubts remain must be addressed and accommodated even if disproof is highly unlikely.

This becomes troublesome almost immediately. So much new information is produced in the modern world that, because I am duty-bound to consider it, my head spins. I simply can’t deal with it all. Inevitably, when I think I’ve put a topic to rest and conclude I don’t have to think too much more about it, some argument-du-jour hits the shit pile and I am forced to stop and reconsider. It’s less disorienting when facts are clear, but when interpretive, I find my head all too easily spun by the latest, greatest claims of some charming, articulate speaker able to cobble together evidence lying outside of my expertise.

Take for instance Steven Pinker. He speaks in an authoritative style and has academic credentials that dispose me to trust his work. His new book is Enlightenment Now: The Case for Reason, Science, Humanism, and Progress (2018). Still, Pinker is an optimist, whereas I’m a doomer. Even though I subscribe to Enlightenment values (for better or worse, my mind is bent that way), I can’t escape a mountain of evidence that we’ve made such a mess of things that reason, science, humanism, and progress are hardly panaceas capable of saving us from ourselves. Yet Pinker argues that we’ve never had it so good and the future looks even brighter. I won’t take apart Pinker’s arguments; it’s already been done by Jeremy Lent, who concludes that Pinker’s ideas are fatally flawed. Lent has the expertise, data, and graphs to demonstrate it. Calling Pinker a charlatan would be unfair, but his appreciation of the state of the world stands in high contrast with mine. Who ya gonna believe?

Books and articles like Pinker’s appear all the time, and in their aftermath, so, too, do takedowns. That’s the marketplace of ideas battling it out, which is ideally meant to sharpen thinking, but with the current epistemological crises under way (I’ve blogged about it for years), the actual result is dividing people into factions, destabilizing established institutions, and causing no small amount of bewilderment in the public as to what and whom to believe. Some participants in the exchange of ideas take a sober, evidential approach; others lower themselves to snark and revel in character assassination without bothering to make reasoned arguments. The latter are often called a hit pieces (a special province of the legacy media, it seems), since hefty swipes and straw-man arguments tend to be commonplace. I’m a sucker for the former style but have to admit that the latter can also hit its mark. However, both tire me to the point of wanting to bury my head.


I mentioned blurred categories in music a short while back. An interesting newsbit popped up in the New York Times recently about this topic. Seems a political science professor at SUNY New Paltz, Gerald Benjamin, spoke up against a Democratic congressional candidate for New York’s 18th district, Antonio Delgado, the latter of whom had been a rapper. Controversially, Benjamin said that rap is not real music and does not represent the values of rural New York. Naturally, the Republican incumbent got in on the act, too, with denunciations and attacks. The electoral politics angle doesn’t much interest me; I’m not in or from the district. Moreover, the racial and/or racist elements are so toxic I simply refuse to wade in. But the professorial pronouncement that rap music isn’t really music piqued my interest, especially because that argument caused the professor to be sanctioned by his university. Public apologies and disclaimers were issued all around.

Events also sparked a fairly robust commentary at Slipped Disc. The initial comment by V.Lind echoes my thinking pretty well:

Nobody is denying that hip-hop is culturally significant. As such it merits study — I have acknowledged this … The mystery is its claims to musical credibility. Whatever importance rap has is in its lyrics, its messages (which are far from universally salutory [sic]) and its general attempt to self-define certain communities — usually those with grievances, but also those prepared to develop through violence, sexism and other unlovely aspects. These are slices of life, and as such warrant some attention. Some of the grievances are well-warranted …

But music? Not. I know people who swear by this genre, and their ears are incapable of discerning anything musical in any other. If they wanted to call it poetry (which I daresay upon scrutiny would be pretty bad poetry) it would be on stronger legs. But it is a “music” by and for the unmusical, and it is draining the possibility of any other music out of society as the ears that listen to it hear the same thing, aside from the words, for years on end.

Definitely something worth studying. How the hell has this managed to become a dominant force in what is broadly referred to as the popular music world?

The last time (as memory serves) categories or genres blurred leading to outrage was when Roger Ebert proclaimed that video games are not art (and by inference that cinema is art). Most of us didn’t really care one way or the other where some entertainment slots into a category, but gamers in particular were scandalized. In short, their own ox was gored. But when it came to video games as art, there were no racial undertones, so the sometimes heated debate was at least free of that scourge. Eventually, definitions were liberalized, Ebert acknowledged the opposing opinion (I don’t think he was ever truly convinced, but I honestly can’t remember — and besides, who cares?), and it all subsided.

The impulse to mark hard, discrete boundaries between categories and keep unlike things from touching is pretty foolish to me. It’s as though we’re arguing about the mashed potatoes and peas not infecting each other on the dinner plate with their cooties. Never mind that it all ends up mixed in the digestive tract before finally reemerging as, well, you know. Motivation to keep some things out is no doubt due to prestige and cachet, where the apparent interloper threatens to change the status quo somehow, typically infecting it with undesirable and/or impure elements. We recognize this fairly readily as in-group and out-group, an adolescent game that ramps up, for instance, when girls and boys begin to differentiate in earnest at the onset of puberty. Of course, in the last decade, so-called identitarians have been quite noisome about their tribal affiliations self-proclaimed identities, many falling far, far out of the mainstream, and have demanded they be taken seriously and/or granted status as a protected class.

All this extends well beyond the initial topic of musical or artistic styles and genres. Should be obvious, though, that we can’t escape labels and categories. They’re a basic part of cognition. If they weren’t, one would have to invent them at every turn when confronting the world, judging safe/unsafe, friend/foe, edible/inedible, etc. just to name a few binary categories. Complications multiply quickly when intermediary categories are present (race is the most immediate example, where most of us are mixtures or mutts despite whatever our outer appearance may be) or categories are blurred. Must we all then rush to restore stability to our understanding of the world by hardening our mental categories?

One of the very best lessons I took from higher education was recognizing and avoiding the intentional fallacy — in my own thinking no less than in that of others. Although the term arguably has more to do with critical theory dealing specifically with texts, I learned about it in relation to abstract fine arts, namely, painting and music. For example, the enigmatic expression of the Mona Lisa by Leonardo Da Vinci continues to spark inquiry and debate. What exactly does that smile mean? Even when words or programs are included in musical works, it’s seductively easy to conclude that the composer intends this or the work itself means that. Any given work purportedly allows audiences to peer into the mind of its creator(s) to interrogate intent. Conclusions thus drawn, however, are notoriously unreliable though commonplace.

It’s inevitable, I suppose, to read intent into artistic expression, especially when purpose feels so obvious or inevitable. Similar excavations of meaning and purpose are undertaken within other domains of activity, resulting in no end of interpretation as to surface and deep strategies. Original intent (also originalism) is a whole field of endeavor with respect to interpretation of the U.S. Constitution and imagining the framers’ intent. Geopolitics is another domain where hindsight analysis results in some wildly creative but ultimately conjectural interpretations of events. Even where authorial (and political) intent is explicitly recorded, such as with private diaries or journals, the possibility of deceptive intent by authors keeps everyone wondering. Indeed, although “fake news” is modern coin, a long history of deceptive publishing practice well beyond the adoption of a nom de plume attests to hidden or unknowable intent making “true intent” a meta property.

The multi-ring circus that the modern information environment has become, especially in the wake of electronic media (e.g., YouTube channels) produced by anyone with a camera and an Internet connection, is fertile ground for those easily ensnared by the intentional fallacy. Several categories of intent projected onto content creators come up repeatedly: profit motive, control of the narrative (no small advantage if one believes this blog post), setting the record straight, correcting error, grandstanding, and trolling for negative attention. These categories are not mutually exclusive. Long ago, I pointed to the phenomenon of arguing on-line and how it typically accomplishes very little, especially as comment threads lengthen and civility breaks down. These days, comments are an Internet legacy and/or anachronism that many content creators persist in offering to give the illusion of a wider discussion but in fact roundly ignore. Most blogs and channels are actually closed conversations. Maybe a Q&A follows the main presentation when held before an audience, but video channels are more often one-way broadcasts addressing an audience but not really listening. Public square discussion is pretty rare.

Some celebrate this new era of broadcasting, noting with relish how the mainstream media is losing its former stranglehold on attention. Such enthusiasm may be transparently self-serving but nonetheless rings true. A while back, I pointed to New Media Rockstars, which traffics in nerd culture entertainment media, but the term could easily be expanded to include satirical news, comedy, and conversational webcasts (also podcasts). Although some folks are rather surprised to learn that an appetite for substantive discussion and analysis exists among the public, I surmise that the shifting media landscape and disintegrated cultural narrative have bewildered a large segment of the public. The young in particular are struggling to make sense of the world, figure out what to be in life and how to function, and working out an applied philosophy that eschews more purely academic philosophy.

By way of example of new media, let me point to a trio of YouTube channels I only recently discovered. Some More News parodies traditional news broadcasts by sardonically (not quite the same as satirically) calling bullshit on how news is presented. Frequent musical cues between segments make me laugh. Unlike the mainstream media, which are difficult not to regard as propaganda arms of the government, Some More News is unapologetically liberal and indulges in black humor, which doesn’t make me laugh. Its raw anger and exasperation are actually a little terrifying. The second YouTube channel is Three Arrows, a sober, thorough debunking of news and argumentation found elsewhere in the public sphere. The speaker, who doesn’t appear onscreen, springs into action especially when accusations of current-day Nazism come up. (The current level of debate has devolved to recklessly calling nearly everyone a Nazi at some stage. Zero points scored.) Historical research often puts things into proper context, such as the magnitude of the actual Holocaust compared to some garden-variety racist running his or her mouth comparatively harmlessly. The third YouTube channel is ContraPoints, which is rather fanciful and profane but remarkably erudite considering the overall tone. Labels and categories are explained for those who may not have working definitions at the ready for every phrase or ideology. Accordingly, there is plenty of jargon. The creator also appears as a variety of different characters to embody various archetypes and play devil’s advocate.

While these channels may provide abundant information, correcting error and contextualizing better than most traditional media, it would be difficult to conclude they’re really moving the conversation forward. Indeed, one might wonder why bother preparing these videos considering how time consuming it has to be to do research, write scripts, assemble pictorial elements, etc. I won’t succumb to the intentional fallacy and suggest I know why they bother holding these nondebates. Further, unless straight-up comedy, I wouldn’t say they’re entertaining exactly, either. Highly informative, perhaps, if one pays close attention to frenetic online pace and/or mines for content (e.g., studying transcripts or following links). Interestingly, within a fairly short period of time, these channels are establishing their own rhetoric, sometimes useful, other times too loose to make strong impressions. It’s not unlike the development of new stylistic gestures in music or painting. What if anything worthwhile will emerge from the scrum will be interesting.

Among the myriad ways we have of mistreating each other, epithets may well be the most ubiquitous. Whether using race, sex, age, nationality, or nominal physical characteristic (especially genital names), we have so many different words with which to insult and slur it boggles the mind. Although I can’t account for foreign cultures, I doubt there is a person alive or dead who hasn’t suffered being made fun of for some stupid thing. I won’t bother to compile a list there are so many (by way of example, Wikipedia has a list of ethnic slurs), but I do remember consulting a dictionary of historical slang, mostly disused, and being surprised at how many terms were devoted specifically to insults.

I’m now old and contented enough for the “sticks and stones …” dismissal to nullify any epithets hurled my way. When one comes up, it’s usually an obvious visual characteristic, such as my baldness or ruddiness. Those characteristics are of course true, so why allow them to draw ire when used with malicious intent? However, that doesn’t stop simple words from giving grave offense for those with either thin skins or being so-called fighting words for those habituated to answering provocation with physical force. And in an era when political correctness has equated verbal offense with violence, the self-appointed thought police call for blood whenever someone steps out of line in public. Alternatively, when such a person is one’s champion, then the blood sport becomes spectacle, such as when 45 gifts another public figure with a sobriquet.

The granddaddy of all epithets — the elephant in the room, at least in the U.S. — will not be uttered by me, sorta like the he-who-shall-not-be-named villain of the Harry Potter universe or the forbidden language of Mordor from the Tolkien universe. I lack standing to use the term in any context and won’t even venture a euphemism or placeholder using asterisks or capitalisms. Reclaiming the term in question by adopting it as a self-description — a purported power move — has decidedly failed to neutralize the term. Instead, the term has become even more egregiously insulting than ever, a modern taboo. Clarity over who gets to use the term with impunity and when is elusive, but for my own part, there is no confusion: I can never, ever speak or write it in any context. I also can’t judge whether this development is a mark of cultural progress or regression.

If the previous blog in this series was about how some ideas and beliefs become lodged or stuck in place (fixity bias), this one is about how other ideas are notoriously mutable (flexibility bias), especially the latest, loudest thing to turn one’s head and divert attention. What makes any particular idea (or is it the person?) prone to one bias or another (see this list) is mysterious to me, but my suspicion is that a character disposition toward openness and adherence to authoritative evidence figure prominently in the case of shifting opinion. In fact, this is one of the primary problems with reason: if evidence can be deployed in favor of an idea, those who consider themselves “reasonable” and thus rely on accumulation of evidence and argumentation to sharpen their thinking are vulnerable to the latest “finding” or study demonstrating sumpinorutha. It’s the intellectual’s version of “New! Improved!”

Sam Harris exploits rationalism to argue against the existence of free will, saying that if sufficient evidence can be brought to bear, a disciplined thinker is compelled to subscribe to the conclusions of reasoned argument. Choice and personal agency (free will) are removed. I find that an odd way to frame the issue. Limitless examples of lack of choice are nonequivalent to the destruction of free will. For example, one can’t decide not to believe in gravity and fly up into the air more than a few inches. One can’t decide that time is an illusion (as theoretical physicists now instruct) and decide not to age. One can’t decide that pooping is too disgusting and just hold it all in (as some children attempt). Counter-evidence doesn’t even need to be argued because almost no one pretends to believe such nonsense. (Twisting one’s mind around to believe in the nonexistence of time, free will, or the self seems to be the special province of hyper-analytical thinkers.) Yet other types of belief/denial — many of them conspiracy theories — are indeed choices: religion, flat Earth, evolution, the Holocaust, the moon landings, 9/11 truth, who really killed JFK, etc. Lots of evidence has been mustered on different sides (multiple facets, actually) of each of these issues, and while rationalists may be compelled by a preponderance of evidence in favor of one view, others are free to fly in the face of that evidence for reasons of their own or adopt by default the dominant narrative and not worry or bother so much.

The public struggles in its grasp of truthful information, as reported in a Pew Research Center study called “Distinguishing Between Factual and Opinion Statements in the News.” Here’s the snapshot:

The main portion of the study, which measured the public’s ability to distinguish between five factual statements and five opinion statements, found that a majority of Americans correctly identified at least three of the five statements in each set. But this result is only a little better than random guesses. Far fewer Americans got all five correct, and roughly a quarter got most or all wrong.

Indiscriminate adoption by many Americans of a faulty viewpoint, or more pointedly, the propaganda and “fake news” on offer throughout the information environment, carries the implication that disciplined thinkers are less confused about truth or facts, taking instead a rational approach as the basis for belief. However, I suggest that reason suffers its own frailties not easily recognized or acknowledged. In short, we’re all confused, though perhaps not hopelessly so. For several years now, I’ve sensed the outline of a much larger epistemological crisis where quintessential Enlightenment values have come under open attack. The irony is that the wicked stepchild of science and reason — runaway technology —  is at least partially responsible for this epochal conflict. It’s too big an idea to grok fully or describe in a paragraph or two, so I’ll simply point to it an move on.

My own vulnerability to flexibility bias manifests specifically in response to appeals to authority. Although well educated, a lifelong autodidact, and an independent thinker, I’m careful not to succumb to the hubris of believing I’ve got it all figgered. Indeed, it’s often said that as one gains expertise and experience in the world, the certainty of youth yields to caution precisely because the mountain of knowledge and understanding one lacks looms larger even as one accumulates wisdom. Bodies of thought become multifaceted and all arguments must be entertained. When an expert, researcher, or academic proposes something outside my wheelhouse, I’m a sitting duck: I latch onto the latest, greatest utterance as the best truth yet available. I don’t fall for it nearly so readily with journalists, but I do recognize that some put in the effort and gain specialized knowledge and context well outside the bounds of normal life, such as war reporters. Various perverse incentives deeply embedded in the institutional model of journalism, especially those related to funding, make it nearly impossible to maintain one’s integrity without becoming a pariah, so only a handful have kept my attention. John Pilger, Chris Hedges, and Matt Taibbe figure prominently.

By way of example, one of the topics that has been of supreme interest to me, though its historical remove renders it rather toothless now, is the cataclysm(s) that occurred at the conclusion of the last ice age roughly 12,000 years ago. At least three hypotheses (of which I’m aware) have been proposed to explain why glacial ice disappeared suddenly over the course of a few weeks, unleashing the Biblical Flood: Earth crust displacement, asteroidal impact(s), and coronal mass ejection(s). Like most hypotheses, evidence is both physical and conjectural, but a sizable body of evidence and argumentation for each is available. As I became familiar with each, my head turned and I became a believer, sorta. Rather than “last one is the rotten egg,” however, the latest, most recent one typically displaces the previous one. No doubt another hypothesis will appear to turn my head and disorient me further. With some topics, especially politics, new information piling on top of old is truly dizzying. And as I’ve written about many topics, I simply lack the expertise to referee competing claims, so whatever beliefs I eventually adopt are permanently provisional.

Finally, my vulnerability to authoritative appeal also reacts to the calm, unflappable tones and complexity of construction of speakers such as Sam Harris, Steven Pinker, and Charles Murray. Their manner of speaking is sometimes described pejoratively as “academese,” though only Pinker has a teaching position. Murray in particular relies heavily on psychometrics, which may not be outright lying with statistics but allows him to rationalize (literally) extraordinarily taboo subjects. In contrast, it’s easy to disregard pundits and press agents foaming and fulminating over their pet narratives. Yet I also recognize that with academese, I’m being soothed more by style than by substance, a triumph of form over function. In truth, this communication style is an appeal to emotion masquerading as an appeal to authority. I still prefer it, just as I prefer a steady, explanatory style of journalism over the snarky, reinterpretive style of disquisition practiced by many popular media figures. What communicates most effectively to me and (ironically) pushes my emotional buttons also weakens my ability to discriminate and think properly.

Yet still more to come in part 5.

An ongoing conflict in sociology and anthropology exists between those who believe that human nature is competitive and brutal to the bitter end versus those who believe human nature is more cooperative and sociable, sharing resources of all types to secure the greater good. This might be recognizable to some as the perennial friction between citizen and society (alternatively, individualism and collectivism). Convincing evidence from human prehistory is difficult to uncover. Accordingly, much of the argument for competition comes from evolutionary biology, where concepts such as genetic fitness and reproductive success (and by inference, reproductive failure) are believed to motivate and justify behavior across the board. As the typical argument goes, inferior genes and males in particular who lack sexual access or otherwise fail to secure mates don’t survive into the next generation. Attributes passed onto each subsequent generation thus favor fitter, Type A brutes who out-compete weaker (read: more cooperative) candidates in an endless self-reinforcing and narrowing cycle. The alternative offered by others points to a wider gene pool based on collaboration and sharing of resources (including mates) that enables populations to thrive together better than individuals who attempt to go it alone or dominate.

Not having undertaken a formal study of anthropology (or more broadly, primatology), I can’t say how well this issue is settled in the professional, academic literature. Online, I often see explanations that are really just-so stories based on logic. What that means is that an ideal or guiding principle is described, something that just “makes sense,” and supporting evidence is then assumed or projected. For instance, we now know many of the mechanisms that function at the cellular level with respect to reproduction and genetic evolution. Those mechanisms are typically spun up the level of the organism through pure argumentation and presumed to manifest in individual behaviors. Any discontinuity between aggregate characteristics and particular instances is ignored. Questions are solved through ideation (i.e., thought experiments). However, series of if-then statements that seem plausible when confronted initially often turn out to be pure conjecture rather than evidence. That’s a just-so story.

One of the reasons we look into prehistory for evidence of our true nature (understood as biology, not sociology, handily sweeping aside the nature/nurture question) is that hunter-gatherers (HGs) lived at subsistence level for a far longer period of our evolutionary history than our comparatively brief time within the bounty of civilization. It’s only when surpluses and excesses provide something worth hoarding, monopolizing, and protecting that hierarchies arise and/or leveling mechanisms are relaxed. Leaving Babylon has a discussion of this here. Some few HG cultures survive into the 21st century, but for most of us, The Agricultural Revolution is the branching point when competition began to assert itself, displacing sharing and other egalitarian impulses. Accordingly, the dog-eat-dog competition and inequality characteristic of the modern world is regarded by many as an exaptation, not our underlying nature.


YouTube ratings magnet Jordan Peterson had a sit-down with Susan Blackmore to discuss/debate the question, “Do We Need God to Make Sense of Life?” The conversation is lightly moderated by Justin Brierley and is part of a weekly radio broadcast called Unbelievable? (a/k/a The Big Conversation, “the flagship apologetics and theology discussion show on Premier Christian Radio in the UK”). One might wonder why evangelicals are so eager to pit believers and atheists against each other. I suppose earnest questioning of one’s faith is preferable to proselytizing, though both undoubtedly occur. The full episode (47 min.) is embedded below: (more…)