Posts Tagged ‘Culture’

With each successive election cycle, I become more cynical (how is that even possible?) about the candidates and their supposed earnest mission to actually serve the public interest. The last couple cycles have produced a new meme that attempts to shift blame for poor governance to the masses: the low-information voter. Ironically, considering the fact that airwaves, magazines, books, public addresses, online venues, and even dinner conversations (such as they still exist if diners aren’t face-planted in their screens) are positively awash in political commentary and pointless debate and strategizing, there is no lack of information available. However, being buried under a déluge of information is akin to a defense attorney hiding damning discovery in an ocean of irrelevance, so I have some sympathy for voters who are thwarted in attempts to make even modestly informed decisions about political issues.

Multiply this basic relationship across many facets of ordinary life and the end result is the low-information citizen (also low-information consumer). Some parties (largely sellers of things, including ideas) possess a profusion of information, whereas useful, actionable information is hidden from the citizen/consumer by an information avalanche. For example, onerous terms of an insurance contract, debt instrument, liability waiver, or even routine license agreement are almost never read prior to signing or otherwise consenting; the acronym tl;dr (stands for “too long; didn’t read”) applies. In other situations, information is withheld entirely, such as pricing comparisons one might undertake if high-pressure sales tactics were not deployed to force potential buyers in decisions right here, right now, dammit! Or citizens are disempowered from exercising any critical judgment by erecting secrecy around a subject, national security being the utility excuse for everything the government doesn’t want people to know.

Add to this the concerted effort (plain enough to see if one bothers to look) to keep the population uneducated, without options and alternatives, scrambling just to get through the day/week/month (handily blocking information gathering), and thus trapped in a condition of low information. Such downward pressure (survival pressure, one might say when considering the burgeoning homeless population) is affecting a greater portion of the population than ever. The American Dream that energized and buoyed the lives of many generations of people (including immigrants) has morphed into the American Nightmare. Weirdly, the immigrant influx has not abated but rather intensified. However, I consider most of those folks (political, economic, and ecological) refugees, not immigrants.

So those are the options available to powers players, where knowledge is power: (1) withhold information, (2) if information can’t be withheld, then bury it as a proverbial needle in a haystack, and (3) render a large percentage of the public unable to process and evaluate information by keeping them undereducated. Oh, here’s another: (4) produce a mountain of mis- and disinformation that bewilders everyone. This last one is arguably the same as (2) except that the intent is propaganda or psyop. One could also argue that miseducating the public (e.g., various grievance studies blown into critical race theory now being taught throughout the educational system) is the same as undereducating. Again, intent matters. Turning someone’s head and radicalizing them with a highly specialized toolkit (mostly rhetorical) for destabilizing social relations is tantamount to making them completely deranged (if not merely bewildered).

These are elements of the ongoing epistemological crisis I’ve been observing for some time now, with the side effect of a quick descent into social madness being used to justify authoritarian (read: fascist) concentration of power and rollback of individual rights and freedoms. The trending term sensemaking also applies, referring to reality checks needed to keep oneself aligned with truth, which is not the same as consensus. Groups are forming up precisely for that purpose, centered on evidentiary rigor as well as skepticism toward the obvious disinformation issuing from government agencies and journalists who shape information according to rather transparent brazen agendas. I won’t point to any particular trusted source but instead recommend everyone do their best (no passivity!) to keep their wits about them and think for themselves. Not an easy task when the information environment is so thoroughly polluted — one might even say weaponized — that it requires special workarounds to navigate effectively.

Guy McPherson used to say in his presentations that we’re all born into bondage, meaning that there is no escape from Western civilization and its imperatives, including especially participation in the money economy. The oblique reference to chattel slavery is clumsy, perhaps, but the point is nonetheless clear. For all but a very few, civilization functions like Tolkien’s One Ring, bringing everyone ineluctably under its dominion. Enlightenment cheerleaders celebrate that circumstance and the undisputed material and technological (same thing, really) bounties of the industrial age, but Counter-Enlightenment thinkers recognize reasons for profound discontent. Having blogged at intervals about the emerging Counter-Enlightenment and what’s missing from modern technocratic society, my gnawing guilt by virtue of forced participation in the planet-killing enterprise of industrial civilization is growing intolerable. Skipping past the conclusion drawn by many doomers that collapse and ecocide due to unrestrained human consumption of resources (and the waste stream that follows) have already launched a mass extinction that will extirpate most species (including large mammals such as humans), let me focus instead on gross dysfunction occurring at levels falling more readily within human control.

An Empire of War

Long overdue U.S. troop withdrawal from Afghanistan has already yielded Taliban resurgence, which was a foregone conclusion at whatever point U.S. troops left (and before them, Soviets). After all, the Taliban lives there and had only to wait. Distasteful and inhumane as it may be to Westerners, a powerful faction (religious fanatics) truly wants to live under a 7th-century style of patriarchy. Considering how long the U.S. occupied the country, a new generation of wannabe patriarchs came to adulthood — an unbroken intergenerational descent. Of course, the U.S. (and others) keeps arming them. Indeed, I heard that the U.S. military is considering bombing raids to destroy the war machines left behind as positions were so swiftly abandoned. Oops, too late! This is the handiest example how failed U.S. military escapades extending over decades net nothing of value to anyone besides weapons and ordnance manufacturers and miserable careerists within various government branches and agencies. The costs (e.g., money, lives, honor, sanity) are incalculable and spread with each country where the American Empire engages. Indeed, the military-industrial complex chooses intervention and war over peace at nearly every opportunity (though careful not to poke them bears too hard). And although the American public’s inability to affect policy (unlike the Vietnam War era) doesn’t equate with participation, the notion that it’s a government of the people deposits some of the blame on our heads anyway. My frustration is that nothing is learned and the same war crimes mistakes keep being committed by maniacs who ought to know better.

Crony and Vulture Capitalism

Critics of capitalism are being proven correct far more often than are apologists and earnest capitalists. The two subcategories I most deplore are crony capitalism and vulture capitalism, both of which typically accrue to the benefit of those in no real need of financial assistance. Crony capitalism is deeply embedded within our political system and tilts the economic playing field heavily in favor of those willing to both pay for and grant favors rather than let markets sort themselves out. Vulture capitalism extracts value out of dead hosts vulnerable resource pools by attacking and often killing them off (e.g., Microsoft, Walmart, Amazon), or more charitably, absorbing them to create monopolies, often by hostile takeover at steep discounts. Distressed mortgage holders forced into short sales, default, and eviction is the contemporary example. Rationalizing predatory behavior as competition is deployed regularly.

Other historical economic systems had similarly skewed hierarchies, but none have reached quite the same heartless, absurd levels of inequality as late-stage capitalism. Pointing to competing systems and the rising tide that lifts all boats misdirects people to make ahistorical comparisons. Human psychology normally restricts one’s points of comparison to contemporaries in the same country/region. Under such narrow comparison, the rank injustice of hundred-billionaires (or even simply billionaires) existing at the same time as giant populations of political/economic/climate refugees and the unhoused (the new, glossy euphemism for homelessness) demonstrates the soul-forfeiting callousness of the top quintile and/or 1% — an ancient lesson never learned. Indeed, aspirational nonsense repackages suffering and sells it back to the underclass, which as a matter of definition will always exist but need not have to live as though on an entirely different planet from Richistan.

Human Development

Though I’ve never been a big fan of behaviorism, the idea that a hypercomplex stew of influences, inputs, and stimuli leads to better or worse individual human development, especially in critical childhood years but also throughout life, is pretty undeniable. As individuals aggregate into societies, the health and wellbeing of a given society is linked to the health and wellbeing of those very individuals who are understood metaphorically as the masses. Behaviorism would aim to optimize conditions (as if such a thing were possible), but because American institutions and social systems have been so completely subordinated to capitalism and its distortions, society has stumbled and fumbled from one brand of dysfunction to another, barely staying ahead of revolution or civil war (except that one time …). Indeed, as the decades have worn on from, say, the 1950s (a nearly idyllic postwar reset that looms large in the memories of today’s patrician octogenarians), it’s difficult to imaging how conditions could have deteriorated any worse other than a third world war.

Look no further than the U.S. educational system, both K–12 and higher ed. As with other institutions, education has had its peaks and valleys. However, the crazy, snowballing race to the bottom witnessed in the last few decades is utterly astounding. Stick a pin in it: it’s done. Obviously, some individuals manage to get educated (some doing quite well, even) despite the minefield that must be navigated, but the exception does not prove the rule. Countries that value quality education (e.g., Finland, China, Singapore, Japan, South Korea) in deed, not just in empty words trotted out predictably by every presidential campaign, routinely trounce decidedly middling results in the U.S. and reveal that dysfunctional U.S. political systems and agencies (Federal, state, municipal) just can’t get the job done properly anymore. (Exceptions are always tony suburbs populated by high-earning and -achieving parents who create opportunities and unimpeded pathways for their kids.) Indeed, the giant babysitting project that morphs into underclass school-to-prison and school-to-military service (cannon fodder) pipelines are what education has actually become for many. The opportunity cost of failing to invest in education (or by proxy, American youth) is already having follow-on effects. The low-information voter is not a fiction, and it extends to every American institution that requires clarity to see through the fog machine operated by the mainstream media.

As an armchair social critic, I often struggle to reconcile how history unfolds without a plan, and similarly, how society self-organizes without a plan. Social engineering gets a bad rap for reasons: it doesn’t work (small exceptions exist) and subverts the rights and freedoms of individuals. However, the rank failure to achieve progress (in human terms, not technological terms) does not suggest stasis. By many measures, the conditions in which we live are cratering. For instance, Dr. Gabor Maté discusses the relationship of stress to addiction in a startling interview at Democracy Now! Just how bad is it for most people?

… it never used to be that children grew up in a stressed nuclear family. That wasn’t the normal basis for child development. The normal basis for child development has always been the clan, the tribe, the community, the neighborhood, the extended family. Essentially, post-industrial capitalism has completely destroyed those conditions. People no longer live in communities which are still connected to one another. People don’t work where they live. They don’t shop where they live. The kids don’t go to school, necessarily, where they live. The parents are away most of the day. For the first time in history, children are not spending most of their time around the nurturing adults in their lives. And they’re spending their lives away from the nurturing adults, which is what they need for healthy brain development.

Does that not sound like self-hobbling? A similar argument can be made about human estrangement from the natural world, considering how rural-to-urban migration (largely completed in the U.S. but accelerating in the developing world) has rendered many Americans flatly unable to cope with, say, bugs and dirt and labor (or indeed most any discomfort). Instead, we’ve trapped ourselves within a society that is, as a result of its organizing principles, slowly grinding down everyone and everything. How can any of us (at least those of us without independent wealth) choose not to participate in this wretched concatenation? Nope, we’re all guilty.

From Ran Prieur (no link, note nested reply):


I was heavily into conspiracy theory in the 90’s. There was a great paper magazine, Kenn Thomas’s Steamshovel Press, that always had thoughtful and well-researched articles exploring anomalies in the dominant narrative.

Another magazine, Jim Martin’s Flatland, was more dark and paranoid but still really smart. A more popular magazine, Paranoia, was stupid but fun.

At some point, conspiracy culture shifted to grand narratives about absolute evil. This happened at the same time that superhero movies (along with Harry Potter and Lord of the Rings) took over Hollywood. The more epic and the more black-and-white the story, the more humans are drawn to it.

This is my half-baked theory: It used to be that ordinary people would accept whatever the TV said — or before that, the church. Only a few weirdos developed the skill of looking at a broad swath of potential facts, and drawing their own pictures.

It’s like seeing shapes in the clouds. It’s not just something you do or don’t do — it’s a skill you can develop, to see more shapes more easily. And now everyone is learning it.

Through the magic of the internet, everyone is discovering that they can make reality look like whatever they want. They feel like they’re finding truth, when really they’re veering off into madness.

SamuraiBeanDog replies: Except that the real issue with the current conspiracy crisis is that people are just replacing the old TV and church sources with social media and YouTube. The masses of conspiracy culture aren’t coming up with their own realities, they’re just believing whatever shit they’re told by conspiracy influencers.

Something that’s rarely said about influencers, and propaganda in general, is that they can’t change anyone’s mind — they have to work with what people already feel good about believing.

Among those building fame and influence via podcasting on YouTube is Michael Malice. Malice is a journalist (for what organization?) and the author of several books, so he has better preparation and content than many who (like me) offer only loose opinion. He latest book (no link) is The Anarchist Handbook (2021), which appears to be a collection of essays (written by others, curated by Malice) arguing in theoretical support of anarchism (not to be confused with chaos). I say theoretical because, as a hypersocial species of animal, humans never live in significant numbers without forming tribes and societies for the mutual benefit of their members. Malice has been making the rounds discussing his book and is undoubtedly an interesting fellow with well rehearsed arguments. Relatedly, he argues in favor of objectivism, the philosophy of Ayn Rand that has been roundly criticized and dismissed yet continues to be attractive especially to purportedly self-made men and women (especially duped celebrities) of significant wealth and achievement.

Thus far in life, I’ve disdained reading Rand or getting too well acquainted with arguments in favor of anarchism and/or objectivism. As an armchair social critic, my bias skews toward understanding how things work (i.e., Nassim Taleb’s power laws) in actuality rather than in some crackpot theory. As I understand it, the basic argument put forward to support any variety of radical individualism is that everyone working in his or her own rational self-interest, unencumbered by the mores and restrictions of polite society, leads to the greatest (potential?) happiness and prosperity. Self-interest is not equivalent to selfishness, but even if it were, the theorized result would still be better than any alternative. A similar argument is made with respect to economics, known as the invisible hand. In both, hidden forces (often digital or natural algorithms), left alone to perform their work, enhance conditions over time. Natural selection is one such hidden force now better understood as a component of evolutionary theory. (The term theory when used in connection with evolution is an anachronism and misnomer, as the former theory has been scientifically substantiated as a power law.) One could argue as well that human society is a self-organizing entity (disastrously so upon even casual inspection) and that, because of the structure of modernity, we are all situated within a thoroughly social context. Accordingly, the notion that one can or should go it alone is a delusion because it’s flatly impossible to escape the social surround, even in aboriginal cultures, unless one is totally isolated from other humans in what little remains of the wilderness. Of course, those few hardy individuals who retreat into isolation typically bring with them the skills, training, tools, and artifacts of society. A better example might be feral children, lost in the wilderness at an early age and deprived of human society but often taken in by a nonhuman animal (and thus socialized differently).

My preferred metaphor when someone insists on total freedom and isolation away from the maddening crowd is traffic — usually automobile traffic but foot traffic as well. Both are examples of aggregate flow arising out of individual activity, like drops of rain forming into streams, rivers, and floods. When stuck in automobile congestion or jostling for position in foot traffic, it’s worthwhile to remember that you are the traffic, a useful example of synecdoche. Those who buck the flow, cut the line, or drive along the shoulder — often just to be stuck again a little farther ahead — are essentially practicing anarchists or me-firsters, whom the rest of us simply regard as assholes. Cultures differ with respect to the orderliness of queuing, but even in those places where flow is irregular and unpredictable, a high level of coordination (lost on many American drivers who can’t figger a roundabout a/k/a traffic circle) is nonetheless evident.

As I understand it, Malice equates cooperation with tyranny because people defer to competence, which leads to hierarchy, which results in power differentials, which transforms into tyranny (petty or profound). (Sorry, can’t locate the precise formulation.) Obvious benefits (e.g., self-preservation) arising out of mutual coordination (aggregation) such as in traffic flows are obfuscated by theory distilled into nicely constructed quotes. Here’s the interesting thing: Malice has lived in Brooklyn most of his life and doesn’t know how to drive! Negotiating foot traffic has a far lower threshold for serious harm than driving. He reports that relocation to Austin, TX, is imminent, and with it, the purchase of a vehicle. My suspicion is that to stay out of harm’s way, Malice will learn quickly to obey tyrannical traffic laws, cooperate with other drivers, and perhaps even resent the growing number of dangerous assholes disrupting orderly flow like the rest of us — at least until he develops enough skill and confidence to become one of those assholes. The lesson not yet learned from Malice’s overactive theoretical perspective is that in a crowded, potentially dangerous world, others must be taken into account. Repetition of this kindergarten lesson throughout human activity may not be the most pleasant thing for bullies and assholes to accept, but refusing to do so makes one a sociopath.

Continuing my book-blogging project on Orality and Literacy, Ong provides context for the oral tradition that surrounded the two great Homeric classics: The Iliad and The Odyssey. According to Ong, it took decades for literary critics and sociologists to overcome their bias, borne out of literacy, and recognize how formulaic are the two epics. They are essentially pastiches of commonplace plots, phrases, and sayings of the time, which was a notable strength when oral delivery based on memorization was how epic poetry was transmitted. In a literate era, such clichés are to be avoided (like the plague).

Aside: my review of David Serota’s Back to Our Future mentions the dialect he and his brother developed, filled with one-liners and catchphrases from entertainment media, especially TV and movies. The three-word (also three-syllable) form seems to be optimal: “Beam me up” (Star Trek), “Use the Force” (Star Wars), “Make my day” (Dirty Harry), “I’ll be back” (The Terminator), etc. This construction is short, punchy, and memorable. The first holder of high office in the U.S. to attempt to govern by catchphrase was probably Ronald Reagan, followed (of course) by Arnold Schwarzenegger and then Donald Trump. Mustn’t overlook that all three (and others) came to prominence via the entertainment industry rather than through earnest (Kennedyesque) public service. Trump’s numerous three-word phrases (shtick, really) lend themselves especially well to being chanted by adoring crowds at his pep rallies, swept up in groupthink, with a recognizable beat-beat-beat-(silence) structure. The rock band Queen stumbled upon this same elemental rhythm with its famous stomp-stomp-clap-(wait) from the anthem “We Are the Champions,” consciously intended for audience participation (as I understand it).

Further aside: “We Are the Champions” combines its iconic rhythm with a recitation tone sourced in antiquity. Make of that what you will.

Ong goes on to provide a discussion of the psychodynamics of orality, which I list here without substantive discussion (read for yourself):

  • orality is additive rather than subordinative
  • orality is aggregative rather than analytic
  • orality is redundant or copious
  • orality is conservative or traditionalist
  • orality is close to the human lifeworld
  • orality is agonistically toned
  • orality is empathetic and participatory rather than objectively distanced
  • orality is homeostatic
  • orality is situational rather than abstract

Of particular interest is Ong’s description of how language functions within oral cultures distinctly from literate cultures, which is the source of the bias mentioned above. To wit:

Fully literate persons can only with great difficulty imagine what a primary oral culture is like, that is, a culture with no knowledge whatsoever of writing or even the possibility of writing … In a primary oral culture, the expression ‘to look up something’ is an empty phrase … [w]ithout writing, words as such have no visual presence, even when the objects they represent are visual … [for] ‘primitive’ (oral) people … language is a mode of action and not simply a countersign of thought — oral people commonly, and probably universally, consider words to have great power. [pp. 31–32]

If this sounds conspicuously reminiscent this previous post, well, congratulations on connecting the dots. The whole point, according to a certain perspective, is that words are capable of violence, which is (re)gaining adherents as our mental frameworks undergo continuous revision. It’s no small thing that slurs, insults, and fighting words (again) provoke offense and violent response and that mere verbal offense equates to violence. Not long ago, nasty words were reclaimed, nullified, and thus made impotent (with varying levels of irrational rules of usage). Well, now they sting again and are used as ammo to cancel (a form of administrative violence, often undertaken anonymously, bureaucratically, and with the assistance of the digital mob) anyone with improper credentials to deploy them.

Let me draw another connection. Here’s a curious quote by Walter Pater, though not well known:

All art constantly aspires towards the condition of music. For while in all other kinds of art it is possible to distinguish the matter from the form, and the understanding can always make this distinction, yet it is the constant effort of art to obliterate it.

Put another way, the separation of signifier from signified, an abstraction conditioned by literacy and rationalism (among other things) is removed (“obliterated”) by music, which connects to emotion more directly than representational art. Similarly, speech within primary oral cultures exists purely as sound and possesses an ephemeral, even effervescence (Ong’s term) quality only experienced in the flow of time. (Arguably, all of human experience takes place within the flow of time.) Music and “primitive” speech are accordingly dynamic and cannot be reduced to static snapshots, that is, fixed on a page as text or committed to a canvas or photograph as a still image (hence, the strange term still life). That’s why a three-word, three-syllable chant, or better yet, the Queen rhythm or the Wave in sports arenas (a gesture requiring subscription of everyone), can possess inherent power, especially as individuals are entrained in groupthink. Music and words-as-violence get inside us and are nearly wholly subjective, not objective — something we all experience organically in early childhood before being taught to read and write (if in fact those skills are learned beyond functional literacy). Does that mean culture is reverting to an earlier stage of development, more primitive, childlike, and irrational?

So far, this multipart blog post has trafficked in principles and generalities. Let me try now to be more specific, starting with an excerpt from Barry Lynn’s article in Harper’s Magazine titled “The Big Tech Extortion Racket” (Sept. 2020):

… around the middle of the nineteenth century, Americans began to develop technologies that could not be broken into component pieces. This was especially true of the railroad and the telegraph … Such corporations [railroad and telegraph companies] posed one overarching challenge: they charged some people more than others to get to market. They exploited their control over an essential service in order to extort money, and sometimes political favors … Americans found the answer to this problem in common law. For centuries, the owners of ferries, stagecoaches, and inns had been required to serve all customers for the same price and in the order in which they arrived. In the late nineteenth century, versions of such “common carrier” rules were applied to the new middleman corporations.

Today we rightly celebrate the Sherman Antitrust Act of 1890, which gave Americans the power to break apart private corporations. But in many respects, the Interstate Commerce Act of 1887 was the more important document. This act was based on the understanding that monopoly networks like the railroad and the telegraph could be used to influence the actions of people who depend on them, and hence their power must be carefully restricted …

For a century and a half, Americans used common carrier policies to ensure the rule of law in activities that depended on privately held monopolies … regulations freed Americans to take full advantage of every important network technology introduced during these years, including telephones, water and electrical services, energy pipelines, and even large, logistics-powered retailers. Citizens did not have to worry that the men who controlled the technologies involved would exploit their middleman position to steal other people’s business or disrupt balances of power.

I appreciate that Barry Lynn brings up the Interstate Commerce Act. If this legal doctrine appeared in the net neutrality debate a few years ago, it must have escaped my notice. While Internet Service Providers (ISPs) enable network access and connectivity, those utilities have not yet exhibited let’s-be-evil characteristics. Similarly, phone companies (including cell phones) and public libraries may well be eavesdropping and/or monitoring activities of the citizenry, but the real action lies elsewhere, namely, on social media networks and with online retailers. Evil is arguably concentrated in the FANG (or FAANG) corporations but has now grown to be ubiquitous in all social networks (e.g., Twitter) operating as common carriers (Zoom? Slack?) and across academe, nearly all of which have succumbed to moral panic. They are interpreting correctly, sad to observe, demands to censor and sanitize others’ no-longer-free speech appearing on their networks or within their realms. How much deeper it goes toward shaping politics and social engineering is quasi-conspiratorial and impossible for me to assess.

Much as I would prefer to believe that individuals possess the good sense to shift their activities away from social networks or turn their attention from discomfiting information sources, that does not appear to be the case. Demands for trigger warnings and safe spaces commonplace a few years ago on college campuses have instead morphed into censorious removal, deplatforming, and cancellation from the entire public sphere. Those are wrong responses in free societies, but modern institutions and technologies have gotten out of hand and outstripped the limits of normal human cognition. In short, we’re a society gone mad. So rather than accept responsibility to sort out information overflow oneself, many are demanding that others do it for them, and evil private corporations are complying (after a fashion). Moreover, calls for creation of an Orwellian Ministry of Truth, rebranded as a Truth Commission and Reality Czar, could hardly be any more chillingly and fascistically bizarre. People really need someone to brainwash decide for them what is real? Has anyone at the New York Times actually read Orwell’s dystopian novel 1984 and taken to heart its lessons?

Considering the acceleration of practically everything in the late-modern world (postmodern refers to something quite different), which makes planning one’s higher education somewhat fraught if the subject matter studied is rendered flatly out-of-date or moribund by the time of either graduation or entry into the workforce, I’ve heard it recommended that expertise in any particular subject area may be less important than developing expertise in at least one subject that takes a systems approach. That system might be language and communications, mathematics (or any other hard science), history, economics and finance, business administration, computer coding, law and governance, etc. So long as a rigorous understanding of procedures and rules is developed, a structuralist mindset can be repeated and transferred into other subject areas. Be careful, however, not to conflate this approach with a liberal arts education, which is sometimes described as learning how to learn and is widely applicable across disciplines. The liberal arts have fallen distinctly out of favor in the highly technological and technocratic world, which cares little for human values resistant to quantification. Problem is, Western societies in particular are based on liberal democratic institutions now straining due to their sclerotic old age. And because a liberal arts education is scarcely undertaken anymore, civics and citizenship are no longer taught. Even the study of English has now been corrupted (postmodern does apply here) to the point that the basic liberal arts skill of critical thinking is being lost through attrition. Nowhere is that more abundantly clear than in bristling debate over free speech and censorship.

Aside. Although society tinkers and refines itself (sometimes declines) over time, a great body of cultural inheritance informs how things are done properly within an ideology or system. When tinkering and refinement become outright intransigence and defiance of an established order, it’s commonplace to hear the objection “but that’s not how _______ works.” For instance, debate over climate science or the utility of vaccines often has one party proclaiming “trust [or believe] the science.” However, that’s not how science works (i.e., through unquestioning trust or belief). The scientific method properly understood includes verification, falsification, and revision when results and assertions fail to establish reasonable certainty (not the same as consensus). Similarly, critical thinking includes a robust falsification check before “facts” can be accepted at face value. So-called “critical studies” (a/k/a grievance studies), like religious faith, typically positions bald assertions beyond the reach of falsification. Well, sorry, that’s not how critical thinking works.

Being older and educated before critical studies were fully legitimized (or gave rise to things as risible as feminist glaciology), my understand has always been that free speech and other rights are absolutes that cannot be sliced and diced into bits. That way lies casuistry, where law founders frequently. Thus, if one wishes, say, to trample or burn the U.S. flag in protest, no law can be passed or constitutional amendment enacted to carve out an exception disallowed that instance of dissenting free speech. A lesser example is kneeling silently rather than participating in singing the national anthem before a sporting event. Though offensive to certain individual’s sensibilities, silencing speech is far worse according to liberal democratic values. Whatever our ideological or political difference are, we cannot work them out when one party has the power to place topics out or bounds or remove others from discussion entirely. The point at which spirited debate crosses over into inciting violence or fomenting insurrection is a large gray area, which is the subject of the second impeachment of 45. Civil law covers such contingencies, so abridging free speech, deplatforming, and adopting the formulation “language is violence” are highly improper responses under the liberal form of government codified in the U.S. Constitution, which includes the Bill of Rights originally omitted from the U.S. Constitution but quickly added to articulate the rights fully.

Liberal democratic ideology arose in mercantile, substantially agrarian Western societies before scientific, industrial, and capitalist revolutions built a full head of steam, so to speak. Considering just how much America has developed since the Colonial Period, it’s no surprise society has outgrown its own founding documents. More pointedly, the intellectual commons was a much smaller environment, often restricted to a soapbox in the town square and the availability of book, periodicals,and broadsides. Today, the public square has moved online to a bewildering array of social media platforms that enables publication of one’s ideas well beyond the sound of one’s voice over a crowd or the bottleneck of a publisher’s printing press. It’s an entirely new development, and civil law has not kept pace. Whether Internet communications are regulated like the airwaves or nationalized like the U.S. military, it’s clear that the Wild West uber-democratic approach (where anyone can basically say anything) has failed. Demands for regulation (restrictions on free speech) are being taken seriously and acted upon by the private corporations that run social media platforms. During this interim phase, it’s easy for me, as a subscriber to liberal democratic values, to insist reflexively on free speech absolutism. The apparent mood of the public lies elsewhere.

I have observed various instances of magical thinking in mainstream culture, especially here, which I find problematical. Although it’s not my ambition to disabuse anyone of magical thinking, which extends far beyond, say, religious thought, I was somewhat taken aback at the suggestion found in the comic at this link (not embedded). For those not familiar with Questionable Content (one of two online comics I read regularly), the comic presents an extended cast of characters, mostly in their early 20s, living in a contemporary New England college town. Those characters are supplemented by a few older parents and lots of AIs (in robot bodies). The AIs are not particularly futuristic but are simply accepted as a normal (if curious) part of the world of the comic. Major story arcs involve characters and AIs (the AIs are characters, I suppose) in the process of discovering and establishing themselves as they (the humans, anyway) transition into early adulthood. There are no great political themes or intrusions into life in a college town. Rather, the comic is largely about acceptance of difference. Often, that means washing away meaningful difference in the name of banal tolerance. Real existential struggle is almost entirely absent.

In the linked comic, a new character comes along and offers advice to an established character struggling with sexual attractions and orientation. The dialogue includes this exchange:

Character A: If tarot or astrology or religion halps you make sense of the world and your place in it, then why not use them?
Character B: But they’re not real. [emphasis in original]
Character A: It doesn’t matter, if you use them constructively!

There it is in a nutshell: believe whatever you want if it, um, halps. I’ve always felt that being wrong (i.e., using unreal or make-believe things) was a sufficient injunction against anchoring oneself to notions widely known to be false. Besides, isn’t it often remarked that the biggest fool is one who fools himself? (Fiction as a combination of entertainment and building a worldview is quite normative, but it’s understood as fiction, or to a lesser degree, as life imitating art and its inverse. Exceptions abound, which are regarded as psychopathy.) The instruction in that dialogue (part object lesson, part lesson in cognition) is not that it’s OK to make mistakes but that knowingly believing something false has worthwhile advantages.

Surveying examples where promulgating false beliefs have constructive and destructive effects is too large a project. Well short of that, nasty categories include fraud, gaslighting, and propaganda, which are criminal in many cases and ought to be in most others (looking at you, MSM! — or not, since I neither trust nor watch). One familiar benevolent category is expressed in the phrase fake it til you make it, often recommended to overcome a lack of confidence. Of course, a swindle is also known as a confidence game (or by its diminutive, a con), so beware overconfidence when asked by another to pay for something (e.g., tarot or astrology readings), take risks, or accept an ideology without question.

As philosophy, willful adoption of falsity for its supposed benefits is half-baked. Though impossible to quantify, my suspicion is that instances of positive outcomes are overbalanced by negative ones. Maybe living in a constructed reality or self-reinforcing fantasy is what people want. The comic discussed is certainly in line with that approach. However, while we dither and delude ourselves with happy, aspirational stories based on silliness, the actual world around us, including all the human institutions that used to serve us but no longer do, falls to tatters. Is it better going through life and eventually to one’s grave refusing to see that reality? Should childlike wonder and innocence be retained in spite of what is easily observable just by poking one’s head up and dismissing comforting lies? Decide for yourself.

Something in an online discussion brought me back to my days as a Boy Scout. (No, not that, with your nasty, nasty assumptions.) It was one of the first merit badges I earned: Citizenship in the Community (link to PDF). I can’t remember any of the content anymore (haven’t yet consulted the PDF), and indeed, looking back with the advantage of several decades of hindsight, I have a hard time imagining any of the (morality? ethics?) lessons learned back then having had much durable impact despite remembering an emerging confidence and awareness (a commonplace delusion of youth) of my position within the community. Still, I appreciate having had many Boy Scout character-building experiences, which led to simple and enduring understandings of ideals such as honor, duty, preparedness, service, forbearance, shouldering hardships, and perhaps most of all, accepting responsibility for others, particularly those younger and weaker. (I’m not claiming to be any sort of paragon of virtue. Cynicism and misanthropy may have wrecked that aspiration.) I never served in the military, but I surmise others learn similar lessons slightly later in life when more readily absorbed and not so easily forgotten. In the past decade plus, some may seek these lessons through participation in endurance sports or martial arts (if not distorted by bad instruction like in Cobra Kai), though the focus outward (i.e., toward community and mutual reliance) may not be as strong.

The subject came up in a discussion of participants in small-scale democracy, something I’ve always known is messy, unrewarding, thankless, and sometimes costly yet still necessary to be a good citizen contributing to one’s community. Many adults get their first taste of local democratic groups (read: self-governing) through parent groups like the Parent-Teacher Association (PTA). Or maybe it’s a performing arts organization, home owner’s association, church council, social work hotline, self-help group, or cooperative. Doesn’t matter which. (Political activism and organizing might be something quite different. Hard to say.) Groups run on the good will and dedication of volunteered time and skills for the benefit of members of the community. As with any population, there are always free riders: those who contribute nothing but enjoy and/or extract benefits. In fact, if everyone were integrally involved, organizational complexity would become unmanageable. If activities of such groups seem like a piece of cake or vaguely utopian, just join one and see how different character types behave. Lotta dead wood in such organization. Moreover, power mongers and self-aggrandizers often take over small-scale democracies and run them like private fiefdoms. Or difficult policy and finance discussions divide otherwise like-minded groups into antagonists. As I said, it’s a decidedly messy undertaking.

Members of the community outside of the executive group (typically a board of directors) also have legitimate interests. Maybe community members attend meetings to keep informed or weigh in online with unconstructive complaints and criticisms (or even mockery and trolling) but then refuse to contribute anything worthwhile. Indeed, boards often have difficulty recruiting new officers or participants because no one wants to take on responsibility and face potential criticism directed at them. I’ve also seen boards settle into the same few folks year after year whose opinions and leadership grow stale and calcifies.

Writ large, leadership skills learned through citizenship in the community rise to the equivalents of Boy Scout merit badges Citizenship in the Nation and Citizenship in the World (no links but searchable). Skills deployed at those strata would arguably require even greater wherewithal and wisdom, with stakes potentially being much higher. Regrettably, having just passed through an election cycle and change of leadership in the U.S., my dour assessment is that leadership has failed miserably at multiple issues. The two most significant involve how we fail to organize society for the benefit of all, namely, economic equality and resource sustainability. Once market forces came to bear on social organization and corporate entities grew too large to be rooted in community service anymore, greed and corruption destroyed high-minded ideals. More self-aggrandizers and careerists than ever (no names, fill in the blanks, they’re all famous — or infamous) rose to the tops of organizations and administrations, especially politics, news media, and the punditry. Their logical antidotes are routinely and ruthlessly disenfranchised and/or ignored. The lasting results are financial inequality run amok and unsustainable resource addictions (energy mostly) that are toxifying the environment and reducing the landscape to ruin and inhabitability. (Perpetual war is a third institutional failure that could be halted almost immediately if moral clarity were somehow to appear.) It’s all out there, plain to see, yet continues to mount because of execrable leadership. Some argue it’s really a problem with human nature, a kind of original stain on our souls that can never be erased and so should be forgiven or at least understood (and rationalized away) within a large context. I’m not yet ready to excuse national and world leaders. Their culpability is criminal.

I have a memory of John Oliver dressing down a room full of entertainment journalists (ugh …) asking him questions following his Emmy win a few years ago. The first few had failed to offer even perfunctory congratulations for his award but instead leapt straight into questions. After his demand that everyone observe basic courtesy by at least acknowledging the reason for their attention being focused on him, each dutifully offered their compliments, which Oliver accepted graciously, and a question and answer ensued. It was a worthy reminder (something I mistakenly believed superfluous when I was much younger) that we have a sophisticated set of manners developed over time to which we should all subscribe. Behaving otherwise (i.e., skipping straight to matters at hand) is boorish, clownish, rude, and unsophisticated. Thus, routine exchanges at the beginnings of most interviews intended for broadcast go something to the effect, “Thanks for appearing on the show” or “Nice to meet you” followed by “Pleased to be here” or “My pleasure.” It’s part of a formal frame, the introduction or prologue, bearing no significant content but needful for hosts and guests to acknowledge each other.

In the course viewing many podcasts, often conducted by relative unknowns who nonetheless manage to attract someone of distinction to interview, I notice a tendency to geek out and succumb to effusive fandom. Even a little bit of that has the unfortunate effect of establishing an uneasy tension because the fan often becomes unhinged in the presence of the celebrity. Even when there is no latent threat of something going really wrong, the fanboi sometimes goes to such an extreme heaping praise and adulation on the interview subject that nothing else worthwhile occurs. Instead, one witnesses only the fanboi’s self-debasement. It makes me squirm watching someone figuratively fellating a celebrity (apology for my coarseness, but that’s really what springs to mind), and those on the receiving end often look just as uncomfortable. There’s simply no good response to the gushing, screaming, fainting, delirious equivalent of a 15-year-old Beatles freak (from back in the day) failing to hold it together and being caught embarrassingly in flagrante delicto.

Like others, I admire some people for their extraordinary accomplishments, but I never describe myself as a fan. Rather, objects of my admiration fall uniformly in the category of heroes people one shouldn’t scrutinize too closely lest their flaws be uncovered. Further, those times I’ve been in the presence of celebrities are usually the occasion of some discomfort precisely because celebrities’ fame invokes a false sense of intimacy (one might say oversharing) because details of their lives are in full public view. A balanced interaction is impossible because I know quite a bit about them whereas they know nothing about me, and topics gravitate toward the reasons for their celebrity. Most of us average folks feel compelled to acknowledge the films, trophies, recordings, awards, etc. that form their accomplishments, no matter how out of date. I’ve never been in the circumstance where a famous person, recognizing that I don’t recognize him or her (or don’t kowtow as expected), plays the celebrity card: “Don’t you know who I am?”

An interrelated effect is when someone has way too much money, that fortune clouding all interactions because it transforms the person into a target for those currying favor or otherwise on the make. Scammers, conmen, golddiggers, sycophants, etc. appear to extract wealth, and the dynamic breeds mutual distrust and wariness even in routine transactions. Chalk it up as another corrupting aspect of inequality run amok, this time affecting wannabes as well. In light of this, I suppose it’s understandable that rich, famous people are most comfortable among those similarly rich and famous, thus, immune to envy and fandom (but not always). Everyone else is alienated. Weird sort of gilded case to live in — not one that I admire.