Archive for the ‘Culture’ Category

“Language is dynamic” is a phrase invoked in praise or derision of shifts in usage. Corollaries include “the only constant is change” and “time’s arrow points in only one direction” — both signalling that stasis is an invalid and ultimately futile conservative value. The flip side might well be the myth of progress, understood in reference not to technological advancement but human nature’s failure to rise above its base (animal) origins. This failure is especially grotesque considering that humans currently albeit temporarily live in an age of material abundance that would provide amply for everyone if that largesse were justly and equitably produced and distributed. However, resources (including labor) are being systematically exploited, diverted, and hoarded by a small, unethical elite (what some call “alpha chimps”) who often use state power to subjugate vulnerable populations to funnel further tribute to the already obscenely wealthy top of the socioeconomic hierarchy. But that’s a different diatribe.

Although I’m sensitive the dynamism of language — especially terms for broad ideas in need of short, snappy neologisms — I’m resistant to adopting most new coin. For instance, multiple colors of pill (red, blue, white, and black to my knowledge) refer to certain narrative complexes that people, in effect, swallow. Similarly, the “blue church” is used to refer to legacy media struggling desperately (and failing) to retain its last shreds of legitimacy and authority. (Dignity is long gone.) Does language really need these terms or are hipsters just being clever? That question probably lacks a definitive answer.

My real interest with this blog post, however, is how the modern digital mediascape has given rise to a curious phenomenon associated with cancel culture: deletion of tweets and social media posts to scrub one’s past of impropriety as though the tweet or post never happened. (I’ve never deleted a post nor have any plans to.) Silicon Valley hegemons can’t resist getting their piece of the action, too, by applying deeply flawed algorithms to everyone’s content to demonetize, restrict, and/or remove (i.e., censor) offensive opinion that runs counter to (shifting) consensus narratives decided upon in their sole discretion as water carriers for officialdom. Algorithmic dragnets are not effective kludges precisely because thoughts are not synonymous with their online expression; one merely points to the other. Used to be said that the Internet is forever, so wait before posting or tweeting a reasonable duration so that irresponsible behavior (opinion and trolling, mostly) can be tempered. Who knows who possesses technical expertise and access to tweet and video archives other than, say, the Wayback Machine? When a public figure says or does something dumb, a search-and-destroy mission is often launched to resurrect offending and damning past utterance. Of course, scrub-a-dub erasure or deletion is merely another attempt to manage narrative and isn’t a plea for forgiveness, which doesn’t exist in the public sphere anyway except for rehabilitated monsters such as past U.S. presidents a/k/a war criminals. And the Internet isn’t in fact forever; ask an archivist.

Shifting language, shifting records, shifting sentiment, shifting intellectual history are all aspects of culture that develop naturally and inevitably over time. We no longer believe, for instance, in the four elements or geocentrism (a/k/a the Aristotelian-Ptolemaic system; never mind the intransigent Flat Earthers who need not be silenced). Darker aspects of these shifts, however, include the remarkable Orwellian insight that “Who controls the past controls the future: who controls the present controls the past” from the 1949 novel Nineteen Eighty-Four. Here’s the passage for context:

Who controls the past, controls the future: who controls the present, controls the past … The mutability of the past is the central tenet of Ingsoc. Past events, it is argued, have no objective existence, but survive only in written records and in human memories. The past is whatever the records and the memories agree upon. And since the Party is in full control of all records, and in equally full control of the minds of its members, it follows that the past is whatever the Party chooses to make it.

In 2021, the awful lesson is taken to heart by multiple parties (not the Party in the novel but wannabes) who have latched maniacally onto Orwellian mechanisms of thought control specifically through the manipulation of records, history, and language. But as mentioned above, policing mere expression is not the same as policing thought itself, at least among those who retain critical thinking skills and independence of mind. I abstain judgment how effective attempted brainwashing is with the masses but will at least mention that Yeonmi Park, who escaped from North Korea in 2007 before settling in the U.S. in 2014, describes the chilling totalitarian thought control exercised by the North Korean government — the stuff of nightmare dystopianism. The template is by now well established and despots everywhere are only too happy to implement it repeatedly, following an evil trajectory that should be resisted at every turn while still possible.

Among those building fame and influence via podcasting on YouTube is Michael Malice. Malice is a journalist (for what organization?) and the author of several books, so he has better preparation and content than many who (like me) offer only loose opinion. He latest book (no link) is The Anarchist Handbook (2021), which appears to be a collection of essays (written by others, curated by Malice) arguing in theoretical support of anarchism (not to be confused with chaos). I say theoretical because, as a hypersocial species of animal, humans never live in significant numbers without forming tribes and societies for the mutual benefit of their members. Malice has been making the rounds discussing his book and is undoubtedly an interesting fellow with well rehearsed arguments. Relatedly, he argues in favor of objectivism, the philosophy of Ayn Rand that has been roundly criticized and dismissed yet continues to be attractive especially to purportedly self-made men and women (especially duped celebrities) of significant wealth and achievement.

Thus far in life, I’ve disdained reading Rand or getting too well acquainted with arguments in favor of anarchism and/or objectivism. As an armchair culture critic, my bias skews toward understanding how things work (i.e., Nassim Taleb’s power laws) in actuality rather than in some crackpot theory. As I understand it, the basic argument put forward to support any variety of radical individualism is that everyone working in his or her own rational self-interest, unencumbered by the mores and restrictions of polite society, leads to the greatest (potential?) happiness and prosperity. Self-interest is not equivalent to selfishness, but even if it were, the theorized result would still be better than any alternative. A similar argument is made with respect to economics, known as the invisible hand. In both, hidden forces (often digital or natural algorithms), left alone to perform their work, enhance conditions over time. Natural selection is one such hidden force now better understood as a component of evolutionary theory. (The term theory when used in connection with evolution is an anachronism and misnomer, as the former theory has been scientifically substantiated as a power law.) One could argue as well that human society is a self-organizing entity (disastrously so upon even casual inspection) and that, because of the structure of modernity, we are all situated within a thoroughly social context. (Guy McPherson used to quip that we’re all born into captivity precisely because there is no escape.) Accordingly, the notion that one can or should go it alone is a delusion because it’s flatly impossible to escape the social surround, even in aboriginal cultures, unless one is totally isolated from other humans in what little remains of the wilderness. Of course, those few hardy individuals who retreat into isolation typically bring with them the skills, training, tools, and artifacts of society. A better example might be feral children, lost in the wilderness at an early age and deprived of human society but often taken in by a nonhuman animal (and thus socialized differently).

My preferred metaphor when someone insists on total freedom and isolation away from the maddening crowd is traffic — usually automobile traffic but foot traffic as well. Both are examples of aggregate flow arising out of individual activity, like drops of rain forming into streams, rivers, and floods. When stuck in automobile congestion or jostling for position in foot traffic, it’s worthwhile to remember that you are the traffic, a useful example of synecdoche. Those who buck the flow, cut the line, or drive along the shoulder — often just to be stuck again a little farther ahead — are essentially practicing anarchists or me-firsters, whom the rest of us simply regard as assholes. Cultures differ with respect to the orderliness of queuing, but even in those places where flow is irregular and unpredictable, a high level of coordination (lost on many American drivers who can’t figger a roundabout a/k/a traffic circle) is nonetheless evident.

As I understand it, Malice equates cooperation with tyranny because people defer to competence, which leads to hierarchy, which results in power differentials, which transforms into tyranny (petty or profound). (Sorry, can’t locate the precise formulation.) Obvious benefits (e.g., self-preservation) arising out of mutual coordination (aggregation) such as in traffic flows are obfuscated by theory distilled into nicely constructed quotes. Here’s the interesting thing: Malice has lived in Brooklyn most of his life and doesn’t know how to drive! Negotiating foot traffic has a far lower threshold for serious harm that driving. He reports that relocation to Austin, TX, is imminent, and with it, the purchase of a vehicle. My suspicion is that to stay out of harm’s way, Malice will learn quickly to obey tyrannical traffic laws, cooperate with other drivers, and perhaps even resent the growing number of dangerous assholes disrupting orderly flow like the rest of us — at least until he develops enough skill and confidence to become one of those assholes. The lesson not yet learned from Malice’s overactive theoretical perspective is that in a crowded, potentially dangerous world, others must be taken into account. Repetition of this kindergarten lesson throughout human activity may not be the most pleasant thing for bullies and assholes to accept, but refusing to do so makes one a sociopath.

In my neighborhood of Chicago, it’s commonplace to see vehicles driving on the road with a giant Puerto Rican flag flying from a pole wedged in each of the rear windows. Often, one of the two flags’ traditional colors (red, white, and blue) is changed to black and white — a symbol of resistance. Puerto Rican politics is a complicated nest of issues I don’t know enough about to say more. However, the zeal of my neighbors is notable. Indeed, as I visited a local farmer’s market last weekend, I couldn’t help but notice quite a welcome diversity on display and folks entirely untroubled by the presence of others who didn’t look just like them (tattoos, unnatural hair colors, long beards and shaved heads, nonstandard attire and accoutrements, etc.). I’m actually pleased to see a level of comfort and freedom to present oneself is such manner as one wishes, and not just because of the buzz phrase “diversity and inclusion.” So go ahead: fly your freak flag high! (This same value applies to viewpoint diversity.)

In contrast, when I venture to some far-flung suburb for sundry activities now that lockdowns and restrictions have been lifted, I encounter mostly white, middle-aged, middle-class suburbanites who admittedly look just like me. It’s unclear that folks in those locales are xenophobic in any way, having withdrawn from city life in all its messiness for a cozy, upscale, crime-free subdivision indistinguishable from the next one over. Maybe that’s an artifact of mid-20th-century white flight, where uniformity of presentation and opinion is the norm. Still, it feels a little weird. (Since the 1980s, some rather well-put-together people have returned to the city center, but that usually requires a king-sized income to purchase a luxury condo in some 50-plus-storey tower. After last summer’s BLM riots, that influx turned again to outflux.) One might guess that, as a visible minority within city confines, I would be more comfortable among my own cohort elsewhere, but that’s not the case. I rather like rubbing elbows with others of diverse backgrounds and plurality of perspectives.

I’ve also grown especially weary of critical race theory being shoved in my face at every turn, as though race is (or should be) the primary lens through which all human relations must be filtered. Such slavish categorization, dropping everyone giant, ill-fitted voting blocs, is the hallmark of ideologues unable to break out of the pseudo-intellectual silos they created for themselves and seek to impose on others. Yet I haven’t joined the growing backlash and instead feel increasingly ill at ease in social situations that appear (on the surface at least) to be too white bread. Shows, perhaps, how notions of race that were irrelevant for most of my life have now crept in and invaded my conscience. Rather than solving or resolving longstanding issues, relentless focus on race instead spreads resentment and discomfort. The melting pot isn’t boiling, but summer is not yet over.

Continuing from part 1.

So here’s the dilemma: knowing a little bit about media theory and how the medium shapes the message, I’m spectacularly unconvinced that the cheerleaders are correct and that an entirely new mediascape (a word I thought maybe I had just made up, but alas, no) promises offers to correct the flaws of the older, inherited mediascape. It’s clearly not journalists leading the charge. Rather, comedians, gadflies, and a few academics (behaving as public intellectuals) command disproportionate attention among the digital chattering classes as regular folks seek entertainment and stimulation superior to the modal TikTok video. No doubt a significant number of news junkies still dote on their favorite journalists, but almost no journalist has escaped self-imposed limitations of the chosen media to offer serious reporting. Rather, they offer “commentary” and half-assed observations on human nature (much like like comedians who believe themselves especially insightful — armchair social critics like me probably fit that bill, too). If the sheer count of aggregate followers and subscribers across social media platforms is any indication (it isn’t …), athletes, musicians (mostly teenyboppers and former pop tarts, as I call them), and the irritatingly ubiquitous Kardashian/Jenner clan are the most influential, especially among Millennials and Gen Z, whose tastes skew toward the frivolous. Good luck getting insightful analysis out of those folks. Maybe in time they’ll mature into thoughtful, engaged citizens. After all, Kim Kardashian apparently completed a law degree (but has yet to pass the bar). Don’t quite know what to think of her three failed marriages (so far). Actually, I try not to.

I’ve heard arguments that the public is voting with its attention and financial support for new media and increasingly disregarding the so-called prestige media (no such thing anymore, though legacy media is still acceptable). That may well be, but it seems vaguely ungrateful for established journalists and comedians, having enjoyed the opportunity to apprentice under seasoned professionals, to take acquired skills to emerging platforms. Good information gathering and shaping — even for jokes — doesn’t happen in a vacuum, and responsible journalism in particular can’t simply be repackaging information gathered by others (i.e., Reuters, the Associated Press, and Al Jezeera) with the aforementioned “commentary.” A frequent reason cited for jumping ship is the desire to escape editorial control and institutional attempts to distort the news itself according to some corporate agenda or ideology. Just maybe new platforms have made that possible in a serious way. However, the related desire to take a larger portion of the financial reward for one’s own work (typically as celebrities seeking to extend their 15 minutes of fame — ugh) is a surefire way to introduce subtle, new biases and distortions. The plethora of metrics available online, for instance, allows content creators to see what “hits” or goes viral, inviting service to public interest that is decidedly less than wholesome (like so much rubbernecking).

It’s also curious that, despite all the talk about engaging with one’s audience, new media is mired in broadcast mode, meaning that most content is presented to be read or heard or viewed with minimal or no audience participation. It’s all telling, and because comments sections quickly run off the rails, successful media personalities ignore them wholesale. One weird feature some have adopted during livestreams is to display viewer donations accompanied by brief comments and questions, the donation being a means of separating and promoting one’s question to the top of an otherwise undifferentiated heap. To my knowledge, none has yet tried the established talk radio gambit of taking live telephone calls, giving the public a chance to make a few (unpurchased) remarks before the host resumes control. Though I’ve never been invited (an invitation is required) and would likely decline to participate, the Clubhouse smartphone app appears to offer regular folks a venue to discuss and debate topics of the day. However, reports on the platform dynamics suggest that the number of eager participants quickly rises to an impossible number for realistic group discussion (the classroom, or better yet, graduate seminar establishes better limitations). A workable moderation mechanism has yet to emerge. Instead, participants must “raise their hand” to be called upon to speak (i.e., be unmuted) and can be kicked out of the “room” arbitrarily if the moderator(s) so decide. This is decidedly not how conversation flows face-to-face.

What strikes me is that while different broadcast modes target and/or capture different demographics, they all still package organize content around the same principle: purporting to have obtained information and expertise to be shared with or taught to audiences. Whether subject matter is news, science, psychology, comedy, politics, etc., they have something ostensibly worth telling you (and me), hopefully while enhancing fame, fortune, and influence. So it frankly doesn’t matter that much whether the package is a 3-minute news segment, a brief celebrity interview on a late night talk show, an article published in print or online, a blog post, a YouTube video of varying duration, a private subscription to a Discord Server, a Subreddit, or an Instagram or Twitter feed; they are all lures for one’s attention. Long-form conversations hosted by Jordan Peterson, Joe Rogan, and Lex Fridman break out of self-imposed time limitations of the typical news segment and flow more naturally, but they also meander and get seriously overlong for anyone but long-haul truckers. (How many times have I tuned out partway into Paul VanderKlay’s podcast commentary or given up on on Matt Taibbi’s SubStack (tl;dr)? Yeah, lost count.) Yet these folks enthusiastically embrace the shifting mediascape. The digital communications era is already mature enough that several generations of platforms have come and gone as well-developed media are eventually coopted or turned commercial and innovators drive out weaker competitors. Remember MySpace, Google Plus, or American Online? The list of defunct social media is actually quite long. Because public attention is a perpetually moving target, I’m confident that those now enjoying their moment in the sun will face new challenges until it all eventually goes away amidst societal collapse. What then?

Coming back to this topic after some time (pt. 1 here). My intention was to expand upon demands for compliance, and unsurprisingly, relevant tidbits continuously pop up in the news. The dystopia American society is building for itself doesn’t disappoint — not that anyone is hoping for such a development (one would guess). It’s merely that certain influential elements of society reliably move toward consolidation of power and credulous citizens predictably forfeit their freedom and autonomy with little or no hesitation. The two main examples to discuss are Black Lives Matter (BLM) and the response to to the global pandemic, which have occurred simultaneously but are not particularly related.

The BLM movement began in summer 2013 but boiled over in summer 2020 on the heels of the George Floyd killing, with protests spilling over into straightforward looting, mayhem, and lawlessness. That fit of high emotional pique found many protester accosting random strangers in public and demanding a raised fist in support of the movement, which was always ideologically disorganized but became irrational and power-hungry as Wokedom discovered its ability to submit others to its will. In response, many businesses erected what I’ve heard called don’t-hurt-me walls in apparent support of BLM and celebration of black culture so that windows would not be smashed and stores ransacked. Roving protests in numerous cities demanded shows of support, though with what exactly was never clear, from anyone encountered. Ultimately, protests morphed into a sort of protection racket, and agitators learned to enjoy making others acquiesce to arbitrary demands. Many schools and corporations now conduct mandatory training to, among other things, identify unconscious bias, which has the distinct aroma of original sin that can never be assuaged or forgiven. It’s entirely understandable that many individuals, under considerable pressure to conform as moral panic seized the country, play along to keep the peace or keep their jobs. Backlash is building, of course.

The much larger example affecting everyone, nationwide and globally, is the response to the pandemic. Although quarantines have been used in the past to limit regional outbreaks of infectious disease, the global lockdown of business and travel was something entirely new. Despite of lack of evidence of efficacy, the precautionary principle prevailed and nearly everyone was forced into home sequestration and later, after an embarrassingly stupid scandal (in the U.S.), made to don masks when venturing out in public. As waves of viral infection and death rolled across the globe, political leaders learned to enjoy making citizens acquiesce to capricious and often contradictory demands. Like BLM, a loose consensus emerged about the “correct” way to handle the needs of the moment, but the science and demographics of the virus produced widely variant interpretations of such correctness. A truly coordinated national response in the U.S. never coalesced, and hindsight has judged the whole morass a fundamentally botched job of maintaining public health in most countries.

But political leaders weren’t done demanding compliance. Any entirely novel vaccine protocol was rushed into production after emergency use authorization was obtained and indemnification (against what?) was granted to the pharma companies that developed competing vaccines. Whether this historical moment will turn out to be something akin to the thalidomide scandal remains to be seen, but at the very least, the citizenry is being driven heavily toward participation in a global medical experiment. Some states even offer million-dollar lotteries to incentivize individuals to comply and take the jab. Open discussion of risks associated with the new vaccines has been largely off limits, and a two-tier society is already emerging: the vaccinated and the unclean (which is ironic, since many of the unclean have never been sick).

Worse yet (and like the don’t-hurt-me walls), many organizations are adopting as-yet-unproven protocols and requiring vaccination for participants in their activities (e.g., schools, sports, concerts) or simply to keep one’s job. The mask mandate was a tolerable discomfort (though not without many principled refusals), but forcing others to be crash test dummies experimental test subjects is well beyond the pale. Considering how the narrative continues to evolve and transform, thoughtful individuals trying to evaluate competing truth claims for themselves are unable to get clear, authoritative answers. Indeed, it’s hard to imagine a situation where authorities in politics, medicine, science, and journalism could worked so assiduously to undermine their own credibility. Predictably, heads (or boards of directors) of many organizations are learning to enjoy the newly discovered power to transform their organizations into petty fiefdoms and demand compliance from individuals — usually under the claim of public safety (“for the children” being unavailable this time). Considering how little efficacy has yet been truly demonstrated with any of the various regimes erected to contain or stall the pandemic, the notion that precautions undertaken have been worth giving injudicious authority to people up and down various power hierarchies to compel individuals remains just that: a notion.

Tyrants and bullies never seem to tire of watching others do the submission dance. In the next round, be ready to hop on one leg and/or bark like a dog when someone flexes on you. Land of the free and home of the brave no longer.

Addendum

The CDC just announced an emergency meeting to be held (virtually) June 18 to investigate reports (800+ via the Vaccination Adverse Effect Reporting System (VAERS), which almost no one had heard of only a month ago) of heart inflammation in adolescents following vaccination against the covid virus. Significant underreporting is anticipated following the circular logic that since authorities declared the vaccines safe prematurely (without standard scientific evidence to support such a statement), the effects cannot be due to the vaccine. What will be the effect of over 140 million people having been assured that vaccination is entirely safe, taken the jab, and then discovered “wait! maybe not so much ….” Will the complete erosion of trust in what we’re instructed told by officialdom and its mouthpieces in journalism spark widespread, organized, grassroots defiance once the bedrock truth is laid bare? Should it?

The famous lyric goes “haters gonna hate.” That reflexive structure is equivalent to the meaningless phrase “It is what it is.” Subtexts attach to these phrases, and they take on lives of their own, after a fashion, with everyone pretending to know precisely what is intended and meant. That was the lesson, by the way, of the phrase “Stupid is as stupid does,” made up precisely to confound bullies who were making fun of someone of apparently limited capacity. In light of these commonplace rhetorical injunctions to actual thought, it is unsurprising that practitioners of various endeavors would be revealed as cheerleaders and self-promoters (sometimes rabidly so) for their own passion projects. With most activities, however, one can’t XX about XX, as in sport about sports, music about music, or cook about cooking. If one plays sports, makes music, or cooks, exemplary results are identifiable easily enough, but promotion on behalf of those results, typically after the fact but sometimes in the midst of the activity (i.e., sports commentary), takes place within the context of language. The two major exceptions I can identify are (1) politicking about politics and (2) writing about writing, both heavily laden with speech. (A third example, which I won’t explore, might be celebrating celebrities. Ugh.)

Of the first example I have little to say except that it’s so miserably, ugly, and venal that only politicians, policy wonks, political junkies, and campaign strategists (now full-time political strategists considering campaigns never end) derive much joy or energy from the reflexive trap. The rest of us prefer to think as little as possible about the entirely corrupt nature of political institutions and the associated players. The second example, however, is arguably an inborn feature of writing that still commands attention. Writers writing about writing might be typically understood as fiction writers revealing their processes. A recent example is J.K. Rowling, who leapt from obscurity to international fame in one bound and now offers writing tips (mainly plotting) to aspirants. An older example is Mark Twain, whose recommendation to ward off verbosity is something I practice (sometimes with limited success). Writers writing about writing now extends to journalists, whose self-reflection never seem to wear thin as the famous ones become brands unto themselves (perhaps even newsworthy in their own right). Training attention on themselves (“Look mom, no hands!”) is rather jejune, but again, commonplace. It’s also worth observing that journalists journaling about journalism, especially those who reveal how the proverbial sausage is made (e.g., Matt Taibbi and his book Hate Inc.: Why Today’s Media Makes Us Despise One Another (2019)), are essentially self-cannibalizing (much like celebrities).

What strikes me lately is how many writers, journalists, and commentators (probably includes bloggers like me — bloggers blogging about blogging) have become cheerleaders for the media in which they work, which is especially true of those who have abandoned legacy media in favor of newer platforms to connect with readerships and/or audiences. Extolling the benefits of the blog is already passé, but the shift over to podcasting and YouTube/TikToc channels, accompanied by testimonial about how great are attributes of the new medium, has passed beyond tiresome now that so many are doing it. Print journalists are also jumping ship from legacy publications, mostly newspapers and magazines, to digital publishing platforms such as Medium, Revue, and Substack. Some create independent newsletters. Broadcast journalists are especially keen on YouTube. A fair bit of incestuous crossover occurs as well, as media figures interview each other endlessly. Despite having restricted my media diet due to basic distrust of the legacy media in particular, I still award a lot of attention to a few outlets I determined deserve my attention and are sometimes even trustworthy. Or sometimes, they’re just entertaining. I still tune in the stray episode of someone I find infuriating just to check in and reinforce my decision not to return more frequently.

Stopping here and breaking this post into parts because the remainder of the draft was already growing overlong. More to come in part 2.

Let’s Be Evil, pt. 05

Posted: May 12, 2021 in Culture, History, Outrage, Politics, War
Tags:

Does this miserable joke meme

inform the following image?

Time moves on yet the story remains stubbornly the same. Like the United States before it (and others elsewhere), Israel is carrying out an extermination campaign — with the aid of the U.S. empire. There’s something uniquely despicable about being unrepentant winners in the unabated practice of colonialism.

Continuing my book-blogging project on Orality and Literacy, Ong provides context for the oral tradition that surrounded the two great Homeric classics: The Iliad and The Odyssey. According to Ong, it took decades for literary critics and sociologists to overcome their bias, borne out of literacy, and recognize how formulaic are the two epics. They are essentially pastiches of commonplace plots, phrases, and sayings of the time, which was a notable strength when oral delivery based on memorization was how epic poetry was transmitted. In a literate era, such clichés are to be avoided (like the plague).

Aside: my review of David Serota’s Back to Our Future mentions the dialect he and his brother developed, filled with one-liners and catchphrases from entertainment media, especially TV and movies. The three-word (also three-syllable) form seems to be optimal: “Beam me up” (Star Trek), “Use the Force” (Star Wars), “Make my day” (Dirty Harry), “I’ll be back” (The Terminator), etc. This construction is short, punchy, and memorable. The first holder of high office in the U.S. to attempt to govern by catchphrase was probably Ronald Reagan, followed (of course) by Arnold Schwarzenegger and then Donald Trump. Mustn’t overlook that all three (and others) came to prominence via the entertainment industry rather than through earnest (Kennedyesque) public service. Trump’s numerous three-word phrases (shtick, really) lend themselves especially well to being chanted by adoring crowds at his pep rallies, swept up in groupthink, with a recognizable beat-beat-beat-(silence) structure. The rock band Queen stumbled upon this same elemental rhythm with its famous stomp-stomp-clap-(wait) from the anthem “We Are the Champions,” consciously intended for audience participation (as I understand it).

Further aside: “We Are the Champions” combines its iconic rhythm with a recitation tone sourced in antiquity. Make of that what you will.

Ong goes on to provide a discussion of the psychodynamics of orality, which I list here without substantive discussion (read for yourself):

  • orality is additive rather than subordinative
  • orality is aggregative rather than analytic
  • orality is redundant or copious
  • orality is conservative or traditionalist
  • orality is close to the human lifeworld
  • orality is agonistically toned
  • orality is empathetic and participatory rather than objectively distanced
  • orality is homeostatic
  • orality is situational rather than abstract

Of particular interest is Ong’s description of how language functions within oral cultures distinctly from literate cultures, which is the source of the bias mentioned above. To wit:

Fully literate persons can only with great difficulty imagine what a primary oral culture is like, that is, a culture with no knowledge whatsoever of writing or even the possibility of writing … In a primary oral culture, the expression ‘to look up something’ is an empty phrase … [w]ithout writing, words as such have no visual presence, even when the objects they represent are visual … [for] ‘primitive’ (oral) people … language is a mode of action and not simply a countersign of thought — oral people commonly, and probably universally, consider words to have great power. [pp. 31–32]

If this sounds conspicuously reminiscent this previous post, well, congratulations on connecting the dots. The whole point, according to a certain perspective, is that words are capable of violence, which is (re)gaining adherents as our mental frameworks undergo continuous revision. It’s no small thing that slurs, insults, and fighting words (again) provoke offense and violent response and that mere verbal offense equates to violence. Not long ago, nasty words were reclaimed, nullified, and thus made impotent (with varying levels of irrational rules of usage). Well, now they sting again and are used as ammo to cancel (a form of administrative violence, often undertaken anonymously, bureaucratically, and with the assistance of the digital mob) anyone with improper credentials to deploy them.

Let me draw another connection. Here’s a curious quote by Walter Pater, though not well known:

All art constantly aspires towards the condition of music. For while in all other kinds of art it is possible to distinguish the matter from the form, and the understanding can always make this distinction, yet it is the constant effort of art to obliterate it.

Put another way, the separation of signifier from signified, an abstraction conditioned by literacy and rationalism (among other things) is removed (“obliterated”) by music, which connects to emotion more directly than representational art. Similarly, speech within primary oral cultures exists purely as sound and possesses an ephemeral, even effervescence (Ong’s term) quality only experienced in the flow of time. (Arguably, all of human experience takes place within the flow of time.) Music and “primitive” speech are accordingly dynamic and cannot be reduced to static snapshots, that is, fixed on a page as text or committed to a canvas or photograph as a still image (hence, the strange term still life). That’s why a three-word, three-syllable chant, or better yet, the Queen rhythm or the Wave in sports arenas (a gesture requiring subscription of everyone), can possess inherent power, especially as individuals are entrained in groupthink. Music and words-as-violence get inside us and are nearly wholly subjective, not objective — something we all experience organically in early childhood before being taught to read and write (if in fact those skills are learned beyond functional literacy). Does that mean culture is reverting to an earlier stage of development, more primitive, childlike, and irrational?

For more than a decade, I’ve had in the back of my mind a blog post called “The Power of Naming” to remark that bestowing a name gives something power, substance, and in a sense, reality. That post never really came together, but its inverse did. Anyway, here’s a renewed attempt.

The period of language acquisition in early childhood is suffused with learning the names of things, most of which is passive. Names of animals (associated closely with sounds they make) are often a special focus using picture books. The kitty, doggie, and horsie eventually become the cat, dog, and horse. Similarly, the moo-cow and the tweety-bird shorten to cow and bird (though songbird may be an acceptable holdover). Words in the abstract are signifiers of the actual things, aided by the text symbols learned in literate cultures to reinforce mere categories instead of examples grounded in reality. Multiply the names of things several hundred thousand times into adulthood and indeed throughout life and one can develop a formidable vocabulary supporting expressive and nuanced thought and speech. Do you know the differences between acute, right, obtuse, straight, and reflex angles? Does it matter? Does your knowledge of barware inform when to use a flute, coupe, snifter, shot (or shooter or caballito), nosing glass (or Glencairn), tumbler, tankard, goblet, sling, and Stein? I’d say you’ve missed something by never having drunk dark beer (Ger.: Schwarzbier) from a frosted schooner. All these varieties developed for reasons that remain invisible to someone content to drink everything from the venerable red Solo cup. Funnily enough, the red Solo cup now comes in different versions, fooling precisely no one.

Returning to book blogging, Walter Ong (in Orality and Literacy) has curious comparisons between primarily oral cultures and literate cultures. For example:

Oral people commonly think of names (one kind of words) as conveying power over things. Explanations of Adam’s naming of the animals in Genesis 2:20 usually call condescending attention to this presumably quaint archaic belief. Such a belief is in fact far less quaint than it seems to unreflective chirographic and typographic folk. First of all, names do give humans beings power over what they name: without learning a vast store of names, one is simply powerless to understand, for example, chemistry and to practice chemical engineering. And so with all other intellectual knowledge. Secondly, chirographic and typographic folk tend to think of names as labels, written or printed tags imaginatively affixed to an object named. Oral folk have no sense of a name as a tag, for they have no idea of a name as something that can be seen. Written or printed representations of words can be labels; real, spoken words cannot be. [p. 33]

This gets at something that has been developing over the past few decades, namely, that as otherwise literate (or functionally literate) people gather more and more information through electronic media (screens that serve broadcast and cable TV, YouTube videos, prerecorded news for streaming, and podcasts, and most importantly, audiobooks — all of which speak content to listeners), the spoken word (re)gains primacy and the printed word fades into disuse. Electronic media may produce a hybrid of orality/literacy, but words are no longer silent, internal, and abstract. Indeed, words — all by themselves — are understood as being capable of violence. Gone are the days when “stick and stones ….” Now, fighting words incite and insults sting again.

Not so long ago, it was possible to provoke a duel with an insult or gesture, such as a glove across the face. Among some people, defense of honor never really disappeared (though dueling did). History has taken a strange turn, however. Proposed legislation to criminalize deadnaming (presumably to protect a small but growing number of transgender and nonbinary people who have redefined their gender identity and accordingly adopted different names) recognizes the violence of words but then tries to transmute the offense into an abstract criminal law. It’s deeply mixed up, and I don’t have the patience to sort it out.

More to say in later blog posts, but I’ll raise the Counter-Enlightenment once more to say that the nature of modern consciousness if shifting somewhat radically in response to stimuli and pressures that grew out of an information environment, roughly 70 years old now but transformed even more fundamentally in the last 25 years, that is substantially discontinuous from centuries-old traditions. Those traditions displaced even older traditions inherited from antiquity. Such is the way of the world, I suppose, and with the benefit of Walter Ong’s insights, my appreciation of the outlines is taking better shape.

From Alan Jacob’s Breaking Bread with the Dead (2020):

The German sociologist Gerd-Günter Voss outlined the development, over many centuries, of three forms of the “conduct of life.” The first is the traditional: in this model your life takes the forms that the lives of people in your culture and class have always taken, at least for as long as anyone remembers. The key values in the traditional conduct of life are “security and regularity.” The second model is the strategic: people who follow this model have clear goals in mind (first, to get into an elite university; later, to become a radiologist or own their own company or retire at fifty) and form a detailed strategic plan to achieve those goals. But, Voss suggests, those two models, while still present in various parts of the world, are increasingly being displaced by a third model for the conduct of life: the situational.

The situational model has arisen in recent social orders that are unprecedentedly dynamic and fluid. People are less likely to plan to be radiologists when they hear that radiologists may be replaced by computers. They are less likely to plan to own a company when whatever business they’re inclined toward may not exist in a decade … they are less likely to plan to have children … They might not even want to plan to have dinner with a friend a week from Friday …

… the situational conduct of life is … a way of coping with social acceleration. But it’s also, or threatens to be, an abandonment of serious reflection on what makes life good. You end up just managing the moment … The feeling of being at a “frenetic standstill” is highly characteristic of the depressed person.