Posts Tagged ‘Insult to Injury’

For more than a decade, I’ve had in the back of my mind a blog post called “The Power of Naming” to remark that bestowing a name gives something power, substance, and in a sense, reality. That post never really came together, but its inverse did. Anyway, here’s a renewed attempt.

The period of language acquisition in early childhood is suffused with learning the names of things, most of which is passive. Names of animals (associated closely with sounds they make) are often a special focus using picture books. The kitty, doggie, and horsie eventually become the cat, dog, and horse. Similarly, the moo-cow and the tweety-bird shorten to cow and bird (though songbird may be an acceptable holdover). Words in the abstract are signifiers of the actual things, aided by the text symbols learned in literate cultures to reinforce mere categories instead of examples grounded in reality. Multiply the names of things several hundred thousand times into adulthood and indeed throughout life and one can develop a formidable vocabulary supporting expressive and nuanced thought and speech. Do you know the differences between acute, right, obtuse, straight, and reflex angles? Does it matter? Does your knowledge of barware inform when to use a flute, coupe, snifter, shot (or shooter or caballito), nosing glass (or Glencairn), tumbler, tankard, goblet, sling, and Stein? I’d say you’ve missed something by never having drunk dark beer (Ger.: Schwarzbier) from a frosted schooner. All these varieties developed for reasons that remain invisible to someone content to drink everything from the venerable red Solo cup. Funnily enough, the red Solo cup now comes in different versions, fooling precisely no one.

Returning to book blogging, Walter Ong (in Orality and Literacy) has curious comparisons between primarily oral cultures and literate cultures. For example:

Oral people commonly think of names (one kind of words) as conveying power over things. Explanations of Adam’s naming of the animals in Genesis 2:20 usually call condescending attention to this presumably quaint archaic belief. Such a belief is in fact far less quaint than it seems to unreflective chirographic and typographic folk. First of all, names do give humans beings power over what they name: without learning a vast store of names, one is simply powerless to understand, for example, chemistry and to practice chemical engineering. And so with all other intellectual knowledge. Secondly, chirographic and typographic folk tend to think of names as labels, written or printed tags imaginatively affixed to an object named. Oral folk have no sense of a name as a tag, for they have no idea of a name as something that can be seen. Written or printed representations of words can be labels; real, spoken words cannot be. [p. 33]

This gets at something that has been developing over the past few decades, namely, that as otherwise literate (or functionally literate) people gather more and more information through electronic media (screens that serve broadcast and cable TV, YouTube videos, prerecorded news for streaming, and podcasts, and most importantly, audiobooks — all of which speak content to listeners), the spoken word (re)gains primacy and the printed word fades into disuse. Electronic media may produce a hybrid of orality/literacy, but words are no longer silent, internal, and abstract. Indeed, words — all by themselves — are understood as being capable of violence. Gone are the days when “stick and stones ….” Now, fighting words incite and insults sting again.

Not so long ago, it was possible to provoke a duel with an insult or gesture, such as a glove across the face. Among some people, defense of honor never really disappeared (though dueling did). History has taken a strange turn, however. Proposed legislation to criminalize deadnaming (presumably to protect a small but growing number of transgender and nonbinary people who have redefined their gender identity and accordingly adopted different names) recognizes the violence of words but then tries to transmute the offense into an abstract criminal law. It’s deeply mixed up, and I don’t have the patience to sort it out.

More to say in later blog posts, but I’ll raise the Counter-Enlightenment once more to say that the nature of modern consciousness if shifting somewhat radically in response to stimuli and pressures that grew out of an information environment, roughly 70 years old now but transformed even more fundamentally in the last 25 years, that is substantially discontinuous from centuries-old traditions. Those traditions displaced even older traditions inherited from antiquity. Such is the way of the world, I suppose, and with the benefit of Walter Ong’s insights, my appreciation of the outlines is taking better shape.

I simply can’t keep up with all the reading, viewing, and listening in my queue. Waking hours are too few, and concentration dissipates long before sleep overtakes. Accordingly, it’s much easier to settle into couch-potato mode and watch some mindless drivel, such as the Netflix hit Bridgerton binged in two sittings. (Unlike cinema critics, I’m not bothered especially by continuity errors, plot holes, clunky dialogue, weak character motivation, gaps of logic, or glossy decadence of the fictional worlds. I am bothered by the Kafka trap sprung on anyone who notices casting decisions that defy time and place — an ill-advised but now commonplace historical revisionism like editing Mark Twain.) As a result, blog posts are less frequent than they might perhaps be as I pronounce upon American (or more broadly, Western) culture, trying vainly to absorb it as a continuously moving target. Calls to mind the phrase Après moi, le déluge, except that there is no need to wait. A deluge of entertainment, news, analysis, punditry, and trolling has buried everyone already. So rather than the more careful consideration I prefer to post, here are some hot takes.

The Irregular Aphorist. Caitlin Johnstone offers many trenchant observations in the form of aphorisms (some of which I’ve quoted before), all gathered under the subtitle Notes From The Edge Of The Narrative Matrix. The modifier irregular only means that aphorisms are a regular but not constant feature. Her site doesn’t have a tag to that effect but probably ought to. Here’s one in particular that caught my attention:

Everything our species has tried has led us to a dying world and a society that is stark raving mad, so nobody is in any position to tell you that you are wrong.

Twin truths here are (1) the dying world and (2) societal madness, both of which I’ve been describing for some time. Glad when others recognize them, too.

Piling on. Though few still are willing to admit it, nonpharmaceutical interventions (NPIs, e.g., distancing, masks, and lockdowns) to stall or reduce the spread of the virus failed to achieve their objectives according to this study. Instead, NPIs piled on suffering no one could forestall. I read somewhere (no link) that the world is approaching half of total, cumulative deaths/infections predicted had nothing been done to impede the pandemic running its course. Adding in deaths of despair (numbers not entirely up to date), we’re using the wrong tools to fight the wrong battle. Of course, interventions opened up giant opportunities for power grabs and vulture capitalism, so the cynic in me shrugs and wonders half aloud “what did you expect, really?”

Growth of the Managerial Bureaucracy. A blog called Easily Distracted by Timothy Burke (never on my blogroll) publishes only a few times per year, but his analysis is terrific — at least when it doesn’t wind up being overlong and inconclusive. Since a student debt jubilee is back in the news (plenty of arguments pro and con), unintended consequences are anticipated in this quote:

When you set out to create elaborate tiers that segregate the deserving poor from the comfortable middle-class and the truly wealthy, you create a system that requires a massive bureaucracy to administer and a process that forces people into petitionary humiliation in order to verify their eligibility. You create byzantine cutoff points that become business opportunities for predatory rentiers.

Something similar may well be occurring with stimulus checks being issued pro rata (has anyone actually gotten one?), but at least we’re spared any petitionary humiliations. We get whatever the algorithms (byzantine cutoff points) dictate. How those funds will be gamed and attached is not yet clear. Stay alert.

No Defense of Free Speech. Alan Jacobs often recommends deleting, unsubscribing, and/or ignoring social media accounts (after his own long love-hate relationship with them) considering how they have become wholly toxic to a balanced psyche as well as principal enablers of surveillance capitalism and narrative control. However, in an article about the manorial elite, he’s completely lost the plot that absolutism is required in defense of free speech. It’s not sufficient to be blasé or even relieved when 45 is kicked off Twitter permanently or when multiple parties conspire to kill Parler. Establishing your own turf beyond the reach of Silicon Valley censors is a nice idea but frankly impractical. Isn’t that what whoever ran Parler (or posted there) must have thought? And besides, fencing off the digital commons these very entities created has catapulted them into the unenviable position of undemocratic, unelected wielders of monopolistic power and co-conspirators to boot. That’s what needs to be curtailed, not free speech.

The Taxonomic Apocalypse. Although drawn from fiction and thus largely hypothetical, a new book (coming late 2021) by Adam Roberts called It’s the End of the World: But What Are We Really Afraid Of? surveys doomsday stories and categorizes different versions of how it all ends. Alan Jacobs (yeah, him again — must have an advance copy of the manuscript) recommends it as “a delightful and provocative little book” but fails to grok two things: (1) these stories are rehearsals-cum-preparations for the real thing, and (2) the real thing really is bearing down on us implacably and so is no longer a mere hypothetical to contemplate and categorize for shits and grins. Despite acceptance of the eventualities that await all of us, reading Roberts’ taxonomy is not something I would expect to find delightful. Skip.

Narrative Collapse. Ran Prier (no link) sometimes makes statements revealing an unexpected god’s-eye view:

[45] is a mean rich kid who figured out that if he does a good Archie Bunker impression, every lost soul with an authoritarian father will think he’s the messiah. We’re lucky that he cares only about himself, instead of having some crazy utopian agenda. But the power, and the agency, is with the disaffected citizens of a declining empire, tasting barbarism.

This is all about people wanting to be part of a group that’s part of a story. Lately, some of the big group-stories have been dying: sky father religion, American supremacy, the conquest of nature, the virtue of wealth-seeking. In their place, young and clumsy group-stories struggle and rise.

Collapse of certain fundamental stories that animate our thinking is at the core of The Spiral Staircase (see About Brutus at top), though it’s often couched in terms of consciousness in transition. Getting through the transition (only temporarily, see previous item in list) probably means completion of the Counter-Enlightenment historical arc, which necessarily includes further descent into barbarism.

Hail Mary for Individualism. I always take special notice when someone cites Allan Bloom. Alan Jacobs (um, yeah, he’s prolific and I’m using his ideas again — sue me) cites Bloom to argue that individualism or the sovereign self, a product of the Enlightenment, is already dead. No doubt, the thought-world described so ably by Bloom no longer exists, but individualism has not yet died out by attrition or been fully dissolved in nonduality. Many of us born before the advent of the Internet retain selfhood and authenticity not yet coopted by or incorporated into mass mind. Moreover, ongoing struggles over identity (e.g., gender, sexual orientation, and race that are often used improperly to define the self) result from an inchoate sense that individualism is eroding precipitously, not that it’s already passé. Defiant attempts to (re)establish an authentic self (contravening all logic and becoming critical theory of one sort or another) in the face of this loss may well be a last-ditch effort to save the self, but it’s failing.

I’ve never before gone straight back with a redux treatment of a blog post. More typically, it takes more than a year before revisiting a given topic, sometimes several years. This time, supplemental information came immediately, though I’ve delayed writing about it. To wit, a Danish study published November 18, 2020, in the Annals of Internal Medicine indicates our face mask precautions against the Coronavirus may be ineffective:

Our results suggest that the recommendation to wear a surgical mask when outside the home among others did not reduce, at conventional levels of statistical significance, the incidence of SARS-CoV-2 infection in mask wearers in a setting where social distancing and other public health measures were in effect, mask recommendations were not among those measures, and community use of masks was uncommon. Yet, the findings were inconclusive and cannot definitively exclude a 46% reduction to a 23% increase in infection of mask wearers in such a setting. It is important to emphasize that this trial did not address the effects of masks as source control or as protection in settings where social distancing and other public health measures are not in effect.

The important phrase there is “did not reduce, at conventional levels of statistical significance,” which is followed by the caveat that the study was partial and so is inconclusive. To say something is statistically insignificant means that results do not exceed the calculated margin of error or randomness. A fair bit of commentary follows the published study, which I have not reviewed.

We’re largely resorting to conventional wisdom with respect to mask wearing. Most businesses and public venues (if open at all) have adopted the mask mandate out of conformity and despite wildly conflicting reports of their utility. Compared to locking down all nonessential social and economic activity, however, I remain resigned to their adoption even though I’m suspicious (as any cynic or skeptic should be) that they don’t work — at least not after the virus is running loose. There is, however, another component worth considering, namely, the need to been seen doing something, not nothing, to address the pandemic. Some rather bluntly call that virtue signalling, such as the pathologist at this link.

In the week since publication of the Danish study and the pathologist’s opinion (note the entirely misleading title), there has been a deluge of additional information, editorials, and protests (no more links, sorry) calling into question recommendations from health organizations and responses by politicians. Principled and unprincipled dissent was already underway since May 2020, which is growing with each month hardship persists. Of particular note is the Supreme Court’s 5-4 decision against New York Gov. Andrew Cuomo’s mandate that religious services be restricted to no more than 10 people in red zones and no more than 25 in orange zones. Score one for the Bill of Rights being upheld even in a time of crisis.

I’ve mentioned the precautionary principle several times, most notably here. Little of our approach to precautions has changed in the two years since that blog post. At the same time, climate change and Mother Nature batter us aggressively. Eventualities remain predictable. Different precautions are being undertaken with respect to the pandemic currently gripping the planet. Arguably, the pandemic is either a subset of Mother Nature’s fury or, if the virus was created in a lab, a self-inflicted wound. Proper pandemic precautions have been confounded by undermining of authority, misinformation, lack of coordination, and politically biased narratives. I’m as confused as the next poor sap. However, low-cost precautions such as wearing masks are entirely acceptable, notwithstanding refusals of many Americans to cooperate after authorities muddied the question of their effectiveness so completely. More significant precautions such as lockdowns and business shutdowns have morphed into received wisdom among government bodies yet are questioned widely as being a cure worse than the disease, not to mention administrative overreach (conspiratorial conjecture withheld).

Now comes evidence published in the New England Journal of Medicine on November 11, 2020, that costly isolation is flatly ineffective at stemming infection rates. Here are the results and conclusions from the abstract of the published study:

Results
A total of 1848 recruits volunteered to participate in the study; within 2 days after arrival on campus, 16 (0.9%) tested positive for SARS-CoV-2, 15 of whom were asymptomatic. An additional 35 participants (1.9%) tested positive on day 7 or on day 14. Five of the 51 participants (9.8%) who tested positive at any time had symptoms in the week before a positive qPCR test. Of the recruits who declined to participate in the study, 26 (1.7%) of the 1554 recruits with available qPCR results tested positive on day 14. No SARS-CoV-2 infections were identified through clinical qPCR testing performed as a result of daily symptom monitoring. Analysis of 36 SARS-CoV-2 genomes obtained from 32 participants revealed six transmission clusters among 18 participants. Epidemiologic analysis supported multiple local transmission events, including transmission between roommates and among recruits within the same platoon.
Conclusions
Among Marine Corps recruits, approximately 2% who had previously had negative results for SARS-CoV-2 at the beginning of supervised quarantine, and less than 2% of recruits with unknown previous status, tested positive by day 14. Most recruits who tested positive were asymptomatic, and no infections were detected through daily symptom monitoring. Transmission clusters occurred within platoons.

So an initial 0.9% tested positive, then an additional 1.9%. This total 2.8% compares to 1.7% in the control group (tested but not isolated as part of the study). Perhaps the experimental and control groups are a bit small (1848 and 1554, respectively), and it’s not clear why the experimental group infection rate is higher than that of the control group, but the evidence points to the uselessness of trying to limit the spread of the virus by quarantining and/or isolation. Once the virus is present in a population, it spreads despite precautions.

A mantra is circulating that we should “trust the science.” Are these results to be trusted? Can we call off all the lockdowns and closures? It’s been at least eight months that the virus has been raging throughout the U.S. Although there might be some instances of isolated populations with no infection, the wider population has by now been exposed. Moreover, some individuals who self-isolated effectively may not have been exposed, but in all likelihood, most of us have been. Accordingly, renewed lockdowns, school and business closures, and destruction of entire industries are a pretense of control we never really had. Their costs are enormous and ongoing. A stay-at-home order (advisory, if you prefer) just went into effect for the City of Chicago on November 16, 2020. My anecdotal observation is that most Chicagoans are ignoring it and going about their business similar to summer and fall months. It’s nothing like the ghost town effect of March and April 2020. I daresay they may well be correct to reject the received wisdom of our civic leaders.

/rant on

Had a rather dark thought, which recurs but then fades out of awareness and memory until conditions reassert it. Simply put, it’s that the mover-shaker-decision-maker sociopaths types in government, corporations, and elsewhere (I refuse to use the term influencer) are typically well protected (primarily by virtue of immense wealth) from threats regular folks face and are accordingly only too willing to sit idly by, scarcely lifting a finger in aid or assistance, and watch dispassionately as others scramble and scrape in response to the buffeting torrents of history. The famous example (even if not wholly accurate) of patrician, disdainful lack of empathy toward others’ plight is Marie Antoinette’s famous remark: “Let them eat cake.” Citing an 18th-century monarch indicates that such tone-deaf sentiment has been around for a long time.

Let me put it another way, since many of our problems are of our own creation. Our styles of social organization and their concomitant institutions are so overloaded with internal conflict and corruption, which we refuse to eradicate, that it’s as though we continuously tempt fate like fools playing Russian roulette. If we were truly a unified nation, maybe we’d wise up and adopt a different organizational model. But we don’t shoulder risk or enjoy reward evenly. Rather, the disenfranchised and most vulnerable among us, determined a variety of ways but forming a substantial majority, have revolvers to their heads with a single bullet in one of five or six chambers while the least vulnerable (the notorious 1%) have, in effect, thousands or millions of chambers and an exceedingly remote chance of firing the one with the bullet. Thus, vulnerability roulette.

In the midst of an epochal pandemic and financial crisis, who gets sacrificed like so much cannon fodder while others retreat onto their ocean-going yachts or into their boltholes to isolate from the rabble? Everyone knows it’s always the bottom rungs of the socioeconomic ladder who unjustly suffer the worst, a distinctly raw deal unlikely ever to change. The middle rungs are also suffering now as contraction affects more and more formerly enfranchised groups. Meanwhile, those at the top use crises as opportunities for further plunder. In an article in Rolling Stone, independent journalist Matt Taibbi, who covered the 2008 financial collapse, observes that our fearless leaders (fearless because they secure themselves before and above all else) again made whole the wealthiest few at the considerable expense of the rest:

The $2.3 trillion CARES Act, the Donald Trump-led rescue package signed into law on March 27th, is a radical rethink of American capitalism. It retains all the cruelties of the free market for those who live and work in the real world, but turns the paper economy into a state protectorate, surrounded by a kind of Trumpian Money Wall that is designed to keep the investor class safe from fear of loss.

This financial economy is a fantasy casino, where the winnings are real but free chips cover the losses. For a rarefied segment of society, failure is being written out of the capitalist bargain.

Why is this a “radical rethink”? We’ve seen identical behaviors before: privatization of profit, indemnification of loss, looting of the treasury, and refusal to prosecute exploitation, torture, and crimes against humanity. Referring specifically to financialization, this is what the phrase “too big to fail” means in a nutshell, and we’ve been down this stretch of road repeatedly.

Naturally, the investor class isn’t ordered back to work at slaughterhouses and groceries to brave the epidemic. Low-wage laborers are. Interestingly, well compensated healthcare workers are also on the vulnerability roulette firing line — part of their professional oaths and duties — but that industry is straining under pressure from its inability to maintain profitability during the pandemic. Many healthcare workers are being sacrificed, too. Then there are tens of millions newly unemployed and uninsured being told that the roulette must continue into further months of quarantine, the equivalent of adding bullets to the chambers until their destruction is assured. The pittance of support for those folks (relief checks delayed or missing w/o explanation or recourse and unemployment insurance if one qualifies, meaning not having already been forced into the gig economy) does little to stave off catastrophe.

Others around the Web have examined the details of several rounds of bailout legislation and found them unjust in the extreme. Many of the provisions actually heap insult and further injury upon injury. Steps that could have been taken, and in some instances were undertaken in past crises (such as during the Great Depression), don’t even rate consideration. Those safeguards might include debt cancellation, universal basic income (perhaps temporary), government-supported healthcare for all, and reemployment through New Deal-style programs. Instead, the masses are largely left to fend for themselves, much like the failed Federal response to Hurricane Katrina.

Some of this is no doubt ideological. A professional class of ruling elites are the only ones to be entrusted with guiding the ship of state, or so goes the political philosophy. But in our capitalist system, government has been purposefully hamstrung and hollowed out to the point of dysfunction precisely so that private enterprise can step in. And when magical market forces fail to stem the slide into oblivion, “Welp, sorry, th-th-that’s all folks,” say the supposed elite. “Nothing we can do to ease your suffering! Our attentions turn instead to ourselves, the courtiers and sycophants surrounding us, and the institutions that enable our perfidy. Now go fuck off somewhere and die, troubling us no more.”

/rant off

I’m aware of at least two authors who describe American character in less than glowing terms: Alexis de Tocqueville and Morris Berman. Tocqueville’s book Democracy in America (two vols., 1835 and 1840) is among the most cited, least read (by 21st-century Americans, anyway) books about America. (I admit to not having read it.) Berman’s American trilogy (titles unnamed, all of which I’ve read) is better known by contemporary Americans (those who read, anyway) and is unflinching in its denunciation of, among other things, our prideful stupidity. Undoubtedly, others have taken a crack at describing American character.

American identity, OTOH, if such a thing even exists, is somewhat more elusive for a variety of reasons. For instance, Americans lack the formative centuries or millennia of history Europeans and Asians have at their disposal. Moreover, Americans (except for Native Americans — multiple synonyms and euphemisms available) are immigrants (or their descendants) drawn from all around the globe. Accordingly, we lack a coherent unifying narrative about who we are. The American melting pot may come closest but is insufficient in its generality. National identity may well be fraying in other societies as each loses its distinctiveness over time. Nonetheless, two influential factors to formation of a loose American identity are negative identity (defining oneself as against others, e.g., adolescent rebellion rather fitting for a young nation) and borrowed identity (better known as cultural appropriation). The latter has been among the chief complaints of social justice warriors.

(more…)

I had at least two further ideas for this third part of a series, but frankly, given the precipitous turn of events over the past month or so, nothing feels appropriate to write about just yet other than the global pandemic that has staggered society, reeling from being forced apart from each other and the way of life to which we are adapted being suddenly ripped out from beneath us. As the voiceover at the beginning of one of the Lord of the Rings movies intones rather soberly, “The world … has changed ….” That was my assessment here, though I was really thinking of the post-truth public sphere.

Many are already admitting that we will never be able to go back to what once was, that what broke will stay forever broken. And while the eventual response may be interpreted in sweet-lemon style as a reform opportunity or beckon call to greatness, I daresay a far more likely result is that mass death, sickness, and ruin will create a critical mass of desperate people not so willing to stay hunkered down waiting for the extended crisis to pass. Indeed, the bunker mentality already imprinted on our minds as we cringe before the next in a series of injurious blows can’t be expected to endure. Folks will take to the streets with all their stockpiled guns and ammo, seeking something, anything to do, rather than dying quietly, meekly, alone, at home. The metaphor of being pummeled into submission or to death is probably incorrect. Right now, we’re still only partway up one of those parabolic curves that ultimately points skyward. Alternatively, it’s a crescendo of pain that overwhelms until nothing functions anymore.

If surviving historians are able to piece together the story some time hence, one possibility will be to observe that the abundance we sorta enjoyed during two centuries of cheap energy did not develop into anything resembling an enlightened style of social organization that could be sustained or indeed even prepare us adequately for inevitable black swan events. Such discontinuities are entirely predictable by virtue of their inevitability, though precise timing is a fool’s errand. Natural disasters are the obvious example, and despite organizations and agencies scattered throughout all levels of government, we’re found flat-footed nearly every time disaster strikes. This global pandemic is no different, nor is the collapse of industrial civilization or runaway climate change. The current crisis is the first major kick in the teeth that may well cascade domino-style into full-on collapse.

As the crisis deepens, top leaders are often found to be worthless. Where is Pence, appointed more than a month ago to coordinate a coronavirus task force? It’s quite unlike a major political figure to do his or her work quietly and competently without media in tow. Even incompetence gets coverage, but Pence is nowhere to be seen. Must be self-quarantining. Some leaders are even worse than worthless; they actively add to the misery. Mainstream media may also have finally gotten hip to the idea that hanging on every insipid word uttered by that gaping chasm of stupidity that is our president is no longer a ratings bonanza to be tolerated in exchange for fruitless fact-checking missions. I fantasize about press events where correspondents heckle and laugh the fatuous gasbag (or his apologists) off the podium. Regrettably, there seems to be no bottom to the humiliation he can withstand so long as attention stays riveted on him. Perhaps the better response to his noisome nonsense would be stone silence — crickets.

One of the victims of cancel culture, coming to my attention only days ago, is Kate Smith (1907–1986), a singer of American popular song. Though Smith had a singing career spanning five decades, she is best remembered for her version(s) of Irving Berlin’s God Bless America, which justifiably became a bit of Americana. The decades of Smith’s peak activity were the 1930s and 40s.

/rant on

I dunno what goes through people’s heads, performing purity rituals or character excavation on folks long dead. The controversy stems from Smith having a couple other songs in her discography: That’s Why Darkies Were Born (1931) and Pickaninny Heaven from the movie Hello, Everybody! (1933). Hate to break it anyone still living under a rock, but these dates are not far removed from minstrelsy, blackface, and The Birth of a Nation (1915) — a time when typical Americans referred to blacks with a variety of terms we now consider slurs. Such references were still used during the American civil rights movement (1960s) and are in use among some virulent white supremacists even today. I don’t know the full context of Kate Smith having sung those songs, but I suspect I don’t need to. In that era, popular entertainment had few of the sensibilities regarding race we now have (culture may have moved on, but it’s hard to say with a straight face it’s evolved or progressed humanely), and uttering commonly used terms back then was not automatic evidence of any sort of snarling racism.

I remember having heard my grandparents, nearly exact contemporaries of Kate Smith, referring to blacks (the term I grew up with, still acceptable I think) with other terms we no longer consider acceptable. It shocked me, but to them, that’s simply what blacks were called (the term(s) they grew up with). Absolutely nothing in my grandparents’ character or behavior indicated a nasty, racist intent. I suspect the same was true of Kate Smith in the 1930s.

Back when I was a librarian, I also saw plenty of sheet music published before 1920 or so with the term darkie (or darkey) in the title. See for example this. The Library of Congress still uses the subject headings “negro spirituals” (is there another kind?) and “negro songs” to refer to various subgenres of American folk song that includes slave songs, work songs, spirituals, minstrel music, protest songs, etc. Maybe we should cancel the Library of Congress. Some published music titles from back then even call them coon songs. That last one is totally unacceptable today, but it’s frankly part of our history, and like changing character names in Mark Twain’s Huckleberry Finn, sanitizing the past does not make it go away or any less discomfiting. But if you wanna bury your head in the sand, go ahead, ostrich.

Also, if some person or entity ever does some questionably racist, sexist, or malign thing (even something short of abominable) situated contextually in the past, does that mean he, she, or it must be cancelled irrevocably? If that be the case, then I guess we gotta cancel composer Richard Wagner, one of the most notorious anti-Semites of the 19th century. Also, stop watching Pixar, Marvel, and Star Wars films (among others), because remember that time when Walt Disney Studios (now Walt Disney Company) made a racist musical film, Song of the South (1946)? Disney’s tainted legacy (extending well beyond that one movie) is at least as awful as, say, Kevin Spacey, and we’re certainly not about to rehabilitate him.

/rant off

A potpourri of recent newsbits and developments. Sorry, no links or support provided. If you haven’t already heard of most of these, you must be living under a rock. On a moment’s consideration, that may not be such a bad place to dwell.

rant on/

I just made up the word of the title, but anyone could guess its origin easily. Many of today’s political and thought leaders (not quite the same thing; politics doesn’t require much thought), as well as American institutions, are busy creating outrageously preposterous legacies for themselves. Doomers like me doubt anyone will be around to recall in a few decades. For instance, the mainstream media (MSM) garners well-deserved rebuke, often attacking each other in the form of one of the memes of the day: a circular firing squad. Its brazen attempts at thought-control (different thrusts at different media organs) and pathetic abandonment of mission to inform the public with integrity have hollowed it out. No amount of rebranding at the New York Times (or elsewhere) will overcome the fact that the public has largely moved on, swapping superhero fiction for the ubiquitous fictions spun by the MSM and politicians. The RussiaGate debacle may be the worst example, but the MSM’s failures extend well beyond that. The U.S. stock market wobbles madly around its recent all-time high, refusing to admit its value has been severely overhyped and inflated through quantitative easing, cheap credit (an artificial monetary value not unlike cryptocurrencies or fiat currency created out of nothing besides social consensus), and corporate buybacks. The next crash (already well overdue) is like the last hurricane: we might get lucky and it will miss us this season, but eventually our lottery number will come up like those 100-year floods now occurring every few years or decades.

Public and higher education systems continue to creak along, producing a glut of dropouts and graduates ill-suited to do anything but the simplest of jobs requiring no critical thought, little training, and no actual knowledge or expertise. Robots and software will replace them anyway. Civility and empathy are cratering: most everyone is ready and willing to flip the bird, blame others, air their dirty laundry in public, and indulge in casual violence or even mayhem following only modest provocation. Who hasn’t fantasized just a little bit about acting out wildly, pointlessly like the mass killers blackening the calendar? It’s now de rigueur. Thus, the meme infiltrates and corrupts vulnerable minds regularly. Systemic failure of the U.S. healthcare and prison systems — which ought to be public institutions but are, like education, increasingly operated for profit to exploit public resources — continues to be exceptional among developed nations, as does the U.S. military and its bloated budget.

Gaffe-prone Democratic presidential candidate Joe Biden cemented his reputation as a goof years ago yet continues to build upon it. One might think that at his age enough would have been enough, but the allure of the highest office in the land is just too great, so he guilelessly applies for the job and the indulgence of the American public. Of course, the real prize-winner is 45, whose constant stream of idiocy and vitriol sends an entire nation scrambling daily to digest their Twitter feeds and make sense of things. Who knows (certainly I don’t) how serious was his remark that he wanted to buy Greenland? It makes a certain sense that a former real-estate developer would offhandedly recommend an entirely new land grab. After all, American history is based on colonialism and expansionism. No matter that the particular land in question is not for sale (didn’t matter for most of our history, either). Of course, everyone leapt into the news cycle with analysis or mockery, only the second of which was appropriate. Even more recent goofiness was 45’s apparent inability to read a map resulting in the suggestion that Hurricane Dorian might strike Alabama. Just as with the Greenland remark, PR flacks went to work to manage and reconfigure public memory, revising storm maps for after-the-fact justification. Has anyone in the media commented that such blatant historical revisionism is the stuff of authoritarian leaders (monarchs, despots, and tyrants) whose underlings and functionaries, fearing loss of livelihood if not indeed life, provide cover for mistakes that really ought to lead to simple admission of error and apology? Nope, just add more goofs to the heaping pile of preposterity.

Of course, the U.S. is hardly alone in these matters. Japan and Russia are busily managing perception of their respective ongoing nuclear disasters, including a new one in Russia that has barely broken through our collective ennui. Having followed the U.S. and others into industrialization and financialization of its economy, China is running up against the same well-known ecological despoliation and limits to growth and is now circling the drain with us. The added spectacle of a trade war with the petulant president in the U.S. distracts everyone from coming scarcity. England has its own clownish supreme leader, at least for now, trying to manage an intractable but binding issue: Brexit. (Does every head of state need a weirdo hairdo?) Like climate change, there is no solution no matter how much steadfast hoping and wishing one into existence occurs, so whatever eventually happens will throw the region into chaos. Folks shooting each other for food and fresh water in the Bahamas post-Hurricane Dorian is a harbinger of violent hair-triggers in the U.S. poised to fire at anything that moves when true existential threats finally materialize. Thus, our collective human legacy is absurd and self-destroying. No more muddling through.

/rant off

The Judaeo-Christian dictum “go forth, be fruitful, and multiply” (Genesis 1:28, translations vary) was taken to heart not only by Jews and Christians but by people everywhere resources allowed. Prior to the modern era, human population remained in check because, among other things, high rates of infant and child mortality, pandemics, and famine were commonplace. Now that modern medicine, hygiene, and health deliver far more children into adulthood (and thus their breeding years) and our fossil fuel energy binge allows us to overproduce and overreproduce, population has spiked. While some herald human flourishing (mere quantity, not quality) as an unmitigated good, our massive human population beggars the question: what to do with all the extra people? The American answer is already known: if they’re not productive citizens (read: labor for someone else’s profit), lock ’em up (ironically transforming them into profit centers using tax monies) or simply abandon them to live (and shit) on the streets of San Francisco or some other temperate, coastal city. If they’re foreigners competing for the same resources we (Americans) want for ourselves, well, just kill ’em (a different sort of disposal).

Those observations are really quite enough, ugly and obvious as they are. However, history isn’t yet done with us. Futurists warn that conditions will only worsen (well, duh!) as technological unemployment (robots and software soon to perform even more tasks that used to be handled by people paid money for their effort and expertise) causes more and more people to be tossed aside in venal pursuit of profit. Optimists and cheerleaders for the new technological utopia dystopia frequently offer as cold comfort that people with newfound time on their hands are free to become entrepreneurial or pursue creative endeavors. Never mind that basic needs (e.g., housing, food, clothing, and healthcare) must come first. The one thing that’s partially correct about the canard that everyone can magically transform themselves into small business owners or content creators is that we have become of nation of idlers fixated on entertainments of many varieties. That’s a real bottomless well. Some percentage (unknown by me) actually produces the content (TV shows, movies, music, books, blogs, journalism, YouTube channels, podcasts, social media feeds, video games, sports teams and competitions, etc.), all completing for attention, and those people are often rewarded handsomely if the medium produces giant subscription and revenues. Most of it is just digital exhaust. I also judge that most of us are merely part of the audience or have failed to go viral hit it big if indeed we have anything on offer in the public sphere. Of course, disposable time and income drives the whole entertainment complex. Doubtful folks living in burgeoning American tent cities contribute anything to that economic sector.

It’s sometimes said that a society can be measured by how it treats its weakest members. The European social contract (much derided in the U.S.) takes that notion seriously and supports the down-and-out. The American social contract typically blames those who are weak, often by no fault of their own (e.g., medical bankruptcy), and kicks them when they’re down. Consider just one common measure of a person: intelligence. Though there are many measures of intelligence, the standard is IQ, which is computational, linguistic, and abstract. It’s taboo to dwell too much on differences, especially when mapped onto race, gender, or nationality, so I won’t go there. However, the standard, conservative distribution places most people in the average between 90 and 110. A wider average between 81 (low average) and 119 (high average) captures even more people before a small percentage of outliers are found at the extremes. Of course, almost everyone thinks him- or herself squarely in the upper half. As one descends deeper into the lower half, it’s been found that IQ deficits mean such a person is unsuitable for most types of gainful employment and some are flatly unsuitable for any employment at all. What to do with those people? With U.S. population now just under 330 million, the lower half is roughly 165 million people! How many of those “useless eaters” are abandoned to their fates is hard to know, but it’s a far bigger number and problem than the ridiculous, unhelpful advice “learn to code” would suggest. The cruelty of the American social contract is plain to see.