Archive for the ‘Nomenclature’ Category

/rant on

Since deleting from my blogroll all doom links and turning my attention elsewhere, the lurking dread of looming collapse (all sorts) has been at low ebb at The Spiral Staircase. Despite many indicators of imminent collapse likewise purged from front-page and top-of-the-broadcast news, evidence continues to mount while citizens contend with other issues, some political and geopolitical, others day-to-day tribulations stemming from politics, economics, and the ongoing pandemic. For instance, I only just recently learned that the Intergovernmental Panel on Climate Change (IPCC — oh yeah … them) issued AR6 last month, the sixth periodic Assessment Report (maybe instead call it the State of the Union Address Planet Report). It’s long, dense reading (the full report is nearly 4,000 pp., whereas the summary for policymakers is a mere 42 pp.) and subject to nearly continuous revision and error correction. The conclusion? Climate change is widespread, rapid, and intensifying. And although it’s true that mundane daily activities occupy center stage in the lives of average folks, there is simply no bigger story or concern for government leaders (I choke on that term) and journalists (that one, too) than climate change because it represents (oh, I dunno …) the collapse of industrial civilization and the early phase of mass extinction. Thus, all politics, geopolitics, economic warfare, class struggle, Wokeism, authoritarian seizure of power, and propaganda filling the minds of people at all levels as well as the institutions they serve amount to a serious misallocation of attention and effort. I will admit, though, that it’s both exhausting and by now futile to worry too much about collapse. Maybe that’s why the climate emergency (the new, improved term) is relegated to background noise easily tuned out.

It’s not just background noise, though, unlike the foreknowledge that death awaits decades from now if one is fortunate to persist into one’s 70s or beyond. No, it’s here now, outside (literally and figuratively), knocking on the door. Turn off your screens and pay attention! (Ironically, everyone now gets the lion’s share of information from screens, not print. So sue me.) Why am I returning to this yet again? Maybe I’ve been reviewing too many dystopian films and novels. Better answer is that those charged with managing and administering states and nations are failing so miserably. It’s not even clear that they’re trying, so pardon me, but I’m rather incensed. It’s not that there aren’t plenty of knowledgeable experts compiling data, writing scientific reports, publishing books, and offering not solutions exactly but at least better ways to manage our affairs. Among those experts, the inability to reverse the climate emergency is well enough understood though not widely acknowledged. (See Macro-Futilism on my blogroll for at least one truth teller who absolutely gets it.) Instead, some lame version of the same dire warning issues again and again: if action isn’t taken now (NOW, dammit!), it will be too late and all will be lost. The collective response is not, however, to pull back, rein in, or even prepare for something less awful than the worst imaginable hard landing where absolutely no one survives despite the existence of boltholes and underground bunkers. Instead, it’s a nearly gleeful acceleration toward doom, like a gambler happily forking over his last twenty at the blackjack table before losing and chucking himself off the top of the casino parking structure. Finally free (if fleetingly)!

Will festering public frustration over deteriorating social conditions tip over into outright revolt, revolution, civil war, and/or regime change? Doesn’t have to be just one. Why is the U.S. still developing and stockpiling armaments, maintaining hundreds of U.S. military bases abroad, and fighting costly, pointless wars of empire (defeat in withdrawal from Afghanistan notwithstanding)? Will destruction of purchasing power of the U.S. dollar continue to manifest as inflation of food and energy costs? Is the U.S. Dept. of Agriculture actually doing anything to secure food systems, or does it merely prepare reports like the AR6 that no one reads or acts upon? Will fragile supply lines be allowed to fail entirely, sparking desperation and unrest in the streets far worse than summer 2020? Famine is how some believe collapse will trigger a megadeath pulse, but I wouldn’t count out chaotic violence among the citizenry, probably exacerbated and escalated as regimes attempt (unsuccessfully) to restore social order. Are any meaningful steps being taken to stop sucking from the fossil fuel teat and return to small-scale agrarian social organization, establishing degrowth and effectively returning to the land (repatriation is my preferred term) instead of going under it? Greenwashing doesn’t count. This headline (“We Live In A World Without Consequences Where Everyone Is Corrupt“) demonstrates pretty well that garbage economics are what pass for governance, primarily preoccupied with maintaining the capitalist leviathan that has captured everything (capture ought to be the trending word of the 2021 but sadly isn’t). Under such constraint, aged institutions are flatly unable to accomplish or even address their missions anymore. And this headline (“Polls Show That The American People Are Extremely Angry – And They Are About To Get Even Angrier“) promises that things are about to get much, much worse — for the obvious reason that more and more people are at the ends of their ropes while the privileged few attend the Met Gala, virtue signal with their butts, and behave as though society isn’t in fact cracking up. (Omitted the obvious-but-erroneous follow-on “before they get better.”) Soon enough, we’ll get to truth-test Heinlein’s misunderstood aphorism “… an armed society is a polite society.”

Those who prophesy dates or deadlines for collapse have often been slightly embarrassed (but relieved) that collapse didn’t arrive on schedule. Against all odds, human history keeps trudging further into borrowed time, kicking cans down roads, blowing bubbles, spinning false narratives, insisting that all this is fine, and otherwise living in make-believe land. Civilization has not quite yet reached the end of all things, but developments over the last couple months feel ever more keenly like the equivalent of Frodo and Sam sitting atop Mount Doom, just outside the Cracks of Doom (a/k/a Sammath Naur), except that humanity is not on a noble, sacrificial mission to unmake the One Ring, whatever that might represent outside of fiction (for Tolkien, probably industrial machines capable of planetary destruction, either slowly and steadily or all at once; for 21st-century armchair social critics like me, capitalism). All former certainties, guarantees, sureties, continuities, and warranties are slipping away despite the current administration’s assurances that the status quo will be maintained. Or maybe it’s merely the transition of summer into fall, presaging the annual dormancy of winter looking suspiciously this year like the great dying. Whatever. From this moment on and in a fit of exuberant pique, I’m willing to call the contest: humanity is now decidedly on the down slope. The true end of history approaches, as no one will be left to tell the tale. When, precisely, the runaway train finally careens over the cliff remains unknown though entirely foreseeable. The concentration of goofy references, clichés, and catchphrases above — usually the mark of sophomoric writing — inspires in me to indulge (further) in gallows humor. Consider these metaphors (some mixed) suggesting that time is running out:

  • the show’s not over til it’s over, but the credits are rolling
  • the chickens are coming home to roost
  • the canary in the coal mine is gasping its last breath
  • the fat lady is singing her swan song
  • borrowed time is nearly up
  • time to join the great majority (I see dead people …)
  • the West fades into the west
  • kiss your babies goodnight and kiss your ass goodbye

/rant off

Stolen ruthlessly from this comment:

“Briefing” — complex issue, often with a singular perception, summarized by emotionally-charged language

“Humanitarian Crisis” — warfare that represents a profit loss

“safety” — population control

“God bless our troops”– the pawns are in play

“we” — everybody lacking self-awareness, definition or firm ideology

“necessary” — power might be lost

“mistaken or untrue” — informed

“authorized” — dictated by unknown/unrecognized/unelected individuals

“analysis” — distortion and/or deception

“attack” — criticism

“schools” — training centers

“vaccine” — experiment

“economy” — Wall Street

“spokesperson” — sock-puppet with seniority

“Congressman/woman” — sock-puppet lacking curiosity

“budget” — waste benefiting entrenched interests

“studies” — financed, subjective conclusions supporting a narrative

“healthcare” — problematic life-extending effort

“entitlement” -potentially problematic (and often expensive) promise

“President” — ignoramus who looks good on camera

“Vice-President” — moron who looks bad in public

“country” — products of failing/failed education system, including universities and graduate schools

“jobs” — human capital

“workers” — human capital that trade time for diminishing return called money

“money” — a sleight of hand devaluation of labor/time

“election” — popularized deception indicating the perception of change

“change” — shift in media coverage

“media coverage” — public distraction

“misinformation” — inconvenient truth

“information” — allowable opinion

“Republican” — warmonger with stock-options

“Democrat” -warmonger with stock-options who smiles a lot

“Trump” — disrupter

“Biden” — unconscious

“Harris” — unsettling laugh-machine

“press” — public assault on intellect

“TV news” — public assault on intellect and wallet

“wallets” — formerly representative of individual wealth, utility & empowerment”

“voting” — self-righteous optimism bordering on delusion

“military service” — economically disadvantaged and uninformed job-seeker

“military officer” — obedient manager

“American dream” — outdated, empty promise of opportunity/improvement/evolution

“terrorist” — foreign patriot engaging in violence

“domestic terrorist” — former believer of outdated, empty promise of opportunity/improvement/evolution

“education” — structured perception

“science” — bureaucratic control

“internet” — threat

“faith” — relatively rigid ideology that encouraging independent collaboration on collective issues

“religion” — dangerous introspection discouraging/disallowing government control”

“Bernie Sanders” — disruption

“Peoples Party” — collective disruption

“Libertarian” — individual disruption expressed locally

“Language is dynamic” is a phrase invoked in praise or derision of shifts in usage. Corollaries include “the only constant is change” and “time’s arrow points in only one direction” — both signalling that stasis is an invalid and ultimately futile conservative value. The flip side might well be the myth of progress, understood in reference not to technological advancement but human nature’s failure to rise above its base (animal) origins. This failure is especially grotesque considering that humans currently albeit temporarily live in an age of material abundance that would provide amply for everyone if that largesse were justly and equitably produced and distributed. However, resources (including labor) are being systematically exploited, diverted, and hoarded by a small, unethical elite (what some call “alpha chimps”) who often use state power to subjugate vulnerable populations to funnel further tribute to the already obscenely wealthy top of the socioeconomic hierarchy. But that’s a different diatribe.

Although I’m sensitive the dynamism of language — especially terms for broad ideas in need of short, snappy neologisms — I’m resistant to adopting most new coin. For instance, multiple colors of pill (red, blue, white, and black to my knowledge) refer to certain narrative complexes that people, in effect, swallow. Similarly, the “blue church” is used to refer to legacy media struggling desperately (and failing) to retain its last shreds of legitimacy and authority. (Dignity is long gone.) Does language really need these terms or are hipsters just being clever? That question probably lacks a definitive answer.

My real interest with this blog post, however, is how the modern digital mediascape has given rise to a curious phenomenon associated with cancel culture: deletion of tweets and social media posts to scrub one’s past of impropriety as though the tweet or post never happened. (I’ve never deleted a post nor have any plans to.) Silicon Valley hegemons can’t resist getting their piece of the action, too, by applying deeply flawed algorithms to everyone’s content to demonetize, restrict, and/or remove (i.e., censor) offensive opinion that runs counter to (shifting) consensus narratives decided upon in their sole discretion as water carriers for officialdom. Algorithmic dragnets are not effective kludges precisely because thoughts are not synonymous with their online expression; one merely points to the other. Used to be said that the Internet is forever, so wait before posting or tweeting a reasonable duration so that irresponsible behavior (opinion and trolling, mostly) can be tempered. Who knows who possesses technical expertise and access to tweet and video archives other than, say, the Wayback Machine? When a public figure says or does something dumb, a search-and-destroy mission is often launched to resurrect offending and damning past utterance. Of course, scrub-a-dub erasure or deletion is merely another attempt to manage narrative and isn’t a plea for forgiveness, which doesn’t exist in the public sphere anyway except for rehabilitated monsters such as past U.S. presidents a/k/a war criminals. And the Internet isn’t in fact forever; ask an archivist.

Shifting language, shifting records, shifting sentiment, shifting intellectual history are all aspects of culture that develop naturally and inevitably over time. We no longer believe, for instance, in the four elements or geocentrism (a/k/a the Aristotelian-Ptolemaic system; never mind the intransigent Flat Earthers who need not be silenced). Darker aspects of these shifts, however, include the remarkable Orwellian insight that “Who controls the past controls the future: who controls the present controls the past” from the 1949 novel Nineteen Eighty-Four. Here’s the passage for context:

Who controls the past, controls the future: who controls the present, controls the past … The mutability of the past is the central tenet of Ingsoc. Past events, it is argued, have no objective existence, but survive only in written records and in human memories. The past is whatever the records and the memories agree upon. And since the Party is in full control of all records, and in equally full control of the minds of its members, it follows that the past is whatever the Party chooses to make it.

In 2021, the awful lesson is taken to heart by multiple parties (not the Party in the novel but wannabes) who have latched maniacally onto Orwellian mechanisms of thought control specifically through the manipulation of records, history, and language. But as mentioned above, policing mere expression is not the same as policing thought itself, at least among those who retain critical thinking skills and independence of mind. I abstain judgment how effective attempted brainwashing is with the masses but will at least mention that Yeonmi Park, who escaped from North Korea in 2007 before settling in the U.S. in 2014, describes the chilling totalitarian thought control exercised by the North Korean government — the stuff of nightmare dystopianism. The template is by now well established and despots everywhere are only too happy to implement it repeatedly, following an evil trajectory that should be resisted at every turn while still possible.

For more than a decade, I’ve had in the back of my mind a blog post called “The Power of Naming” to remark that bestowing a name gives something power, substance, and in a sense, reality. That post never really came together, but its inverse did. Anyway, here’s a renewed attempt.

The period of language acquisition in early childhood is suffused with learning the names of things, most of which is passive. Names of animals (associated closely with sounds they make) are often a special focus using picture books. The kitty, doggie, and horsie eventually become the cat, dog, and horse. Similarly, the moo-cow and the tweety-bird shorten to cow and bird (though songbird may be an acceptable holdover). Words in the abstract are signifiers of the actual things, aided by the text symbols learned in literate cultures to reinforce mere categories instead of examples grounded in reality. Multiply the names of things several hundred thousand times into adulthood and indeed throughout life and one can develop a formidable vocabulary supporting expressive and nuanced thought and speech. Do you know the differences between acute, right, obtuse, straight, and reflex angles? Does it matter? Does your knowledge of barware inform when to use a flute, coupe, snifter, shot (or shooter or caballito), nosing glass (or Glencairn), tumbler, tankard, goblet, sling, and Stein? I’d say you’ve missed something by never having drunk dark beer (Ger.: Schwarzbier) from a frosted schooner. All these varieties developed for reasons that remain invisible to someone content to drink everything from the venerable red Solo cup. Funnily enough, the red Solo cup now comes in different versions, fooling precisely no one.

Returning to book blogging, Walter Ong (in Orality and Literacy) has curious comparisons between primarily oral cultures and literate cultures. For example:

Oral people commonly think of names (one kind of words) as conveying power over things. Explanations of Adam’s naming of the animals in Genesis 2:20 usually call condescending attention to this presumably quaint archaic belief. Such a belief is in fact far less quaint than it seems to unreflective chirographic and typographic folk. First of all, names do give humans beings power over what they name: without learning a vast store of names, one is simply powerless to understand, for example, chemistry and to practice chemical engineering. And so with all other intellectual knowledge. Secondly, chirographic and typographic folk tend to think of names as labels, written or printed tags imaginatively affixed to an object named. Oral folk have no sense of a name as a tag, for they have no idea of a name as something that can be seen. Written or printed representations of words can be labels; real, spoken words cannot be. [p. 33]

This gets at something that has been developing over the past few decades, namely, that as otherwise literate (or functionally literate) people gather more and more information through electronic media (screens that serve broadcast and cable TV, YouTube videos, prerecorded news for streaming, and podcasts, and most importantly, audiobooks — all of which speak content to listeners), the spoken word (re)gains primacy and the printed word fades into disuse. Electronic media may produce a hybrid of orality/literacy, but words are no longer silent, internal, and abstract. Indeed, words — all by themselves — are understood as being capable of violence. Gone are the days when “stick and stones ….” Now, fighting words incite and insults sting again.

Not so long ago, it was possible to provoke a duel with an insult or gesture, such as a glove across the face. Among some people, defense of honor never really disappeared (though dueling did). History has taken a strange turn, however. Proposed legislation to criminalize deadnaming (presumably to protect a small but growing number of transgender and nonbinary people who have redefined their gender identity and accordingly adopted different names) recognizes the violence of words but then tries to transmute the offense into an abstract criminal law. It’s deeply mixed up, and I don’t have the patience to sort it out.

More to say in later blog posts, but I’ll raise the Counter-Enlightenment once more to say that the nature of modern consciousness if shifting somewhat radically in response to stimuli and pressures that grew out of an information environment, roughly 70 years old now but transformed even more fundamentally in the last 25 years, that is substantially discontinuous from centuries-old traditions. Those traditions displaced even older traditions inherited from antiquity. Such is the way of the world, I suppose, and with the benefit of Walter Ong’s insights, my appreciation of the outlines is taking better shape.

Wanted to provide an update to the previous post in my book-blogging project on Walter Ong’s Orality and Literacy to correct something that wasn’t clear to me at first. The term chirographic refers to writing, but I conflated writing more generally with literacy. Ong actually distinguishes chirographic (writing) from typographic (type or print) and includes another category: electronic media.

Jack Goody … has convincingly shown how shifts hitherto labeled as shifts from magic to science, or from the so-called ‘prelogical’ to the more and more ‘rational’ state of consciousness, or from Lévi-Strauss’s ‘savage’ mind to domesticated thought, can be more economically and cogently explained as shifts from orality to various stages of literacy … Marshall McLuhan’s … cardinal gnomic saying, ‘The medium is the message’, registered his acute awareness of the importance of the shift from orality through literacy and print to electronic media. [pp. 28–29]

So the book’s primary contrast is between orality and literacy, but literacy has a sequence of historical developments: chirographic, typographic, and electronic media. These stages are not used interchangeably by Ong. Indeed, they exist simultaneously in the modern world and all contribute to overall literacy while each possesses unique characteristics. For instance, reading from handwriting (printing or cursive, the latter far less widely used now except for signatures) is different from reading from print on paper or on the screen. Further, writing by hand, typing on a typewriter, typing into a word-processor, and composing text on a smartphone each has its effects on mental processes and outputs. Ong also mentions remnants of orality that have not yet been fully extinguished. So the exact mindset or style of consciousness derived from orality vs. literacy is neither fixed nor established universally but contains aspects from each category and subcategory.

Ong also takes a swing at Julian Jaynes. Considering that Jaynes’ book The Origin of Consciousness in the Breakdown of the Bicameral Mind (1977) (see this overview) was published only seven years prior to Orality and Literacy (1982), the impact of Jaynes’ thesis must have still been felt quite strongly (as it is now among some thinkers). Yet Ong disposes of Jaynes rather parsimoniously, stating

… if attention to sophisticated orality-literacy contrasts is growing in some circles, it is still relatively rare in many fields where it could be helpful. For example, the early and late stages of consciousness which Julian Jaynes (1977) describes and related to neuro-physiological changes to the bicameral mind would also appear to lend themselves largely to much simpler and more verifiable descriptions in terms of a shift from orality to literacy. [p. 29]

In light of the details above, it’s probably not accurate to say (as I did before) that we are returning to orality from literacy. Rather, the synthesis of characteristics is shifting, as it always has, in relation to new stimuli and media. Since the advent of cinema and TV — the first screens, now supplemented by the computer and smartphone — the way humans consume information is undergoing yet another shift. Or perhaps it’s better to conclude that it’s always been shifting, not unlike how we have always been and are still evolving, though the timescales are usually too slow to observe without specialized training and analysis. Shifts in consciousness arguably occur far more quickly than biological evolution, and the rate at which new superstimuli are introduced into the information environment suggest radical discontinuity with even the recent past — something that used to be call the generation gap.

I’ve always wondered what media theorists such as McLuhan (d. 1980), Neil Postman (d. 2003), and now Ong (d. 2003) would make of the 21st century had they lived long enough to witness what has been happening, with 2014–2015 being the significant inflection point according to Jonathan Haidt. (No doubt there are other media theorists working on this issue who have not risen to my attention.) Numerous other analyses point instead to the early 20th century as the era when industrial civilization harnessed fossil fuels and turned the mechanisms and technologies of innovators decidedly against humanity. Pick your branching point.

Caveat: this post is uncharacteristically long and perhaps a bit disjointed. Or perhaps an emerging blogging style is being forged. Be forewarned.

Sam Harris has been the subject of or mentioned in numerous previous blog posts. His podcast Making Sense (formerly, Waking Up), partially behind a paywall but generously offered for free (no questions asked) to those claiming financial hardship, used to be among those I would tune in regularly. Like the Joe Rogan Experience (soon moving to Spotify — does that mean its disappearance from YouTube?), the diversity of guests and reliable intellectual stimulation have been attractive. Calling his podcast Making Sense aligns with my earnest concern over actually making sense of things as the world spins out of control and our epistemological crisis deepens. Yet Harris has been a controversial figure since coming to prominence as a militant atheist. I really want to like what Harris offers, but regrettably, he has lost (most of) my attention. Others reaching the same conclusion have written or vlogged their reasons, e.g., “Why I’m no longer a fan of ….” Do a search.

Having already ranted over specific issues Harris has raised, let me instead register three general complaints. First, once a subject is open for discussion, it’s flogged to death, often without reaching any sort of conclusion, or frankly, helping to make sense. For instance, Harris’ solo discussion (no link) regarding facets of the killing of George Floyd in May 2020, which event sparked still unabated civil unrest, did more to confuse than clarify. It was as though Harris were trying the court case by himself, without a judge, jury, or opposing counsel. My second complaint is that Harris’ verbosity, while impressive in many respects, leads to interviews marred by long-winded, one-sided speeches where the thread is hopelessly lost, blocking an interlocutor from tracking and responding effectively. Whether Harris intends to bury others under an avalanche of argument or does so uncontrollably doesn’t matter. It’s still a Gish gallop. Third is his over-emphasis on hypotheticals and thought experiments. Extrapolation is a useful but limited rhetorical technique, as is distillation. However, treating prospective events as certainties is tantamount to building arguments on poor foundations, namely, abstractions. Much as I admire Harris’ ambition to carve out a space within the public sphere to get paid for thinking and discussing topics of significant political and philosophical currency, he frustrates me enough that I rarely tune in anymore.


Caveat: rather overlong for me, but I got rolling …

One of the better articles I’ve read about the pandemic is this one by Robert Skidelsky at Project Syndicate (a publication I’ve never heard of before). It reads as only slightly conspiratorial, purporting to reveal the true motivation for lockdowns and social distancing, namely, so-called herd immunity. If that’s the case, it’s basically a silent admission that no cure, vaccine, or inoculation is forthcoming and the spread of the virus can only be managed modestly until it has essentially raced through the population. Of course, the virus cannot be allowed to simply run its course unimpeded, but available impediments are limited. “Flattening the curve,” or distributing the infection and death rates over time, is the only attainable strategy and objective.

Wedding mathematical and biological insights, as well as the law of mass action in chemistry, into an epidemic model may seem obvious now, but it was novel roughly a century ago. We’re also now inclined, if scientifically oriented and informed, to understand the problem and its potential solutions management in terms of engineering rather than medicine (or maybe in terms of triage and palliation). Global response has also made the pandemic into a political issue as governments obfuscate and conceal true motivations behind their handling (bumbling in the U.S.) of the pandemic. Curiously, the article also mentions financial contagion, which is shaping up to be worse in both severity and duration than the viral pandemic itself.


/rant on

Had a rather dark thought, which recurs but then fades out of awareness and memory until conditions reassert it. Simply put, it’s that the mover-shaker-decision-maker sociopaths types in government, corporations, and elsewhere (I refuse to use the term influencer) are typically well protected (primarily by virtue of immense wealth) from threats regular folks face and are accordingly only too willing to sit idly by, scarcely lifting a finger in aid or assistance, and watch dispassionately as others scramble and scrape in response to the buffeting torrents of history. The famous example (even if not wholly accurate) of patrician, disdainful lack of empathy toward others’ plight is Marie Antoinette’s famous remark: “Let them eat cake.” Citing an 18th-century monarch indicates that such tone-deaf sentiment has been around for a long time.

Let me put it another way, since many of our problems are of our own creation. Our styles of social organization and their concomitant institutions are so overloaded with internal conflict and corruption, which we refuse to eradicate, that it’s as though we continuously tempt fate like fools playing Russian roulette. If we were truly a unified nation, maybe we’d wise up and adopt a different organizational model. But we don’t shoulder risk or enjoy reward evenly. Rather, the disenfranchised and most vulnerable among us, determined a variety of ways but forming a substantial majority, have revolvers to their heads with a single bullet in one of five or six chambers while the least vulnerable (the notorious 1%) have, in effect, thousands or millions of chambers and an exceedingly remote chance of firing the one with the bullet. Thus, vulnerability roulette.

In the midst of an epochal pandemic and financial crisis, who gets sacrificed like so much cannon fodder while others retreat onto their ocean-going yachts or into their boltholes to isolate from the rabble? Everyone knows it’s always the bottom rungs of the socioeconomic ladder who unjustly suffer the worst, a distinctly raw deal unlikely ever to change. The middle rungs are also suffering now as contraction affects more and more formerly enfranchised groups. Meanwhile, those at the top use crises as opportunities for further plunder. In an article in Rolling Stone, independent journalist Matt Taibbi, who covered the 2008 financial collapse, observes that our fearless leaders (fearless because they secure themselves before and above all else) again made whole the wealthiest few at the considerable expense of the rest:

The $2.3 trillion CARES Act, the Donald Trump-led rescue package signed into law on March 27th, is a radical rethink of American capitalism. It retains all the cruelties of the free market for those who live and work in the real world, but turns the paper economy into a state protectorate, surrounded by a kind of Trumpian Money Wall that is designed to keep the investor class safe from fear of loss.

This financial economy is a fantasy casino, where the winnings are real but free chips cover the losses. For a rarefied segment of society, failure is being written out of the capitalist bargain.

Why is this a “radical rethink”? We’ve seen identical behaviors before: privatization of profit, indemnification of loss, looting of the treasury, and refusal to prosecute exploitation, torture, and crimes against humanity. Referring specifically to financialization, this is what the phrase “too big to fail” means in a nutshell, and we’ve been down this stretch of road repeatedly.

Naturally, the investor class isn’t ordered back to work at slaughterhouses and groceries to brave the epidemic. Low-wage laborers are. Interestingly, well compensated healthcare workers are also on the vulnerability roulette firing line — part of their professional oaths and duties — but that industry is straining under pressure from its inability to maintain profitability during the pandemic. Many healthcare workers are being sacrificed, too. Then there are tens of millions newly unemployed and uninsured being told that the roulette must continue into further months of quarantine, the equivalent of adding bullets to the chambers until their destruction is assured. The pittance of support for those folks (relief checks delayed or missing w/o explanation or recourse and unemployment insurance if one qualifies, meaning not having already been forced into the gig economy) does little to stave off catastrophe.

Others around the Web have examined the details of several rounds of bailout legislation and found them unjust in the extreme. Many of the provisions actually heap insult and further injury upon injury. Steps that could have been taken, and in some instances were undertaken in past crises (such as during the Great Depression), don’t even rate consideration. Those safeguards might include debt cancellation, universal basic income (perhaps temporary), government-supported healthcare for all, and reemployment through New Deal-style programs. Instead, the masses are largely left to fend for themselves, much like the failed Federal response to Hurricane Katrina.

Some of this is no doubt ideological. A professional class of ruling elites are the only ones to be entrusted with guiding the ship of state, or so goes the political philosophy. But in our capitalist system, government has been purposefully hamstrung and hollowed out to the point of dysfunction precisely so that private enterprise can step in. And when magical market forces fail to stem the slide into oblivion, “Welp, sorry, th-th-that’s all folks,” say the supposed elite. “Nothing we can do to ease your suffering! Our attentions turn instead to ourselves, the courtiers and sycophants surrounding us, and the institutions that enable our perfidy. Now go fuck off somewhere and die, troubling us no more.”

/rant off

A complex of interrelated findings about how consciousness handles the focus of perception has been making the rounds. Folks are recognizing the limited time each of us has to deal with everything pressing upon us for attention and are adopting the notion of the bandwidth of consciousness: the limited amount of perception / memory / thought one can access or hold at the forefront of attention compared to the much larger amount occurring continuously outside of awareness (or figuratively, under the hood). Similarly, the myriad ways attention is diverted by advertisers and social media (to name just two examples) to channel consumer behaviors or increase time-on-device metrics have become commonplace topics of discussion. I’ve used the terms information environment, media ecology, and attention economy in past posts on this broad topic.

Among the most important observations is how the modern infosphere has become saturated with content, much of it entirely pointless (when not actively disorienting or destructive), and how many of us willingly tune into it without interruption via handheld screens and earbuds. It’s a steady flow of stimulation (overstimulation, frankly) that is the new normal for those born and/or bred to the screen (media addicts). Its absence or interruption is discomfiting (like a toddler’s separation anxiety). However, mental processing of information overflow is tantamount to drinking from a fire hose: only a modest fraction of the volume rushing nonstop can be swallowed. Promoters of meditation and presensing, whether implied or manifest, also recognize that human cognition requires time and repose to process and consolidate experience, transforming it into useful knowledge and long-term memory. More and more stimulation added on top is simply overflow, like a faucet filling the bathtub faster than drain can let water out, spilling overflow onto the floor like digital exhaust. Too bad that the sales point of these promoters is typically getting more done, because dontcha know, more is better even when recommending less.

Quanta Magazine has a pair of articles (first and second) by the same author (Jordana Cepelewicz) describing how the spotlight metaphor for attention is only partly how cognition works. Many presume that the mind normally directs awareness or attention to whatever the self prioritizes — a top-down executive function. However, as any loud noise, erratic movement, or sharp pain demonstrates, some stimuli are promoted to awareness by virtue of their individual character — a bottom-up reflex. The fuller explanation is that neuroscientists are busy researching brain circuits and structures that prune, filter, or gate the bulk of incoming stimuli so that attention can be focused on the most important bits. For instance, the article mentions how visual perception circuits process categories of small and large differently, partly to separate figure from ground. Indeed, for cognition to work at all, a plethora of inhibitory functions enable focus on a relatively narrow subset of stimuli selected from the larger set of available stimuli.

These discussions about cognition (including philosophical arguments about (1) human agency vs. no free will or (2) whether humans exist within reality or are merely simulations running inside some computer or inscrutable artificial intelligence) so often get lost in the weeds. They read like distinctions without differences. No doubt these are interesting subjects to contemplate, but at the same time, they’re sorta banal — fodder for scientists and eggheads that most average folks dismiss out of hand. In fact, selective and inhibitory mechanisms are found elsewhere in human physiology, such as pairs of muscles to move to and fro or appetite stimulants / depressants (alternatively, activators and deactivators) operating in tandem. Moreover, interactions are often not binary (on or off) but continuously variable. For my earlier post on this subject, see this.

/rant on

Yet another journalist has unburdened herself (unbidden story of personal discovery masquerading as news) of her addiction to digital media and her steps to free herself from the compulsion to be always logged onto the onslaught of useless information hurled at everyone nonstop. Other breaking news offered by our intrepid late-to-the-story reporter: water is wet, sunburn stings, and the Earth is dying (actually, we humans are actively killing it for profit). Freeing oneself from the screen is variously called digital detoxification (detox for short), digital minimalism, digital disengagement, digital decoupling, and digital decluttering (really ought to be called digital denunciation) and means limiting the duration of exposure to digital media and/or deleting one’s social media accounts entirely. Naturally, there are apps (counters, timers, locks) for that. Although the article offers advice for how to disentangle from screen addictions of the duh! variety (um, just hit the power switch), the hidden-in-plain-sight objective is really how to reengage after breaking one’s compulsions but this time asserting control over the infernal devices that have taken over life. It’s a love-hate style of technophilia and chock full of illusions embarrassing even to children. Because the article is nominally journalism, the author surveys books, articles, software, media platforms, refusniks, gurus, and opinions galore. So she’s partially informed but still hasn’t demonstrated a basic grasp of media theory, the attention economy, or surveillance capitalism, all of which relate directly. Perhaps she should bring those investigative journalism skills to bear on Jaron Lanier, one of the more trenchant critics of living online.

I rant because the embedded assumption is that anything, everything occurring online is what truly matters — even though online media didn’t yet exist as recently as thirty years ago — and that one must (must I say! c’mon, keep up!) always be paying attention to matter in turn or suffer from FOMO. Arguments in favor of needing to be online for information and news gathering are weak and ahistorical. No doubt the twisted and manipulated results of Google searches, sometimes contentious Wikipedia entries, and various dehumanizing, self-as-brand social media platforms are crutches we all now use — some waaaay, way more than others — but they’re nowhere close to the only or best way to absorb knowledge or stay in touch with family and friends. Career networking in the gig economy might require some basic level of connection but shouldn’t need to be the all-encompassing, soul-destroying work maintaining an active public persona has become.

Thus, everyone is chasing likes and follows and retweets and reblogs and other analytics as evidence of somehow being relevant on the sea of ephemera floating around us like so much disused, discarded plastic in those infamous garbage gyres. (I don’t bother to chase and wouldn’t know how to drive traffic anyway. Screw all those solicitations for search-engine optimization. Paying for clicks is for chumps, though lots apparently do it to lie enhance their analytics.) One’s online profile is accordingly a mirror of or even a substitute for the self — a facsimile self. Lost somewhere in my backblog (searched, couldn’t find it) is a post referencing several technophiles positively celebrating the bogus extension of the self accomplished by developing and burnishing an online profile. It’s the domain of celebrities, fame whores, narcissists, and sociopaths, not to mention a few criminals. Oh, and speaking of criminals, recent news is that OJ Simpson just opened a Twitter account to reform his disastrous public image? but is fundamentally outta touch with how deeply icky, distasteful, and disgusting it feels to others for him to be participating once again in the public sphere. Disgraced criminals celebrities negatively associated with the Me-Too Movement (is there really such a movement or was it merely a passing hashtag?) have mostly crawled under their respective multimillion-dollar rocks and not been heard from again. Those few who have tried to reemerge are typically met with revulsion and hostility (plus some inevitable star-fuckers with short memories). Hard to say when, if at all, forgiveness and rejoining society become appropriate.

/rant off