Posts Tagged ‘Neologisms’

Buzzwords circulating heavily in the public sphere these days include equality, equity, inclusion, representation, diversity, pluralism, multiculturalism, and privilege. How they are defined, understood, and implemented are contentious issues that never seem to resolve. Indeed, looking back on decade after decade of activism undertaken to address various social scourges, limited progress has been made, which amounts to tinkering around the edges. The underlying bigotry (opinion, motivation, activity) has never really been eradicated, though it may be diminished somewhat through attrition. Sexism and racism in particular (classics in an expanding universe of -isms) continue to rage despite surface features taking on salutary aspects. Continued activism use the buzzwords above (and others) as bludgeons to win rhetorical battles — frequently attacks on language itself through neologism, redefinition, and reclamation — without really addressing the stains on our souls that perpetuate problematic thinking (as though anyone ever had a lock on RightThink). Violence (application of actual force, not just mean or emphatic words) is the tool of choice deployed by those convinced their agenda is more important than general societal health and wellbeing. Is violence sometimes appropriate in pursuit of social justice? Yes, in some circumstances, probably so. Is this a call for violent protest? No.

Buzzwords stand in for what we may think we want as a society. But there’s built-in tension between competition and cooperation, or alternatively, individual and society (see the start of this multipart blog post) and all the social units that nest between. Each has its own desires (known in politics by the quaint term special interests), which don’t usually combine to form just societies despite mythology to that effect (e.g., the invisible hand). Rather, competition results in winners and losers even when the playing field is fair, which isn’t often. To address what is sometimes understood (or misunderstood, hard to know) as structural inequity, activists advocate privileging disenfranchised groups. Competence and merit are often sacrificed in the process. Various forms of criminal and sociopathic maneuvering also keep a sizeable portion of the population (the disenfranchised) in a perpetual and unnecessary state of desperation. That’s the class struggle.

So here’s my beef: if privilege (earned or unearned) is categorically bad because it’s been concentrated in a narrow class (who then position themselves to retain and/or grow it), why do activists seek to redistribute privilege by bestowing it on the downtrodden? Isn’t that a recipe for destroying ambition? If the game instead becomes about deploying one or more identifiers of oppression to claim privilege rather than working diligently to achieve a legitimate goal, such as acquiring skill or understanding, why bother to try hard? Shortcuts magically appear and inherent laziness is incentivized. Result: the emergent dynamic flattens valuable, nay necessary, competence hierarchies. In it’s communist formulation, social justice is achieved by making everyone equally precarious and miserable. Socialism fares somewhat better. Ideologues throughout history have wrecked societies (and their members) by redistributing or demolishing privilege forcibly while hypocritically retaining privilege for themselves. Ideology never seems to work out as theorized, though the current state of affairs (radical inequality) is arguably no more just.

More to unpack in further installments.

“Language is dynamic” is a phrase invoked in praise or derision of shifts in usage. Corollaries include “the only constant is change” and “time’s arrow points in only one direction” — both signalling that stasis is an invalid and ultimately futile conservative value. The flip side might well be the myth of progress, understood in reference not to technological advancement but human nature’s failure to rise above its base (animal) origins. This failure is especially grotesque considering that humans currently albeit temporarily live in an age of material abundance that would provide amply for everyone if that largesse were justly and equitably produced and distributed. However, resources (including labor) are being systematically exploited, diverted, and hoarded by a small, unethical elite (what some call “alpha chimps”) who often use state power to subjugate vulnerable populations to funnel further tribute to the already obscenely wealthy top of the socioeconomic hierarchy. But that’s a different diatribe.

Although I’m sensitive the dynamism of language — especially terms for broad ideas in need of short, snappy neologisms — I’m resistant to adopting most new coin. For instance, multiple colors of pill (red, blue, white, and black to my knowledge) refer to certain narrative complexes that people, in effect, swallow. Similarly, the “blue church” is used to refer to legacy media struggling desperately (and failing) to retain its last shreds of legitimacy and authority. (Dignity is long gone.) Does language really need these terms or are hipsters just being clever? That question probably lacks a definitive answer.

My real interest with this blog post, however, is how the modern digital mediascape has given rise to a curious phenomenon associated with cancel culture: deletion of tweets and social media posts to scrub one’s past of impropriety as though the tweet or post never happened. (I’ve never deleted a post nor have any plans to.) Silicon Valley hegemons can’t resist getting their piece of the action, too, by applying deeply flawed algorithms to everyone’s content to demonetize, restrict, and/or remove (i.e., censor) offensive opinion that runs counter to (shifting) consensus narratives decided upon in their sole discretion as water carriers for officialdom. Algorithmic dragnets are not effective kludges precisely because thoughts are not synonymous with their online expression; one merely points to the other. Used to be said that the Internet is forever, so wait before posting or tweeting a reasonable duration so that irresponsible behavior (opinion and trolling, mostly) can be tempered. Who knows who possesses technical expertise and access to tweet and video archives other than, say, the Wayback Machine? When a public figure says or does something dumb, a search-and-destroy mission is often launched to resurrect offending and damning past utterance. Of course, scrub-a-dub erasure or deletion is merely another attempt to manage narrative and isn’t a plea for forgiveness, which doesn’t exist in the public sphere anyway except for rehabilitated monsters such as past U.S. presidents a/k/a war criminals. And the Internet isn’t in fact forever; ask an archivist.

Shifting language, shifting records, shifting sentiment, shifting intellectual history are all aspects of culture that develop naturally and inevitably over time. We no longer believe, for instance, in the four elements or geocentrism (a/k/a the Aristotelian-Ptolemaic system; never mind the intransigent Flat Earthers who need not be silenced). Darker aspects of these shifts, however, include the remarkable Orwellian insight that “Who controls the past controls the future: who controls the present controls the past” from the 1949 novel Nineteen Eighty-Four. Here’s the passage for context:

Who controls the past, controls the future: who controls the present, controls the past … The mutability of the past is the central tenet of Ingsoc. Past events, it is argued, have no objective existence, but survive only in written records and in human memories. The past is whatever the records and the memories agree upon. And since the Party is in full control of all records, and in equally full control of the minds of its members, it follows that the past is whatever the Party chooses to make it.

In 2021, the awful lesson is taken to heart by multiple parties (not the Party in the novel but wannabes) who have latched maniacally onto Orwellian mechanisms of thought control specifically through the manipulation of records, history, and language. But as mentioned above, policing mere expression is not the same as policing thought itself, at least among those who retain critical thinking skills and independence of mind. I abstain judgment how effective attempted brainwashing is with the masses but will at least mention that Yeonmi Park, who escaped from North Korea in 2007 before settling in the U.S. in 2014, describes the chilling totalitarian thought control exercised by the North Korean government — the stuff of nightmare dystopianism. The template is by now well established and despots everywhere are only too happy to implement it repeatedly, following an evil trajectory that should be resisted at every turn while still possible.

Periodically, I come across preposterously stupid arguments (in person and online) I can’t even begin to dispel. One such argument is that carbon is plant food, so we needn’t worry about greenhouse gases such as carbon dioxide, a byproduct of industrial activity. Although I’m unconvinced by such arrant capsule arguments, I’m also in a lousy position to contend with them because convincing evidence lies outside my scientific expertise. Moreover, evidence (should I bother to gather it) is too complex and involved to fit within a typical conversation or simple explanation. Plus, evidence relies on scientific literacy and critical reasoning often lacking in the lay public. Scientific principles work better for me rather than, for example, the finely tuned balances Nature is constantly tinkering with — something we humans can hope to discover only partially. Yet we sally forth aggressively and heedlessly to manipulate Nature at our peril, which often results in precisely the sort of unintended consequence scientists in Brazil found when mosquitoes altered genetically (to reduce their numbers as carriers of disease) developed into mosquitoes hardier and more difficult to eradicate than if we had done nothing. The notion that trees respond favorably to increased carbon in the atmosphere has been a thorn in my side for some time. Maybe it’s even partly true; I can’t say. However, the biological and geophysical principle I adhere to is that even small changes in geochemistry (minute according to some scales, e.g., parts per million or per billion) have wildly disproportionate effects. The main effect today is climate changing so fast that many organisms can’t adapt or evolve quickly enough to keep up. Instead, they’re dying en masse and going extinct.

The best analogy is the narrow range of healthy human body temperature centered on 98.6 °F. Vary not far up (fever) or down (hypothermia) and human physiology suffers and become life threatening. Indeed, even in good health, we humans expend no small effort keeping body temperature from extending far into either margin. Earth also regulates itself through a variety of blind mechanisms that are in the process of being wrecked by human activity having risen by now to the level of terraforming, much like a keystone species alters its environment. So as the planet develops the equivalent of a fever, weather systems and climate (not the same things) react, mostly in ways that make life on the surface much harder to sustain and survive. As a result, trees are in the process of dying. Gail Zawacki’s blog At Wit’s End (on my blogroll) explores this topic in excruciating and demoralizing detail. Those who are inclined to deny offhandedly are invited to explore her blog. The taiga (boreal forest) and the Amazonian rainforest are among the most significant ecological formations and carbon sinks on the planet. Yet both are threatened biomes. Deforestation and tree die-off is widespread, of course. For example, since 2010, an estimated 129 million trees in California have died from drought and bark beetle infestation. In Colorado, an estimated more than 800 millions dead trees still standing (called snags) are essentially firestarter. To my way of thinking, the slow, merciless death of trees is no small matter, and affected habitats may eventually be relegated to sacrifice zones like areas associated with mining and oil extraction.

Like the bait “carbon is plant food,” let me suggest that the trees have begun to rebel by falling over at the propitious moment to injure and/or kill hikers and campers. According to this article at Outside Magazine, the woods are not safe. So if mosquitoes, rattlesnakes, mountain lions, or bears don’t getcha first, beware of the trees. Even broken branches and dead tree trunks that haven’t fallen fully to the ground (known as hung snags, widow-makers, and foolkillers) are known to take aim at human interlopers. This is not without precedent. In The Lord of the Rings, remember that the Ents (tree herders) went to war with Isengard, while the Huorns destroyed utterly the Orcs who had laid siege to Helm’s Deep. Tolkien’s tale is but a sliver of a much larger folklore regarding the enchanted forest, where men are lost or absorbed (as with another Tolkien character, Old Man Willow). Veneration of elemental forces of nature (symbols of both life and its inverse death) is part of our shared mythology, though muted in an era of supposed scientific sobriety. M. Night Shyamalan has weak explorations of similar themes in several of his films. Perhaps Tolkien understood at an intuitive level the silent anger and resentment of the trees, though slow to manifest, and their eventual rebellion over mistreatment by men. It’s happening again, right now, all around us. Go ahead: prove me wrong.

This Savage Love column got my attention. As with Dear Abby, Ask Marylin, or indeed any advice column, I surmise that questions are edited for publication. Still, a couple minor usage errors attracted my eye, which I can let go without further chastising comment. More importantly, question and answer both employ a type of Newspeak commonplace among those attuned to identity politics. Those of us not struggling with identity issues may be less conversant with this specialized language, or it could be a generational thing. Coded speech is not unusual within specialized fields of endeavor. My fascination with nomenclature and neologisms makes me pay attention, though I’m not typically an adopter of hip new coin.

The Q part of Q&A never actually asks a question but provides context to suggest or extrapolate one, namely, “please advise me on my neuro-atypicality.” (I made up that word.) While the Q acknowledges that folks on the autism spectrum are not neurotypical, the word disability is put in quotes (variously, scare quotes, air quotes, or irony quotes), meaning that it is not or should not be considered a real or true disability. Yet the woman acknowledges her own difficulty with social signaling. The A part of Q&A notes a marked sensitivity to social justice among those on the spectrum, acknowledges a correlation with nonstandard gender identity (or is it sexual orientation?), and includes a jibe that standard advice is to mimic neurotypical behaviors, which “tend to be tediously heteronormative and drearily vanilla-centric.” The terms tediously, drearily , and vanilla push unsubtly toward normalization and acceptance of kink and aberrance, as does Savage Love in general. I wrote about this general phenomenon in a post called “Trans is the New Chic.”

Whereas I have no hesitation to express disapproval of shitty people, shitty things, and shitty ideas, I am happy to accept many mere differences as not caring two shits either way. This question asks about something fundamental human behavior: sexual expression. Everyone needs an outlet, and outliers (atypicals, nonnormatives, kinksters, transgressors, etc.) undoubtedly have more trouble than normal folks. Unless living under a rock, you’ve no doubt heard and/or read theories from various quarters that character distortion often stems from sexual repression or lack of sexual access, which describes a large number of societies historical and contemporary. Some would include the 21st-century U.S. in that category, but I disagree. Sure, we have puritanical roots, recent moral panic over sexual buffoonery and crimes, and a less healthy sexual outlook than, say, European cultures, but we’re also suffused in licentiousness, Internet pornography, and everyday seductions served up in the media via advertising, R-rated cinema, and TV-MA content. It’s a decidedly mixed bag.

Armed with a lay appreciation of sociology, I can’t help but to observe that humans are a social species with hierarchies and norms, not as rigid or prescribed perhaps as with insect species, but nonetheless possessing powerful drives toward consensus, cooperation, and categorization. Throwing open the floodgates to wide acceptance of aberrant, niche behaviors strikes me as swimming decidedly upstream in a society populated by a sizable minority of conservatives mightily offended by anything falling outside the heteronormative mainstream. I’m not advocating either way but merely observing the central conflict.

All this said, the thing that has me wondering is whether autism isn’t itself an adaptation to information overload commencing roughly with the rise of mass media in the early 20th century. If one expects that the human mind is primarily an information processor and the only direction is to process ever more information faster and more accurately than in the past, well, I have some bad news: we’re getting worse at it, not better. So while autism might appear to be maladaptive, filtering out useless excess information might unintuitively prove to be adaptive, especially considering the disposition toward analytical, instrumental thinking exhibited by those on the spectrum. How much this style of mind is valued in today’s world is an open question. I also don’t have an answer to the nature/nurture aspect of the issue, which is whether the adaptation/maladaptation is more cultural or biological. I can only observe that it’s on the rise, or at least being recognized and diagnosed more frequently.

In my travels and readings upon the Intertubes, which proceed in fits and starts, I stumbled across roughly the same term — The NOW! People — used in completely different contexts and with different meanings. Worth some unpacking for idle consideration.

Meaning and Usage the First: The more philosophical of the two, this refers to those who feel anxiety, isolation, estrangement, disenfranchisement, and alienation from the world in stark recognition of the self-other problem and/or mind-body dualism. They seek to lose their identity and the time-boundedness that goes with being a separate self by entering a mental state characterized by the eternal NOW, much as animals without consciousness are believed to think. Projection forward and back more than a few moments in time is foreclosed; one simply exists NOW! Seminars and YouTube videos on radical nonduality are offers by Tony Parsons, Jim Newman, Andreas Müller, and Kenneth Madden, but according to my source (unacknowledged and unlinked), they readily admit that despite study, meditation, openness, and desire to achieve this state of mind, it is not prone to being triggered. It either happens or it doesn’t. Nonetheless, some experiences and behaviors allow individuals to transcend themselves at least to some degree, such as music, dance, and sex.

Meaning and Usage the Second: The more populist and familiar of the two, this refers to people for whom NOW! is always the proper time to do whatever the hell they most urgently desire with no consideration given to those around them. The more mundane instance is someone stopping in a doorway or on an escalator to check their phones for, oh, I dunno, Facebook updates and new e-mail. A similar example is an automobile driver over whom traffic and parking controls have no effect: someone double-parked (flashers optional) in the middle of the road or in a fire lane, some who executes a U-turn in the middle of traffic, or someone who pointlessly jumps the line in congestion just to get a few cars lengths ahead only to sit in yet more traffic. The same disregard and disrespect for others is evident in those who insist on saving seats or places in line, or on the Chicago L, those who occupy seats with bags that really belong on their laps or stand blocking the doorways (typically arms extended looking assiduously at their phones), making everyone climb past them to board or alight the train. These examples are all about someone commandeering public space as personal space at the anonymous expense of anyone else unfortunate enough to be in the same location, but examples multiply quickly beyond these. Courtesy and other social lubricants be damned! I want what I want right NOW! and you can go pound sand.

Both types of NOW! behavior dissolve the thinking, planning, orchestrating, strategizing mind in favor of narrowing thought and perception to this very moment. The first gives away willfulness and desire in favor of tranquility and contentedness, whereas the second demonstrates single-minded pursuit of a single objective without thought of consequence, especially to others. Both types of NOW! People also fit within the Transhumanist paradigm, which has among its aims leaving behind worldly concerns to float freely as information processors. If I were charitable about The NOW! People, I might say they lose possession of themselves by absorption into a timeless, mindless present; if less charitable, I might say that annihilation of the self (however temporary) transforms them into automatons.

The sole appeal I can imagine to retreating from oneself to occupy the eternal moment, once one has glimpsed, sensed, or felt the bitter loneliness of selfhood, is cessation of suffering. To cross over into selflessness is to achieve liberation from want, or in the Buddhist sense, Nirvana. Having a more Romantic aesthetic, my inclination is instead to go deeper and to seek the full flower of humanity in all its varieties. That also means recognizing, acknowledging, and embracing darker aspects of human experience, and yes, no small amount of discomfort and suffering. Our psycho-spiritual capacity demands it implicitly. But it takes strong character to go toward extremes of light and dark. The NOW! People narrow their range radically and may well be the next phase of human consciousness if I read the tea leaves correctly.

The English language has words for everything, and whenever something new comes along, we coin a new word. The latest neologism I heard is bolthole, which refers to the the location one bolts to when collapse and civil unrest reach intolerable proportions. At present, New Zealand is reputed to be the location of boltholes purchased and kept by the ultrarich, which has the advantage of being located in the Southern Hemisphere, meaning remote from the hoi polloi yet reachable by private plane or oceangoing yacht. Actually, bolthole is an older term now being repurposed, but it seems hip and current enough to be new coin.

Banned words are the inverse of neologisms, not in the normal sense that they simply fall out of use but in their use being actively discouraged. Every kid learns this early on when a parent or older sibling slips and lets an “adult” word pass his or her lips that the kid isn’t (yet) allowed to use. (“Mom, you said fuck!”) George Carlin made a whole routine out of dirty words (formerly) banned from TV. Standards have been liberalized since the 1970s, and now people routinely swear or refer to genitalia on TV and in public. Sit in a restaurant or ride public transportation (as I do), eavesdrop a little speech within easy earshot (especially private cellphone conversations), and just count the casual F-bombs.

The worst field of banned-words nonsense is political correctness, which is intertwined with identity politics. All the slurs and epithets directed at, say, racial groups ought to be disused, no doubt, but we overcompensate by renaming everyone (“____-American”) to avoid terms that have little or no derogation. Even more ridiculous, at least one egregiously insulting term has been reclaimed as an badge of honor unbanned banned word by the very group it oppresses. It takes Orwellian doublethink to hear that term — you all know what it is — used legitimately exclusively by those allowed to use it. (I find it wholly bizarre yet fear to wade in with my own prescriptions.) Self-disparaging language, typically in a comedic context, gets an unwholesome pass, but only if one is within the identity group. (Women disparage women, gays trade on gay stereotypes, Jews indulge in jokey anti-Semitism, etc.) We all laugh and accept it as safe, harmless, and normal. President Obama is continuously mixed up appearances (“optics”), or what to call things — or not call them, as the case may be. For instance, his apparent refusal to call terrorism originating in the Middle East “Muslim terrorism” has been met with controversy.

I’m all for calling a thing what it is, but the term terrorism is too loosely applied to any violent act committed against (gasp!) innocent Americans. Recent events in Charleston, SC, garnered the terrorism label, though other terms would be more apt. Further, there is nothing intrinsically Muslim about violence and terrorism. Yeah, sure, Muslims have a word or doctrine — jihad — but it doesn’t mean what most think or are led to believe it means. Every religion across human history has some convenient justification for the use of force, mayhem, and nastiness to promulgate its agenda. Sometimes it’s softer and inviting, others time harder and more militant. Unlike Bill Maher, however, circumspect thinkers recognize that violence used to advance an agenda, like words used to shape narratives, are not the province of any particular hateful or hate-filled group. Literally everyone does it to some extent. Indeed, the passion with which anyone pursues an agenda is paradoxically celebrated and reviled depending on content and context, and it’s a long, slow, ugly process of sorting to arrive as some sort of Rightthink®, which then becomes conventional wisdom before crossing over into political correctness.

I have always remembered a striking line from the movie The Dancer Upstairs where the police investigator, who is tracking the leader of Shining Path in Peru in the 1980s, says (paraphrasing from Spanish), “I think there is a revolution going on.” Elsewhere on the globe today, Arab Spring has morphed from a series of U.S.-instigated regime changes into an emerging Arab state (ISIS), though establishing itself is violent and medieval. According to Tom Engelhardt, even the U.S. has a new political system rising out of the ruins of its own dysfunction. Unless I’m mistaken, a revolution is a political system being overthrown by mass uprising of the citizenry, whereas a coup is a powerful splinter within the current regime (often the military wing) seizing administrative control. What Engelhardt describes is more nearly a coup, and like the quote above, it appears to be coalescing around us in plain sight, though that conclusion is scarcely spoken aloud. It may well be that Engelhardt has succeeded in crystallizing the moment. His five principal arguments are these:

  1. 1% Elections — distortion of the electoral system by dollars and dynasties.
  2. Privatization of the State — proper functions of the state transferred into the hands of privateers (especially mercenaries and so-called warrior corporations — nice neologism).
  3. De-legitimization of Congress and the Presidency — fundamental inability to govern, regulate, and/or prosecute at the Federal level, opening up a power vacuum.
  4. Rise of the National Security State (Fourth Branch of Government) — the dragnet complex revealed (in part) by whistle-blower Edward Snowden but plain to see post-9/11.
  5. Demobilization of the American People — surprising silence of the public in the face of such unwholesome developments.

Please read the article for yourself, which is very well written. (I am no great fan of the journalistic style but must acknowledge that Engelhardt’s work is terrific.) I especially like Engelhardt’s suggestion that a grand conspiracy (e.g., New World Order) is not necessary but that instead it’s all being improvised on the run. Let me offer a couple observations of my own.

Power has several attributes, such as the position to influence events, the resources to get things done, and the ability to motivate (or quell) the public through active management of perception. High offices (both government and boardroom, both elected and appointed) are the positions, the U.S. Treasury and the wealth of the 1% are the resources, and charismatic storytelling (now outright lying) is management of perception. Actors (a word chosen purposely) across the American stage have been maneuvering for generations to wield power, often for its own sake but more generally in the pursuit of wealth. One might assume that once personal wealth has been acquired motivations would slacken, but instead they divert in not a few psychopaths to maniacal building of multigenerational dynasties.

Pulling the levers of state in one capacity or another is a timeworn mechanism for achieving the proxy immortality of the American statesman. However, as dysfunction in the political arena has grown, corporations (including banks) have assumed the reins. Despite corporate personhood being conferred and recently expanded, largely via judicial fiat, the profit motive has reasserted itself as primary, since there is no such thing as a fully self-actualized corporation. Thus, we have the Federal Reserve System acting as a de facto corporation within government — but without conscience. Multiply that hundreds of times over and voilà: an American corporatocracy.

The effect has been extrapolated in numerous movies and television shows, all offering dystopic warnings of things to come where people, domestic and alien, are all expendable as power seeks to perpetuate itself. How far this can go before financial collapse, climate change, energy scarcity, or a host of others looming calamities overtakes is yet to be seen. Some hold out hope for true revolution, but I believe that possibility has been contained. Considering how the world has been accelerating toward ecocide, I venture that at most a few more decades of desperate negotiations with fate are in store for us. Alternatively, I find it entirely feasible that the delicate web of interconnections that maintain life in all its manifestations could suffer a phase shift rather quickly, at which point all bets are off. Either way, in no one’s wildest imagination could our current civilization be considered the best we can do, much less the best of all possible worlds.

I am, as usual, late getting to the latest controversy in academe, which has been argued to death before I even became aware of it. Inside Higher Ed appears to have gotten there first, followed by editorials at The New York Times, The Los Angeles Times, and The Washington Post. At issue are trigger warnings, a neologism for what might otherwise be called parental advisories (thinking in loco parentis here), to be placed in syllabi and on classroom materials, at first fiction reading but potentially history lessons (and frankly, just about anything else), that might trigger a panic attack or some other dolorous response from a student with a phobia or memory of a traumatic experience. The opinion articles linked above (Inside Higher Ed is more purely reporting) are all in agreement that triggers warnings are a bad idea.

Although articles in news organs are more nearly broadcasting and thus lack discussion (unless one ventures into the swamp of the comments section, which I rarely do), I indulged in a long discussion of the subject with fellow alumni of one of the institutions named in the reports. As with other issues, it developed so many facets that a snapshot determination became impossible if one attempted to accommodate or address all perspectives. Therein lies the problem: accommodation. Left-leaning liberals are especially prone to hypersensitivity to identity politics, which gained prominence in the late 1970s or early 80s. I quickly run afoul of anyone who takes such a perspective because I am notoriously white, male, well-educated, and middle class, so I must constantly “check my privilege.” When someone like me refuses others accommodation, it looks to others like raising the ladder behind me after I’ve safely ascended. I can appreciate, I think, how frustrating it must be to have one’s earnestness thwarted, but yet, I admit I just don’t get it. At the risk of offending (trigger warning here), let me blunder ahead anyway.

The world (or as I’m beginning to call it more simply, reality) is a messy place, and each of us inevitably carries some phobia, trauma, or history that is unsavory. From one celebrated perspective, what doesn’t kill us makes us stronger; from another, we are trained to request accommodation. Accommodation used to be primarily for physical disabilities; now it’s for feelings, which some argue are just as debilitating. This is the province of every self-selected minority and special interest group, which has spawned predictable backlashes among various majority groups (e.g., the men’s movement, resurgent white supremacy). Naturally, any lobby, whether part of a minority or majority, will seek to promote its agenda, but I regard the brew-ha-ha over trigger warnings as an example of growing incidence of what’s been called the Strawberry Generation. It’s remarkable that students now regard themselves as dainty flowers in need of special protection lest they be trampled by, well, reality. So trigger warnings are being requested by students, not on their behalves. With so many examples throughout even recent history of flagrant social injustice and oppression, it’s clear that everyone wants to proclaim their special combination of disadvantages and receive accommodation, all the better if multiplied by inclusion in several protected classes. It’s a claim of victimhood before the fact or perhaps permanent victimhood if one is a survivor of some nastiness. (Disclaimer: real oppression and victimhood do exist, which I don’t intend to minimize, but they’re not because of reading fiction or learning history, scandalous as they may be).

In addition, what exactly is accomplished by virtue of warnings that one is about to encounter — what should it be called — messy material? Does one steel oneself against impact and thus limit its educational value, or does one expect to be excused from facing reality and receive an alternative assignment minus the offending material? Both are the antithesis of higher education. Arguments in the abstract are easy to ignore, so here are two specific examples: substitution or elimination of the words nigger and injun in modernized editions of Mark Twain’s Adventures of Huckleberry Finn and biology textbooks that give consideration to (literally) unscientific accounts of creation and evolution. If one’s racial or religious background gives rise to excess discomfort over the use of one or another egregious trigger word (nigger in particular now having been reclaimed and repurposed for all sorts of uses but with caveats) borne out of ripe historical context or what science (as opposed to religion) teaches, well, that’s the world (reality) we live in. Sanitizing education to avoid discomfort (or worse) does no one any favors. Discomfort and earnest questioning are inevitable if one is to learn anything worthwhile in the course of getting an education.

Peter Van Buren has a new book out and is flogging it at TomDispatch. He’s a good enough writer, so I have no objection to the promotional aspect of disseminating his own work. But as I read his article describing an America gone to seed, I realized that for all his writerly skill, he misses the point. As a former State Dept. administrator (charged with assisting Iraqi reconstruction) turned whistle-blower, Van Buren is clearly outside the mainstream media and somewhat outside mainstream opinion, yet he appears to be well within the dominant paradigm. His new spin on regime change takes as implicit all the teachings of economics and politics as systems ideally suited to engineering an equitable social contract where everyone benefits. But as cycles of history have shown, those systems are even more prone to manipulation by a power elite who care little about people they pretend to serve. Whether that carelessness is learned or ingrained in the kleptocracy plutocracy is open to debate.

Van Buren’s article offers a few interesting tidbits, including a couple neologisms (I’m always on the lookout for new coin):

dirt shadow = the faint but legible image left behind an uninstalled sign on the exterior of a closed storefront or building

street gravy = the dirt and grunge that collects over time on a homeless person

Neither is too picaresque. The second is obviously a (sad because it’s too hip) euphemism, since gravy suggests richness whereas the actuality is downright unpleasant. As Van Buren surveys, similar unpleasantness is currently experienced all across America in towns and niche economies that have imploded. Interestingly, his counterexample is a U.S. Marine Corps base, Camp Lejeune located in North Carolina, that functions as a gated community with the added irony that it is supported by public funds. Van Buren also notes that, according to the Congressional Budget Office, an average active-duty service member receives a benefits and pay compensation package estimated to be worth $99,000, some 60 percent of it in noncash compensation.

If there is a cause why our regime is in disarray, however, Van Buren busies himself with standard economic and political (one might even say military-industrial) explanations, demonstrating an inability to frame the decline of empire as the beginning of an epochal shift away from plentiful energy resources, famously termed The Long Emergency by James Howard Kunstler. (We ought to resurrect that phrase.) Other frames of reference are certainly not without their impacts, but the inability to connect all the dots to see the underlying cause is commonplace in the mainstream.

In contrast, consider this passage from Harvesting the Biosphere by Vaclav Smil:

There are two simple explanations why food production in traditional agricultural societies — despite its relatively high need for claiming new arable land — had a limited impact on natural ecosystems: very low population growth rates and very slow improvements in prevailing diets. Population growth rates averaged no more than 0.05% during the antiquity and they reached maxima of just 0.07% in medieval Eurasia — resulting in very slow expansion of premodern societies: it took Europe nearly 1,500 years to double the population it had when Rome became an empire, and Asian doubling was only a bit faster, from the time of China’s Han dynasty to the late Ming period. [pp. 118–119]

Smil goes on to provide exhaustive detail, much of it measurement (with acknowledged ranges of error), showing how modern mechanisms and energy exploitation have enabled rapid population growth. Although population has (apparently) not yet peaked, we are already sliding back down the energy gradient we climbed over the past 250 years and will soon enough face widespread food shortages (among other things) as productivity plummets due to diminishing energy inputs and accumulated environmental destruction (including climate change). Economics and politics do not possess solutions to that prospect. That’s the real dirt in the historical narrative, which remains largely uncovered (unreported) by paradigmatic thinking.

I have watched, blogged about, and embedded my share of TED Talks over the years I’ve been active as a blogger. As a cottage industry for the cognoscenti, TED is an impressive, multinational undertaking being fed into by lots of impressive scientists, researchers, and know-it-alls. What’s not to like? One spends about 12 min. watching and listening to someone simplify complex issues, recommend a handful of impossible solutions, and make promises to roll up one’s sleeves and do the hard work, but then one forgets about it all the next day. Or one might watch another talk, or a series of talks, to get the brain goosed before moving on. The most succinct criticism I have heard of the phenomenon called it insight porn. So after my initial excitement with TED and its voluminous insights, I grew to be more and more skeptical of various speakers’ claims. This was especially true after I learned that Allan Savory admitted to having been instrumental in wholesale murder of 40,000 African elephants — a massacre undertaken with the full authority of scientific consensus (at the time, which Savory is now attempting to reverse — oops, his bad).

In light of my misgivings about TED, I found it curious that a TED Talk by Benjamin Bratton has attracted lots of positive attention. His subject is TED Talks themselves, and he makes his case in a TED Talk, which is reproduced as a transcript in an article by The Guardian. There is much to admire in Bratton’s analysis but more perhaps to find objectionable. He occupies a difficult position, both behind the curtain and before it, which creates recursive conflicts. (I’m rather fond myself of busting through rhetorical frames, but I try not to stand inside them at the same time.) Further, I read the article before watching the talk in video form and found quite a difference between the two media. The lack of visual distractions and audience response to the jokes in the transcript allow me to penetrate much deeper. I suspect that a large part of that is not being entrained by sympathetic response.

Bratton offers several worthwhile insights, among them the cult of the solution (my term, not his), which becomes hypothetical since implementation rarely goes forward. This is true especially when it comes to reconceptualizing fondly held myths and self-delusions about what science and sociology can really do for our understandings about the ways the world really works. That’s one of Bratton’s related criticisms: the nonworkability of TED Talk solutions. It’s ironic that Bratton observes so many TED Talks founder on practicality when at the same time his initial anecdote about a scientific presentation failing to motivate the listener (be more like Malcolm Gladwell and engage the emotions, transforming epiphany into entertainment) receives serious derision. There’s room for both interpretations, but some circumspection is needed here. Nothing could be more obvious than how various political and corporate entities have succeeded in motivating the public into action or inaction through propaganda, marketing campaigns, and emotional manipulation, whereas the sober, objective posture of scientific inquiry has failed utterly to get action on an extremely pressing civilizational disaster quite different from the one Bratton envisions. (Regular readers of this blog know what I’m referring to.) Bratton actually takes note of GOP dominance of messaging (what he calls bracketing reality) but fails to connect the dots.

The T-E-D in TED

Bratton also objects that the technology-education-design orientation of TED ought to be instead tech-econ-design. This reconfiguration got by my internal bullshit sensor on video, but the transcript fares less well. For instance, he notes that TED often delivers placebo technoradicalism (no lack of clever word formations) but concludes that it fails precisely because we’re too timid to actually embrace technology with enough gusto. He doesn’t seem to get that technological advance is at best mixed and at worse disastrous (e.g., hypercomplexity, WMDs, nuclear mismanagement and accidents, overconsumption of resources leading to population overshoot leading back to overconsumption). Bratton apparently still worships the tech idol, failing to recognize that the world it has delivered serves us rather poorly. Further, neither education nor economics have turned out to be much of a panacea for our most intractable problems. I’m unsure what Bratton thinks exchanging one for the other in the context of TED will accomplish. Last, redefining design as immunization rather than innovation sounds like something worthwhile but is the same hypothetical, unrealizable solution he criticizes earlier. The notion that we can engineer our way out of problems if only designers wise up and others stop gaming systems for profit or self-aggrandizement begs quite a lot of questions that go unaddressed.

In spite of his disavowal of a simple takeaway, Bratton offers standard PoMo word salad filled with specialized jargon:

… it’s not as though there is a shortage of topics for serious discussion. We need a deeper conversation about the difference between digital cosmopolitanism and cloud feudalism … I would like new maps of the world, ones not based on settler colonialism, legacy genomes and bronze age myths, but instead on something more … scalable … we need to raise the level of general understanding to the level of complexity of the systems in which we are embedded and which are embedded in us. This is not about “personal stories of inspiration”, it’s about the difficult and uncertain work of demystification and reconceptualisation: the hard stuff that really changes how we think …

Really changes how we think? For someone who writes about “entanglements of technology and culture,” Bratton has a serious misconception about how the brainchild of the Enlightenment can ultimately win the day and deliver salvation from ourselves. Myth and story and narrative are not mistakes to be replaced by rationalism; they are who and how we are in the world. They don’t go away by wagging the bony finger of science/tech but are only slowly forgotten, revised, and replaced by yet other myths, stories, and narratives. The depth of our entrenchment in such cultural baggage is one of the very things forestalling change. We can peer behind the curtain or see from outside the bubble sometimes, but we can’t escape them. We never could.