Archive for the ‘Mental Health’ Category

Ours is an era when individuals are encouraged to explore, amplify, and parade various attributes of their identities out in public, typically via social media. For those just coming of age and/or recently having entered adulthood, because identity is not yet fully formed, defining oneself is more nearly a demand. When identity is further complicated by unusual levels of celebrity, wealth, beauty, and athleticism (lots of overlap there), defining oneself is often an act of rebellion against the perceived demands of an insatiable public. Accordingly, it was unsurprising to me at least to learn of several well-known people unhappy with their lives and the burdens upon them.

Regular folks can’t truly relate the glitterati, who are often held up aspirational models. For example, many of us look upon the discomforts of Prince Harry and Meghan Markle with a combination of perverse fascination and crocodile tears. They were undoubtedly trapped in a strange, gilded prison before repudiating the duties expected of them as “senior royals,” attempting an impossible retreat to normalcy outside of England. Should be obvious that they will continue to be hounded while public interest in them persists. Similarly, Presley Gerber made news, fell out of the news, and then got back into the news as a result of his growing collection of tattoos. Were he simply some anonymous fellow, few would care. However, he has famous parents and already launched a modeling career before his face tattoo announced his sense of being “misunderstood.” Pretty bold move. With all the presumed resources and opportunities at his disposal, many have wondered in comments and elsewhere whether another, better declaration of self might have been preferred.

Let me give these three the benefit of doubt. Although they all have numerous enviable attributes, the accident of birth (or in Markle’s case, decision to marry) landed them in exceptional circumstances. The percentage of celebrities who crack under the pressure of unrelenting attention and proceed to run off the rails is significant. Remaining grounded is no doubt easier if one attains celebrity (or absurd immense wealth) after, say, the age of 25 or even later. (On some level, we’ve all lost essential groundedness with reality, but that’s another blog post.) Those who are children of celebrities or who become child prodigies may not all be consigned to character distortion or a life irrevocably out of balance, but it’s at least so commonplace that the dangerous potential should be recognized and embraced only with wariness. I’ve heard of programs designed to help professional athletes who become sudden multimillionaires (and thus targets of golddiggers and scammers) make the transition. Good for them that structured support is available. Yet another way average folks can’t relate: we have to work things out for ourselves.

Here’s the example I don’t get: Taylor Swift. She was the subject of a Netflix biography called Miss Americana (2020) that paints her as, well, misunderstood. Thing is, Swift is a runaway success story, raking in money, fans, awards, attention, and on balance, detractors. That success is something she earnestly desired and struggled to achieve only to learn that the glossy, popstar image sold especially but nonexclusively to 14-year-old girls comes with a lot of heavy baggage. How can the tragic lives of so many musicians launched into superstardom from the late 1950s onward have escaped Swift’s awareness in our media-saturated world? Naming names is sorta tacky, so I demur, but there are lots of them. Swift obtained her heart’s desire, found her songwriting and political voice, maintains a high public profile, and shows no lack of productivity. Sure, it’s a life out of balance, not remotely normal the way most noncelebrities would understand. However, she signed up for it willingly (if naïvely) and by all accounts perpetuates it. She created her own distinctive gilded prison. I don’t envy her, nor do I particularly feel sorry for her, as the Netflix show appears to instruct.

In the introduction to an article at TomDispatch about anticipated resumption of professional sports currently on hiatus like much of the rest of human activity (economic and otherwise), Tom Engelhardt recalls that to his childhood self, professional sports meant so much and yet so little (alternatively, everything and nothing). This charming aspect of the innocence of childhood continues into adulthood, whether as spectator or participant, as leisure and freedom from threat allow. The article goes on to offer conjecture regarding the effect of reopening professional sports on the fall presidential election. Ugh! Racehorse politics never go out of season. I reject such purely hypothetical analyses, which isn’t the same as not caring about the election. Maybe I’ll wade in after a Democratic nominee is chosen to say that third-party candidates may well have a much larger role to play this time round because we’re again being offered flatly unacceptable options within the two-party single-party system. Until then, phooey on campaign season!

Still, Engelhardt’s remark put me in mind of a blog post I considered fully nine years ago but never got around to writing, namely, how music functions as meaningless abstraction. Pick you passion, I suppose: sports, music (any genre), literature, painting, poetry, dance, cinema and TV, fashion, fitness, nature, house pets, house plants, etc. Inspiration and devotion come in lots of forms, few of which are essential (primary or ontological needs on Maslow’s Hierarchy) yet remain fundamental to who we are and what we want out of life. Accordingly, when one’s passion is stripped away, being left grasping and rootless is quite common. That’s not equivalent to losing a job or loved one (those losses are afflicting many people right now, too), but our shared experience these days with no bars, no restaurants, no sports, no concerts, no school, and no church all add up to no society. We’re atomized, unable to connect and socialize meaningfully, digital substitutes notwithstanding. If a spectator, maybe one goes in search of replacements, which is awfully cold comfort. If a participant, one’s identity is wrapped up in such endeavors; resulting loss of meaning and/or purpose can be devastating.

It would be easy to over-analyze and over-intellectualize what meaningless abstraction means. It’s a trap, so I’ll do my best not to over-indulge. Still, it’s worth observing that as passions are habituated and internalized, their mode of appreciation is transferred from the senses (or sensorium) to the mind or head (as observed here). Coarseness and ugliness are then easily digested, rationalized, and embraced instead of being repulsive as they should be. There’s the paradox: as we grow more “sophisticated” (scare quotes intentional), we also invert and become more base. How else to explain tolerance of increasingly brazen dysfunction, corruption, servitude (e.g., debt), and gaslighting? It also explains the attraction to entertainments such as combat sports (and thug sports such as football and hockey), violent films, professional wrestling (more theater than sport), and online trolling. An instinctual blood lust that accompanies being predators, if not expressed more directly in war, torture, crime, and self-destruction, is sublimated into entertainment. Maybe that’s an escape valve so pressures don’t build up any worse, but that possibility strikes me as rather weak considering just how much damage has already been done.

The first time I wrote on this title was here. I’m pretty satisfied with that 11-year-old blog post. Only recently, I copped to use of reframing to either zoom in on detail or zoom out to context, a familiar rhetorical device. Here I’m zooming out again to the god’s eye view of things.

The launching point for me is James Howard Kunstler’s recent blog post explaining and apologizing for his generation’s principal error: financialization of the U.S. economy. In that post, he identifies characteristics in grandparents and parents of boomers as each responds and adapts to difficulties of the most self-destructive century in human history. Things destroyed include more than just lives, livelihoods, and the biosphere. After several centuries of rising expectations and faith in progress (or simply religious faith), perhaps the most telling destruction is morale, first in the reckless waste of WWI (the first mechanized war), then repeatedly in serial economic and political catastrophes and wars that litter the historical record right up to today. So it’s unsurprising (but not excusable) that boomers, seeing in unavoidable long-term destruction our powerlessness to master ourselves or in fact much of anything — despite the paradox of developing and obtaining more power at every opportunity — embarked on a project to gather to themselves as much short-term wealth and power as possible because, well, why the fuck not? Kunstler’s blog post is good, and he admits that although the masters-of-the-universe financial wizards who converted the economy into a rigged casino/carnival game for their own benefit are all boomers, not all boomers are responsible except in the passive sense that we (includes me, though I’m just as powerless as the next) have allowed it to transpire without the necessary corrective: revolt.

Zooming out, however, I’m reminded of Jared Diamond’s assessment that the greatest mistake humans ever committed was the Agricultural Revolution 10–13 millennia ago. That context might be too wide, so let me restrict to the last 500 years. One theory propounded by Morris Berman in his book Why America Failed (2011) is that after the discovery of the New World, the cohort most involved in colonizing North America was those most desperate and thus inclined to accept largely unknown risks. To them, the lack of ontological security and contingent nature of their own lives were undeniable truths that in turn drive distortion of the human psyche. Thus, American history and character are full of abominations hardly compensated for by parallel glories. Are boomers, or more generally Americans, really any worse than others throughout history? Probably not. Too many counter-examples to cite.

The current endgame phase of history is difficult to assess as we experience it. However, a curious theory came to my attention that fits well with my observation of a fundamental epistemological crisis that has made human cognition into a hall of mirrors. (See also here and here, and I admit cognition may have always been a self-deception.) In a recent Joe Rogan podcast, Eric Weinstein, who comes across as equally brilliant and disturbed (admitting that not much may separate those two categories), opines that humans can handle only 3–4 layers of deception before collapsing into disorientation. It’s probably a feature, not a bug, and many have learned to exploit it. The example Weinstein discusses (derivative of others’ analyses, I think) is professional wrestling. Fans and critics knew for a very long time that wrestling looks fake, yet until the late 1980s, wrestlers and promoters held fast to the façade that wresting matches are real sporting competitions rather than being “sports entertainments.” Once the jig was up, it turned out that fans didn’t really care; it was real enough for them. Now we’ve come full circle with arguments (and the term kayfabe) that although matches are staged and outcomes known in advance, the wresting itself is absolutely for real. So we encounter a paradox where what we’re told and shown is real, except that it isn’t, except that it sorta is, ultimately finding that it’s turtles all the way down. Enthusiastic, even rabid, embrace of the unreality of things is now a prime feature of the way we conduct ourselves.

Professional wrestling was not the first organization or endeavor to offer this style of mind-bending unreality. Deception and disinformation (e.g., magic shows, fortune-telling, con jobs, psyops) have been around forever. However, wrestling may well have perfected the style for entertainment purposes, which has in turn infiltrated nearly all aspects of modern life, not least of which are economics and politics. Thus, we have crypto- and fiat currencies based on nothing, where money can be materialized out of thin air to save itself from worthlessness, at least until that jig is up, too. We also have twin sham candidates for this fall’s U.S. presidential election, both clearly unfit for the job for different reasons. And in straightforward fictional entertainment, we have a strong revival of magical Medievalism, complete with mythical creatures, spells, and blades of fortune. As with economics and politics, we know it’s all a complex of brazen lies and gaslighting, but it’s nonetheless so tantalizing that its entertainment value outstrips and sidelines any calls to fidelity or integrity. Spectacle and fakery are frankly more interesting, more fun, more satisfying. Which brings me to my favorite Joe Bageant quote:

We have embraced the machinery of our undoing as recreation.

The crisis consists precisely in the fact that the old is dying and the new
cannot be born; in this interregnum a great variety of morbid symptoms appear.

Antonio Gramsci

 As a kid, I was confused when during some TV drama I heard the phrase “The king is dead; long live the king!” I was interpreting events too literally: the king had just died, so how could his subjects proclaim for him long life? Only when age awarded me greater sophistication (probably not wisdom, though) did I then realize that the phrase connotes the end of one era and the start of another. Old regent dies; new regent assumes power. We’re in the midst of such as transition from one era to the next, though it isn’t marked clearly by the death of a leader. Indeed, when I launched this blog in 2006, that was what I sensed and said so plainly in the About Brutus link at top, which hasn’t changed since then except to correct my embarrassing typos. I initially thought the transition would be about an emerging style of consciousness. Only slightly later, I fell down the rabbit hole regarding climate change (an anthropogenic, nonlinear, extinction-level process). I still believe my intuitions and/or conclusions on both subjects, but I’ve since realized that consciousness was always a moving target and climate change could unfold slowly enough to allow other fundamental shifts to occur alongside. No promises, though. We could also expire rather suddenly if things go awry quickly and unexpectedly. At this point, however, and in a pique of overconfidence, I’m willing to offer that another big transition has finally come into focus despite its being underway as I write. Let me explain. In his book America: The Farewell Tour (2018), Chris Hedges writes this:

Presently, 42 percent of the U.S. public believes in creationism … [and] nearly a third of the population, 94 million people, consider themselves evangelical. Those who remain in a reality-based universe do not take seriously the huge segment of the public, mostly white and working-class, who because of economic distress have primal yearnings for vengeance, new glory, and moral renewal and are easily seduced by magical thinking … The rational, secular forces, those that speak in the language of fact and reason, are hated and feared, for they seek to pull believers back into “the culture of death” that nearly destroyed them. The magical belief system, as it was for impoverished German workers who flocked to the Nazi Party, is an emotional life raft. It is all the supports them. [pp. 50–51]

That’s where we are now, retreating into magical thinking we supposedly left behind in the wake of the Enlightenment. Call it the Counter-Enlightenment (or Un-Enlightenment). We’re on this track for a variety of reasons but primarily because the bounties of the closing Age of Abundance have been gobbled up by a few plutocrats. Most of the rest of population, formerly living frankly precarious lives (thus, the precariat), have now become decidedly unnecessary (thus, the unnecessariat). The masses know that they have been poorly served by their own social, political, and cultural institutions, which have been systematically hijacked and diverted into service of the obscenely, absurdly rich.

Three developments occurring right now, this week, indicate that we’re not just entering an era of magical thinking (and severely diminishing returns) but that we’ve lost our shit, gone off the deep end, and sought escape valves to release intolerable pressures. It’s the same madness of crowds writ large — something that periodically overtakes whole societies, as noted above by Chris Hedges. Those developments are (1) the U.S. stock market (and those worldwide?) seesawing wildly on every piece of news, (2) deranged political narratives and brazenly corrupt machinations that attempt to, among other things, install select the preferred Democratic presidential candidate to defeat 45, and (3) widespread panic over the Covid-19 virus. Disproportionate response to the virus is already shutting down entire cities and regions even though the growing epidemic so far in the U.S. has killed fewer people than, say, traffic accidents. Which will wreak the worst mayhem is a matter of pointless conjecture since the seriousness of the historical discontinuity will require hindsight to access. Meanwhile, the king is dead. Long live the king!

A potpourri of recent newsbits and developments. Sorry, no links or support provided. If you haven’t already heard of most of these, you must be living under a rock. On a moment’s consideration, that may not be such a bad place to dwell.

rant on/

I just made up the word of the title, but anyone could guess its origin easily. Many of today’s political and thought leaders (not quite the same thing; politics doesn’t require much thought), as well as American institutions, are busy creating outrageously preposterous legacies for themselves. Doomers like me doubt anyone will be around to recall in a few decades. For instance, the mainstream media (MSM) garners well-deserved rebuke, often attacking each other in the form of one of the memes of the day: a circular firing squad. Its brazen attempts at thought-control (different thrusts at different media organs) and pathetic abandonment of mission to inform the public with integrity have hollowed it out. No amount of rebranding at the New York Times (or elsewhere) will overcome the fact that the public has largely moved on, swapping superhero fiction for the ubiquitous fictions spun by the MSM and politicians. The RussiaGate debacle may be the worst example, but the MSM’s failures extend well beyond that. The U.S. stock market wobbles madly around its recent all-time high, refusing to admit its value has been severely overhyped and inflated through quantitative easing, cheap credit (an artificial monetary value not unlike cryptocurrencies or fiat currency created out of nothing besides social consensus), and corporate buybacks. The next crash (already well overdue) is like the last hurricane: we might get lucky and it will miss us this season, but eventually our lottery number will come up like those 100-year floods now occurring every few years or decades.

Public and higher education systems continue to creak along, producing a glut of dropouts and graduates ill-suited to do anything but the simplest of jobs requiring no critical thought, little training, and no actual knowledge or expertise. Robots and software will replace them anyway. Civility and empathy are cratering: most everyone is ready and willing to flip the bird, blame others, air their dirty laundry in public, and indulge in casual violence or even mayhem following only modest provocation. Who hasn’t fantasized just a little bit about acting out wildly, pointlessly like the mass killers blackening the calendar? It’s now de rigueur. Thus, the meme infiltrates and corrupts vulnerable minds regularly. Systemic failure of the U.S. healthcare and prison systems — which ought to be public institutions but are, like education, increasingly operated for profit to exploit public resources — continues to be exceptional among developed nations, as does the U.S. military and its bloated budget.

Gaffe-prone Democratic presidential candidate Joe Biden cemented his reputation as a goof years ago yet continues to build upon it. One might think that at his age enough would have been enough, but the allure of the highest office in the land is just too great, so he guilelessly applies for the job and the indulgence of the American public. Of course, the real prize-winner is 45, whose constant stream of idiocy and vitriol sends an entire nation scrambling daily to digest their Twitter feeds and make sense of things. Who knows (certainly I don’t) how serious was his remark that he wanted to buy Greenland? It makes a certain sense that a former real-estate developer would offhandedly recommend an entirely new land grab. After all, American history is based on colonialism and expansionism. No matter that the particular land in question is not for sale (didn’t matter for most of our history, either). Of course, everyone leapt into the news cycle with analysis or mockery, only the second of which was appropriate. Even more recent goofiness was 45’s apparent inability to read a map resulting in the suggestion that Hurricane Dorian might strike Alabama. Just as with the Greenland remark, PR flacks went to work to manage and reconfigure public memory, revising storm maps for after-the-fact justification. Has anyone in the media commented that such blatant historical revisionism is the stuff of authoritarian leaders (monarchs, despots, and tyrants) whose underlings and functionaries, fearing loss of livelihood if not indeed life, provide cover for mistakes that really ought to lead to simple admission of error and apology? Nope, just add more goofs to the heaping pile of preposterity.

Of course, the U.S. is hardly alone in these matters. Japan and Russia are busily managing perception of their respective ongoing nuclear disasters, including a new one in Russia that has barely broken through our collective ennui. Having followed the U.S. and others into industrialization and financialization of its economy, China is running up against the same well-known ecological despoliation and limits to growth and is now circling the drain with us. The added spectacle of a trade war with the petulant president in the U.S. distracts everyone from coming scarcity. England has its own clownish supreme leader, at least for now, trying to manage an intractable but binding issue: Brexit. (Does every head of state need a weirdo hairdo?) Like climate change, there is no solution no matter how much steadfast hoping and wishing one into existence occurs, so whatever eventually happens will throw the region into chaos. Folks shooting each other for food and fresh water in the Bahamas post-Hurricane Dorian is a harbinger of violent hair-triggers in the U.S. poised to fire at anything that moves when true existential threats finally materialize. Thus, our collective human legacy is absurd and self-destroying. No more muddling through.

/rant off

Decades ago, I read Douglas Adams’ Hitchhiker’s Guide to the Galaxy trilogy. Lots of inventive things in those books have stayed with me despite not having revisited them. For instance, I found the SEP (Somebody-Else’s-Problem) Field and the infinite improbability drive tantalizing concepts even though they’re jokes. Another that resonates more as I age is disorientation felt (according to Adams) because of dislocation more than 500 light-years away from home, namely, the planet of one’s origin. When I was younger, my wanderlust led me to venture out into the world (as opposed to the galaxy), though I never gave much thought to the stabilizing effect of the modest town in which I grew up before moving to a more typical American suburb and then to various cities, growing more anonymous with each step. Although I haven’t lived in that town for 25+ years, I pass through periodically and admit it still feels like home. Since moving away, it’s been swallowed up in suburban sprawl and isn’t really the same place anymore.

Reading chapter 4 of Pankaj Mishra’s The Age of Anger brought back to me the idea of being rooted in a particular place and its culture, and more significantly, how those roots can be severed even without leaving. The main cause appears to be cultural and economic infiltration by foreign elements, which has occurred many places through mere demographic drift and in others by design or force (i.e., colonialism and globalization). How to characterize the current waves of political, economic, and climate refugees inundating Europe and the smaller migration of Central Americans (and others) into the U.S. is a good question. I admit to being a little blasé about it: like water, people gonna go where they gonna go. Sovereign states can attempt to manage immigration somewhat, but stopgap administration ultimately fails, at least in open societies. In the meantime, the intractable issue has made many Americans paranoid and irrational while our civil institutions have become decidedly inhumane in their mistreatment of refugees. The not-so-hidden migration is Chinese people into Africa. Only the last of these migrations gives off the stink of neocolonialism, but they all suggest decades of inflamed racial tension to come if not open race wars.

Mishra cites numerous authors and political leaders/revolutionaries in chapter 4 who understand and observe that modernizing and Westernizing countries, especially those attempting to catch up, produce psychic turmoil in their populations because of abandonment and transformation of their unique, local identities as they move, for instance, from predominantly agrarian social organization to urbanization in search of opportunity and in the process imitate and adopt inappropriate Western models. Mishra quotes a 1951 United Nations document discussing the costs of supposed progress:

There is a sense in which rapid economic progress in impossible without painful adjustments. Ancient philosophies have to be scrapped; old social institutions have to disintegrate; bonds of cast, creed and race have to burst; and large numbers of persons who cannot keep up with progress have to have their expectations of a comfortable life frustrated. [p. 118]

Thus, men were “uprooted from rural habitats and condemned to live in the big city,” which is a reenactment of the same transformation the West underwent previously. Another insightful passage comes from the final page of Westoxification (1962) or Weststruckness (English transliteration varies) by the Iranian novelist Jalal Al-e-Ahmad:

And now I, not as an Easterner, but as one like the first Muslims, who expected to see the Resurrection on the Plain of Judgment in their lifetimes, see that Albert Camus, Eugene Ionesco, Ingmar Bergman, and many other artists, all of them from the West, are proclaiming this same resurrection. All regard the end of human affairs with despair. Sartre’s Erostratus fires a revolver at the people in the street blindfolded; Nabokov’s protagonist drives his car into the crowd; and the stranger, Meursault, kills someone in reaction to a bad case of sunburn. These fictional endings all represent where humanity is ending up in reality, a humanity that, if it does not care to be crushed under the machine, must go about in a rhinoceros’s skin. [pp. 122–123]

It’s unclear that the resurrection referenced above is the Christian one. Nonetheless, how sobering is it to recognize that random, anonymous victims of nihilistic violence depicted in storytelling have their analogues in today’s victims of mass killings? A direct line of causality from the severed roots of place to violent incidents cannot be drawn clearly, but the loss of a clear, stabilizing sense of self, formerly situated within a community now suffering substantial losses of historical continuity and tradition, is certainly an ingredient.

More to come in pt. 2.

As I reread what I wrote 2.5 years ago in my first blog on this topic, I surmise that the only update needed to my initial assessment is a growing pile of events that demonstrate my thesis: our corrupted information environment is too taxing on human cognition, with the result that a small but growing segment of society gets radicalized (wound up like a spring) and relatively random individuals inevitably pop, typically in a self-annihilating gush of violence. News reports bear this out periodically, as one lone-wolf kook after another takes it upon himself (are there any examples of females doing this?) to shoot or blow up some target, typically chosen irrationally or randomly though for symbolic effect. More journalists and bloggers are taking note of this activity and evolving or resurrecting nomenclature to describe it.

The earliest example I’ve found offering nomenclature for this phenomenon is a blog with a single post from 2011 (oddly, no follow-up) describing so-called stochastic terrorism. Other terms include syntactic violence, semantic violence, and epistemic violence, but they all revolve around the same point. Whether on the sending or receiving end of communications, some individuals are particularly adept at or sensitive to dog whistles that over time activate and exacerbate tendencies toward radical ideology and violence. Wired has a brief article from a few days ago discussing stochastic terrorism as jargon, which is basically what I’m doing here. Admittedly, the last of these terms, epistemic violence (alternative: epistemological violence), ranges farther afield from the end effect I’m calling wind-up toys. For instance, this article discussing structural violence is much more academic in character than when I blogged on the same term (one of a handful of “greatest hits” for this blog that return search-engine hits with some regularity). Indeed, just about any of my themes and topics can be given a dry, academic treatment. That’s not my approach (I gather opinions differ on this account, but I insist that real academic work is fundamentally different from my armchair cultural criticism), but it’s entirely valid despite being a bit remote for most readers. One can easily get lost down the rabbit hole of analysis.

If indeed it’s mere words and rhetoric that transform otherwise normal people into criminals and mass murderers, then I suppose I can understand the distorted logic of the far Left that equates words and rhetoric themselves with violence, followed by the demand that they be provided with warnings and safe spaces lest they be triggered by what they hear, read, or learn. As I understand it, the fear is not so much that vulnerable, credulous folks will be magically turned into automatons wound up and set loose in public to enact violent agendas but instead that virulent ideas and knowledge (including many awful truths of history) might cause discomfort and psychological collapse akin to what happens to when targets of hate speech and death threats are reduced, say, to quivering agoraphobia. Desire for protection from harm is thus understandable. The problem with such logic, though, is that protections immediately run afoul of free speech, a hallowed but misunderstood American institution that preempts quite a few restrictions many would have placed on the public sphere. Protections also stall learning and truth-seeking straight out of the gate. And besides, preemption of preemption doesn’t work.

In information theory, the notion of a caustic idea taking hold of an unwilling person and having its wicked way with him or her is what’s called a mind virus or meme. The viral metaphor accounts for the infectious nature of ideas as they propagate through the culture. For instance, every once in a while, a charismatic cult emerges and inducts new members, a suicide cluster appears, or suburban housewives develop wildly disproportionate phobias about Muslims or immigrants (or worse, Muslim immigrants!) poised at their doorsteps with intentions of rape and murder. Inflaming these phobias, often done by pundits and politicians, is precisely the point of semantic violence. Everyone is targeted but only a few are affected to the extreme of acting out violently. Milder but still invalid responses include the usual bigotries: nationalism, racism, sexism, and all forms of tribalism, “othering,” or xenophobia that seek to insulate oneself safely among like folks.

Extending the viral metaphor, to protect oneself from infectious ideas requires exposure, not insulation. Think of it as a healthy immune system built up gradually, typically early in life, through slow, steady exposure to harm. The alternative is hiding oneself away from germs and disease, which has the ironic result of weakening the immune system. For instance, I learned recently that peanut allergies can be overcome by gradual exposure — a desensitization process — but are exacerbated by removal of peanuts from one’s environment and/or diet. This is what folks mean when they say the answer to hate speech is yet more (free) speech. The nasty stuff can’t be dealt with properly when it’s quarantined, hidden away, suppressed, or criminalized. Maybe there are exceptions. Science fiction entertains those dangers with some regularity, where minds are shunted aside to become hosts for invaders of some sort. That might be overstating the danger somewhat, but violent eruptions may provide some credence.

A year ago, I wrote about charges of cultural appropriation being levied upon fiction writers, as though fiction can now only be some watered-down memoir lest some author have the temerity to conjure a character based on someone other than him- or herself. Specifically, I linked to an opinion piece by Lionel Shriver in the NY Times describing having been sanctioned for writing characters based on ideas, identities, and backgrounds other that his own. Shriver has a new article in Prospect Magazine that provides an update, perhaps too soon to survey the scene accurately since the target is still moving, but nonetheless curious with respect to the relatively recent appearance of call-out culture and outrage engines. In his article, Shriver notes that offense and umbrage are now given equal footing with bodily harm and emotional scarring:

Time was that children were taught to turn aside tormentors with the cry, “Sticks and stones may break my bones, but words will never hurt me!” While you can indeed feel injured because Bobby called you fat, the law has traditionally maintained a sharp distinction between bodily and emotional harm. Even libel law requires a demonstration of palpable damage to reputation, which might impact your livelihood, rather than mere testimony that a passage in a book made you cry.

He also points out that an imagined “right not to be offended” is now frequently invoked, even though there is no possibility of avoiding offense if one is actually conscious in the world. For just one rather mundane example, the extraordinary genocidal violence of 20th-century history, once machines and mechanisms (now called WMDs) were applied to warfare (and dare I say it: statecraft), ought to be highly offensive to any humanitarian. That history cannot be erased, though I suppose it can be denied, revised, buried, and/or lost to living memory. Students or others who insist they be excused from being triggered by knowledge of awful events are proverbial ostriches burying their heads in the sand.

As variations of this behavior multiply and gain social approval, the Thought Police are busily mustering against all offense — real, perceived, or wholly imagined — and waging a broad-spectrum sanitation campaign. Shriver believes this could well pose the end of fiction as publishers morph into censors and authors self-censor in an attempt to pass through the SJW gauntlet. Here’s my counter-argument:

rant on/

I feel mightily offended — OFFENDED I say! — at the arrant stupidity of SJWs whose heads are full of straw (and strawmen), who are so clearly confused about what is even possible within the dictates and strictures of, well, reality, and accordingly retreated into cocoons of ideation from which others are scourged for failure to adhere to some bizarre, muddleheaded notion of equity. How dare you compel me to think prescribed thoughts emanating from your thought bubble, you damn bullies? I have my own thoughts and feelings deserving of support, maybe even more than yours considering your obvious naïveté about how the world works. Why aren’t you laboring to promote mine but instead clamoring to infect everyone with yours? Why is my writing so resoundingly ignored while you prance upon the stage demanding my attention? You are an affront to my values and sensibilities and can stuff your false piety and pretend virtue where the sun don’t shine. Go ahead and be offended; this is meant to offend. If it’s gonna be you or me who’s transgressed precisely because all sides of issues can’t be satisfied simultaneously, then on this issue, I vote for you to be in the hot seat.

rant off/

This Savage Love column got my attention. As with Dear Abby, Ask Marylin, or indeed any advice column, I surmise that questions are edited for publication. Still, a couple minor usage errors attracted my eye, which I can let go without further chastising comment. More importantly, question and answer both employ a type of Newspeak commonplace among those attuned to identity politics. Those of us not struggling with identity issues may be less conversant with this specialized language, or it could be a generational thing. Coded speech is not unusual within specialized fields of endeavor. My fascination with nomenclature and neologisms makes me pay attention, though I’m not typically an adopter of hip new coin.

The Q part of Q&A never actually asks a question but provides context to suggest or extrapolate one, namely, “please advise me on my neuro-atypicality.” (I made up that word.) While the Q acknowledges that folks on the autism spectrum are not neurotypical, the word disability is put in quotes (variously, scare quotes, air quotes, or irony quotes), meaning that it is not or should not be considered a real or true disability. Yet the woman acknowledges her own difficulty with social signaling. The A part of Q&A notes a marked sensitivity to social justice among those on the spectrum, acknowledges a correlation with nonstandard gender identity (or is it sexual orientation?), and includes a jibe that standard advice is to mimic neurotypical behaviors, which “tend to be tediously heteronormative and drearily vanilla-centric.” The terms tediously, drearily , and vanilla push unsubtly toward normalization and acceptance of kink and aberrance, as does Savage Love in general. I wrote about this general phenomenon in a post called “Trans is the New Chic.”

Whereas I have no hesitation to express disapproval of shitty people, shitty things, and shitty ideas, I am happy to accept many mere differences as not caring two shits either way. This question asks about something fundamental human behavior: sexual expression. Everyone needs an outlet, and outliers (atypicals, nonnormatives, kinksters, transgressors, etc.) undoubtedly have more trouble than normal folks. Unless living under a rock, you’ve no doubt heard and/or read theories from various quarters that character distortion often stems from sexual repression or lack of sexual access, which describes a large number of societies historical and contemporary. Some would include the 21st-century U.S. in that category, but I disagree. Sure, we have puritanical roots, recent moral panic over sexual buffoonery and crimes, and a less healthy sexual outlook than, say, European cultures, but we’re also suffused in licentiousness, Internet pornography, and everyday seductions served up in the media via advertising, R-rated cinema, and TV-MA content. It’s a decidedly mixed bag.

Armed with a lay appreciation of sociology, I can’t help but to observe that humans are a social species with hierarchies and norms, not as rigid or prescribed perhaps as with insect species, but nonetheless possessing powerful drives toward consensus, cooperation, and categorization. Throwing open the floodgates to wide acceptance of aberrant, niche behaviors strikes me as swimming decidedly upstream in a society populated by a sizable minority of conservatives mightily offended by anything falling outside the heteronormative mainstream. I’m not advocating either way but merely observing the central conflict.

All this said, the thing that has me wondering is whether autism isn’t itself an adaptation to information overload commencing roughly with the rise of mass media in the early 20th century. If one expects that the human mind is primarily an information processor and the only direction is to process ever more information faster and more accurately than in the past, well, I have some bad news: we’re getting worse at it, not better. So while autism might appear to be maladaptive, filtering out useless excess information might unintuitively prove to be adaptive, especially considering the disposition toward analytical, instrumental thinking exhibited by those on the spectrum. How much this style of mind is valued in today’s world is an open question. I also don’t have an answer to the nature/nurture aspect of the issue, which is whether the adaptation/maladaptation is more cultural or biological. I can only observe that it’s on the rise, or at least being recognized and diagnosed more frequently.

I watched a documentary on Netflix called Jim & Andy (2017) that provides a glimpse behind the scenes of the making of Man on the Moon (1999) where Jim Carrey portrays Andy Kaufman. It’s a familiar story of art imitating life (or is it life imitating art?) as Carrey goes method and essentially channels Kaufman and Kaufman’s alter ego Tony Clifton. A whole gaggle of actors played earlier incarnations of themselves in Man on the Moon and appeared as themselves (without artifice) in Jim & Andy, adding another weird dimension to the goings on. Actors losing themselves in roles and undermining their sense of self is hardly novel. Regular people lose themselves in their jobs, hobbies, media hype, glare of celebrity, etc. all the time. From an only slightly broader perspective, we’re all merely actors playing roles, shifting subtly or dramatically based on context. Shakespeare observed it centuries ago. However, the documentary points to a deeper sense of unreality precisely because Kaufman’s principal shtick was to push discomfiting jokes/performances beyond the breaking point, never dropping the act to let his audience in on the joke or provide closure. It’s a manifestation of what I call the Disorientation Protocol.

(more…)