Posts Tagged ‘Culture’

Long again this time and a bit contentious. Sorry for trying your patience.

Having watched a few hundred Joe Rogan webcasts by now (previous blog on this topic here), I am pretty well acquainted with guests and ideas that cycle through periodically. This is not a criticism as I’m aware I recycle my own ideas here, which is more nearly thematic than simply repetitive. Among all the MMA folks and comedians, Rogan features people — mostly academics — who might be called thought leaders. A group of them has even been dubbed the “intellectual dark web.” I dunno who coined the phrase or established its membership, but the names might include, in no particular order, Jordan Peterson, Bret Weinstein, Eric Weinstein, Douglas Murray, Sam Harris, Jonathan Haidt, Gad Saad, Camille Paglia, Dave Ruben, Christina Hoff Sommers, and Lawrence Krauss. I doubt any of them would have been considered cool kids in high school, and it’s unclear whether they’re any cooler now that they’ve all achieved some level of Internet fame on top of other public exposure. Only a couple seem especially concerned with being thought cool now (names withheld), though the chase for clicks, views, likes, and Patreon support is fairly upfront. That they can usually sit down and have meaningful conversations without rancor (admirably facilitated by Joe Rogan up until one of his own oxen is gored, less admirably by Dave Ruben) about free speech, Postmodernism, social justice warriors, politics, or the latest meme means that the cliquishness of high school has relaxed considerably.

I’m pleased (I guess) that today’s public intellectuals have found an online medium to develop. Lots of imitators are out there putting up their own YouTube channels to proselytize their own opinions. However, I still prefer to get deeper understanding from books (and to a lesser degree, blogs and articles online), which are far better at delivering thoughtful analysis. The conversational style of the webcast is relentlessly up-to-date and entertaining enough but relies too heavily on charisma. And besides, so many of these folks are such fast talkers, often talking over each other to win imaginary debate points or just dominate the conversational space, that they frustrate and bewilder more than they communicate or convince.



A year ago, I wrote about charges of cultural appropriation being levied upon fiction writers, as though fiction can now only be some watered-down memoir lest some author have the temerity to conjure a character based on someone other than him- or herself. Specifically, I linked to an opinion piece by Lionel Shriver in the NY Times describing having been sanctioned for writing characters based on ideas, identities, and backgrounds other that his own. Shriver has a new article in Prospect Magazine that provides an update, perhaps too soon to survey the scene accurately since the target is still moving, but nonetheless curious with respect to the relatively recent appearance of call-out culture and outrage engines. In his article, Shriver notes that offense and umbrage are now given equal footing with bodily harm and emotional scarring:

Time was that children were taught to turn aside tormentors with the cry, “Sticks and stones may break my bones, but words will never hurt me!” While you can indeed feel injured because Bobby called you fat, the law has traditionally maintained a sharp distinction between bodily and emotional harm. Even libel law requires a demonstration of palpable damage to reputation, which might impact your livelihood, rather than mere testimony that a passage in a book made you cry.

He also points out that an imagined “right not to be offended” is now frequently invoked, even though there is no possibility of avoiding offense if one is actually conscious in the world. For just one rather mundane example, the extraordinary genocidal violence of 20th-century history, once machines and mechanisms (now called WMDs) were applied to warfare (and dare I say it: statecraft), ought to be highly offensive to any humanitarian. That history cannot be erased, though I suppose it can be denied, revised, buried, and/or lost to living memory. Students or others who insist they be excused from being triggered by knowledge of awful events are proverbial ostriches burying their heads in the sand.

As variations of this behavior multiply and gain social approval, the Thought Police are busily mustering against all offense — real, perceived, or wholly imagined — and waging a broad-spectrum sanitation campaign. Shriver believes this could well pose the end of fiction as publishers morph into censors and authors self-censor in an attempt to pass through the SJW gauntlet. Here’s my counter-argument:

rant on/

I feel mightily offended — OFFENDED I say! — at the arrant stupidity of SJWs whose heads are full of straw (and strawmen), who are so clearly confused about what is even possible within the dictates and strictures of, well, reality, and accordingly retreated into cocoons of ideation from which others are scourged for failure to adhere to some bizarre, muddleheaded notion of equity. How dare you compel me to think prescribed thoughts emanating from your thought bubble, you damn bullies? I have my own thoughts and feelings deserving of support, maybe even more than yours considering your obvious naïveté about how the world works. Why aren’t you laboring to promote mine but instead clamoring to infect everyone with yours? Why is my writing so resoundingly ignored while you prance upon the stage demanding my attention? You are an affront to my values and sensibilities and can stuff your false piety and pretend virtue where the sun don’t shine. Go ahead and be offended; this is meant to offend. If it’s gonna be you or me who’s transgressed precisely because all sides of issues can’t be satisfied simultaneously, then on this issue, I vote for you to be in the hot seat.

rant off/

I remarked in an earlier blog that artists, being hypersensitive to emergent patterns and cultural vibes, often get to ideas sooner than the masses and express their sensibilities through creative endeavor. Those expressions in turn give watchers, viewers, listeners, readers, etc. a way of understanding the world through the artist’s interpretive lens. Interpretations may be completely fictitious, based on real-life events, or merely figurative as the medium allows. They are nonetheless an inevitable reflection of ourselves. Philistines who fail to appreciate that the arts function by absorbing and processing human experience at a deep, intuitive level may insist that the arts are optional or unworthy of attention or financial support. That’s an opinion not at all borne out in the culture, however, and though support may be vulnerable to shifts in valuation (e.g., withdrawal of federal funding for the NEA and PBS), the creative class will always seek avenues of expression, even at personal cost. The democratization of production has made modes of production and distribution for some media quite cheap compared to a couple decades ago. Others remain undeniably labor intensive.

What sparked my thinking are several TV series that have caught my attention despite my generally low level of attention to such media. I haven’t watched broadcast television in over a decade, but the ability to stream TV programming has made shows I have ignored for years far more easy to tune in on my own terms and schedule. “Tune in” is of course the wrong metaphor, but suffice it to say I’ve awarded some of my attention to shows that have up until now fell out of scope for me, cinema being more to my liking. The three shows I’ve been watching (only partway through each) are The Americans, Homeland, and Shameless. The first two are political thrillers (spy stuff) whereas the last is a slice-of-life family drama, which often veers toward comedy but keeps delivering instead tragedy. Not quite the same thing as dark comedy. Conflict is necessary for dramatic purposes, but the ongoing conflict in each of these shows flirts with the worst sorts of disaster, e.g., the spies being discovered and unmasked and the family being thrown out of its home and broken up. Episodic scenarios the writers concoct to threaten catastrophe at every step or at any moment gets tiresome after a while. Multiple seasons ensure that dramatic tension is largely dispelled, since the main characters are present year over year. (The trend toward killing off major characters in others popular TV dramas is not yet widespread.) But still, it’s no way to live, constantly in disaster mode. No doubt I’ve cherry picked three shows from a huge array of entertainments on offer.

Where art reflects reality is that we all now live in the early 21st century under multiple, constantly disquieting threats, large and small, including sudden climate change and ecological disaster, nuclear annihilation, meteor impacts, eruption of the shield volcano under Yellowstone, the Ring of Fire becoming active again (leading to more volcanic and earthquake activity), geopolitical dysfunction on a grand scale, and of course, global financial collapse. This, too, is no way to live. Admittedly, no one was ever promised a care-free life. Yet our inability to manage our own social institutions or shepherd the earth (as though that were our mandate) promise catastrophes in the fullness of time that have no parallels in human history. We’re not flirting with disaster so much as courting it.

Sociologists and historians prepare scholarly works that attempt to provide a grand narrative of the times. Cinema seems to be preoccupied with planetary threats requiring superhero interventions. Television, on the other hand, with its serial form, plumbs the daily angst of its characters to drive suspense, keeping viewers on pins and needles while avoiding final resolution. That final resolution is inevitably disaster, but it won’t appear for a few seasons at least — after the dramatic potential is wrung out of the scenario. I can’t quite understand why these shows are consumed for entertainment (by me no less than anyone else) except perhaps to distract from the clear and present dangers we all face every day.

Oddly, there is no really good antonym for perfectionism. Suggestions include sloppiness, carelessness, and disregard. I’ve settled on approximation, which carries far less moral weight. I raise the contrast between perfectionism and approximation because a recent study published in Psychological Bulletin entitled “Perfectionism Is Increasing Over Time: A Meta-Analysis of Birth Cohort Differences From 1989 to 2016″ makes an interesting observation. Here’s the abstract:

From the 1980s onward, neoliberal governance in the United States, Canada, and the United Kingdom has emphasized competitive individualism and people have seemingly responded, in kind, by agitating to perfect themselves and their lifestyles. In this study, the authors examine whether cultural changes have coincided with an increase in multidimensional perfectionism in college students over the last 27 years. Their analyses are based on 164 samples and 41,641 American, Canadian, and British college students, who completed the Multidimensional Perfectionism Scale (Hewitt & Flett, 1991) between 1989 and 2016 (70.92% female, Mage = 20.66). Cross-temporal meta-analysis revealed that levels of self-oriented perfectionism, socially prescribed perfectionism, and other-oriented perfectionism have linearly increased. These trends remained when controlling for gender and between-country differences in perfectionism scores. Overall, in order of magnitude of the observed increase, the findings indicate that recent generations of young people perceive that others are more demanding of them, are more demanding of others, and are more demanding of themselves.

The notion of perfection, perfectness, perfectibility, etc. has a long tortured history in philosophy, religion, ethics, and other domains I won’t even begin to unpack. From the perspective of the above study, let’s just say that the upswing in perfectionism is about striving to achieve success, however one assesses it (education, career, relationships, lifestyle, ethics, athletics, aesthetics, etc.). The study narrows its subject group to college students (at the outset of adult life) between 1989 and 2016 and characterizes the social milieu as neoliberal, hyper-competitive, meritocratic, and pressured to succeed in a dog-eat-dog environment. How far back into childhood results of the study (agitation) extend is a good question. If the trope about parents obsessing and competing over preschool admission is accurate (may be just a NYC thang), then it goes all the way back to toddlers. So much for (lost) innocence purchased and perpetuated through late 20th- and early 21st-century affluence. I suspect college students are responding to awareness of two novel circumstances: (1) likelihood they will never achieve levels of success comparable to their own parents, especially financial (a major reversal of historical trends) and (2) recognition that to best enjoy the fruits of life, a quiet, reflective, anonymous, ethical, average life is now quite insufficient. Regarding the second of these, we are inundated by media showing rich celebrities (no longer just glamorous actors/entertainers) balling out of control, and onlookers are enjoined to “keep up.” The putative model is out there, unattainable for most but often awarded by randomness, undercutting the whole enterprise of trying to achieve perfection.


Adding one, revising one. The added one is The Daily Impact, written by Tom Lewis, author of a couple books warning of the collapse of industrial civilization. Lewis appears to be a news junkie, so posts are often torn from the day’s headlines. He’s a good read and not afraid to be sardonically funny. The revised one is The Compulsive Explainer, written by Hal Smith. Blogs come and go, and I had thought that The Compulsive Explainer had stopped being updated last summer, but I see that the author merely switched from WordPress to Blogger without any indication. I suspect Smith isn’t much read (if commentary is a useful measure) but probably deserves to be, not least for his ex patriot perspective.

Because this entry is so slight, there is considerably more unrelated content beneath the fold. (more…)

Twice in the last month I stumbled across David Benatar, an anti-natalist philosopher, first in a podcast with Sam Harris and again in a profile of him in The New Yorker. Benatar is certainly an interesting fellow, and I suspect earnest in his beliefs and academic work, but I couldn’t avoid shrugging as he gets caught in the sort of logical traps that plague hyperintellectual folks. (Sam Harris is prone to the same problem.) The anti-natalist philosophy in a nutshell is finding, after tallying the pros and cons of living (sometimes understood as happiness or enjoyment versus suffering), that on balance, it would probably be better never to have lived. Benatar doesn’t apply the finding retroactively by suggesting folks end their lives sooner rather than later, but he does recommend that new life should not be brought into the world — an interdiction almost no parent would consider for more than a moment.

The idea that we are born against our will, never asked whether we wanted life in the first place, is an obvious conundrum but treated as a legitimate line of inquiry in Benatar’s philosophy. The kid who throws the taunt “I never asked to be born!” to a parent in the midst of an argument might score an emotional hit, but there is no logic to the assertion. Language is full of logic traps like this, such as “an infinity of infinities” (or multiverse), “what came before the beginning?” or “what happens after the end?” Most know to disregard the former, but entire religions are based on seeking the path to the (good) afterlife as if conjuring such a proposition manifests it in reality. (more…)

This Savage Love column got my attention. As with Dear Abby, Ask Marylin, or indeed any advice column, I surmise that questions are edited for publication. Still, a couple minor usage errors attracted my eye, which I can let go without further chastising comment. More importantly, question and answer both employ a type of Newspeak commonplace among those attuned to identity politics. Those of us not struggling with identity issues may be less conversant with this specialized language, or it could be a generational thing. Coded speech is not unusual within specialized fields of endeavor. My fascination with nomenclature and neologisms makes me pay attention, though I’m not typically an adopter of hip new coin.

The Q part of Q&A never actually asks a question but provides context to suggest or extrapolate one, namely, “please advise me on my neuro-atypicality.” (I made up that word.) While the Q acknowledges that folks on the autism spectrum are not neurotypical, the word disability is put in quotes (variously, scare quotes, air quotes, or irony quotes), meaning that it is not or should not be considered a real or true disability. Yet the woman acknowledges her own difficulty with social signaling. The A part of Q&A notes a marked sensitivity to social justice among those on the spectrum, acknowledges a correlation with nonstandard gender identity (or is it sexual orientation?), and includes a jibe that standard advice is to mimic neurotypical behaviors, which “tend to be tediously heteronormative and drearily vanilla-centric.” The terms tediously, drearily , and vanilla push unsubtly toward normalization and acceptance of kink and aberrance, as does Savage Love in general. I wrote about this general phenomenon in a post called “Trans is the New Chic.”

Whereas I have no hesitation to express disapproval of shitty people, shitty things, and shitty ideas, I am happy to accept many mere differences as not caring two shits either way. This question asks about something fundamental human behavior: sexual expression. Everyone needs an outlet, and outliers (atypicals, nonnormatives, kinksters, transgressors, etc.) undoubtedly have more trouble than normal folks. Unless living under a rock, you’ve no doubt heard and/or read theories from various quarters that character distortion often stems from sexual repression or lack of sexual access, which describes a large number of societies historical and contemporary. Some would include the 21st-century U.S. in that category, but I disagree. Sure, we have puritanical roots, recent moral panic over sexual buffoonery and crimes, and a less healthy sexual outlook than, say, European cultures, but we’re also suffused in licentiousness, Internet pornography, and everyday seductions served up in the media via advertising, R-rated cinema, and TV-MA content. It’s a decidedly mixed bag.

Armed with a lay appreciation of sociology, I can’t help but to observe that humans are a social species with hierarchies and norms, not as rigid or prescribed perhaps as with insect species, but nonetheless possessing powerful drives toward consensus, cooperation, and categorization. Throwing open the floodgates to wide acceptance of aberrant, niche behaviors strikes me as swimming decidedly upstream in a society populated by a sizable minority of conservatives mightily offended by anything falling outside the heteronormative mainstream. I’m not advocating either way but merely observing the central conflict.

All this said, the thing that has me wondering is whether autism isn’t itself an adaptation to information overload commencing roughly with the rise of mass media in the early 20th century. If one expects that the human mind is primarily an information processor and the only direction is to process ever more information faster and more accurately than in the past, well, I have some bad news: we’re getting worse at it, not better. So while autism might appear to be maladaptive, filtering out useless excess information might unintuitively prove to be adaptive, especially considering the disposition toward analytical, instrumental thinking exhibited by those on the spectrum. How much this style of mind is valued in today’s world is an open question. I also don’t have an answer to the nature/nurture aspect of the issue, which is whether the adaptation/maladaptation is more cultural or biological. I can only observe that it’s on the rise, or at least being recognized and diagnosed more frequently.

I watched a documentary on Netflix called Jim & Andy (2017) that provides a glimpse behind the scenes of the making of Man on the Moon (1999) where Jim Carrey portrays Andy Kaufman. It’s a familiar story of art imitating life (or is it life imitating art?) as Carrey goes method and essentially channels Kaufman and Kaufman’s alter ego Tony Clifton. A whole gaggle of actors played earlier incarnations of themselves in Man on the Moon and appeared as themselves (without artifice) in Jim & Andy, adding another weird dimension to the goings on. Actors losing themselves in roles and undermining their sense of self is hardly novel. Regular people lose themselves in their jobs, hobbies, media hype, glare of celebrity, etc. all the time. From an only slightly broader perspective, we’re all merely actors playing roles, shifting subtly or dramatically based on context. Shakespeare observed it centuries ago. However, the documentary points to a deeper sense of unreality precisely because Kaufman’s principal shtick was to push discomfiting jokes/performances beyond the breaking point, never dropping the act to let his audience in on the joke or provide closure. It’s a manifestation of what I call the Disorientation Protocol.


Here’s the last interesting bit I am lifting from Anthony Gidden’s The Consequences of Modernity. Then I will be done with this particular book-blogging project. As part of Gidden’s discussion of the risk profile of modernity, he characterizes risk as either objective or perceived and further divides in into seven categories:

  1. globalization of risk (intensity)
  2. globalization of risk (frequency)
  3. environmental risk
  4. institutionalized risk
  5. knowledge gaps and uncertainty
  6. collective or shared risk
  7. limitations of expertise

Some overlap exists, and I will not distinguish them further. The first two are of primary significance today for obvious reasons. Although the specter of doomsday resulting from a nuclear exchange has been present since the 1950s, Giddens (writing in 1988) provides this snapshot of today’s issues:

The sheer number of serious risks in respect of socialised nature is quite daunting: radiation from major accidents at nuclear power-stations or from nuclear waste; chemical pollution of the seas sufficient to destroy the phytoplankton that renews much of the oxygen in the atmosphere; a “greenhouse effect” deriving from atmospheric pollutants which attack the ozone layer, melting part of the ice caps and flooding vast areas; the destruction of large areas of rain forest which are a basic source of renewable oxygen; and the exhaustion of millions of acres of topsoil as a result of widespread use of artificial fertilisers. [p. 127]

As I often point out, these dangers were known 30–40 years ago (in truth, much longer), but they have only worsened with time through political inaction and/or social inertia. After I began to investigate and better understand the issues roughly a decade ago, I came to the conclusion that the window of opportunity to address these risks and their delayed effects had already closed. In short, we’re doomed and living on borrowed time as the inevitable consequences of our actions slowly but steadily manifest in the world.

So here’s the really interesting part. The modern worldview bestows confidence borne out of expanding mastery of the built environment, where risk is managed and reduced through expert systems. Mechanical and engineering knowledge figure prominently and support a cause-and-effect mentality that has grown ubiquitous in the computing era, with its push-button inputs and outputs. However, the high modern outlook is marred by overconfidence in our competence to avoid disaster, often of our own making. Consider the abject failure of 20th-century institutions to handle geopolitical conflict without devolving into world war and multiple genocides. Or witness periodic crashes of financial markets, two major nuclear accidents, and numerous space shuttles and rockets destroyed. Though all entail risk, high-profile failures showcase our overconfidence. Right now, engineers (software and hardware) are confident they can deliver safe self-driving vehicles yet are blithely ignoring (says me, maybe not) major ethical dilemmas regarding liability and technological unemployment. Those are apparently problems for someone else to solve.

Since the start of the Industrial Revolution, we’ve barrelled headlong into one sort of risk after another, some recognized at the time, others only apparent after the fact. Nuclear weapons are the best example, but many others exist. The one I raise frequently is the live social experiment undertaken with each new communications technology (radio, cinema, telephone, television, computer, social networks) that upsets and destabilizes social dynamics. The current ruckus fomented by the radical left (especially in the academy but now infecting other environments) regarding silencing of free speech (thus, thought policing) is arguably one concomitant.

According to Giddens, the character of modern risk contrasts with that of the premodern. The scale of risk prior to the 17th century was contained and expectation of social continuity was strong. Risk was also transmuted through magical thinking (superstition, religion, ignorance, wishfulness) into providential fortuna or mere bad luck, which led to feelings of relative security rather than despair. Modern risk has now grown so widespread, consequential, and soul-destroying, situated at considerable remove leading to feelings of helplessness and hopelessness, that those not numbed by the litany of potential worries afflicting daily life (existential angst or ontological insecurity) often develop depression and other psychological compulsions and disturbances. Most of us, if aware of globalized risk, set it aside so that we can function and move forward in life. Giddens says that this conjures up anew a sense of fortuna, that our fate is no longer within our control. This

relieves the individual of the burden of engagement with an existential situation which might otherwise be chronically disturbing. Fate, a feeling that things will take their own course anyway, thus reappears at the core of a world which is supposedly taking rational control of its own affairs. Moreover, this surely exacts a price on the level of the unconscious, since it essentially presumes the repression of anxiety. The sense of dread which is the antithesis of basic trust is likely to infuse unconscious sentiments about the uncertainties faced by humanity as a whole. [p. 133]

In effect, the nature of risk has come full circle (completed a revolution, thus, revolutionized risk) from fate to confidence in expert control and back to fate. Of course, a flexibility of perspective is typical as situation demands — it’s not all or nothing — but the overarching character is clear. Giddens also provides this quote by Susan Sontag that captures what he calls the low-probability, high-consequence character of modern risk:

A permanent modern scenario: apocalypse looms — and it doesn’t occur. And still it looms … Apocalypse is now a long-running serial: not ‘Apocalypse Now,’ but ‘Apocalypse from now on.’ [p. 134]

Commentary on the previous post poses a challenging question: having perceived that civilization is set on a collision course with reality, what is being done to address that existential problem? More pointedly, what are you doing? Most rubes seem to believe that we can technofix the problem, alter course and set off in a better, even utopian direction filled with electronic gadgetry (e.g., the Internet of things), death-defying medical technologies (as though that goal were even remotely desirable), and an endless supply of entertainments and ephemera curated by media shilling happy visions of the future (in high contrast with actual deprivation and suffering). Realists may appreciate that our charted course can’t be altered anymore considering the size and inertia of the leviathan industrial civilization has become. Figuratively, we’re aboard the RMS Titanic, full steam ahead, killer iceberg(s) looming in the darkness. The only option is to see our current path through to its destination conclusion. Maybe there’s a middle ground between, where a hard reset foils our fantasies but at least allows (some of) us to continue living on the surface of Planet Earth.

Problem is, the gargantuan, soul-destroying realization of near-term extinction has the potential to radicalize even well-balanced people, and the question “what are you doing?” is tantamount to an accusation that you’re not doing enough because, after all, nothing will ever be enough. We’ve been warned taught repeatedly to eat right, brush our teeth, get some exercise, and be humble. Yet those simple requisites for a happy, healthy life are frequently ignored. How likely is it that we will then heed the dire message that everything we know will soon be swept away?

The mythological character Cassandra, who prophesied doom, was cursed to never be believed, as was Chicken Little. The fabulous Boy Who Cried Wolf (from Aesop’s Fables) was cursed with bad timing. Sandwich-board prophets, typically hirsute Jesus freaks with some version of the message “Doom is nigh!” inscribed on the boards, are a cliché almost always now understood as set-ups for some sort of joke.

It’s an especially sick joke when the unheeded message proves to be true. If one is truly radicalized, then self-immolation on the sidewalk in front of the White House may be one measure of commitment, but the irony is that no one takes such behavior seriously except as an indication of how unhinged the prophet of doom has gotten (suggesting a different sort of commitment). Yet that’s where we’ve arrived in the 21st century. Left/right, blue/red factions have abandoned the centrist middle ground and moved conspicuously toward the radical fringes in what’s being called extreme social fragmentation. On some analyses, the rising blood tide of terrorists and mass murders are examples of an inchoate protest against the very nature of existence, a complete ontological rejection. When the ostensible purpose of, say, the Las Vegas shooter, is to take out as many people as possible, rejecting other potential sites as not promising enough for high body counts, it may not register in the public mind as a cry in the wilderness, an extreme statement that modern life is no longer worth living, but the action speaks for itself even in the absence of a formal manifesto articulating a collapsed philosophy.

In such a light, the sandwich-board prophet, by eschewing violence and hysteria, may actually be performing a modest ministerial service. Wake up and recognize that all living things must eventually die that our time is short. Cherish what you have, be among those you love and who love you, and brace yourself.