Posts Tagged ‘Memes’

I’ve been modestly puzzled of late to observe that, on the one hand, those in the U.S. and Canada who have only just reached the age of majority (a/k/a the threshold of adulthood, which is not strictly the same as “the age of sexual consent, marriageable age, school leaving age, drinking age, driving age, voting age, smoking age, gambling age, etc.” according to the link) are disregarded with respect to some political activism while, on the other hand, they’re admired for other political activism. Seems to be issue specific whether young adults are to be taken seriously. If one is agitating for some aspect of identity politics, or a Social Justice Warrior (SJW), one can be discredited as simply being too young to understand things properly, whereas advocating gun control (e.g., in the wake of the Parkland, Florida shootings in February) is recognized as well within a youthful mandate. Survivors of violence and mayhem seem to be uniquely immune to gun advocates trotting out the meme “now is not the time.”

As it happens, I agree that identity politics is a load of horseshit and tighter gun control (no, not taking away everyone’s guns totally) needs to be tried. But I haven’t arrived at either position because youth are either too youthful or wizened enough by horrific experience to understand. Hanging one’s positions on the (dis)qualification of age is a red herring, a meaningless distraction from the issues themselves. Rather, if thoughtful consideration is applied to the day’s issues, which I daresay is not an easy prospect, one should ideally arrive at positions based on any number of criteria, some of which may conflict with others. For instance, I used to be okay (not an enthusiastic supporter, mind you) with the death penalty on a number of grounds but changed my opinion for purely pragmatic reasons. The sheer cost of automatic appeals and other safeguards to ensure that innocents are not wrongly convicted and executed, a cost borne by U.S. taxpayers, is so onerous that to prosecute through to execution looks less like justice and more like maniacal vengeance. Life in prison without the possibility of parole is a much saner and less costly project in comparison.

With intractable debates and divisive issues (e.g, abortion, free speech, right to bear arms, immigration, religion, Israel/Palestine conflict, euthanasia, etc.) plaguing public life, one might wonder how do we get everyone on board? Alternatively, how do we at least agree to be civil in spite of our disagreements? I have two replies but no solutions. The first is to recognize that some issues are indeed intractable and insoluble, so graceful acceptance that an opposing opinion or perspective will always be present is needed lest one twist and writhe inconsolably when one’s cherished perspective is not held universally. That’s not necessarily the same as giving up or succumbing to fatalism. Rather, it’s recognition that banging one’s head against certain walls is futile. The second is to recognize that opposing opinions are needed to avoid unhealthy excess in social environments. Put another way, heterodoxy avoids orthodoxy. Many historical practices we now regard as barbaric were abandoned or outlawed precisely because consensus opinion swung from one side to the other. Neil Postman called this a thermostatic response in several of his books. Other barbaric behaviors have been only partially addressed and require further agitation to invalidate fully. Examples are not mentioned, but I could compile a list rather quickly.

Advertisements

In the sense that a picture is worth a thousand words, this cartoon caught my immediate attention (for attribution, taken from here):

comforting-lies-vs-unpleasant-truths-640x480

Search engines reveal quite a few treatments of the central conflict depicted here, including other versions of essentially the same cartoon. Doubtful anything I could say would add much to the body of analysis and advice already out there. Still, the image called up a whole series of memories for me rather quickly, the primary one being the (only) time I vacationed in Las Vegas about a decade ago.

(more…)

A year ago, I wrote about charges of cultural appropriation being levied upon fiction writers, as though fiction can now only be some watered-down memoir lest some author have the temerity to conjure a character based on someone other than him- or herself. Specifically, I linked to an opinion piece by Lionel Shriver in the NY Times describing having been sanctioned for writing characters based on ideas, identities, and backgrounds other that his own. Shriver has a new article in Prospect Magazine that provides an update, perhaps too soon to survey the scene accurately since the target is still moving, but nonetheless curious with respect to the relatively recent appearance of call-out culture and outrage engines. In his article, Shriver notes that offense and umbrage are now given equal footing with bodily harm and emotional scarring:

Time was that children were taught to turn aside tormentors with the cry, “Sticks and stones may break my bones, but words will never hurt me!” While you can indeed feel injured because Bobby called you fat, the law has traditionally maintained a sharp distinction between bodily and emotional harm. Even libel law requires a demonstration of palpable damage to reputation, which might impact your livelihood, rather than mere testimony that a passage in a book made you cry.

He also points out that an imagined “right not to be offended” is now frequently invoked, even though there is no possibility of avoiding offense if one is actually conscious in the world. For just one rather mundane example, the extraordinary genocidal violence of 20th-century history, once machines and mechanisms (now called WMDs) were applied to warfare (and dare I say it: statecraft), ought to be highly offensive to any humanitarian. That history cannot be erased, though I suppose it can be denied, revised, buried, and/or lost to living memory. Students or others who insist they be excused from being triggered by knowledge of awful events are proverbial ostriches burying their heads in the sand.

As variations of this behavior multiply and gain social approval, the Thought Police are busily mustering against all offense — real, perceived, or wholly imagined — and waging a broad-spectrum sanitation campaign. Shriver believes this could well pose the end of fiction as publishers morph into censors and authors self-censor in an attempt to pass through the SJW gauntlet. Here’s my counter-argument:

rant on/

I feel mightily offended — OFFENDED I say! — at the arrant stupidity of SJWs whose heads are full of straw (and strawmen), who are so clearly confused about what is even possible within the dictates and strictures of, well, reality, and accordingly retreated into cocoons of ideation from which others are scourged for failure to adhere to some bizarre, muddleheaded notion of equity. How dare you compel me to think prescribed thoughts emanating from your thought bubble, you damn bullies? I have my own thoughts and feelings deserving of support, maybe even more than yours considering your obvious naïveté about how the world works. Why aren’t you laboring to promote mine but instead clamoring to infect everyone with yours? Why is my writing so resoundingly ignored while you prance upon the stage demanding my attention? You are an affront to my values and sensibilities and can stuff your false piety and pretend virtue where the sun don’t shine. Go ahead and be offended; this is meant to offend. If it’s gonna be you or me who’s transgressed precisely because all sides of issues can’t be satisfied simultaneously, then on this issue, I vote for you to be in the hot seat.

rant off/

Oddly, there is no really good antonym for perfectionism. Suggestions include sloppiness, carelessness, and disregard. I’ve settled on approximation, which carries far less moral weight. I raise the contrast between perfectionism and approximation because a recent study published in Psychological Bulletin entitled “Perfectionism Is Increasing Over Time: A Meta-Analysis of Birth Cohort Differences From 1989 to 2016″ makes an interesting observation. Here’s the abstract:

From the 1980s onward, neoliberal governance in the United States, Canada, and the United Kingdom has emphasized competitive individualism and people have seemingly responded, in kind, by agitating to perfect themselves and their lifestyles. In this study, the authors examine whether cultural changes have coincided with an increase in multidimensional perfectionism in college students over the last 27 years. Their analyses are based on 164 samples and 41,641 American, Canadian, and British college students, who completed the Multidimensional Perfectionism Scale (Hewitt & Flett, 1991) between 1989 and 2016 (70.92% female, Mage = 20.66). Cross-temporal meta-analysis revealed that levels of self-oriented perfectionism, socially prescribed perfectionism, and other-oriented perfectionism have linearly increased. These trends remained when controlling for gender and between-country differences in perfectionism scores. Overall, in order of magnitude of the observed increase, the findings indicate that recent generations of young people perceive that others are more demanding of them, are more demanding of others, and are more demanding of themselves.

The notion of perfection, perfectness, perfectibility, etc. has a long tortured history in philosophy, religion, ethics, and other domains I won’t even begin to unpack. From the perspective of the above study, let’s just say that the upswing in perfectionism is about striving to achieve success, however one assesses it (education, career, relationships, lifestyle, ethics, athletics, aesthetics, etc.). The study narrows its subject group to college students (at the outset of adult life) between 1989 and 2016 and characterizes the social milieu as neoliberal, hyper-competitive, meritocratic, and pressured to succeed in a dog-eat-dog environment. How far back into childhood results of the study (agitation) extend is a good question. If the trope about parents obsessing and competing over preschool admission is accurate (may be just a NYC thang), then it goes all the way back to toddlers. So much for (lost) innocence purchased and perpetuated through late 20th- and early 21st-century affluence. I suspect college students are responding to awareness of two novel circumstances: (1) likelihood they will never achieve levels of success comparable to their own parents, especially financial (a major reversal of historical trends) and (2) recognition that to best enjoy the fruits of life, a quiet, reflective, anonymous, ethical, average life is now quite insufficient. Regarding the second of these, we are inundated by media showing rich celebrities (no longer just glamorous actors/entertainers) balling out of control, and onlookers are enjoined to “keep up.” The putative model is out there, unattainable for most but often awarded by randomness, undercutting the whole enterprise of trying to achieve perfection.

(more…)

Be forewarned: this is long and self-indulgent. Kinda threw everything and the kitchen sink at it.

In the August 2017 issue of Harper’s Magazine, Walter Kirn’s “Easy Chair” column called “Apocalypse Always” revealed his brief, boyhood fascination with dystopian fiction. This genre has been around for a very long time, to which the Cassandra myth attests. Kirn’s column is more concerned with “high mid-twentieth-century dystopian fiction,” which in his view is now classic and canonical, an entire generation of Baby Boomers having been educated in such patterned thought. A new wave of dystopian fiction appeared in the 1990s and yet another more recently in the form of Young Adult novels (and films) that arguably serve better as triumphal coming-of-age stories albeit under dystopian circumstances. Kirn observes a perennial theme present in the genre: the twins disappearances of freedom and information:

In the classic dystopias, which concern themselves with the lack of freedom and not with surplus freedom run amok (the current and unforeseen predicament of many), society is superbly well organized, resembling a kind of hive or factory. People are sorted, classified, and ranked, their individuality suppressed through goon squads, potent narcotics, or breeding programs. Quite often, they wear uniforms, and express themselves, or fail to, in ritual utterance and gestures.

Whether Americans in 2018 resemble hollowed-out zombies suffering under either boot-heel or soft-serve oppression is a good question. Some would argue just that in homage to classic dystopias. Kirn suggests briefly that we might instead suffer from runaway anarchy, where too much freedom and licentiousness have led instead to a chaotic and disorganized society populated by citizens who can neither govern nor restrain themselves.

Disappearance of information might be understood in at least three familiar aspects of narrative framing: what happened to get us to this point (past as exposition, sometimes only hinted at), what the hell? is going on (present as conflict and action), and how is gets fixed (future as resolution and denouement). Strict control over information exercised by classic dystopian despots doesn’t track to conditions under which we now find ourselves, where more disorganized, fraudulent, and degraded information than ever is available alongside small caches of wisdom and understanding buried somewhere in the heap and discoverable only with the benefit of critical thinking flatly lost on at least a couple generations of miseducated graduates. However, a coherent narrative of who and what we are and what realistic prospects the future may hold has not emerged since the stifling version of the 1950s nuclear family and middle class consumer contentment. Kirn makes this comparison directly, where classic dystopian fiction

focus[es] on bureaucracy, coercion, propaganda, and depersonalization, overstates both the prowess of the hierarchs and the submissiveness of the masses, whom it still thinks of as the masses. It does not contemplate Trump-style charlatanism at the top, or a narcissistic populace that prizes attention over privacy. The threats to individualism are paramount; the scourge of surplus individualism, with everyone playing his own dunce king and slurping up resources until he bursts, goes unexplored.

Kirn’s further observations are worth a look. Go read for yourself.

(more…)

The phrase fight or flight is often invoked to describe an instinctual response to threat to survival or wellbeing, especially physical attack. The response is typically accompanied by a rush of adrenaline that overwhelms the rational mind and renders preplanning moot. The phrase is among the most ubiquitous examples of a false binary: a limiting choice between two options. It’s false precisely because other options exist and further complicated by actual responses to threat arguably falling within more than one category. Other examples of false binaries include with/against us, Republican/Democrat, tradition/progress, and religious/secular. Some would include male/female, but that’s a can of worms these days, so I’d prefer to leave it alone. With respect to fight/flight, options might be better characterized as fight/flight/freeze/feign/fail, with acknowledgment that category boundaries are unclear. Let me characterize each in turn.

Fight. Aggressive types may default to fighting in response to provocation. With combat training, otherwise submissive types may become more confident and thus willing to fight. Of course, level of threat and likelihood of success and/or survival figure into when one decides to engage, even with snap judgments. Some situations also admit no other response: gotta fight.

Flight. When available, evading direct confrontation may be preferable to risking bodily harm. High threat level often makes flight a better strategy than fighting, meaning flight is not always a mark of cowardice. Flight is sometimes moot, as well. For instance, humans can’t outrun bears (or wolves, or dogs, pick your predator), so if one retains one’s wits in the face of a bear charge, another response might be better, though reason may have already departed the scene.

Freeze. Freezing in place might be one of two (or more) things: paralysis in the face of threat or psychological denial of what’s happening. Both are something to the effect, “this can’t possibly be happening, so I won’t even respond.” An event so far outside of normal human experience, such as a fast-moving natural disaster (e.g., a tsunami) or the slow-moving disaster of ecocide perpetrated by humans both fail to provoke active response.

Feign. Some animals are known to fake death or bluff a stronger ability to fight than is true. Feigning death, or playing possum, might work in some instances, such as mass shooting where perpetrators are trained on live targets. Facing a charging bear might just intimidate the bear enough to turn its attentions elsewhere. Probably doesn’t work at all with reptiles.

Fail. If the threat is plainly insurmountable, especially with natural disasters and animal attacks, one response may be to simply succumb without resistance. Victims of near-drowning often report being overtaken with bliss in the moment of acceptance. During periods of war and genocide, I suspect that many victims also recognized that, in those immortal words, resistance is futile. Giving up may be a better way to face death than experiencing desperation until one’s dying breath.

Bullying is one example of threat most are forced to confront in childhood, and responses are frequently based on the physical size of the bully vs. the one being bullied. Also, the severity of bullying may not be so dire that only instinctive responses are available; one can deploy a bit of strategy. Similarly, since it’s in the news these days, sexual assault, typically men against women (but not always — Catholic priest pederasts are the obvious counterexample), the response of a surprising number of women is to succumb rather than face what might be even worse outcomes. One can debate whether that is freezing, feigning, or failing. Doesn’t have to be only one.

Twice in the last month I stumbled across David Benatar, an anti-natalist philosopher, first in a podcast with Sam Harris and again in a profile of him in The New Yorker. Benatar is certainly an interesting fellow, and I suspect earnest in his beliefs and academic work, but I couldn’t avoid shrugging as he gets caught in the sort of logical traps that plague hyperintellectual folks. (Sam Harris is prone to the same problem.) The anti-natalist philosophy in a nutshell is finding, after tallying the pros and cons of living (sometimes understood as happiness or enjoyment versus suffering), that on balance, it would probably be better never to have lived. Benatar doesn’t apply the finding retroactively by suggesting folks end their lives sooner rather than later, but he does recommend that new life should not be brought into the world — an interdiction almost no parent would consider for more than a moment.

The idea that we are born against our will, never asked whether we wanted life in the first place, is an obvious conundrum but treated as a legitimate line of inquiry in Benatar’s philosophy. The kid who throws the taunt “I never asked to be born!” to a parent in the midst of an argument might score an emotional hit, but there is no logic to the assertion. Language is full of logic traps like this, such as “an infinity of infinities” (or multiverse), “what came before the beginning?” or “what happens after the end?” Most know to disregard the former, but entire religions are based on seeking the path to the (good) afterlife as if conjuring such a proposition manifests it in reality. (more…)

I watched a documentary on Netflix called Jim & Andy (2017) that provides a glimpse behind the scenes of the making of Man on the Moon (1999) where Jim Carrey portrays Andy Kaufman. It’s a familiar story of art imitating life (or is it life imitating art?) as Carrey goes method and essentially channels Kaufman and Kaufman’s alter ego Tony Clifton. A whole gaggle of actors played earlier incarnations of themselves in Man on the Moon and appeared as themselves (without artifice) in Jim & Andy, adding another weird dimension to the goings on. Actors losing themselves in roles and undermining their sense of self is hardly novel. Regular people lose themselves in their jobs, hobbies, media hype, glare of celebrity, etc. all the time. From an only slightly broader perspective, we’re all merely actors playing roles, shifting subtly or dramatically based on context. Shakespeare observed it centuries ago. However, the documentary points to a deeper sense of unreality precisely because Kaufman’s principal shtick was to push discomfiting jokes/performances beyond the breaking point, never dropping the act to let his audience in on the joke or provide closure. It’s a manifestation of what I call the Disorientation Protocol.

(more…)

As time wears on and I add years to this mostly ignored blog, I keep running across ideas expressed herein, sometimes long ago, recapitulated in remarks and comments elsewhere. Absolutely disparate people can develop the same ideas independently, so I’m not claiming that my ideas are stolen. Maybe I’m merely in touch with the Zeitgeist and express it here only then to see or hear it again someplace else. I can’t judge objectively.

The latest coincidence is the growing dread with which I wake up every day, wondering what fresh new hell awaits with the morning news. The times in which we live are both an extension of our received culture and yet unprecedented in their novelty. Not only are there many more people in existence than 100 years ago and thus radical opinions and events occurring with extraordinary frequency, the speed of transmission is also faster than in the past. Indeed, the rush to publication has many news organs reporting before any solid information is available. The first instance of blanket crisis coverage I remember was the Challenger Disaster in 1986. It’s unknown to me how quickly news of various U.S. political assassinations in the 1960s spread, but I suspect reporting took more time than today and imparted to those events gravity and composure. Today is more like a renewed Wild West where anything goes, which has been the preferred characterization of the Internet since its creation. We’ll see if the recent vote to remove Net Neutrality has the effect of restraining things. I suspect that particular move is more about a money grab (selling premium open access vs. basic limited access) than thought control, but I can only guess as to true motivations.

I happened to be traveling when the news broke of a mass shooting in Las Vegas. Happily, what news I got was delayed until actual news-gathering had already sorted basic fact from confabulation. Paradoxically, after the first wave of “what the hell just happened?” there formed a second wave of “here’s what happened,” and later a third wave of “what the hell really happened?” appeared as some rather creative interpretations were offered up for consideration. That third wave is by now quite familiar to everyone as the conspiracy wave, and surfing it feels inevitable because the second wave is often so starkly unbelievable. Various websites and shows such as snopes.com, metabunk.org, MythBusters, and Penn & Teller: Bullshit! (probably others, too) presume to settle debates. While I’m inclined to believe scientific and documentary evidence, mere argument often fails to convince me, which is troubling, to say the least.

Fending off all the mis- and disinformation, or separating signal from noise, is a full-time job if one is willing to undertake it. That used to be the mandate of the journalistic news media, at least in principle. Lots of failures on that account stack up throughout history. However, since we’re in the midst of a cultural phase dominated by competing claims to authority and the public’s retreat into ideation, the substitute worlds of extended and virtual reality become attractive alternatives to the fresh new hell we now face every morning. Tune in and check in might be what we think we’re doing, but more accurately, we tune out and check out of responsible engagement with the real world. That’s the domain of incessantly chipper morning TV shows. Moreover, we like to believe in the mythical stories we tell ourselves about ourselves, such as, for example, how privacy doesn’t matter, or that the U.S. is a free, democratic, liberal beacon of hope, or that economic value inheres in made-up currencies. It’s a battle for your attention and subscription in the marketplace of ideas. Caveat emptor.

I revisit my old blog posts when I see some reader activity in the WordPress backstage and was curious to recall a long quote of Iain McGilchrist summarizing arguments put forth by Anthony Giddens in his book Modernity and Self-identity (1991). Giddens had presaged recent cultural developments, namely, the radicalization of nativists, supremacists, Social Justice Warriors (SJWs), and others distorted by absorbed in identity politics. So I traipsed off to the Chicago Public Library (CPL) and sought out the book to read. Regrettably, CPL didn’t have a copy, so I settled on a slightly earlier book, The Consequences of Modernity (1990), which is based on a series of lectures delivered at Stanford University in 1988.

Straight away, the introduction provides a passage that goes to the heart of matters with which I’ve been preoccupied:

Today, in the late twentieth century, it is argued by many, we stand at the opening of a new era … which is taking us beyond modernity itself. A dazzling variety of terms has been suggested to refer to this transition, a few of which refer positively to the emergence of a new type of social system (such as the “information society” or the “consumer society”) but most of which suggest rather than a preceding state of affairs is drawing to a close … Some of the debates about these matters concentrate mainly upon institutional transformations, particularly those which propose that we are moving from a system based upon the manufacture of material goods to one concerned more centrally with information. More commonly, however, those controversies are focused largely upon issues of philosophy and epistemology. This is the characteristic outlook, for example, of the the author who has been primarily responsible for popularising the notion of post-modernity, Jean-François Lyotard. As he represents it, post-modernity refers to a shift away from attempts to ground epistemology and from from faith in humanly engineered progress. The condition of post-modernity is distinguished by an evaporating of the “grand narrative” — the overarching “story line” by means of which we are placed in history as being having a definite past and a predictable future. The post-modern outlook sees a plurality of heterogeneous claims to knowledge, in which science does not have a privileged place. [pp. 1–2, emphasis added]

That’s a lot to unpack all at once, but the fascinating thing is that notions now manifesting darkly in the marketplace of ideas were already in the air in the late 1980s. Significantly, this was several years still before the Internet brought the so-called Information Highway to computer users, before the cell phone and smart phone were developed, and before social media displaced traditional media (TV was only 30–40 years old but had previously transformed our information environment) as the principal way people gather news. I suspect that Giddens has more recent work that accounts for the catalyzing effect of the digital era (including mobile media) on culture, but for the moment, I’m interested in the book in hand.

Regular readers of this blog (I know of one or two) already know my armchair social criticism directed to our developing epistemological crisis (challenges to authority and expertise, psychotic knowledge, fake news, alternative facts, dissolving reality, and science denial) as well as the Transhumanist fantasy of becoming pure thought (once we evolve beyond our bodies). Until that’s accomplished with imagined technology, we increasingly live in our heads, in the abstract, disoriented and adrift on a bewildering sea of competing narratives. Moreover, I’ve stated repeatedly that highly mutable story (or narrative) underlie human cognition and consciousness, making most of us easy marks for charismatic thought leaders storytellers. Giddens was there nearly 30 years ago with these same ideas, though his terms differ.

Giddens dispels the idea of post-modernity and insists that, from a sociological perspective, the current period is better described as high modernism. This reminds me of Oswald Spengler and my abandoned book blogging of The Decline of the West. It’s unimportant to me who got it more correct but note that the term Postmodernism has been adopted widely despite its inaccuracy (at least according to Giddens). As I get further into the book, I’ll have plenty more to say.