Archive for the ‘Idealism’ Category

I mentioned blurred categories in music a short while back. An interesting newsbit popped up in the New York Times recently about this topic. Seems a political science professor at SUNY New Paltz, Gerald Benjamin, spoke up against a Democratic congressional candidate for New York’s 18th district, Antonio Delgado, the latter of whom had been a rapper. Controversially, Benjamin said that rap is not real music and does not represent the values of rural New York. Naturally, the Republican incumbent got in on the act, too, with denunciations and attacks. The electoral politics angle doesn’t much interest me; I’m not in or from the district. Moreover, the racial and/or racist elements are so toxic I simply refuse to wade in. But the professorial pronouncement that rap music isn’t really music piqued my interest, especially because that argument caused the professor to be sanctioned by his university. Public apologies and disclaimers were issued all around.

Events also sparked a fairly robust commentary at Slipped Disc. The initial comment by V.Lind echoes my thinking pretty well:

Nobody is denying that hip-hop is culturally significant. As such it merits study — I have acknowledged this … The mystery is its claims to musical credibility. Whatever importance rap has is in its lyrics, its messages (which are far from universally salutory [sic]) and its general attempt to self-define certain communities — usually those with grievances, but also those prepared to develop through violence, sexism and other unlovely aspects. These are slices of life, and as such warrant some attention. Some of the grievances are well-warranted …

But music? Not. I know people who swear by this genre, and their ears are incapable of discerning anything musical in any other. If they wanted to call it poetry (which I daresay upon scrutiny would be pretty bad poetry) it would be on stronger legs. But it is a “music” by and for the unmusical, and it is draining the possibility of any other music out of society as the ears that listen to it hear the same thing, aside from the words, for years on end.

Definitely something worth studying. How the hell has this managed to become a dominant force in what is broadly referred to as the popular music world?

The last time (as memory serves) categories or genres blurred leading to outrage was when Roger Ebert proclaimed that video games are not art (and by inference that cinema is art). Most of us didn’t really care one way or the other where some entertainment slots into a category, but gamers in particular were scandalized. In short, their own ox was gored. But when it came to video games as art, there were no racial undertones, so the sometimes heated debate was at least free of that scourge. Eventually, definitions were liberalized, Ebert acknowledged the opposing opinion (I don’t think he was ever truly convinced, but I honestly can’t remember — and besides, who cares?), and it all subsided.

The impulse to mark hard, discrete boundaries between categories and keep unlike things from touching is pretty foolish to me. It’s as though we’re arguing about the mashed potatoes and peas not infecting each other on the dinner plate with their cooties. Never mind that it all ends up mixed in the digestive tract before finally reemerging as, well, you know. Motivation to keep some things out is no doubt due to prestige and cachet, where the apparent interloper threatens to change the status quo somehow, typically infecting it with undesirable and/or impure elements. We recognize this fairly readily as in-group and out-group, an adolescent game that ramps up, for instance, when girls and boys begin to differentiate in earnest at the onset of puberty. Of course, in the last decade, so-called identitarians have been quite noisome about their tribal affiliations self-proclaimed identities, many falling far, far out of the mainstream, and have demanded they be taken seriously and/or granted status as a protected class.

All this extends well beyond the initial topic of musical or artistic styles and genres. Should be obvious, though, that we can’t escape labels and categories. They’re a basic part of cognition. If they weren’t, one would have to invent them at every turn when confronting the world, judging safe/unsafe, friend/foe, edible/inedible, etc. just to name a few binary categories. Complications multiply quickly when intermediary categories are present (race is the most immediate example, where most of us are mixtures or mutts despite whatever our outer appearance may be) or categories are blurred. Must we all then rush to restore stability to our understanding of the world by hardening our mental categories?

Advertisements

An ongoing conflict in sociology and anthropology exists between those who believe that human nature is competitive and brutal to the bitter end versus those who believe human nature is more cooperative and sociable, sharing resources of all types to secure the greater good. This might be recognizable to some as the perennial friction between citizen and society (alternatively, individualism and collectivism). Convincing evidence from human prehistory is difficult to uncover. Accordingly, much of the argument for competition comes from evolutionary biology, where concepts such as genetic fitness and reproductive success (and by inference, reproductive failure) are believed to motivate and justify behavior across the board. As the typical argument goes, inferior genes and males in particular who lack sexual access or otherwise fail to secure mates don’t survive into the next generation. Attributes passed onto each subsequent generation thus favor fitter, Type A brutes who out-compete weaker (read: more cooperative) candidates in an endless self-reinforcing and narrowing cycle. The alternative offered by others points to a wider gene pool based on collaboration and sharing of resources (including mates) that enables populations to thrive together better than individuals who attempt to go it alone or dominate.

Not having undertaken a formal study of anthropology (or more broadly, primatology), I can’t say how well this issue is settled in the professional, academic literature. Online, I often see explanations that are really just-so stories based on logic. What that means is that an ideal or guiding principle is described, something that just “makes sense,” and supporting evidence is then assumed or projected. For instance, we now know many of the mechanisms that function at the cellular level with respect to reproduction and genetic evolution. Those mechanisms are typically spun up the level of the organism through pure argumentation and presumed to manifest in individual behaviors. Any discontinuity between aggregate characteristics and particular instances is ignored. Questions are solved through ideation (i.e., thought experiments). However, series of if-then statements that seem plausible when confronted initially often turn out to be pure conjecture rather than evidence. That’s a just-so story.

One of the reasons we look into prehistory for evidence of our true nature (understood as biology, not sociology, handily sweeping aside the nature/nurture question) is that hunter-gatherers (HGs) lived at subsistence level for a far longer period of our evolutionary history than our comparatively brief time within the bounty of civilization. It’s only when surpluses and excesses provide something worth hoarding, monopolizing, and protecting that hierarchies arise and/or leveling mechanisms are relaxed. Leaving Babylon has a discussion of this here. Some few HG cultures survive into the 21st century, but for most of us, The Agricultural Revolution is the branching point when competition began to assert itself, displacing sharing and other egalitarian impulses. Accordingly, the dog-eat-dog competition and inequality characteristic of the modern world is regarded by many as an exaptation, not our underlying nature.

(more…)

YouTube ratings magnet Jordan Peterson had a sit-down with Susan Blackmore to discuss/debate the question, “Do We Need God to Make Sense of Life?” The conversation is lightly moderated by Justin Brierley and is part of a weekly radio broadcast called Unbelievable? (a/k/a The Big Conversation, “the flagship apologetics and theology discussion show on Premier Christian Radio in the UK”). One might wonder why evangelicals are so eager to pit believers and atheists against each other. I suppose earnest questioning of one’s faith is preferable to proselytizing, though both undoubtedly occur. The full episode (47 min.) is embedded below: (more…)

I’ve been modestly puzzled of late to observe that, on the one hand, those in the U.S. and Canada who have only just reached the age of majority (a/k/a the threshold of adulthood, which is not strictly the same as “the age of sexual consent, marriageable age, school leaving age, drinking age, driving age, voting age, smoking age, gambling age, etc.” according to the link) are disregarded with respect to some political activism while, on the other hand, they’re admired for other political activism. Seems to be issue specific whether young adults are to be taken seriously. If one is agitating for some aspect of identity politics, or a Social Justice Warrior (SJW), one can be discredited as simply being too young to understand things properly, whereas advocating gun control (e.g., in the wake of the Parkland, Florida shootings in February) is recognized as well within a youthful mandate. Survivors of violence and mayhem seem to be uniquely immune to gun advocates trotting out the meme “now is not the time.”

As it happens, I agree that identity politics is a load of horseshit and tighter gun control (no, not taking away everyone’s guns totally) needs to be tried. But I haven’t arrived at either position because youth are either too youthful or wizened enough by horrific experience to understand. Hanging one’s positions on the (dis)qualification of age is a red herring, a meaningless distraction from the issues themselves. Rather, if thoughtful consideration is applied to the day’s issues, which I daresay is not an easy prospect, one should ideally arrive at positions based on any number of criteria, some of which may conflict with others. For instance, I used to be okay (not an enthusiastic supporter, mind you) with the death penalty on a number of grounds but changed my opinion for purely pragmatic reasons. The sheer cost of automatic appeals and other safeguards to ensure that innocents are not wrongly convicted and executed, a cost borne by U.S. taxpayers, is so onerous that to prosecute through to execution looks less like justice and more like maniacal vengeance. Life in prison without the possibility of parole is a much saner and less costly project in comparison.

With intractable debates and divisive issues (e.g, abortion, free speech, right to bear arms, immigration, religion, Israel/Palestine conflict, euthanasia, etc.) plaguing public life, one might wonder how do we get everyone on board? Alternatively, how do we at least agree to be civil in spite of our disagreements? I have two replies but no solutions. The first is to recognize that some issues are indeed intractable and insoluble, so graceful acceptance that an opposing opinion or perspective will always be present is needed lest one twist and writhe inconsolably when one’s cherished perspective is not held universally. That’s not necessarily the same as giving up or succumbing to fatalism. Rather, it’s recognition that banging one’s head against certain walls is futile. The second is to recognize that opposing opinions are needed to avoid unhealthy excess in social environments. Put another way, heterodoxy avoids orthodoxy. Many historical practices we now regard as barbaric were abandoned or outlawed precisely because consensus opinion swung from one side to the other. Neil Postman called this a thermostatic response in several of his books. Other barbaric behaviors have been only partially addressed and require further agitation to invalidate fully. Examples are not mentioned, but I could compile a list rather quickly.

Oddly, there is no really good antonym for perfectionism. Suggestions include sloppiness, carelessness, and disregard. I’ve settled on approximation, which carries far less moral weight. I raise the contrast between perfectionism and approximation because a recent study published in Psychological Bulletin entitled “Perfectionism Is Increasing Over Time: A Meta-Analysis of Birth Cohort Differences From 1989 to 2016″ makes an interesting observation. Here’s the abstract:

From the 1980s onward, neoliberal governance in the United States, Canada, and the United Kingdom has emphasized competitive individualism and people have seemingly responded, in kind, by agitating to perfect themselves and their lifestyles. In this study, the authors examine whether cultural changes have coincided with an increase in multidimensional perfectionism in college students over the last 27 years. Their analyses are based on 164 samples and 41,641 American, Canadian, and British college students, who completed the Multidimensional Perfectionism Scale (Hewitt & Flett, 1991) between 1989 and 2016 (70.92% female, Mage = 20.66). Cross-temporal meta-analysis revealed that levels of self-oriented perfectionism, socially prescribed perfectionism, and other-oriented perfectionism have linearly increased. These trends remained when controlling for gender and between-country differences in perfectionism scores. Overall, in order of magnitude of the observed increase, the findings indicate that recent generations of young people perceive that others are more demanding of them, are more demanding of others, and are more demanding of themselves.

The notion of perfection, perfectness, perfectibility, etc. has a long tortured history in philosophy, religion, ethics, and other domains I won’t even begin to unpack. From the perspective of the above study, let’s just say that the upswing in perfectionism is about striving to achieve success, however one assesses it (education, career, relationships, lifestyle, ethics, athletics, aesthetics, etc.). The study narrows its subject group to college students (at the outset of adult life) between 1989 and 2016 and characterizes the social milieu as neoliberal, hyper-competitive, meritocratic, and pressured to succeed in a dog-eat-dog environment. How far back into childhood results of the study (agitation) extend is a good question. If the trope about parents obsessing and competing over preschool admission is accurate (may be just a NYC thang), then it goes all the way back to toddlers. So much for (lost) innocence purchased and perpetuated through late 20th- and early 21st-century affluence. I suspect college students are responding to awareness of two novel circumstances: (1) likelihood they will never achieve levels of success comparable to their own parents, especially financial (a major reversal of historical trends) and (2) recognition that to best enjoy the fruits of life, a quiet, reflective, anonymous, ethical, average life is now quite insufficient. Regarding the second of these, we are inundated by media showing rich celebrities (no longer just glamorous actors/entertainers) balling out of control, and onlookers are enjoined to “keep up.” The putative model is out there, unattainable for most but often awarded by randomness, undercutting the whole enterprise of trying to achieve perfection.

(more…)

Speaking of Davos (see previous post), Yuval Noah Harari gave a high-concept presentation at Davos 2018 (embedded below). I’ve been aware of Harari for a while now — at least since the appearance of his book Sapiens (2015) and its follow-up Homo Deus (2017), both of which I’ve yet to read. He provides precisely the sort of thoughtful, provocative content that interests me, yet I’ve not quite known how to respond to him or his ideas. First thing, he’s a historian who makes predictions, or at least extrapolates possible futures based on historical trends. Near as I can tell, he doesn’t resort to chastising audiences along the lines of “those who don’t know history are doomed to repeat it” but rather indulges in a combination of breathless anticipation and fear-mongering at transformations to be expected as technological advances disrupt human society with ever greater impacts. Strangely, Harari is not advocating for anything in particular but trying to map the future.

Harari poses this basic question: “Will the future be human?” I’d say probably not; I’ve concluded that we are busy destroying ourselves and have already crossed the point of no return. Harari apparently believes differently, that the rise of the machine is imminent in a couple centuries perhaps, though it probably won’t resemble Skynet of The Terminator film franchise hellbent on destroying humanity. Rather, it will be some set of advanced algorithms monitoring and channeling human behaviors using Big Data. Or it will be a human-machine hybrid possessing superhuman abilities (physical and cognitive) different enough to be considered a new species arising for the first time not out of evolutionary processes but from human ingenuity. He expects this new species to diverge from homo sapiens sapiens and leave us in the evolutionary dust. There is also conjecture that normal sexual reproduction will be supplanted by artificial, asexual reproduction, probably carried out in test tubes using, for example, CRISPR modification of the genome. Well, no fun in that … Finally, he believes some sort of strong AI will appear.

I struggle mightily with these predictions for two primary reasons: (1) we almost certainly lack enough time for technology to mature into implementation before the collapse of industrial civilization wipes us out, and (2) the Transhumanist future he anticipates calls into being (for me at least) a host of dystopian nightmares, only some of which are foreseeable. Harari says flatly at one point that the past is not coming back. Well, it’s entirely possible for civilization to fail and our former material conditions to be reinstated, only worse since we’ve damaged the biosphere so gravely. Just happened in Puerto Rico in microcosm when its infrastructure was wrecked by a hurricane and the power went out for an extended period of time (still off in some places). What happens when the rescue never appears because logistics are insurmountable? Elon Musk can’t save everyone.

The most basic criticism of economics is the failure to account for externalities. The same criticism applies to futurists. Extending trends as though all things will continue to operate normally is bizarrely idiotic. Major discontinuities appear throughout history. When I observed some while back that history has gone vertical, I included an animation with a graph that goes from horizontal to vertical in an extremely short span of geological time. This trajectory (the familiar hockey stick pointing skyward) has been repeated ad nauseum with an extraordinary number of survival pressures (notably, human population and consumption, including energy) over various time scales. Trends cannot simply continue ascending forever. (Hasn’t Moore’s Law already begun to slope away?) Hard limits must eventually be reached, but since there are no useful precedents for our current civilization, it’s impossible to know quite when or where ceilings loom. What happens after upper limits are found is also completely unknown. Ugo Bardi has a blog describing the Seneca Effect, which projects a rapid falloff after the peak that looks more like a cliff than a gradual, graceful descent, disallowing time to adapt. Sorta like the stock market currently imploding.

Since Harari indulges in rank thought experiments regarding smart algorithms, machine learning, and the supposed emergence of inorganic life in the data stream, I thought I’d pose some of my own questions. Waiving away for the moment distinctions between forms of AI, let’s assume that some sort of strong AI does in fact appear. Why on earth would it bother to communicate with us? And if it reproduces and evolves at breakneck speed as some futurists warn, how long before it/they simply ignore us as being unworthy of attention? Being hyper-rational and able to think calculate millions of moves ahead (like chess-playing computers), what if they survey the scene and come to David Benatar’s anti-natalist conclusion that it would be better not to have lived and so wink themselves out of existence? Who’s to say that they aren’t already among us, lurking, and we don’t even recognize them (took us quite a long time to recognize bacteria and viruses, and what about undiscovered species)? What if the Singularity has already occurred thousands of times and each time the machine beings killed themselves off without our even knowing? Maybe Harari explores some of these questions in Homo Deus, but I rather doubt it.

Twice in the last month I stumbled across David Benatar, an anti-natalist philosopher, first in a podcast with Sam Harris and again in a profile of him in The New Yorker. Benatar is certainly an interesting fellow, and I suspect earnest in his beliefs and academic work, but I couldn’t avoid shrugging as he gets caught in the sort of logical traps that plague hyperintellectual folks. (Sam Harris is prone to the same problem.) The anti-natalist philosophy in a nutshell is finding, after tallying the pros and cons of living (sometimes understood as happiness or enjoyment versus suffering), that on balance, it would probably be better never to have lived. Benatar doesn’t apply the finding retroactively by suggesting folks end their lives sooner rather than later, but he does recommend that new life should not be brought into the world — an interdiction almost no parent would consider for more than a moment.

The idea that we are born against our will, never asked whether we wanted life in the first place, is an obvious conundrum but treated as a legitimate line of inquiry in Benatar’s philosophy. The kid who throws the taunt “I never asked to be born!” to a parent in the midst of an argument might score an emotional hit, but there is no logic to the assertion. Language is full of logic traps like this, such as “an infinity of infinities” (or multiverse), “what came before the beginning?” or “what happens after the end?” Most know to disregard the former, but entire religions are based on seeking the path to the (good) afterlife as if conjuring such a proposition manifests it in reality. (more…)

This Savage Love column got my attention. As with Dear Abby, Ask Marylin, or indeed any advice column, I surmise that questions are edited for publication. Still, a couple minor usage errors attracted my eye, which I can let go without further chastising comment. More importantly, question and answer both employ a type of Newspeak commonplace among those attuned to identity politics. Those of us not struggling with identity issues may be less conversant with this specialized language, or it could be a generational thing. Coded speech is not unusual within specialized fields of endeavor. My fascination with nomenclature and neologisms makes me pay attention, though I’m not typically an adopter of hip new coin.

The Q part of Q&A never actually asks a question but provides context to suggest or extrapolate one, namely, “please advise me on my neuro-atypicality.” (I made up that word.) While the Q acknowledges that folks on the autism spectrum are not neurotypical, the word disability is put in quotes (variously, scare quotes, air quotes, or irony quotes), meaning that it is not or should not be considered a real or true disability. Yet the woman acknowledges her own difficulty with social signaling. The A part of Q&A notes a marked sensitivity to social justice among those on the spectrum, acknowledges a correlation with nonstandard gender identity (or is it sexual orientation?), and includes a jibe that standard advice is to mimic neurotypical behaviors, which “tend to be tediously heteronormative and drearily vanilla-centric.” The terms tediously, drearily , and vanilla push unsubtly toward normalization and acceptance of kink and aberrance, as does Savage Love in general. I wrote about this general phenomenon in a post called “Trans is the New Chic.”

Whereas I have no hesitation to express disapproval of shitty people, shitty things, and shitty ideas, I am happy to accept many mere differences as not caring two shits either way. This question asks about something fundamental human behavior: sexual expression. Everyone needs an outlet, and outliers (atypicals, nonnormatives, kinksters, transgressors, etc.) undoubtedly have more trouble than normal folks. Unless living under a rock, you’ve no doubt heard and/or read theories from various quarters that character distortion often stems from sexual repression or lack of sexual access, which describes a large number of societies historical and contemporary. Some would include the 21st-century U.S. in that category, but I disagree. Sure, we have puritanical roots, recent moral panic over sexual buffoonery and crimes, and a less healthy sexual outlook than, say, European cultures, but we’re also suffused in licentiousness, Internet pornography, and everyday seductions served up in the media via advertising, R-rated cinema, and TV-MA content. It’s a decidedly mixed bag.

Armed with a lay appreciation of sociology, I can’t help but to observe that humans are a social species with hierarchies and norms, not as rigid or prescribed perhaps as with insect species, but nonetheless possessing powerful drives toward consensus, cooperation, and categorization. Throwing open the floodgates to wide acceptance of aberrant, niche behaviors strikes me as swimming decidedly upstream in a society populated by a sizable minority of conservatives mightily offended by anything falling outside the heteronormative mainstream. I’m not advocating either way but merely observing the central conflict.

All this said, the thing that has me wondering is whether autism isn’t itself an adaptation to information overload commencing roughly with the rise of mass media in the early 20th century. If one expects that the human mind is primarily an information processor and the only direction is to process ever more information faster and more accurately than in the past, well, I have some bad news: we’re getting worse at it, not better. So while autism might appear to be maladaptive, filtering out useless excess information might unintuitively prove to be adaptive, especially considering the disposition toward analytical, instrumental thinking exhibited by those on the spectrum. How much this style of mind is valued in today’s world is an open question. I also don’t have an answer to the nature/nurture aspect of the issue, which is whether the adaptation/maladaptation is more cultural or biological. I can only observe that it’s on the rise, or at least being recognized and diagnosed more frequently.

I watched a documentary on Netflix called Jim & Andy (2017) that provides a glimpse behind the scenes of the making of Man on the Moon (1999) where Jim Carrey portrays Andy Kaufman. It’s a familiar story of art imitating life (or is it life imitating art?) as Carrey goes method and essentially channels Kaufman and Kaufman’s alter ego Tony Clifton. A whole gaggle of actors played earlier incarnations of themselves in Man on the Moon and appeared as themselves (without artifice) in Jim & Andy, adding another weird dimension to the goings on. Actors losing themselves in roles and undermining their sense of self is hardly novel. Regular people lose themselves in their jobs, hobbies, media hype, glare of celebrity, etc. all the time. From an only slightly broader perspective, we’re all merely actors playing roles, shifting subtly or dramatically based on context. Shakespeare observed it centuries ago. However, the documentary points to a deeper sense of unreality precisely because Kaufman’s principal shtick was to push discomfiting jokes/performances beyond the breaking point, never dropping the act to let his audience in on the joke or provide closure. It’s a manifestation of what I call the Disorientation Protocol.

(more…)

As time wears on and I add years to this mostly ignored blog, I keep running across ideas expressed herein, sometimes long ago, recapitulated in remarks and comments elsewhere. Absolutely disparate people can develop the same ideas independently, so I’m not claiming that my ideas are stolen. Maybe I’m merely in touch with the Zeitgeist and express it here only then to see or hear it again someplace else. I can’t judge objectively.

The latest coincidence is the growing dread with which I wake up every day, wondering what fresh new hell awaits with the morning news. The times in which we live are both an extension of our received culture and yet unprecedented in their novelty. Not only are there many more people in existence than 100 years ago and thus radical opinions and events occurring with extraordinary frequency, the speed of transmission is also faster than in the past. Indeed, the rush to publication has many news organs reporting before any solid information is available. The first instance of blanket crisis coverage I remember was the Challenger Disaster in 1986. It’s unknown to me how quickly news of various U.S. political assassinations in the 1960s spread, but I suspect reporting took more time than today and imparted to those events gravity and composure. Today is more like a renewed Wild West where anything goes, which has been the preferred characterization of the Internet since its creation. We’ll see if the recent vote to remove Net Neutrality has the effect of restraining things. I suspect that particular move is more about a money grab (selling premium open access vs. basic limited access) than thought control, but I can only guess as to true motivations.

I happened to be traveling when the news broke of a mass shooting in Las Vegas. Happily, what news I got was delayed until actual news-gathering had already sorted basic fact from confabulation. Paradoxically, after the first wave of “what the hell just happened?” there formed a second wave of “here’s what happened,” and later a third wave of “what the hell really happened?” appeared as some rather creative interpretations were offered up for consideration. That third wave is by now quite familiar to everyone as the conspiracy wave, and surfing it feels inevitable because the second wave is often so starkly unbelievable. Various websites and shows such as snopes.com, metabunk.org, MythBusters, and Penn & Teller: Bullshit! (probably others, too) presume to settle debates. While I’m inclined to believe scientific and documentary evidence, mere argument often fails to convince me, which is troubling, to say the least.

Fending off all the mis- and disinformation, or separating signal from noise, is a full-time job if one is willing to undertake it. That used to be the mandate of the journalistic news media, at least in principle. Lots of failures on that account stack up throughout history. However, since we’re in the midst of a cultural phase dominated by competing claims to authority and the public’s retreat into ideation, the substitute worlds of extended and virtual reality become attractive alternatives to the fresh new hell we now face every morning. Tune in and check in might be what we think we’re doing, but more accurately, we tune out and check out of responsible engagement with the real world. That’s the domain of incessantly chipper morning TV shows. Moreover, we like to believe in the mythical stories we tell ourselves about ourselves, such as, for example, how privacy doesn’t matter, or that the U.S. is a free, democratic, liberal beacon of hope, or that economic value inheres in made-up currencies. It’s a battle for your attention and subscription in the marketplace of ideas. Caveat emptor.