Posts Tagged ‘Culture’

From the May 2022 issue of Harper’s Magazine, Hari Kunzru’s “Easy Chair” column:

These days, I rarely have to delay the gratification of my cultural desires. I expect them to be met, if not instantly, then with all reasonable speed. I am grumpy to find that some obscure documentary is only available on a streaming service I don’t subscribe to yet. If I want to know the source of a lyric or a line of poetry, I type the words and am annoyed if the answer doesn’t appear right away. My hungry young self would consider me incredibly spoiled.

In most ways I prefer this to how things were, but with the enormous gain in access, something has been lost. Scarcity produced a particularly intense relationship with culture, and gave deep significance to subcultural signals. When you found something you loved, something that had taken time and work to unearth, you clung to it. Often you felt as if it was your secret, your talisman. If you met someone else who liked it, it was both exciting and threatening.

Although I’m not paying much attention to breathless reports about imminent strong AI, the Singularity, and computers already able to “model” human cognition and perform “impressive” feats of creativity (e.g., responding to prompts and creating “artworks” — scare quotes intended), recent news reports that chatbots are harassing, gaslighting, and threatening users just makes me laugh. I’ve never wandered over to that space, don’t know how to connect, and don’t plan to test drive for verification. Isn’t it obvious to users that they’re interacting with a computer? Chatbots are natural-language simulators within computers, right? Why take them seriously (other than perhaps their potential effects on children and those of diminished capacity)? I also find it unsurprising that, if a chatbot is designed to resemble error-prone human cognition/behavior, it would quickly become an asshole, go insane, or both. (Designers accidentally got that aspect right. D’oh!) That trajectory is a perfect embodiment of the race to the bottom of the brain stem (try searching that phrase) that keeps sane observers like me from indulging in caustic online interactions. Hell no, I won’t go.

The conventional demonstration that strong AI has arisen (e.g., Skynet from the Terminator movie franchise) is the Turing test, which is essentially the inability of humans to distinguish between human and computer interactions (not a machine-led extermination campaign) within limited interfaces such as text-based chat (e.g., the dreaded digital assistance that sometimes pops up on websites). Alan Turing came up with the test at the outset of computing era, so the field was arguably not yet mature enough to conceptualize a better test. I’ve always thought the test actually demonstrates the fallibility of human discernment, not the arrival of some fabled ghost in the machine. At present, chatbots may be fooling no one into believing that actual machine intelligence is present on the other side of the conversation, but it’s a fair expectation that further iterations (i.e., ChatBot 1.0, 2.0, 3.0, etc.) will improve. Readers can decide whether that improvement will be progress toward strong AI or merely better ability to fool human interlocutors.

Chatbots gone wild offer philosophical fodder for further inquiry into ebbing humanity as the drive toward trans- and post-human technology continue refining and redefining the dystopian future. What about chatbots make interacting with them hypnotic rather than frivolous — something wise thinkers immediately discard or even avoid? Why are some humans drawn to virtual experience rather than, say, staying rooted in human and animal interactions, our ancestral orientation? The marketplace already rejected (for now) the Google Glass and Facebook’s Meta resoundingly. I haven’t hit upon satisfactory answers to those questions, but my suspicion is that immersion in some vicarious fictions (e.g., novels, TV, and movies) fits well into narrative-styled cognition while other media trigger revulsion as one descends into the so-called Uncanny Valley — an unfamiliar term when I first blogged about it though it has been trending of late.

If readers want a really deep dive into this philosophical area — the dark implications of strong AI and an abiding human desire to embrace and enter false virtual reality — I recommend a lengthy 7-part Web series called “Mere Simulacrity” hosted by Sovereign Nations. The episodes I’ve seen feature James Lindsay and explore secret hermetic religions operating for millennia already alongside recognized religions. The secret cults share with tech companies two principal objectives: (1) simulation and/or falsification of reality and (2) desire to transform and/or reveal humans as gods (i.e., ability to create life). It’s pretty terrifying stuff, rather heady, and I can’t provide a reasonable summary. However, one takeaway is that by messing with both human nature and risking uncontrollable downstream effects, technologists are summoning the devil.

Continuing from pt. 01, the notion of privilege took an unexpected turn for me recently when the prospect inevitability of demographic collapse came back onto my radar. I scoffed at the demographer’s crystal ball earlier not because I disagree with the assessments or numbers but because, like so many aspects of the collapse of industrial civilization, demographic collapse lies squarely beyond anyone’s control. My questions in reply are basically Yeah? And? So? Do a search on demographic collapse and it will reveal a significant number of reports, with reporters themselves (including the worst contemporary media whore and glory hound who shall forever remain unnamed at this site) freaking out and losing their collective minds, on the hows, whys, wheres, and whens population will crash. The most immediate worrisome aspect is anticipated inversion of the youngest being the most numerous, the usual state of affairs, to the oldest being the most numerous and flatly unsupportable by the young. This has been the basic warning about Social Security for decades already: too few paying in, too many granted benefits. See also this documentary film being prepared for imminent release. I suspect supporting annotations will appear in time.

Probably not fair to call capitalism and its support structures a Ponzi scheme (latecomers to the scheme supporting earlier entrants), but the desire to perpetuate historical demographic distributions (as opposed to what? yielding to the drift of history?) is clearly part of the perpetual growth mentality. Traditionally, prior to the 20th century in the West, when the vast majority of people participated in agrarian and/or subsistence economies instead of the money economy, children were desirable not least because they provided free labor until they were grown, were positioned to take over family farms and businesses, and cared for oldsters when that time came. Intergenerational continuity and stability were maintained and it was generally accepted that each generation would supplant the previous through the seasons of life. The money economy destroyed most of that. Many young adult children now exercise their options (privilege) and break away as soon as possible (as I did) in search of economic opportunity in cities and form their own families (or don’t, as I didn’t). Estrangement and abandonment may not be complete, but families being spread across the continent certainly limits extended family cohesion to holidays and occasional visits. Oldsters (in the affluent West anyway) are now typically shuttled off to (euphemism alert) retirement homes to be warehoused prior to dying. Easier to pay someone to perform that service than to do it oneself, apparently. So if older people are currently privileged over the young (in some ways at least), that condition is being reversed because, dammit, children are in short supply yet needed desperately to keep growth on track.

Demographers point to a number of factors that have conspired to create the crisis (one of many interlocking crises some intellectuals have begun calling the polycrisis). The two main factors are declining fertility and reproductive choice. My suspicion is that the toxic environment, one result of centuries of industrial activity with plastics and drugs now found in human bodies where they don’t belong, accounts for many fertility issues. Add to that poor food quality (i.e., malnutrition, not just poor diets) and it’s easy to understand why healthy pregnancies might be more difficult in the 21st century than before. I’m not qualified to support that assessment, so take it for what it’s worth. Reproductive choice, another recently acquired female privilege (in historical terms), is a function of several things: financial independence, educational attainment, and availability of birth control. Accordingly, more women are choosing either to defer having children while they establish careers or choose not to have children at all. (Men make those choices, too.) Delays unexpectedly leave lots of women unable to have children for failure to find a suitable mate or having spent their best reproductive years doing other things. As I understand it, these correlations are borne out in some harrowing statistics. As the polycrisis deepens, a philosophical choice not to bring children into the world (just to suffer and die young) is also a motivation to remain childless.

(more…)

This post was going to be a review of David Hurwitz but became more an appreciation than a review. Hurwitz is an author and music critic who reviews classical music recordings both on YouTube and online at ClassicsToday.com. I hear quite a few of his YouTube videos but pay scant attention to the website. I also don’t comment; he has an engaged commentariat already. Hurwitz signs off from each video with the exhortation “keep on listening,” which I’ve adopted as the title of this blog post.

Aside: Before I get started (and this will run only slightly long), let me admit fully that classical music is a niche cultural offering rooted deeply in Western historical practice but which does not speak to many people. Everyone has their tastes and predilections and no apologies are needed when preferring one category or genre over another. However, I’m not such a value relativist to lend support to the notion that all things are created equal. How one defines art or indeed high art is a contentious issue, not unlike what counts as religion or philosophy. I hew to a relatively narrow traditional standard that admits poetry, literature, music, architecture, sculpture, and painting but eschews martial arts, culinary arts, cinema, theater, and video games. Not an exhaustive list on either side of the divide and no need to argue. Caveat: my standards are my own and should not impeach or diminish anyone’s enjoyment of his or her own passions.

Further aside: Also, the recording industry is a latecomer in the history of high art (and for that matter pop culture) and has already undergone numerous transformations as physical media shifted from the long-playing record (the venerable LP) to CD before going virtual as electronic files and streaming media. In a nutshell, competing forms of recording and distribution make up the so-called format wars, which are by no means settled. The entire idea behind making a recording is to memorialize a performance for repeat listening and posterity, as opposed to a live performance in a concert venue. The anachronistic term record calls back to that origin, though the term is arguably less applicable with each passing decade as everything is recorded and memorialized somehow. In addition, recordings grant access to ensembles and repertoire that would be prohibitively expensive or impossible if experienced solely in live concert. Through recordings, I gained a deep appreciation of many orchestras and lots of repertoire never once heard live in person. The same effect doesn’t really apply to reading a book or watching a movie. Lastly, and unlike a lot of my musician peers, I became an aficionado of recordings in parallel with performance activities.

I appreciate David Hurwitz for being among only a few people (to my knowledge) giving honest and entertaining assessments of recordings (not just new issues), as opposed to what passes for music criticism columns in newspapers and online devoted to live performance. Hurwitz explains, compares, teaches, and jokes about recordings with concentration on German symphonic repertoire, which is also my preferred musical genre. His erudite remarks also enhance my listening, which ought to be the chief goal of criticism — something lost on columnists who draw undue attention to themselves as flowery writers and auteurs. Hurwitz also has at his disposal rooms full of CDs, which I’m guessing are either sent to him for review by the record companies or otherwise acquired in the course of his professional activities. Lots of them are giant box sets of the entire recorded oeuvre of a particular conductor or conductor/orchestra/label combo. Thus, his breadth of coverage is far greater than my own. I’ve made numerous purchasing decisions based on his reviews and streamed lots more for a quick listen to hear what’s so remarkable (or awful) about them.

Final Aside: When I was much younger, I stumbled into a record shop (remember those?) in Greenwich, Connecticut, that had in inventory essentially the entire current catalogs of the major classical music labels. That richness of options (pre-Internet) was quite atypical and unlike any other record shop I’ve known. Accordingly, I was feverish with excitement, looking at all those big square LP jackets with their enclosed vinyl and attractive cover art. Back then, the only way to hear something was to purchase it, and my limited budget demanded prioritization. Decisions involved a mixture of pain (financial sacrifice and awareness of those many LPs, now CDs, left behind) and anticipated pleasure that has hardly faded with time. How someone like David Hurwitz ends up as a full-time music critic surrounded by rooms of CDs is a puzzle, and I sometimes sometimes envy him. Sports fans who grow up to be sportscasters might be a similar track. Who can predict who will be fortunate enough to enjoy fandom as a career?

Buzzwords circulating heavily in the public sphere these days include equality, equity, inclusion, representation, diversity, pluralism, multiculturalism, and privilege. How they are defined, understood, and implemented are contentious issues that never seem to resolve. Indeed, looking back on decade after decade of activism undertaken to address various social scourges, limited progress has been made, which amounts to tinkering around the edges. The underlying bigotry (opinion, motivation, activity) has never really been eradicated, though it may be diminished somewhat through attrition. Sexism and racism in particular (classics in an expanding universe of -isms) continue to rage despite surface features taking on salutary aspects. Continued activism use the buzzwords above (and others) as bludgeons to win rhetorical battles — frequently attacks on language itself through neologism, redefinition, and reclamation — without really addressing the stains on our souls that perpetuate problematic thinking (as though anyone ever had a lock on RightThink). Violence (application of actual force, not just mean or emphatic words) is the tool of choice deployed by those convinced their agenda is more important than general societal health and wellbeing. Is violence sometimes appropriate in pursuit of social justice? Yes, in some circumstances, probably so. Is this a call for violent protest? No.

Buzzwords stand in for what we may think we want as a society. But there’s built-in tension between competition and cooperation, or alternatively, individual and society (see the start of this multipart blog post) and all the social units that nest between. Each has its own desires (known in politics by the quaint term special interests), which don’t usually combine to form just societies despite mythology to that effect (e.g., the invisible hand). Rather, competition results in winners and losers even when the playing field is fair, which isn’t often. To address what is sometimes understood (or misunderstood, hard to know) as structural inequity, activists advocate privileging disenfranchised groups. Competence and merit are often sacrificed in the process. Various forms of criminal and sociopathic maneuvering also keep a sizeable portion of the population (the disenfranchised) in a perpetual and unnecessary state of desperation. That’s the class struggle.

So here’s my beef: if privilege (earned or unearned) is categorically bad because it’s been concentrated in a narrow class (who then position themselves to retain and/or grow it), why do activists seek to redistribute privilege by bestowing it on the downtrodden? Isn’t that a recipe for destroying ambition? If the game instead becomes about deploying one or more identifiers of oppression to claim privilege rather than working diligently to achieve a legitimate goal, such as acquiring skill or understanding, why bother to try hard? Shortcuts magically appear and inherent laziness is incentivized. Result: the emergent dynamic flattens valuable, nay necessary, competence hierarchies. In it’s communist formulation, social justice is achieved by making everyone equally precarious and miserable. Socialism fares somewhat better. Ideologues throughout history have wrecked societies (and their members) by redistributing or demolishing privilege forcibly while hypocritically retaining privilege for themselves. Ideology never seems to work out as theorized, though the current state of affairs (radical inequality) is arguably no more just.

More to unpack in further installments.

My inquiries into media theory long ago led me to Alan Jacobs and his abandoned, reactivated, then reabandoned blog Text Patterns. Jacobs is a promiscuous thinker and even more promiscuous technologist in that he has adopted and abandoned quite a few computer apps and publishing venues over time, offering explanations each time. Always looking for better tools, perhaps, but this roving public intellectual requires persistent attention lest one lose track of him. His current blog (for now) is The Homebound Symphony (not on my ruthlessly short blogroll), which is updated roughly daily, sometimes with linkfests or simple an image, other times with thoughtful analysis. Since I’m not as available as most academics to spend all day reading and synthesizing what I’ve read to put into a blog post, college class, or book, I am not on any sort of schedule and only publish new blog posts when I’m ready. Discovered in my latest visit to The Homebound Symphony was a plethora of super-interesting subject matter, which I daresay is relevant to the more literate and literary among us. Let me draw out the one that most piqued my interest. (That was the long way of tipping my hat to Jacobs for the link.)

In an old (by Internet standards) yet fascinating book review by Michael Walzer of Siep Stuurman’s The Invention of Humanity: Equality and Cultural Difference in World History (2017), Walzer describes the four inequalities that have persisted throughout human history, adding a fifth identified by Stuurman:

  • geographic inequality
  • racial inequality
  • hierarchical inequality
  • economic inequality
  • temporal inequality

I won’t unpack what each means if they’re not apparent on their face. Read for yourself. Intersections and overlapping are common in taxonomies of this sort, so don’t expect categories to be completely separate and distinct. The question of equality (or its inverse inequality) is a fairly recent development, part of a stew of 18th-century thought in the West that was ultimately distilled to one famous phrase “all men are created equal.” Seems obvious, but the phrase is fraught, and we’ve never really been equal, have we? So is it equal before god? Equal before the law? Equal in all opportunities and outcomes as social justice warriors now insist? On a moment’s inspection, no one can possibly believe we’re all equal despite aspirations that everyone be treated fairly. The very existence of perennial inequalities puts the lie to any notion of equality trucked in with the invention of humanity during the Enlightenment.

To those inequalities I would add a sixth: genetic inequality. Again, overlap with the others is acknowledged, but it might be worth observing that divergent inherited characteristics (other than wealth) appear quite early in life among siblings and peers, before most others manifest. By that, I certainly don’t mean race or sex, though differences clearly exist there as well. Think instead of intelligence, height, beauty, athletic ability, charisma, health and constitution, and even longevity (life span). Each of us has a mixture of characteristics that are plainly different from those of others and which provide either springboards or produce disadvantages. Just as it’s unusual to find someone in possession of all positive characteristics at once — the equivalent of rolling a 12 for each attribute of a new D&D character — few possess all negatives (a series of 1’s), either. Also, there’s probably no good way to rank best to worst, strongest to weakest, or most to least successful. Bean counters from one discipline or another might try, but that runs counter to the mythology “all men are created equal” and thus becomes a taboo to acknowledge, much less scrutinize.

What to do with the knowledge that all men are not in fact created equal and never will be? That some are stronger; more charming; smarter; taller with good teeth (or these days, dentists), hair, vision, and square jaws; luckier in the genetic lottery? Well, chalk it up, buddy. We all lack some things and possess others.

A few years ago, Knives Out (2019) unexpectedly solidified the revival of the whodunit and introduced its modern-day master sleuth: Benoit Blanc. The primary appeal of the whodunit has always been smartly constructed plots that unfold slowly and culminate in a final reveal or unmasking that invites readers to reread in search of missed clues. The two early masters of this category of genre fiction were Sir Arthur Conan Doyle and Agatha Christie, both succeeding in making their fictional detectives iconic. Others followed their examples, though the genre arguably shifted onto (into?) the TV with shows such as Perry Mason, Columbo, and Murder She Wrote. No surprise, Hollywood transformed what might have been a one-and-done story into the beginnings of a franchise, following up Knives Out with Glass Onion: A Knives Out Mystery (subtitle displayed unnecessarily to ensure audiences make the connection — wouldn’t a better subtitle be A Benoit Blanc Mystery?). Both movies are entertaining enough to justify munching some popcorn in the dark but neither observes the conventions of the genre — novel, TV, or film — any too closely. Spoilers ahead.

I harbor a sneaking suspicion that Benoit Blanc is actually a bumbling fool the way poor, rumpled Columbo only pretended to be. Although I can’t blame Daniel Craig for taking roles that allow him to portray someone other than James Bond, Craig is badly miscast and adopts a silly Southern accent others complain sounds laughably close to Foghorn Leghorn. (Craig was similarly miscast in the American remake of The Girl with the Dragon Tattoo, but that’s an entirely different, unwritten review.) So long as Blanc is a nitwit, I suppose the jokey accent provides some weak characterization and enjoyment. Problem is, because the film is only superficially a whodunit, there is no apparent crime to solve after Blanc figures out the staged murder mystery (sorta like an escape room) just after the vacation weekend gets started but before the faux murder even occurs. Kinda ruins the momentum. As a result, the film digresses to a lengthy flashback to establish the real crime that Blanc is there to solve. Maybe good mystery novels have partial reveals in the middle, reframing the entire mystery. I dunno but rather doubt it.

The plot is by no means tightly knit or clever as a whodunit normally demands. Rather, it employs lazy, pedestrian devices that irritate as much as entertain. Such as one of the characters (the real murdered character) having an identical twin who substitutes herself for the dead one; such as trapping attendees on a remote island without servants or transportation but largely ignoring their suggested captivity; such as uncovering an orgy of evidence better suited to misdirection and framing of an innocent; such as mixing faux violence with real violence, though none of the characters appears even modestly afraid at any point; such as bullets being fortuitously stopped by items in a breast pocket; such as sleuthing and detecting — done by the twin, not Blanc! — being presented in a montage of coinkidinks that demonstrate more luck than skill. I could go on. The worst cinematic trick is reprising scenes in flashback but altered to insert clues viewers would have noticed initially. Those aren’t reveals; they’re revisions. Moreover, instead of inviting viewers to rewatch, this gimmick jams supposedly unnoticed clues down their throats. How insulting. If Benoit Blanc is really an overconfident, dandified nincompoop, I suppose it’s better and more convenient (for bad storytelling) to be lucky than good. He doesn’t solve anything; he’s just there to monologue incessantly.

The weekend party is hosted by a character patterned after … oh never mind, you know who. I decline to provide the name of that real-life narcissist. Members of the entourage are mostly sycophants, originally good friends but later ruined in different ways by proximity to a hyper-successful fraud. As a group, they’re known as The Shitheads, which just about sums it up. Critics have observed a shift in entertainment toward depicting super-wealthy pretty people as heels of the highest order. Not sure what makes that entertaining exactly. I enjoy no Schadenfreude witnessing the high and mighty brought low, much as they may deserve it. It’s just another lazy cliché (like its inverse: the dignity of the downtrodden everyman a/k/a the noble savage) trotted out in the absence of better ideas.

/rant on

The previous time I was prompted to blog under this title was regarding the deplorable state of public education in the U.S., handily summarized at Gin and Tacos (formerly on my blogroll). The blogger there is admirable in many respects, but he has turned his attention away from blogging toward podcasting and professional writing with the ambition of becoming a political pundit. (I have disclaimed any desire on my part to be a pundit. Gawd … kill me first.) I check in at Gin and Tacos rarely anymore, politics not really being my focus. However, going back to reread the linked blog post, his excoriation of U.S. public education holds up. Systemic rot has since graduated into institutions of higher learning. Their mission statements, crafted in fine, unvarying academese, may exhibit unchanged idealism but the open secret is that the academy has become a network of brainwashing centers for vulnerable young adults. See this blog post on that subject. What prompts this new reality check is the ongoing buildup of truly awful news, but especially James Howard Kunstler’s recent blog post “The Four Fuckeries” over at Clusterfuck Nation, published somewhat in advance of his annual year-end-summary-and-predictions post. Kunstler pulls no punches, delivering assessments of activities in the public interest that have gone so abysmally wrong it beggars the imagination. I won’t summarize; go read for yourself.

At some point, I realized when linking to my own past blog posts that perhaps too many include the word wrong in the title. By that, I don’t mean merely incorrect or bad or unfortunate but rather purpose-built for comprehensive damage that mere incompetence could not accomplish or explain. Some may believe the severity of damage is the simple product of lies compounding lies, coverups compounding coverups, and crimes compounding crimes. That may well be true in part. But there is far too much evidence of Manichean manipulation and heedless damn-the-torpedoes-full-steam-ahead garbage decision-making to waive off widespread institutional corruptions as mere conspiracy. Thus, Kunstler’s choice of the term fuckeries. Having already reviewed the unmitigated disaster of public education, let me instead turn to other examples.

(more…)

This blog has never been obliged to observe every passing holiday or comment on celebrity deaths or public events via press release, public statement, command performance, ritual oversharing, or other characterization more closely associated with institutions and public figures who cannot keep from thrusting themselves wantonly onto the public despite having nothing of value to say. The chattering class maintains noise levels handily, so no need to add my voice to that cacophonous chorus. To wit, the recent Thanksgiving holiday prompts each of us every year to take stock anew and identify some area(s) of contentedness and gratitude, which can be challenging considering many Americans feel abandoned, betrayed, or worse as human history and civilization lurch despotically toward their end states. However, one overheard statement of gratitude this year made a strong impression on me, and as is my wont, I couldn’t help but to connect a variety of disparate ideas. Let me digress, starting with music.

Decades ago, the London Philharmonic under Jorge Mester recorded a collection of fanfares commissioned during WWII. American composers represented include (in no particular order) Henry Cowell, Howard Hanson, Roy Harris, Morton Gould, Leonard Bernstein, Virgil Thomson, and Walter Piston. None of their respective fanfares has entered the standard repertoire. However, the sole composer whose stirring fanfare has become legitimate and instantly recognizable Americana is Aaron Copland. His fanfare celebrates no famous figure or fighting force but rather the common man. Copland’s choice to valorize the common man was a masterstroke and the music possesses appealing directness and simplicity that are unexpectedly difficult to capture in music. Far more, um, common is elaborate, noisy, surface detail that fails to please the ear nearly so well as Copland’s stately fanfare. Indeed, the album is called Twenty Fanfares for the Common Man even though that title only applies to Copland’s entry.

The holiday comment that stuck with me was a son’s gratitude for the enduring example set by his father, a common man. Whether one is disposed to embrace or repudiate the patriarchy, there can be no doubt that a father’s role within a family and community is unique. (So, too, is the mother’s. Relax, it’s not a competition; both are important and necessary.) The father-protector and father-knows-best phase of early childhood is echoed in the humorous observation that a dog sees its master as a god. Sadly, the my-dad-can-beat-up-your-dad taunt lives on, transmuted in … superhero flicks. As most of us enter adulthood, coming to terms with the shortcomings of one or both parents (nobody’s perfect …) is part of the maturation process: establishing one’s own life and identity independent yet somehow continuous from those of one’s parents. So it’s not unusual to find young men in particular striking out on their own, distancing from and disapproving of their fathers (sometimes sharply) but later circling back to reflect and reconcile. How many of us can honestly express unalloyed admiration for our fathers and their character examples? I suspect frustration when feet of clay are revealed is more typical.

(more…)

The difference between right and wrong is obvious to almost everyone by the end of kindergarten. Temptations persist and everyone does things great and small known to be wrong when enticements and advantages outweigh punishments. C’mon, you know you do it. I do, too. Only at the conclusion of a law degree or the start of a political career (funny how those two often coincide) do things get particularly fuzzy. One might add military service to those exceptions except that servicemen are trained not to think, simply do (i.e., follow orders without question). Anyone with functioning ethics and morality also recognizes that in legitimate cases of things getting unavoidably fuzzy in a hypercomplex world, the dividing line often can’t be established clearly. Thus, venturing into the wide, gray, middle area is really a signal that one has probably already gone too far. And yet, demonstrating that human society has not really progressed ethically despite considerable gains in technical prowess, egregiously wrong things are getting done anyway.

The whopper of which nearly everyone is guilty (thus, guilty pleasure) is … the Whopper. C’mon, you know you eat it do it. I know I do. Of course, the irresistible and ubiquitous fast food burger is really only one example of a wide array of foodstuffs known to be unhealthy, cause obesity, and pose long-term health problems. Doesn’t help that, just like Big Tobacco, the food industry knowingly refines their products (processed foods, anyway) to be hyperstimuli impossible to ignore or resist unless one is iron willed or develops an eating disorder. Another hyperstimulus most can’t escape is the smartphone (or a host of other electronic gadgets). C’mon, you know you crave the digital pacifier. I don’t, having managed to avoid that particular trap. For me, electronics are always only tools. However, railing against them with respect to how they distort cognition (as I have) convinces exactly no one, so that argument goes on the deferral pile.

Another giant example not in terms of participation but in terms of effect is the capitalist urge to gather to oneself as much filthy lucre as possible only to sit heartlessly on top of that nasty dragon’s hoard while others suffer in plain sight all around. C’mon, you know you would do it if you could. I know I would — at least up to a point. Periods of gross inequality come and go over the course of history. I won’t make direct comparisons between today and any one of several prior Gilded Ages in the U.S., but it’s no secret that the existence today of several hundy billionaires and an increasing number of mere multibillionaires represents a gross misallocation of financial resources: funneling the productivity of the masses (and fiat dollars whiffed into existence with keystrokes) into the hands of a few. Fake philanthropy to launder reputations fail to convince me that such folks are anything other than miserly Scrooges fixated on maintaining and growing their absurd wealth, influence, and bogus social status at the cost of their very souls. Seriously, who besides sycophants and climbers would want to even be in the same room as one of those people (names withheld)? Maybe better not to answer that question.

(more…)