Archive for the ‘Culture’ Category

Although I’m not paying much attention to breathless reports about imminent strong AI, the Singularity, and computers already able to “model” human cognition and perform “impressive” feats of creativity (e.g., responding to prompts and creating “artworks” — scare quotes intended), recent news reports that chatbots are harassing, gaslighting, and threatening users just makes me laugh. I’ve never wandered over to that space, don’t know how to connect, and don’t plan to test drive for verification. Isn’t it obvious to users that they’re interacting with a computer? Chatbots are natural-language simulators within computers, right? Why take them seriously (other than perhaps their potential effects on children and those of diminished capacity)? I also find it unsurprising that, if a chatbot is designed to resemble error-prone human cognition/behavior, it would quickly become an asshole, go insane, or both. (Designers accidentally got that aspect right. D’oh!) That trajectory is a perfect embodiment of the race to the bottom of the brain stem (try searching that phrase) that keeps sane observers like me from indulging in caustic online interactions. Hell no, I won’t go.

The conventional demonstration that strong AI has arisen (e.g., Skynet from the Terminator movie franchise) is the Turing test, which is essentially the inability of humans to distinguish between human and computer interactions (not a machine-led extermination campaign) within limited interfaces such as text-based chat (e.g., the dreaded digital assistance that sometimes pops up on websites). Alan Turing came up with the test at the outset of computing era, so the field was arguably not yet mature enough to conceptualize a better test. I’ve always thought the test actually demonstrates the fallibility of human discernment, not the arrival of some fabled ghost in the machine. At present, chatbots may be fooling no one into believing that actual machine intelligence is present on the other side of the conversation, but it’s a fair expectation that further iterations (i.e., ChatBot 1.0, 2.0, 3.0, etc.) will improve. Readers can decide whether that improvement will be progress toward strong AI or merely better ability to fool human interlocutors.

Chatbots gone wild offer philosophical fodder for further inquiry into ebbing humanity as the drive toward trans- and post-human technology continue refining and redefining the dystopian future. What about chatbots make interacting with them hypnotic rather than frivolous — something wise thinkers immediately discard or even avoid? Why are some humans drawn to virtual experience rather than, say, staying rooted in human and animal interactions, our ancestral orientation? The marketplace already rejected (for now) the Google Glass and Facebook’s Meta resoundingly. I haven’t hit upon satisfactory answers to those questions, but my suspicion is that immersion in some vicarious fictions (e.g., novels, TV, and movies) fits well into narrative-styled cognition while other media trigger revulsion as one descends into the so-called Uncanny Valley — an unfamiliar term when I first blogged about it though it has been trending of late.

If readers want a really deep dive into this philosophical area — the dark implications of strong AI and an abiding human desire to embrace and enter false virtual reality — I recommend a lengthy 7-part Web series called “Mere Simulacrity” hosted by Sovereign Nations. The episodes I’ve seen feature James Lindsay and explore secret hermetic religions operating for millennia already alongside recognized religions. The secret cults share with tech companies two principal objectives: (1) simulation and/or falsification of reality and (2) desire to transform and/or reveal humans as gods (i.e., ability to create life). It’s pretty terrifying stuff, rather heady, and I can’t provide a reasonable summary. However, one takeaway is that by messing with both human nature and risking uncontrollable downstream effects, technologists are summoning the devil.

Continuing from pt. 01, the notion of privilege took an unexpected turn for me recently when the prospect inevitability of demographic collapse came back onto my radar. I scoffed at the demographer’s crystal ball earlier not because I disagree with the assessments or numbers but because, like so many aspects of the collapse of industrial civilization, demographic collapse lies squarely beyond anyone’s control. My questions in reply are basically Yeah? And? So? Do a search on demographic collapse and it will reveal a significant number of reports, with reporters themselves (including the worst contemporary media whore and glory hound who shall forever remain unnamed at this site) freaking out and losing their collective minds, on the hows, whys, wheres, and whens population will crash. The most immediate worrisome aspect is anticipated inversion of the youngest being the most numerous, the usual state of affairs, to the oldest being the most numerous and flatly unsupportable by the young. This has been the basic warning about Social Security for decades already: too few paying in, too many granted benefits. See also this documentary film being prepared for imminent release. I suspect supporting annotations will appear in time.

Probably not fair to call capitalism and its support structures a Ponzi scheme (latecomers to the scheme supporting earlier entrants), but the desire to perpetuate historical demographic distributions (as opposed to what? yielding to the drift of history?) is clearly part of the perpetual growth mentality. Traditionally, prior to the 20th century in the West, when the vast majority of people participated in agrarian and/or subsistence economies instead of the money economy, children were desirable not least because they provided free labor until they were grown, were positioned to take over family farms and businesses, and cared for oldsters when that time came. Intergenerational continuity and stability were maintained and it was generally accepted that each generation would supplant the previous through the seasons of life. The money economy destroyed most of that. Many young adult children now exercise their options (privilege) and break away as soon as possible (as I did) in search of economic opportunity in cities and form their own families (or don’t, as I didn’t). Estrangement and abandonment may not be complete, but families being spread across the continent certainly limits extended family cohesion to holidays and occasional visits. Oldsters (in the affluent West anyway) are now typically shuttled off to (euphemism alert) retirement homes to be warehoused prior to dying. Easier to pay someone to perform that service than to do it oneself, apparently. So if older people are currently privileged over the young (in some ways at least), that condition is being reversed because, dammit, children are in short supply yet needed desperately to keep growth on track.

Demographers point to a number of factors that have conspired to create the crisis (one of many interlocking crises some intellectuals have begun calling the polycrisis). The two main factors are declining fertility and reproductive choice. My suspicion is that the toxic environment, one result of centuries of industrial activity with plastics and drugs now found in human bodies where they don’t belong, accounts for many fertility issues. Add to that poor food quality (i.e., malnutrition, not just poor diets) and it’s easy to understand why healthy pregnancies might be more difficult in the 21st century than before. I’m not qualified to support that assessment, so take it for what it’s worth. Reproductive choice, another recently acquired female privilege (in historical terms), is a function of several things: financial independence, educational attainment, and availability of birth control. Accordingly, more women are choosing either to defer having children while they establish careers or choose not to have children at all. (Men make those choices, too.) Delays unexpectedly leave lots of women unable to have children for failure to find a suitable mate or having spent their best reproductive years doing other things. As I understand it, these correlations are borne out in some harrowing statistics. As the polycrisis deepens, a philosophical choice not to bring children into the world (just to suffer and die young) is also a motivation to remain childless.

(more…)

This post was going to be a review of David Hurwitz but became more an appreciation than a review. Hurwitz is an author and music critic who reviews classical music recordings both on YouTube and online at ClassicsToday.com. I hear quite a few of his YouTube videos but pay scant attention to the website. I also don’t comment; he has an engaged commentariat already. Hurwitz signs off from each video with the exhortation “keep on listening,” which I’ve adopted as the title of this blog post.

Aside: Before I get started (and this will run only slightly long), let me admit fully that classical music is a niche cultural offering rooted deeply in Western historical practice but which does not speak to many people. Everyone has their tastes and predilections and no apologies are needed when preferring one category or genre over another. However, I’m not such a value relativist to lend support to the notion that all things are created equal. How one defines art or indeed high art is a contentious issue, not unlike what counts as religion or philosophy. I hew to a relatively narrow traditional standard that admits poetry, literature, music, architecture, sculpture, and painting but eschews martial arts, culinary arts, cinema, theater, and video games. Not an exhaustive list on either side of the divide and no need to argue. Caveat: my standards are my own and should not impeach or diminish anyone’s enjoyment of his or her own passions.

Further aside: Also, the recording industry is a latecomer in the history of high art (and for that matter pop culture) and has already undergone numerous transformations as physical media shifted from the long-playing record (the venerable LP) to CD before going virtual as electronic files and streaming media. In a nutshell, competing forms of recording and distribution make up the so-called format wars, which are by no means settled. The entire idea behind making a recording is to memorialize a performance for repeat listening and posterity, as opposed to a live performance in a concert venue. The anachronistic term record calls back to that origin, though the term is arguably less applicable with each passing decade as everything is recorded and memorialized somehow. In addition, recordings grant access to ensembles and repertoire that would be prohibitively expensive or impossible if experienced solely in live concert. Through recordings, I gained a deep appreciation of many orchestras and lots of repertoire never once heard live in person. The same effect doesn’t really apply to reading a book or watching a movie. Lastly, and unlike a lot of my musician peers, I became an aficionado of recordings in parallel with performance activities.

I appreciate David Hurwitz for being among only a few people (to my knowledge) giving honest and entertaining assessments of recordings (not just new issues), as opposed to what passes for music criticism columns in newspapers and online devoted to live performance. Hurwitz explains, compares, teaches, and jokes about recordings with concentration on German symphonic repertoire, which is also my preferred musical genre. His erudite remarks also enhance my listening, which ought to be the chief goal of criticism — something lost on columnists who draw undue attention to themselves as flowery writers and auteurs. Hurwitz also has at his disposal rooms full of CDs, which I’m guessing are either sent to him for review by the record companies or otherwise acquired in the course of his professional activities. Lots of them are giant box sets of the entire recorded oeuvre of a particular conductor or conductor/orchestra/label combo. Thus, his breadth of coverage is far greater than my own. I’ve made numerous purchasing decisions based on his reviews and streamed lots more for a quick listen to hear what’s so remarkable (or awful) about them.

Final Aside: When I was much younger, I stumbled into a record shop (remember those?) in Greenwich, Connecticut, that had in inventory essentially the entire current catalogs of the major classical music labels. That richness of options (pre-Internet) was quite atypical and unlike any other record shop I’ve known. Accordingly, I was feverish with excitement, looking at all those big square LP jackets with their enclosed vinyl and attractive cover art. Back then, the only way to hear something was to purchase it, and my limited budget demanded prioritization. Decisions involved a mixture of pain (financial sacrifice and awareness of those many LPs, now CDs, left behind) and anticipated pleasure that has hardly faded with time. How someone like David Hurwitz ends up as a full-time music critic surrounded by rooms of CDs is a puzzle, and I sometimes sometimes envy him. Sports fans who grow up to be sportscasters might be a similar track. Who can predict who will be fortunate enough to enjoy fandom as a career?

Buzzwords circulating heavily in the public sphere these days include equality, equity, inclusion, representation, diversity, pluralism, multiculturalism, and privilege. How they are defined, understood, and implemented are contentious issues that never seem to resolve. Indeed, looking back on decade after decade of activism undertaken to address various social scourges, limited progress has been made, which amounts to tinkering around the edges. The underlying bigotry (opinion, motivation, activity) has never really been eradicated, though it may be diminished somewhat through attrition. Sexism and racism in particular (classics in an expanding universe of -isms) continue to rage despite surface features taking on salutary aspects. Continued activism use the buzzwords above (and others) as bludgeons to win rhetorical battles — frequently attacks on language itself through neologism, redefinition, and reclamation — without really addressing the stains on our souls that perpetuate problematic thinking (as though anyone ever had a lock on RightThink). Violence (application of actual force, not just mean or emphatic words) is the tool of choice deployed by those convinced their agenda is more important than general societal health and wellbeing. Is violence sometimes appropriate in pursuit of social justice? Yes, in some circumstances, probably so. Is this a call for violent protest? No.

Buzzwords stand in for what we may think we want as a society. But there’s built-in tension between competition and cooperation, or alternatively, individual and society (see the start of this multipart blog post) and all the social units that nest between. Each has its own desires (known in politics by the quaint term special interests), which don’t usually combine to form just societies despite mythology to that effect (e.g., the invisible hand). Rather, competition results in winners and losers even when the playing field is fair, which isn’t often. To address what is sometimes understood (or misunderstood, hard to know) as structural inequity, activists advocate privileging disenfranchised groups. Competence and merit are often sacrificed in the process. Various forms of criminal and sociopathic maneuvering also keep a sizeable portion of the population (the disenfranchised) in a perpetual and unnecessary state of desperation. That’s the class struggle.

So here’s my beef: if privilege (earned or unearned) is categorically bad because it’s been concentrated in a narrow class (who then position themselves to retain and/or grow it), why do activists seek to redistribute privilege by bestowing it on the downtrodden? Isn’t that a recipe for destroying ambition? If the game instead becomes about deploying one or more identifiers of oppression to claim privilege rather than working diligently to achieve a legitimate goal, such as acquiring skill or understanding, why bother to try hard? Shortcuts magically appear and inherent laziness is incentivized. Result: the emergent dynamic flattens valuable, nay necessary, competence hierarchies. In it’s communist formulation, social justice is achieved by making everyone equally precarious and miserable. Socialism fares somewhat better. Ideologues throughout history have wrecked societies (and their members) by redistributing or demolishing privilege forcibly while hypocritically retaining privilege for themselves. Ideology never seems to work out as theorized, though the current state of affairs (radical inequality) is arguably no more just.

More to unpack in further installments.

My inquiries into media theory long ago led me to Alan Jacobs and his abandoned, reactivated, then reabandoned blog Text Patterns. Jacobs is a promiscuous thinker and even more promiscuous technologist in that he has adopted and abandoned quite a few computer apps and publishing venues over time, offering explanations each time. Always looking for better tools, perhaps, but this roving public intellectual requires persistent attention lest one lose track of him. His current blog (for now) is The Homebound Symphony (not on my ruthlessly short blogroll), which is updated roughly daily, sometimes with linkfests or simple an image, other times with thoughtful analysis. Since I’m not as available as most academics to spend all day reading and synthesizing what I’ve read to put into a blog post, college class, or book, I am not on any sort of schedule and only publish new blog posts when I’m ready. Discovered in my latest visit to The Homebound Symphony was a plethora of super-interesting subject matter, which I daresay is relevant to the more literate and literary among us. Let me draw out the one that most piqued my interest. (That was the long way of tipping my hat to Jacobs for the link.)

In an old (by Internet standards) yet fascinating book review by Michael Walzer of Siep Stuurman’s The Invention of Humanity: Equality and Cultural Difference in World History (2017), Walzer describes the four inequalities that have persisted throughout human history, adding a fifth identified by Stuurman:

  • geographic inequality
  • racial inequality
  • hierarchical inequality
  • economic inequality
  • temporal inequality

I won’t unpack what each means if they’re not apparent on their face. Read for yourself. Intersections and overlapping are common in taxonomies of this sort, so don’t expect categories to be completely separate and distinct. The question of equality (or its inverse inequality) is a fairly recent development, part of a stew of 18th-century thought in the West that was ultimately distilled to one famous phrase “all men are created equal.” Seems obvious, but the phrase is fraught, and we’ve never really been equal, have we? So is it equal before god? Equal before the law? Equal in all opportunities and outcomes as social justice warriors now insist? On a moment’s inspection, no one can possibly believe we’re all equal despite aspirations that everyone be treated fairly. The very existence of perennial inequalities puts the lie to any notion of equality trucked in with the invention of humanity during the Enlightenment.

To those inequalities I would add a sixth: genetic inequality. Again, overlap with the others is acknowledged, but it might be worth observing that divergent inherited characteristics (other than wealth) appear quite early in life among siblings and peers, before most others manifest. By that, I certainly don’t mean race or sex, though differences clearly exist there as well. Think instead of intelligence, height, beauty, athletic ability, charisma, health and constitution, and even longevity (life span). Each of us has a mixture of characteristics that are plainly different from those of others and which provide either springboards or produce disadvantages. Just as it’s unusual to find someone in possession of all positive characteristics at once — the equivalent of rolling a 12 for each attribute of a new D&D character — few possess all negatives (a series of 1’s), either. Also, there’s probably no good way to rank best to worst, strongest to weakest, or most to least successful. Bean counters from one discipline or another might try, but that runs counter to the mythology “all men are created equal” and thus becomes a taboo to acknowledge, much less scrutinize.

What to do with the knowledge that all men are not in fact created equal and never will be? That some are stronger; more charming; smarter; taller with good teeth (or these days, dentists), hair, vision, and square jaws; luckier in the genetic lottery? Well, chalk it up, buddy. We all lack some things and possess others.

A bunch of unrelated things I have been reading and hearing suddenly and rather unexpectedly came together synthetically. The profusion is too great to provide a handy set of links, and backstage analytics indicate that almost no one follows the links I provide anyway, so this will be free form.

Hyperanalysis

As I have previously blogged, peering (or staring) too intently at a physical object or subject of inquiry tends to result in the object or subject being examined at the wrong resolution. An obtuse way of restating this is that one can’t study the cosmos under a microscope or cell biology through a telescope. The common mistake is to study minutia and fail to perceive the whole, rarely the reverse. Iain McGilchrist suggests that hyperanalysis is a hallmark of an imbalance in brain lateralization, where the left brain (the Emissary) takes primacy over the right brain (the Master). Others building on McGilchrist’s thesis have theorized that human culture and styles of cognition have swung in and out of proper balance periodically throughout history. One theory used portraiture to support how depiction of the human face can be either humanistic or clinical in orientation. What artists perceive then produce divergent aesthetics where the eyes and mouth in particular suggest different ways of encountering the world. Using art to develop theories of psychohistory fits well with one of the preoccupations of this blog, namely, the nature of consciousness.

Intervention

Armed with a precise appreciation of some small aspect of a whole, often abstracted and idealized independently from easily observed fact in actuality, some make the leap that human activity can be acted upon confidently with entirely foreseeable outcomes. Thus, incautious decision-makers intervene to correct, adjust, or game areas within their concern and control. The applicable aphorism that restrains cautious people is “fools rush in where angels fear to tread.” Perhaps the Dunning-Kruger Effect helps to explain (at least in part) why actors plunge forward despite ample evidence that outcomes don’t adhere reliably to planning. Indeed, the rise and fall of many institutions demonstrate the fallibility of interventions in dynamic systems. Just ask the Federal Reserve. Periodic recourse to warfare is also characterized as a breakdown of peaceful stratagems accompanied by a fog where clarity of purpose is lost. Faced with deteriorating conditions, the demand that something be done, not nothing, also testifies to human frailty and frequent refusal to accept unknown contingency or inevitability.

Micromanagement

When ideology truly misfires, strenuous interventions take the form of tyranny in an attempt to force resistant outcomes into predetermined channels. Too many awful examples of abject failure on that account to mention. Small-scale tyranny might be better recognized as micromanagement. Control freaks adopt that behavior out of any number of motivations, such as distrust of others’ competence, inability to delegate or share responsibility, and hero syndrome. Ironically, the targets of control are often petty and do not contribute meaningfully to enhanced behavior or function. Rather, they focus on a subsidiary metric as a proxy for overall health, wellbeing, and effectiveness.

As people encounter ideologies and learn to systematize, they are especially prone to using incomplete understandings of complex systems and then seizing upon one intervention or another to attempt to solve problems. This is not limited to students in the early years of college who discover a supposed Rosetta stone for fixing ills but rather includes any reform political candidate or newly minted CEO with a mandate to remove corruption, dead weight, or inefficiency. This is also the domain of specialists, experts, and academics who have concentrated their efforts in a narrow subfield and feel confident generalizing that acquired skill and applied knowledge outside the area of focus. Of course, it’s warranted to rely on brain surgeons for brain surgery and car mechanics for automotive repairs. But no one expects them to offer advice on intractable social problems, correct historical wrongs, or develop a centered philosophy of anything. Indeed, institutions and societies are so complex and inherently unsteerable that, despite many futile attempts, no one has ever developed a comprehensive understanding sufficient to engineer programs that lead anywhere remotely in the vicinity of utopia. Yet with an abundance of confidence, agitators and activists — sometime quite young but unafraid to instruct their elders — seek to implement their ideological designs, by force if necessary, to enact desired change. Maybe those changes are considered humble baby steps toward improved social justice and greater equity, tinkering around the edges perhaps, but I doubt most ambitions are so constrained. While that energy is absolutely necessary in a pluralistic society to avoid cynicism and stagnation, it often becomes a victim of its own success when radicals gain power and impose orthodoxies that are ultimately self-defeating and thus short lived. History is full of movements, civil wars, and revolutions that demonstrate the point. Dystopian fiction also forewarns how tyrannies develop out of misguided application of ideology and power.

I probably haven’t described any too well the power law that coalesced in my thinking. Nor do I pretend to have solutions for anything. As I’ve often written, given the myriad problems global civilization now faces, it’s well nigh impossible to know what to wish for with much clarity since deliverance from one set of problems often exacerbates others. World systems theorists try to piece together various dynamics into a coherent unified theory, and I admire the effort to understand how the world really works, but I still subscribe to the precautionary principle when it comes to implementing globe-spanning programs.

A few years ago, Knives Out (2019) unexpectedly solidified the revival of the whodunit and introduced its modern-day master sleuth: Benoit Blanc. The primary appeal of the whodunit has always been smartly constructed plots that unfold slowly and culminate in a final reveal or unmasking that invites readers to reread in search of missed clues. The two early masters of this category of genre fiction were Sir Arthur Conan Doyle and Agatha Christie, both succeeding in making their fictional detectives iconic. Others followed their examples, though the genre arguably shifted onto (into?) the TV with shows such as Perry Mason, Columbo, and Murder She Wrote. No surprise, Hollywood transformed what might have been a one-and-done story into the beginnings of a franchise, following up Knives Out with Glass Onion: A Knives Out Mystery (subtitle displayed unnecessarily to ensure audiences make the connection — wouldn’t a better subtitle be A Benoit Blanc Mystery?). Both movies are entertaining enough to justify munching some popcorn in the dark but neither observes the conventions of the genre — novel, TV, or film — any too closely. Spoilers ahead.

I harbor a sneaking suspicion that Benoit Blanc is actually a bumbling fool the way poor, rumpled Columbo only pretended to be. Although I can’t blame Daniel Craig for taking roles that allow him to portray someone other than James Bond, Craig is badly miscast and adopts a silly Southern accent others complain sounds laughably close to Foghorn Leghorn. (Craig was similarly miscast in the American remake of The Girl with the Dragon Tattoo, but that’s an entirely different, unwritten review.) So long as Blanc is a nitwit, I suppose the jokey accent provides some weak characterization and enjoyment. Problem is, because the film is only superficially a whodunit, there is no apparent crime to solve after Blanc figures out the staged murder mystery (sorta like an escape room) just after the vacation weekend gets started but before the faux murder even occurs. Kinda ruins the momentum. As a result, the film digresses to a lengthy flashback to establish the real crime that Blanc is there to solve. Maybe good mystery novels have partial reveals in the middle, reframing the entire mystery. I dunno but rather doubt it.

The plot is by no means tightly knit or clever as a whodunit normally demands. Rather, it employs lazy, pedestrian devices that irritate as much as entertain. Such as one of the characters (the real murdered character) having an identical twin who substitutes herself for the dead one; such as trapping attendees on a remote island without servants or transportation but largely ignoring their suggested captivity; such as uncovering an orgy of evidence better suited to misdirection and framing of an innocent; such as mixing faux violence with real violence, though none of the characters appears even modestly afraid at any point; such as bullets being fortuitously stopped by items in a breast pocket; such as sleuthing and detecting — done by the twin, not Blanc! — being presented in a montage of coinkidinks that demonstrate more luck than skill. I could go on. The worst cinematic trick is reprising scenes in flashback but altered to insert clues viewers would have noticed initially. Those aren’t reveals; they’re revisions. Moreover, instead of inviting viewers to rewatch, this gimmick jams supposedly unnoticed clues down their throats. How insulting. If Benoit Blanc is really an overconfident, dandified nincompoop, I suppose it’s better and more convenient (for bad storytelling) to be lucky than good. He doesn’t solve anything; he’s just there to monologue incessantly.

The weekend party is hosted by a character patterned after … oh never mind, you know who. I decline to provide the name of that real-life narcissist. Members of the entourage are mostly sycophants, originally good friends but later ruined in different ways by proximity to a hyper-successful fraud. As a group, they’re known as The Shitheads, which just about sums it up. Critics have observed a shift in entertainment toward depicting super-wealthy pretty people as heels of the highest order. Not sure what makes that entertaining exactly. I enjoy no Schadenfreude witnessing the high and mighty brought low, much as they may deserve it. It’s just another lazy clichĂ© (like its inverse: the dignity of the downtrodden everyman a/k/a the noble savage) trotted out in the absence of better ideas.

/rant on

New Year’s Day (or just prior) is the annual cue for fools full of loose talk to provide unasked their year-in-review and “best of” articles summarizing the previous calendar year. I don’t go in for such clichĂ©d forms of curation but certainly recognize an appetite among Web denizens for predigested content that tells them where to park their attention and what or how to think rather than thinking for themselves. Considering how mis- and under-educated the public has grown to be since the steady slippage destruction of educational standards and curricula began in the 1970s (says me), I suppose that appetite might be better characterized as need in much the same way children needs guidance and rules enforced by wizened authorities beginning with parents yet never truly ending, only shifting over to various institutions that inform and restrain society as a whole. I continue to be flabbergasted by the failure of parents (and teachers) to curb the awful effects of electronic media. I also find it impossible not to characterize social media and other hyperstimuli as gateways into the minds of impressionable youth (and permanent adult children) very much like certain drugs (e.g., nicotine, alcohol, and cannabis) are characterized as gateways to even worse drugs. No doubt everyone must work out a relationship with these unavoidable, ubiquitous influences, but that’s not equivalent to throwing wide open the gate for everything imaginable to parade right in, as many do.

Hard to assess whether foundations below American institutions (to limit my focus) were allowed to decay through neglect and inattention or were actively undermined. Either way, their corruption and now inter-generational inability to function effectively put everyone in a wildly precarious position. The know-how, ambition, and moral focus needed to do anything other than game sclerotic systems for personal profit and acquisition of power are eroding so quickly that operations requiring widespread subscription by the public (such as English literacy) or taking more than the push of a button or click of a mouse to initiate preprogrammed commands are entering failure mode. Like the accidental horror film Idiocracy, the point will come when too few possess the knowledge and skills anymore to get things done but can only indulge in crass spectacle with their undeveloped minds. Because this is a date-related blog post, I point out that Idiocracy depicts results of cultural decay 500 years hence. It won’t take nearly that long. Just one miserable example is the fascist, censorious mood — a style of curation — that has swept through government agencies and Silicon Valley offices intent on installing unchallenged orthodoxies, or for that matter, news junkies and social media platform users content to accept coerced thinking. Religions of old ran that gambit but no need to wait for a new Inquisition to arise. Heretics are already persecuted via cancel culture, which includes excommunication social expulsion, suspension and/or cancellation of media accounts, and confiscation of bank deposits.

A similar point can be made about the climate emergency. Fools point to weather rather than climate to dispel urgency. Reports extrapolating trends often focus on the year 2100, well after almost all of us now alive will have departed this Earth, as a bogus target date for eventualities like disappearance of sea and glacial ice, sea level rise, unrecoverable greenhouse gas concentrations in the atmosphere, pH imbalance in the oceans, and other runaway, self-reinforcing consequences of roughly 300 years of industrial activity that succeeded unwittingly in terraforming the planet, along the way making it fundamentally uninhabitable for most species. The masses labor in 2023 under the false impression that everyone is safely distanced from those outcomes or indeed any of the consequences of institutional failure that don’t take geological time to manifest fully. Such notions are like assurances offered to children who seek to understand their own mortality: no need to worry about that now, that’s a long, long way off. Besides, right now there are hangovers to nurse, gifts to return for cash, snow to shovel, and Super Bowl parties to plan. Those are right now or at least imminent. Sorry to say, so is the full-on collapse of institutions that sustain and protect everyone. The past three years have already demonstrated just how precarious modern living arrangements are, yet most mental models can’t or won’t contemplate the wholesale disappearance of this way of life, and if one has learned of others pointing to this understanding, well, no need to worry about that just yet, that’s a long, long way off. However, the slide down the opposite side of all those energy, population, and wealth curves won’t take nearly as long as it took to climb up them.

/rant off

/rant on

The previous time I was prompted to blog under this title was regarding the deplorable state of public education in the U.S., handily summarized at Gin and Tacos (formerly on my blogroll). The blogger there is admirable in many respects, but he has turned his attention away from blogging toward podcasting and professional writing with the ambition of becoming a political pundit. (I have disclaimed any desire on my part to be a pundit. Gawd … kill me first.) I check in at Gin and Tacos rarely anymore, politics not really being my focus. However, going back to reread the linked blog post, his excoriation of U.S. public education holds up. Systemic rot has since graduated into institutions of higher learning. Their mission statements, crafted in fine, unvarying academese, may exhibit unchanged idealism but the open secret is that the academy has become a network of brainwashing centers for vulnerable young adults. See this blog post on that subject. What prompts this new reality check is the ongoing buildup of truly awful news, but especially James Howard Kunstler’s recent blog post “The Four Fuckeries” over at Clusterfuck Nation, published somewhat in advance of his annual year-end-summary-and-predictions post. Kunstler pulls no punches, delivering assessments of activities in the public interest that have gone so abysmally wrong it beggars the imagination. I won’t summarize; go read for yourself.

At some point, I realized when linking to my own past blog posts that perhaps too many include the word wrong in the title. By that, I don’t mean merely incorrect or bad or unfortunate but rather purpose-built for comprehensive damage that mere incompetence could not accomplish or explain. Some may believe the severity of damage is the simple product of lies compounding lies, coverups compounding coverups, and crimes compounding crimes. That may well be true in part. But there is far too much evidence of Manichean manipulation and heedless damn-the-torpedoes-full-steam-ahead garbage decision-making to waive off widespread institutional corruptions as mere conspiracy. Thus, Kunstler’s choice of the term fuckeries. Having already reviewed the unmitigated disaster of public education, let me instead turn to other examples.

(more…)

This blog has never been obliged to observe every passing holiday or comment on celebrity deaths or public events via press release, public statement, command performance, ritual oversharing, or other characterization more closely associated with institutions and public figures who cannot keep from thrusting themselves wantonly onto the public despite having nothing of value to say. The chattering class maintains noise levels handily, so no need to add my voice to that cacophonous chorus. To wit, the recent Thanksgiving holiday prompts each of us every year to take stock anew and identify some area(s) of contentedness and gratitude, which can be challenging considering many Americans feel abandoned, betrayed, or worse as human history and civilization lurch despotically toward their end states. However, one overheard statement of gratitude this year made a strong impression on me, and as is my wont, I couldn’t help but to connect a variety of disparate ideas. Let me digress, starting with music.

Decades ago, the London Philharmonic under Jorge Mester recorded a collection of fanfares commissioned during WWII. American composers represented include (in no particular order) Henry Cowell, Howard Hanson, Roy Harris, Morton Gould, Leonard Bernstein, Virgil Thomson, and Walter Piston. None of their respective fanfares has entered the standard repertoire. However, the sole composer whose stirring fanfare has become legitimate and instantly recognizable Americana is Aaron Copland. His fanfare celebrates no famous figure or fighting force but rather the common man. Copland’s choice to valorize the common man was a masterstroke and the music possesses appealing directness and simplicity that are unexpectedly difficult to capture in music. Far more, um, common is elaborate, noisy, surface detail that fails to please the ear nearly so well as Copland’s stately fanfare. Indeed, the album is called Twenty Fanfares for the Common Man even though that title only applies to Copland’s entry.

The holiday comment that stuck with me was a son’s gratitude for the enduring example set by his father, a common man. Whether one is disposed to embrace or repudiate the patriarchy, there can be no doubt that a father’s role within a family and community is unique. (So, too, is the mother’s. Relax, it’s not a competition; both are important and necessary.) The father-protector and father-knows-best phase of early childhood is echoed in the humorous observation that a dog sees its master as a god. Sadly, the my-dad-can-beat-up-your-dad taunt lives on, transmuted in … superhero flicks. As most of us enter adulthood, coming to terms with the shortcomings of one or both parents (nobody’s perfect …) is part of the maturation process: establishing one’s own life and identity independent yet somehow continuous from those of one’s parents. So it’s not unusual to find young men in particular striking out on their own, distancing from and disapproving of their fathers (sometimes sharply) but later circling back to reflect and reconcile. How many of us can honestly express unalloyed admiration for our fathers and their character examples? I suspect frustration when feet of clay are revealed is more typical.

(more…)