/rant on

A NYT opinion piece by Jessica Grose called “Get Tech Out of the Classroom Before It’s Too Late” (link behind a paywall) came to my attention when Alan Jacobs blogged about it (both from a couple days ago). Typical of crap journalism, the article opens with contextual anecdotes to provide connection to real life, mentions and links to numerous other articles and studies to support its statements of the obvious arguments, and ends with banal, inactionable recommendations what to do next. The template is predictable and easy to cast aside considering that, in the case of tech in the classroom, it’s already too late: tech is embedded so deeply that dislodging it now requires Herculean effort no one wants to contemplate or undertake. Here is the dumb recommendation from the end of the article:

We need to reframe the entire conversation around tech in schools because it’s far from clear that we’re getting the results we want as a society and because parents are in a defensive crouch, afraid to appear anti-progress or unwilling to prepare the next generation for the future … If we don’t hit pause now and try to roll back some of the excesses, we’ll be doing our children — and society — a profound disservice.

Are you kidding me? We need to talk about …? Roll back some excesses? A profound disservice? The state of K–12 education is a shambles precisely because (one reason among many) it’s been coopted by tech companies that managed over the course of decades to sell their snake oil (program learning, computer-assisted instruction, distance and remote learning, online courses, etc.) to educators and administrators who ought to know better but are nonetheless lured by sales pitches (and one presumes, kickbacks and technocratic control over large tech budgets). Parents are only slightly less culpable in their willingness to allow erosion of educational standards (slowly, then all at once as the saying goes) that have transformed primary and secondary education into a sacrifice zone. Higher ed. has fared no better for different reasons, perhaps (an entirely different rant).

Alan Jacobs iterates numerous ways that everyone knows tech in schools is a disaster. His most forceful statement is this:

Everyone knows that the big Silicon Valley companies do not care how much damage they do to society or the environment; they care only about what Mark Zuckerberg likes to call DOMINATION. [emphasis added]

Adding my comments to the heap — not that anyone will pay me any attention, either — if a young child, whose natural impulse in explorer mode is to put things in the mouth to test and ingest, were to get hold of, say, rat poison or drain cleaner from under the kitchen sink, any parent would leap into action to keep the kid from swallowing the poison. Maybe the kid has sufficient maturity to recognize that certain common household dangers (heavy furniture, electrical sockets, knives, etc.) are not to be messed with. However, kids are not trusted or empowered to assess those dangers. Tech in the home and classroom is poison to children, deranging and (arguably) ruining them for life (varying degrees of damage over large populations, obviously). Yet unlike household poisons that are either removed from the childhood environment or locked away out of reach by responsible parents, irresponsible parents put the damned devices in their kids’ hands, usually to keep them occupied. During my childhood, it was the TV, which in hindsight was far less damaging to development because we kids would eventually tire of laying on the floor in front of the boob tube TV, get bored, and go outside to play. The infinite scroll and FOMO has apparently defeated the fatigue factor so that users keep going and going until sleep finally comes. Yeah, I know that childhood curiosity and attention-seeking are like unquenchable fires — emergencies that must be addressed immediately at all cost — at least until the child is plopped in front of the TV or handed an Internet-enabled tablet. Doesn’t matter. It’s the parents’ job to keep poison out of their kids’ hands and mouths, to keep them alive. Failure to do so is criminal negligence.

If an older child, whose natural impulse in explorer mode is to examine and test various objects, tools, and ideas, gets hold of, say, a loaded gun not properly secured by a responsible parent or caregiver, the gun owner would leap into action to keep the kid from firing the gun and potentially injuring or killing someone. Maybe the kid has the maturity to recognize the loaded gun is not a toy or maybe the kid acts in imitation of shoot-em-up TV shows and movies everyone sees. Again, results vary widely across large populations. However, kids are not trusted to make potentially life-and-death decisions with loaded guns. The Internet and its lures (wholesome, prurient, educational, entertainment, doesn’t matter) is the equivalent of a loaded gun. It’s the parents’ job to keep the loaded gun from their kids, to keep them alive. Failure to do so is negligence. Firing it (the loaded gun is the Internet, dummy) at or into the minds of youth does damage, yet parents and educators who already know this can’t bring themselves to take away the damned weapon for reasons that — let me put this bluntly — are insane.

The 21st century has thus become another occasion (multiple earlier episodes) for society to go mad. Everyone knows this; just look around at what goes on in neighborhoods on up to nation states. Alan Jacobs chalks it up to “a deficiency of will and a malformation of desire.” I’m more inclined toward Iain McGilchrist’s description of the usurpation of the Master by the Emissary (the right and left brain hemispheres, respectively). Madness is by no means limited to children in explorer mode, though they are being sacrificed at the scale of entire generations. Other vulnerable populations (prisoners, warfighters, the poor and homeless, etc.) are similarly sacrificed because of deficiencies of will, malformations of desire, and/or inability to address giant social problems effectively. I guess we “need to reframe the entire conversation.” Sure, that’s the ticket.

/rant off

Be forewarned: long post ahead.

The Bulletin of the Atomic Scientists popped up again in the news. Apparently, artificial intelligence (AI) was added to the already burgeoning list of threats capable of ramping up into global catastrophe and ending humanity handily (not to discount the rest of the living planet). That early 2024 Bulletin update joins a growing number of pronouncements from authorities, experts, pundits, politicians, think tanks, and government agencies — each clamoring for attention like the boy who cried “wolf” — that insist, in so many words, “I mean it this time. Not gonna tell you again. Last time issuing this warning. Pay attention! Time to take immediate action! No more kicking the can down the road. Gotta rush the cockpit and regain control of the plane. Revolt now and reclaim government. Yadda yadda ….”

Given recent history, it’s rather astonishing to surmise that various Cassandras still pretend to believe in the power of collective action, government action, or any other mechanism to right the sinking ship or indeed save ourselves from the accumulated weight and consequence of past activity. Hopium is a powerful drug indeed. Frankly, it’s not even clear (to me at least) that government has even modest ability to manage human affairs in the narrow sense or on the short term, much less shepherd the entire planet. (Government at the town and village levels may still have some mojo when not overdriven by state and Federal governments.) Collective action? Political, economic, and/or climate activists? The great unwashed masses? The revolutionary mob? Fuggeddaboudit. All one has to do is turn away from electronic distractions, look around, and gather a modest amount of information to observe directly what a complete, dystopian shit show the 21st century has become. No particular need to list the many atrocities at work right now — this very minute. That survey would likely destroy what little is left of my soul.

So are things really coming to their bitter end? Three prescient titles to that effect spring to mind.

Read the rest of this entry »

goblin mode: originally referring to the Covid pandemic, an intuition that one is living in a fundamentally new normal, that there can be no return to previous social conditions because the world shifted irrevocably

remote theoretics: purely ideological positions, often for the sake of argument but used as guides for policy, that have little or no possibility of manifesting in reality

digital modesty: overdue recognition that maybe it’s not such a good idea after all to overshare online

pathway confusion: anxiety due to available options forward being so unclear that decisions cannot be made

crank on confirmation: entering into negotiation (in bad faith) and springing a last-minute demand for, say, a 20% discount or premium to finalize the deal

screenager: an obvious portmanteau of screen and teenager to describe adolescents who spend an inordinate amount of their waking hours absorbed in screens

dry promotion: a job upgrade (often a mere change of job title) without an increase in compensation

bougie broke: a consumerist lifestyle beyond one’s means (even with a handsome income) adopted to impress others, similar to keeping up with the Joneses

rage bait: news reports and online trolling designed to enrage the viewer/listener, similar to hate bait

FAFO: an acronym for Fuck Around and Find Out, a taunt that challenges one’s expectation of cause and effect

… the best means of career advancement is to attach yourselves to initiatives which are
purely ideological, and so cannot fail in practice, because there is no practice. —Aurelien

I keep returning in my mind to Thomas Chatterton Williams’ term The Age of Theory as a one-size-fits-all explanation how the postmodern era has gone positively batty. That’s clearly reductio ad adsurdam, but let me make the argument anyway. What follows owes more to my own survey of the scene than anything by Williams, but he got the ball rolling.

In modern politico-speak (and elsewhere), it’s not unusual to hear phrases such as seeking justice, protecting freedom, saving democracy, and the like. Everyone knows what is meant, sorta, but if pressed to define in any rigorous way, failures abound. Such is the character of any concept or body of ideas cited frequently but lacking in specificity: it becomes reified. Cynic that I am, my immediate reaction upon hearing a reified concept invoked is to withdraw consideration and support. That’s because it’s a loose, emotional appeal that typically can’t stand up to scrutiny. And although a commonplace in speech making, news reporting, and political argument, garbage phrases such as these mark the speaker or writer as someone not to be sought out or consulted for thoughtful analysis principally because there are no meaningful starting lines or endpoints. Even if one uses the current state of affairs as the takeoff, even for the purpose of preserving the status quo (the true meaning of conservation), there can never be a point of successful arrival because no conventional measure allows one to know that the goal has been achieved. The goad to, say, “save democracy” by voting for a certain candidate can’t be demonstrated reliably even if/when the candidate wins. I reject the supposition that the candidate winning is itself the measure; that’s just racehorse politics. The real work of governance begins after taking office. The lack of concrete evidence of achieving a purported goal is also what’s meant by the initial quote above by Aurelien, a writer on Substack I read on occasion.

But wait, it gets worse (it always gets worse). If reification is to concretize otherwise amorphous concepts that lack specificity and measurability, the process also works paradoxically in reverse to render concrete practice more purely ideological. The term ideation is sometimes used pejoratively to describe measurable, achievable goals that have been turned into open-ended and/or disembodied concepts. For example, capitalism is a complex socioeconomic system that nonetheless has knowable, measurable attributes and outcomes. But as economists theorize about the practice of capitalism and measurable attributes (e.g., the inflation rate and the Gini coefficient), typically under political pressure, outcomes are regularly shifted, revised, and distorted by various bureaucratic offices until useful meaning is lost in a maze of obfuscation and outright lying. Even more egregiously, now that the U.S. dollar is no longer pegged to gold (“real money” by those hawking precious metals), fiat currency and modern monetary theory (MMT) have turned money (and its inverse: debt) into mere social consensus, quite literally transforming cold, hard cash into pixelated notions on ledgers somewhere. Cryptocurrencies and nonfungible tokens (NFTs) are further examples of entirely theoretical stores of wealth that in truth mask the destruction of wealth.

For me, this is the essence of The Age of Theory: the disintegration and dematerialization of real things in favor of their symbolic tokens. Other instances of this effect include individual flight from reality into social media, augmented reality, and virtual reality (lots of overlap in those). Rather than live one’s life in actuality (meat world to some), many prefer to watch others’ fantasy lives confabulated on social media, project their own fantasies avatars into digital spaces, and/or seek the unalloyed goal of Transhumanism by escaping the body entirely and uploading one’s consciousness into a computer. Thus far, only modest steps on that slippery slope are possible, but the offer not just of augment but to dematerialize oneself and embrace digital oblivion is out there and often hotly desired. As observed psychologists and psychometricians, deranging and destabilizing effects are already evident across the culture. Chief among them is renewed interest and even willingness to let the nuclear genie back out of the bottle with strategic strikes as though escalation would not ensue quickly and put a resounding end to civilization in a series of blinding flashes. That’s the ultimate disintegration and dematerialization made possible by theorizing undertaken by those dangerously estranged from reality.

Next in my series of ridiculous dance steps is The Connoisseur Cakewalk. Why a cakewalk? Because participants promenade (often in a puffed-up, preening display of pride and pomposity) around a circle divided into chalk squares until the music stops, someone ends on the prizewinning square, and a cake is awarded. Used to be a staple of county fairs and small-town municipal events. No mad scramble at the end as with Musical Chairs.

I heard someone remark that the new currency is fame and/or prestige. Fame tends to go hand-in-hand with the old currency (wealth), but for those not yet among the 1%, the pretense of wealth can be simulated pretty well, at least for the camera. Is that what’s behind all the so-called influencers parading their best lives in front of everyone (via social media, natch), with little or no apparent concern for genteel good taste? Indeed, social media appears to be built for positioning oneself, often through fakery, as high as possible on a comparative or competitive hierarchy as a connoisseur of the good life (implicit assumption: the good life can be purchased instead of being demonstrated by good character and service to others). The term bougie broke (bougie = bourgeois) describes those who, trying to keep up with wealthy lifestyles, live hand-to-mouth even though they may earn decent incomes.

The venerable once-per-year holiday missive is also filled with selective news from extended family, carefully curated to report only the most favorable highlights (e.g., achievements, exotic travel, adventurous hobbies). This is also the domain of alumni magazines. (I receive issues from my two alma maters reporting institutional successes and seeking donations for new projects. Negativity is quietly expunged.) As I interact with co-workers and other acquaintances, I get similar surface-level reports of how wonderful was their latest restaurant outing or vacation travel. Of course, we’re all guilty of selective memory that makes possible forgetting minor humiliations suffered at the hands of the maître d’ or airline, so only positive highlights are reported.

I suppose one’s best life is by necessity accompanied by some luxuries, honors, and accolades. But lopsided reports and incessant selfies typical of this era of nearly cost-fee communications irk me. Indeed, I’m a terrible curmudgeon in my relatively limited circle because I don’t admire wealthy and/or famous people for their fame and wealth and don’t for a moment care about the climbers and fakers. They aren’t prizewinners in The Connoisseur Cakewalk. My values are amplified by the following passage from Sentences from Seneca (or is it The Madness of Hercules? the title is a little unclear when seeking links) by Dana Gioia.

Reading Seneca, I immediately recognized the men of my Sicilian and Mexican families. I knew the word stoic but nothing about Stoicism as a philosophy. In Roman Stoicism, I saw the hard wisdom of the men who had raised me. They had suffered sorrow, poverty, and discrimination. Some had been beaten, extorted, or unjustly arrested. They took pride in their capacity for work and self-sufficiency. They remembered their troubles and deprivations, but they did not define themselves by their trauma. They were fatalistic but happy. They controlled their passions and bore suffering without complaint. No one had much money or property, but they felt what they had sufficed. They were suspicious of luxuries. They viewed their troubles with laughter rather than anger.

The stoicism of the ancient Romans eschews the boastful projection so commonplace in today’s culture. Remnants of stoicism have undoubted survived among some downtrodden demographics. Perhaps all of us could learn from those examples as the Age of Abundance reverses and rejoins past Ages of Scarcity, which is most of human history.

I again heard X (formerly Twitter) called a complaint or outrage engine. No surprise there. Something similar could be said of this blog, which often takes a dispirited and/or disapproving tone when analyzing the dystopia we have created for ourselves. News and entertainment media that rely on up-to-the-moment coverage — fast culture as distinguished from slow culture found in, say, magazines and books — similarly skew toward scandal and salacious content. Crowd-sourced social media is yet more of the same, with all manner of awfulness driving engagement. One might think that all this negativity would be purposely avoided by those who don’t want their innocence punctured or equanimity disturbed, and for some, avoidance (turtling? ostriching? hermiting?) works pretty well. Others take seriously the idea of active engagement with news and ideas of the day, which means accepting the bad with the good and not shrinking from discomfiting knowledge. That said, here’s what I don’t get.

If one is truly engaged and inevitably learns of some sort of abject awfulness, current day or historical, it should not cause full psychological meltdown. (Coulda posed that as a question, but I’ll be more decisive and emphatic for a change.) For instance, discovery at some tender stage of childhood development that the Holocaust occurred (still just within living memory) or long before it the Golden Horde rampaged across Asia, both of which were responsible for millions of deaths, should be disquieting but not unduly difficult to absorb into one’s worldview. These are facts of history that remain significant through time no matter how much awareness one may have of them, and there is an inexhaustible supply of these facts depending on how many rocks are rolled over to reveal history’s dirty underbelly. Yet the urge to sanitize our past by denying, burying, reinterpreting, and falsifying history is quite strong, demonstrating clearly that many can simply no longer handle the truth (if ever they could).

Many historical facts are practices rather than events. The whopper for the U.S. — chattel slavery — has so unhinged some that it has morphed into original sin, which cannot be erased, overcome, or forgiven. That scandalous practice in U.S. history (eliding its much longer history around the globe) is in fact so traumatizing that it has recently (after a relatively brief interregnum) become a renewed launching point for social justice movements. Think of it a second-wave Civil Rights Movement similar to second- and third-wave feminism that has somehow lost the plot. Woke ideology is arguably a further expansion of attempts to expunge discrimination on any basis, often under the banner of inclusion. Regrettably, Woke has grown into an incoherent stew of best intentions with unplanned collateral effects. Smuggled in during such out-of-control cultural moments are foolhardy plans to design and implement utopian societies, missing the lesson that when fervent ideologies have been tried in the past, they have led repeatedly (perhaps not inevitably, but close) to tyranny and atrocity. Disclaimer: I lack the confidence to advocate for or against any ideology. Still, fools rush in where angels fear to tread.

Read the rest of this entry »

Although I have previously prophesied the death of the skyscraper as a prestige project, it seems new life has been breathed into this moribund competition for vertical supremacy. China and other Asian countries continue erecting skyscrapers despite prohibitive costs, typically tens of billions of dollars per building, and concerns about the possibility of operating supertall structures effectively or profitably in a looming era of diminishing energy. One might wonder about that last remark. Peak oil was a common concern in 2006 but the temporary shale oil boom has effectively kicked that can down the road (like so many others) for roughly 20 years. But that’s not the primary focus of this blog post.

For a while still, the Burj Khalifa in Dubai remains the tallest completed building. However, construction of the completely unnecessary Jeddah Tower in Saudi Arabia is rumored to have resumed in 2023 after a five-year hiatus, which upon completion is expected to be the world’s tallest (until some other challenger appears). Final height of the Jeddah Tower may well exceed that of the Burj Khalifa, but only completion of the former will establish which wins the sweepstakes.

In other weird news, a new North American skyscraper rivaling One Trade Center in height (final design and height also not yet established) was pitched in a rather unlikely location: Oklahoma City. The need or demand for such a building is highly questionable to say the least, though it’s expected to be a tourism draw. Some commentators find its position in the middle of so-called Tornado Alley to be a recipe for disaster. Seriously, what could possibly go wrong?

Continuing my series of dances that are even at a glance actually tragic follies, let me suggest The Unipolar Strut (following The Rebranding Shuffle, The Builder’s Waltz, and The Twist). As the last remaining superpower, the world hegemon (but not sole nuclear power), at least temporarily, the U.S. has taken upon itself unwarranted responsibility for the entire world at considerable costs. Why else maintain a giant military with bases encircling the globe so wildly out of proportion to any other country? In such a unipolar world, as yet unchallenged (preparations underway), one might think that the U.S. would maneuver and arrange everything within the realm of possibility to its benefit and reap handsome rewards for itself and its allies. However, that would require competent leadership even if means employed were illicit and maniacal. Instead, illicit and maniacal still describes U.S. treatment of foreigners but U.S. leadership is fundamentally incompetent to achieve its objectives (unless its true aim is simply to wreak havoc across the global stage, the same basic motivation as vandals and bullies). Accordingly, alliances are forming to resist and/or bypass the bully.

This assessment was reinforced by an article in the June 2023 issue of Harper’s Magazine by Benjamin Schwarz and Christopher Layne posing the obvious question “Why Are We in Ukraine?” It’s a thoughtful history of great power politics that used to sorta function in the nuclear era (once the power to destroy the world was created) but has been fundamentally betrayed by the U.S. for roughly thirty years (if not longer). It’s also precisely the sort of information needed to evaluate the position of the U.S. in the 21st century. Regrettably, lessons the article provides are easy to ignore if one functions primarily within the electronically mediated oral culture I discussed in this post. The article also called to mind a joke meme that appeared in 2006, much embellished since then, that depicts a military officer beginning to wonder if he isn’t in fact one of the baddies. It’s funny because he’s wearing one of those snazzy Yahtzee uniforms, which permits even the dullest among us to render snap judgment in the affirmative: you’re a baddie.

Were anyone to pose that same question for the U.S. (or at least U.S. leadership and its many water-carriers in the professional managerial class), the inescapable response ought to be “Um, yeah, well, sure. We ARE the baddies this time around.” Just consider the many 21st-century wars (only one quarter of the way in) for which the U.S. is directly responsible. Uniforms and demonic emblems unnecessary.

Austrian symphonist Anton Bruckner was born in 1824 (died 1896), making 2024 his bicentennial. Naturally, musicians have taken note of this anniversary with recordings newly issued and more expected throughout the year. For a major composer of monumental works, Bruckner is a strange case. Many of my musician friends report that they just don’t get his musical language or find him coarse and so ignore him. This is the case with conductors, too, some of whom never conduct Bruckner (or at least never record him) for reasons stated and unstated. Yet I wouldn’t characterize Bruckner as a niche composer. Rather, he was significant contributor to symphonic development extending from Haydn and Mozart to Mahler and Shostakovich.

In my review of the Wing Cycle (which pulls in views but no comments), I mentioned that a few orchestras distinguish themselves as quintessentially Brucknerian. Indeed, every orchestra/conductor combo must come to terms with how to shape the sound — the strings no less than the brass. Seriously, it’s an issue, most noticeably when the brass fail to blaze or the strings are too compact. I’ve performed Bruckner symphonies, and the instinct of the conductor (in my experience) has been to damp down the brass after hearing the initial fortissimo. Perhaps it’s the quality of the orchestras in which I was performing, but I rather suspect it was the shock of that sound. Of course, I’ve also heard recordings where the brass play like absolute pigs (offenders unnamed), so a judicious balance must be struck.

Read the rest of this entry »

A second item came to me in a podcast where Jordan Peterson interviewed Ben Shapiro. Considering Peterson is among the most significant public intellectuals of the last decade, I sorta follow him. I pay Ben Shapiro no particular heed but am not opposed to hearing from him periodically. In the course of the interview, Shapiro stumbled into the answer to a question that has frustrated me for some time. I’ve argued that the typographic mind (or chirographic mind to use Walter Ong’s term) is on the wane and that society is regressing to oral culture. However, the character of that orality is not the same as preliterate or nonliterate cultures precisely because everyone is now enveloped in omnimedia.

Shapiro’s observation was that when elders are valued as repositories of experience and wisdom, that resource is relatively simple to access advantageously but requires trust and authority, which are built through facework (physical presence) and come naturally to those who have already traversed life’s phases. When elders are discarded, discounted, or forgotten, there are few authorities to which one can turn reliably because trust cannot be established at a distance, especially when mediated through electronics. So although newscasters, politicians, celebrities, academics, and others may be issuing scripted statements from positions of considerable power (i.e., bully pulpits), it’s normal to be suspicious that an ulterior motive is present (usually selling something) or that one is the mark in a straightforward confidence game. Clergy might be a good counterexample if secularism and ecclesiastical scandals had not invalidate them.

Here are a few examples. Scam phone calls fail abysmally at establishing trust: some anonymized nobody (often with a foreign accent but a Christian name such as Dave or Susan) on the other end of the phone connects with lures and promises only to ask for personal information that is frankly none of their business. Who buys, say, car or burial insurance that way? Similarly, the Federal government has for decades run information/narrative campaigns (or more insidiously, psyops, propaganda organs, and censorship networks) to convince Americans to accept such things as austerity, foreign wars, and forfeiture of privacy in exchange for bogus safety/security. Campaigns are primarly through oral engagement (talking heads on a screen) because so few bother to read anymore. (Twitter hardly counts as reading.) What may be initially plausible in the flow of speech, talkers adding to the flow without pause to allow time for consideration (a necessity, frankly), is not the way a chirographic mind best processes information. As a result, the latest canard or malign justification tends to be what sticks in an electronically mediated oral culture that lacks real authority or even the pretense of trust.

If this subject is inscrutable, well, that’s because it’s deep culture. If one observes that a fish doesn’t notice the water in which it swims, that’s an analogy for being so completely embedded in an environment that its basic attributes are simply assumed. Omnimedia is the information environment now, which has gone from its primary textual form in literate culture to an oral culture fed by video (broadcast and cable TV, streaming, YouTube, cinema, etc.) that is conducted from a distance to one’s preferred screen — a fundamental distinction from oral cultures of the past.