Archive for the ‘Idle Nonsense’ Category

For this blog post, let me offer short and long versions of the assertion and argument, of which one of Caitlin Johnstone’s many aphorisms is the short one:

Short version: Modern mainstream feminism is just one big celebration of the idea that women can murder, predate, oppress, and exploit for power and profit just as well as any man.

Long version: Depicting strength in terms of quintessential masculine characteristics is ruining (fictional) storytelling. (Offenders in contemporary cinema and streaming will go unnamed, but examples abound now that the strong-female-lead meme has overwhelmed characters, plots, and stories. Gawd, I tire of it.) One could survey the past few decades to identify ostensibly strong women basically behaving like asshole men just to — what? — show that it can be done? Is this somehow better than misogynist depictions of characters using feminine wiles (euphemism alert) to get what they want? These options coexist today, plus some mixture of the two. However, the main reason the strong female lead fails as storytelling — punching, fighting, and shooting toe-to-toe with men — is that it bears little resemblance to reality.

In sports (combat sports especially), men and women are simply not equally equipped for reasons physiological, not ideological. Running, jumping, throwing, swinging, and punching in any sport where speed and power are principal attributes favors male physiology. Exceptions under extraordinary conditions (i.e., ultradistance running) only demonstrate the rule. Sure, a well-trained and -conditioned female in her prime can beat and/or defeat an untrained and poorly conditioned male. If one of those MMA females came after me, I’d be toast because I’m entirely untrained and I’m well beyond the age of a cage fighter. But that’s not what’s usually depicted onscreen. Instead, it’s one badass going up against another badass, trading blows until a victor emerges. If the female is understood as the righteous one, she is typically shown victorious despite the egregious mismatch.

Nonadherence to reality can be entertaining, I suppose, which might explain why the past two decades have delivered so many overpowered superheroes and strong female leads, both of which are quickly becoming jokes and producing backlash. Do others share my concern that, as fiction bleeds into reality, credulous women might be influenced by what they see onscreen to engage recklessly in fights with men (or for that matter, other women)? Undoubtedly, a gallant or chivalrous man would take a few shots before fighting back, but if not felled quickly, my expectation is that the fight is far more likely to go very badly for the female. Moreover, what sort of message does it communicate to have females engaging in violence and inflicting their will on others, whether in the service of justice or otherwise? That’s the patriarchy in a nutshell. Rebranding matriarchal social norms in terms of negative male characteristics, even for entertainment purposes, serves no one particularly well. I wonder if hindsight will prompt the questions “what on Earth were we thinking?” Considering how human culture is stuck in permanent adolescence, I rather doubt it.

From an article in City Journal by Andrey Mir (by way of Alan Jacob’s blog Snakes and Ladders) called “The Medium Is the Menace“:

Digital natives are fit for their new environment but not for the old one. Coaches complain that teenagers are unable to hold a hockey stick or do pull-ups. Digital natives’ peripheral vision — required for safety in physical space — is deteriorating. With these deficits come advantages in the digital realm. The eye is adjusting to tunnel vision — a digital native can see on-screen details that a digital immigrant can’t see. When playing video games, digital immigrants still instinctively dodge bullets or blows, but digital natives do not. Their bodies don’t perceive an imaginary digital threat as a real one, which is only logical. Their sensorium has readjusted to ignore fake digital threats that simulate physical ones. No need for an instinctive fear of heights or trauma: in the digital world, even death can be overcome by re-spawning. Yet what will happen when millions of young people with poor grip strength, peripheral blindness, and no instinctive fear of collision start, say, driving cars? Will media evolution be there in time to replace drivers with autopilots in self-driving vehicles?

Got one of those chain e-mail messages from who knows who or where, ending with the exhortation to pass it on. My comments follow each of the titular things. Read at your peril. (I could nit-pick the awfulness of the writing of the quoted paragraphs, but I’ll just let that go.) Before commenting, however, let me point out that the anonymous writer behind this listicle assumes that systems will function long enough for predictions to prove out. The last two years have already demonstrated that the world is entering a period of extreme flux where many styles and functions of social organization will break down irreparably. Supply chain difficulties with computer chips (and relatedly, fossil fuels) are just one example of nonlinear change that is making owning and operating a personal vehicle far less affordable (soon impossible for many) than decades past. Impossible to predict when breakdown reaches critical mass, but when it does, all bets are off.

1. The Post Office. Get ready to imagine a world without the post office. They are so deeply in financial trouble that there is probably no way to sustain it long term. Email, Fed Ex, and UPS have just about wiped out the minimum revenue needed to keep the post office alive. Most of your mail every day is junk mail and bills. 

Despite its popularity among the general public, the U.S. Postal Service (USPS link ends in .com, not .gov) has been under attack for generations already with the ostensible goal of privatizing it. Financial trouble is by design: the USPS is being driven to extinction so that its services can be handed off to for-profit alternatives, jacking up prices in the process. So yeah, it might fail and go away like other cherished American institutions.

(more…)

Following up the two previous entries in this series, the Feb. 2022 issue of Scientific American has a cover article by Adam Becker called “The Origins of Space and Time” with the additional teaser “Does spacetime emerge from a more fundamental reality?” (Oddly, the online title is different, making it awkward to find.) I don’t normally read Scientific American, which has become a bit like Time and Newsweek in its blurb-laden, graphics-heavy presentation intended patronizingly for general-interest readers. In fact, I’ll quote the pullout (w/o graphics) that summarizes the article:

How Spacetime Emerges. Space and time are traditionally thought of as the backdrop to the universe. But new research suggests they might not be fundamental; instead spacetime could be an emergent property of a more basic reality, the true backdrop to the cosmos. This idea comes from two theories that attempt to bridge the divide between general relativity and quantum mechanics. The first, string theory, recasts subatomic particles as tiny loops of vibrating string. The second, loop quantum gravity, envisions spacetime being broke down into chunks — discrete bits that combine to create a seemingly smooth continuum.

Being a layperson in such matters, I’ll admit openly that I don’t fully grasp the information presented. Indeed, every breathless announcement from CERN (or elsewhere) about a new subatomic particle discovery or some research group’s new conjectures into quantum this-or-that I typically greet passively at best. Were I a physicist or cosmologist, my interest would no doubt be more acute, but these topics are so far removed from everyday life they essentially become arcane inquiries into the number of angels dancing on the head of a pin. I don’t feel strongly enough to muster denunciation, but discussion of another aspect of pocket reality is worth some effort.

My understanding is that the “more basic reality, the true backdrop” discussed in the article is multidimensionality, something Eric Weinstein has also been grappling with under the name Geometric Unity. (Bizarrely, Alex Jones has also raved about interdimensional beings.) If the universe indeed has several more undetectable dimensions (as memory serves, Weinstein says as many as 14) and human reality is limited to only a few, potentially breaking through to other dimension and/or escaping boundaries of a mere four is tantalizing yet terrifying. Science fiction often explores these topics, usually in the context of space travel and human colonization of the galaxy. As thought experiments, fictional stories can be authentically entertaining and enjoyable. Within nonfiction reality, desire to escape off-world or into extra- or interdimensionality is an expression of desperation considering just how badly humans have fucked up the biosphere and guaranteed an early extinction for most species (including ours). I also chafe at the notion that this world, this reality, is not enough and that pressing forward like some unstoppable chemical reaction or biological infiltration is the obvious next step.

This intended follow-up has been stalled (pt. 1 here) for one simple reason: the premise presented in the embedded YouTube video is (for me at least) easy to dismiss out of hand and I haven’t wanted to revisit it. Nevertheless, here’s the blurb at the top of the comments at the webpage:

Is reality created in our minds, or are the things you can touch and feel all that’s real? Philosopher Bernardo Kastrup holds doctorates in both philosophy and computer science, and has made a name for himself by arguing for metaphysical idealism, the idea that reality is essentially a mental phenomenon.

Without going into point-by-point discussion, the top-level assertion, if I understand it correctly (not assured), is that material reality comes out of mental experience rather than the reverse. It’s a chicken-and-egg question with materialism and idealism (fancy technical terms not needed) each vying for primacy. The string of conjectures (mental gymnastics, really, briefly impressive until one recognizes how quickly they lose correlation with how most of us think about and experience reality) that inverts the basic relationship of inner experience to outer reality is an example of waaaay overthinking a problem. No doubt quite a lot of erudition can be brought to bear on such questions, but even if those questions were resolved satisfactorily on an intellectual level and an internally coherent structure or system were developed or revealed, it doesn’t matter or lead anywhere. Humans are unavoidably embodied beings. Each individual existence also occupies a tiny sliver of time (the timeline extending in both directions to infinity). Suggesting that mental experience is briefly instantiated in personhood but is actually drawn out of some well of souls, collective consciousness, or panpsychism and rejoins them in heaven, hell, or elsewhere upon expiration is essentially a religious claim. It’s also an attractive supposition, granting each of us not permanence or immortality but rather something somehow better (?) though inscrutable because it lies beyond perception (but not conceptualization). Except for an eternity of torments in hell, I guess, if one deserves that awful fate.

One comment about Kastrup. He presents his perspective (his findings?) with haughty derision of others who can’t see or understand what it so (duh!) obvious. He falls victim to the very same over-intellectualized flim-flam he mentions when dismissing materialists who need miracles and shortcuts to smooth over holes in their scientific/philosophical systems. The very existence of earnest disagreement by those who occupy themselves with such questions might suggest some humility, as in “here’s my explanation, they have theirs, judge for yourself.” But there’s a third option: the great unwashed masses (including nearly all our ancestors) for whom such questions are never even fleeting thoughts. It’s all frankly immaterial (funnily, both the right and wrong word at once). Life is lived and experienced fundamentally on its surface — unless, for instance, one has been incubated too long within the hallowed halls of academia, lost touch with one’s brethren, and become preoccupied with cosmic investigations. Something quite similar happens to politicians and the wealthy, who typically hyperfocus on gathering to themselves power and then exercising that power over others (typically misunderstood as simply pulling the levers and operating the mechanisms of society). No wonder their pocket of reality looks so strikingly different.

Heard a remark (can’t remember where) that most these days would attack as openly ageist. Basically, if you’re young (let’s say below 25 years of age), then it’s your time to shut up, listen, and learn. Some might even say that true wisdom doesn’t typically emerge until much later in life, if indeed it appears at all. Exceptions only prove the rule. On the flip side, energy, creativity, and indignation (e.g., “it’s not fair! “) needed to drive social movements are typically the domain of those who have less to lose and everything to gain, meaning those just starting out in adult life. A full age range is needed, I suppose, since society isn’t generally age stratified except at the extremes (childhood and advanced age). (Turns out that what to call old people and what counts as old is rather clumsy, though probably not especially controversial.)

With this in mind, I can’t help but to wonder what’s going on with recent waves of social unrest and irrational ideology. Competing factions agitate vociferously in favor of one social/political ideology or another as though most of the ideas presented have no history. (Resemblances to Marxism, Bolshevism, and white supremacy are quite common. Liberal democracy, not so much.) Although factions aren’t by any means populated solely by young people, I observe that roughly a decade ago, higher education in particular transformed itself into an incubator for radicals and revolutionaries. Whether dissatisfaction began with the faculty and infected the students is impossible for me to assess. I’m not inside that intellectual bubble. However, urgent calls for radical reform have since moved well beyond the academy. A political program or ideology has yet to be put forward that I can support fully. (My doomer assessment of what the future holds forestalls knowing with any confidence what sort of program or ideology into which to pour my waning emotional and intellectual energy.) It’s still fairly simple to criticize and denounce, of course. Lots of things egregiously wrong in the world.

My frustration with what passes for political debate (if Twitter is any indication) is the marked tendency to immediately resort to comparisons with Yahtzees in general or Phitler in particular. It’s unhinged and unproductive. Yahtzees are cited as an emotional trigger, much like baseless accusations of racism send everyone scrambling for cover lest they be cancelled. Typically, the Yahtzee/Phitler comparison or accusation itself is enough to put someone on their heels, but wizened folks (those lucky few) recognize the cheap rhetorical trick. The Yahtzee Protocol isn’t quite the same as Godwin’s Law, which states that the longer a discussion goes on (at Usenet in the earliest examples) increases the inevitability likelihood of someone bringing up Yahtzees and Phitler and ruining useful participation. The protocol has been deployed effectively in the Russian-Ukraine conflict, though I’m at a loss to determine in which direction. The mere existence of the now-infamous Azov Battalion, purportedly comprised is Yahtzees, means that automatically, reflexively, the fight is on. Who can say what the background rate of Yahtzee sympathizers (whatever that means) might be in any fighting force or indeed the general population? Not me. Similarly, what threshold qualifies a tyrant to stand beside Phitler on a list of worst evers? Those accusations are flung around like cooked spaghetti thrown against the wall just to see what sticks. Even if the accusation does stick, what possible good does it do? Ah, I know: it makes the accuser look like a virtuous fool.

I had that dream again. You know the one: I have to go take a final test in a class I forgot about, never attended or dropped from my schedule. Most higher-ed students have this dream repeatedly, as do former students (or for those who take the educational enterprise seriously as a life-long endeavor, perpetual students). The dream usually features open-ended anxiety because it’s all anticipation — one never steps into the classroom to sit for the test. But this time, the twist was that the final test transformed into a group problem-solving seminar. The subject matter was an arcane post-calculus specialty (maybe I’ve seen too many Big Bang Theory whiteboards strewn with undecipherable equations), and the student group was stumped trying to solve some sort of engineering problem. In heroic dream mode, I recontextualized the problem despite my lack of expertise, which propelled the group past its block. Not a true test of knowledge or understanding, since I hadn’t attended class and didn’t learn its subject matter, but a reminder that problem-solving is often not straight application of factors easily set forth and manipulable.

Outside of the dream, in my morning twilight (oxymoron alert), I mused on the limitations of tackling social issues like there were engineering problems, which typically regards materials, processes, and personnel as mere resources to be marshaled and acted upon to achieve a goal but with little consideration — at least in the moment — of downstream effects or indeed human values. The Manhattan Project is a prime example, which (arguably) helped the allied powers win WWII but launched the world into the Atomic Age, complete with its own Cold War and the awful specter of mutually assured destruction (MAD). Borrowing a term from economics, it’s easy to rationalize negative collateral effects in terms of creative destruction. I object: the modifier creative masks that the noun is still destruction (cracked eggs needed to make omelets, ya know). Otherwise, maybe the term would be destructive creation. Perhaps I misunderstand, but the breakthrough with the Manhattan Project came about through synthesis of knowledge that lay beyond the purview of most narrowly trained engineers.

That is precisely the problem with many social ills today, those that actually have solutions anyway. The political class meant to manage and administer views problems primarily through a political lens (read: campaigning) and is not especially motivated to solve anything. Similarly, charitable organizations aimed at eradicating certain problems (e.g., hunger, homelessness, crime, educational disadvantage) can’t actually solve any problems because that would be the end of their fundraising and/or government funding, meaning that the organization itself would cease. Synthetic knowledge needed to solve a problem and then terminate the project is anathema to how society now functions; better that problems persist.

Past blog posts on this topic include “Techies and Fuzzies” and “The Man Who Knew Too Little,” each of which has a somewhat different emphasis. I’m still absorbed by the conflict between generalists and specialists while recognizing that both are necessary for full effectiveness. That union is the overarching message, too, of Iain McGilchrist’s The Master and His Emissary (2010), the subject of many past blog posts.

In the spirit of today’s holiday, let me wish everyone on Earth a bit of peace, albeit temporary. Take a brief respite from the ongoing storm of bad news and propaganda. I know that I throw out that word, propaganda, with too much frequency these days, but it’s nearly always the appropriate term. Leading up to the holiday, folks have been told not to enjoy ourselves, not to celebrate (what to celebrate in this sacred-secular amalgam is highly selective and individual), and to fear and avoid family gatherings because, well, pestilence will invade every hearth and kill all inhabitants. It’s quite ridiculous on even a moment’s consideration. Yet that’s the message of the leading emissaries of official wisdom, many of whom seem to believe they are commandants exercising unlimited power over cities, states, and nations as though running prison camps. They’re spreading Christmas fear rather than Christmas cheer. Near as I can tell, most people are saying, sensibly, “bah, humbug” to cowering alone at home.

Also offering here one of Caitlin Johnstone’s many aphorisms, which I often find darkly funny:

Coming up next on CBS News, the uplifting story of a little girl who is constantly being kicked in the head by government officials and the small town that raised money to buy her a helmet.

/rant on

Remember when the War on Christmas meme materialized a few years ago out of thin air, even though no one in particular was on attack? Might have to rethink that one. Seems that every holiday now carries an obligation to revisit the past, recontextualize origin stories, and confess guilt over tawdry details of how the holiday came to be celebrated. Nearly everyone who wants to know knows by now that Christmas is a gross bastardization of pagan celebrations of the winter solstice, cooped by various organized churches (not limited to Christians!) before succumbing to the awesome marketing onslaught (thanks, Coca-Cola) that makes Xmas the “most wonderful time of the year” (as the tune goes) and returning the holiday to its secular roots. Thanksgiving is now similarly ruined, no longer able to be celebrated and enjoyed innocently (like a Disney princess story reinterpreted as a white or male savior story — or even worse, a while male) but instead used as an excuse to admit America’s colonial and genocidal past and extermination mistreatment of native populations as white Europeans encroached ever more persistently on lands the natives held more or less as a commons. Gone are the days when one could gather among family and friends, enjoy a nice meal and good company, and give humble, sincere thanks for whatever bounty fortuna had bestowed. Now it’s history lectures and acrimony and families rent asunder along generational lines, typically initiated by those newly minted graduates of higher education and their newfangled ideas about equity, justice, and victimhood. Kids these days … get off my lawn!

One need not look far afield to find alternative histories that position received wisdom about the past in the cross-hairs just to enact purification rituals that make it/them, what, clean? accurate? whole? I dunno what the real motivation is except perhaps to force whites to self-flagellate over sins of our ancestors. Damn us to hell for having cruelly visited iniquity upon everyone in the process of installing white, patriarchal Europeanness as the dominant Western culture. I admit all of it, though I’m responsible for none of it. Moreover, history stops for no man, no culture, no status quo. White, patriarchal Europeanness is in serious demographic retreat and has arguably already lost its grip on cultural dominance. The future is female (among other things), amirite? Indeed, whether intended or not, that was the whole idea behind the American experiment: the melting pot. Purity was never the point. Mass migration out of politically, economically, and ecologically ravaged regions means that the experiment is no longer uniquely American.

Interdisciplinary approaches to history, if academic rigidity can be overcome, regularly develop new understandings to correct the historical record. Accordingly, the past is remarkably dynamic. (I’ve been especially intrigued by Graham Hancock’s work on ancient civilizations, mostly misunderstood and largely forgotten except for megalithic projects left behind.) But the past is truly awful, with disaster piled upon catastrophe followed by calamity and cataclysm. Still waiting for the apocalypse. Peering too intently into the past is like staring at the sun: it scorches the retinas. Moreover, the entire history of history is replete with stories being told and retold, forgotten and recovered, transformed in each iteration from folklore into fable into myth into legend before finally passing entirely out of human memory systems. How many versions of Christmas are there across cultures and time? Or Thanksgiving, or Halloween, or any Hallmark® holiday that has crossed oceans and settled into foreign lands? What counts as the true spirit of any of them when their histories are so mutable?

/rant off

What if everyone showed up to an event with an expectation of using all their tech and gadgets to facilitate the group objective only to discover that nothing worked? You go to a fireworks display but the fireworks won’t ignite. You go to a concert but the instruments and voices make no sound. You go to a sporting event but none of the athletes can move. Does everyone just go home? Come back another day to try again? Or better yet, you climb into your car to go somewhere but it won’t start. Does everyone just stay home and the event never happens?

Those questions relate to a new “soft” weapons system called Scorpius (no link). The device or devices are said to disrupt and disable enemy tech by issuing a narrowly focused electromagnetic beam. (Gawd, just call it a raygun or phaser. No embarrassment over on-the-nose naming of other things.) Does the beam fry the electronics of its target, like a targeted Carrington event, or just simply scramble the signals, making the tech inoperable? Can tech be hardened against attack, such as being encased in a Faraday cage? Wouldn’t such isolation itself make tech nonfunctional, since electronic communications between locations is the essence of modern devices, especially for targeting and telemetry? These are a few more idle questions (unanswered, since announcements of new weaponry I consulted read like advertising copy) about this latest advance (euphemism alert) in the arms race. Calling a device that can knock a plane (um, warplane) out of the sky (crashing somewhere, obviously) “soft protection” because the mechanism is a beam rather than a missile rather obfuscates the point. Sure, ground-based technologies might be potentially disabled without damage, but would that require continuous beam-based defense?

I recall an old Star Trek episode, the one with the Gorn, where omnipotent aliens disabled all weapons systems of two spaceships postured for battle by superheating controls, making them too hot to handle. Guess no one thought of oven mitts or pencils to push the “Fire!” buttons. (Audiences were meant to think, considering Star Trek was a thinking person’s entertainment, but not too much.) Instead of mass carnage, the two captains were transported to the surface of a nearby planet to battle by proxy (human vs. reptile). In quintessential Star Trek fashion — imagining a hopeful future despite militaristic trappings — the human captain won not only the physical battle but the moral battle (with self) by refusing to dispatch the reptile captain after he/it was disabled. The episode posed interesting questions so long as no one searched in the weeds for plausibility.

We’re confronted now, and again, with many of these same questions, some technical, some strategic, but more importantly, others moral and ethical. Thousands of years of (human) history have already demonstrated the folly of war (but don’t forget profitability). It’s a perennial problem, and from my vantage point, combatants on all sides are no closer to Trekkie moral victory now than in the 1960s. For instance, the U.S. and its allies are responsible for millions of deaths in Iraq, Afghanistan, Syria, and elsewhere just in the last two decades. Go back further in time and imperial designs look more and more like sustained extermination campaigns. But hey, we came to play, and any strategic advantage must be developed and exploited, moral quandaries notwithstanding.

It’s worth pointing out that in the Gorn episode, the captains were deprived of their weapons and resorted to brute force before the human improvised a projectile weapon out of materials handily strewn about, suggesting perhaps that intelligence is the most deadly weapon. Turns out to have been just another arms race.