Archive for the ‘Idle Nonsense’ Category

Following up the two previous entries in this series, the Feb. 2022 issue of Scientific American has a cover article by Adam Becker called “The Origins of Space and Time” with the additional teaser “Does spacetime emerge from a more fundamental reality?” (Oddly, the online title is different, making it awkward to find.) I don’t normally read Scientific American, which has become a bit like Time and Newsweek in its blurb-laden, graphics-heavy presentation intended patronizingly for general-interest readers. In fact, I’ll quote the pullout (w/o graphics) that summarizes the article:

How Spacetime Emerges. Space and time are traditionally thought of as the backdrop to the universe. But new research suggests they might not be fundamental; instead spacetime could be an emergent property of a more basic reality, the true backdrop to the cosmos. This idea comes from two theories that attempt to bridge the divide between general relativity and quantum mechanics. The first, string theory, recasts subatomic particles as tiny loops of vibrating string. The second, loop quantum gravity, envisions spacetime being broke down into chunks — discrete bits that combine to create a seemingly smooth continuum.

Being a layperson in such matters, I’ll admit openly that I don’t fully grasp the information presented. Indeed, every breathless announcement from CERN (or elsewhere) about a new subatomic particle discovery or some research group’s new conjectures into quantum this-or-that I typically greet passively at best. Were I a physicist or cosmologist, my interest would no doubt be more acute, but these topics are so far removed from everyday life they essentially become arcane inquiries into the number of angels dancing on the head of a pin. I don’t feel strongly enough to muster denunciation, but discussion of another aspect of pocket reality is worth some effort.

My understanding is that the “more basic reality, the true backdrop” discussed in the article is multidimensionality, something Eric Weinstein has also been grappling with under the name Geometric Unity. (Bizarrely, Alex Jones has also raved about interdimensional beings.) If the universe indeed has several more undetectable dimensions (as memory serves, Weinstein says as many as 14) and human reality is limited to only a few, potentially breaking through to other dimension and/or escaping boundaries of a mere four is tantalizing yet terrifying. Science fiction often explores these topics, usually in the context of space travel and human colonization of the galaxy. As thought experiments, fictional stories can be authentically entertaining and enjoyable. Within nonfiction reality, desire to escape off-world or into extra- or interdimensionality is an expression of desperation considering just how badly humans have fucked up the biosphere and guaranteed an early extinction for most species (including ours). I also chafe at the notion that this world, this reality, is not enough and that pressing forward like some unstoppable chemical reaction or biological infiltration is the obvious next step.

This intended follow-up has been stalled (pt. 1 here) for one simple reason: the premise presented in the embedded YouTube video is (for me at least) easy to dismiss out of hand and I haven’t wanted to revisit it. Nevertheless, here’s the blurb at the top of the comments at the webpage:

Is reality created in our minds, or are the things you can touch and feel all that’s real? Philosopher Bernardo Kastrup holds doctorates in both philosophy and computer science, and has made a name for himself by arguing for metaphysical idealism, the idea that reality is essentially a mental phenomenon.

Without going into point-by-point discussion, the top-level assertion, if I understand it correctly (not assured), is that material reality comes out of mental experience rather than the reverse. It’s a chicken-and-egg question with materialism and idealism (fancy technical terms not needed) each vying for primacy. The string of conjectures (mental gymnastics, really, briefly impressive until one recognizes how quickly they lose correlation with how most of us think about and experience reality) that inverts the basic relationship of inner experience to outer reality is an example of waaaay overthinking a problem. No doubt quite a lot of erudition can be brought to bear on such questions, but even if those questions were resolved satisfactorily on an intellectual level and an internally coherent structure or system were developed or revealed, it doesn’t matter or lead anywhere. Humans are unavoidably embodied beings. Each individual existence also occupies a tiny sliver of time (the timeline extending in both directions to infinity). Suggesting that mental experience is briefly instantiated in personhood but is actually drawn out of some well of souls, collective consciousness, or panpsychism and rejoins them in heaven, hell, or elsewhere upon expiration is essentially a religious claim. It’s also an attractive supposition, granting each of us not permanence or immortality but rather something somehow better (?) though inscrutable because it lies beyond perception (but not conceptualization). Except for an eternity of torments in hell, I guess, if one deserves that awful fate.

One comment about Kastrup. He presents his perspective (his findings?) with haughty derision of others who can’t see or understand what it so (duh!) obvious. He falls victim to the very same over-intellectualized flim-flam he mentions when dismissing materialists who need miracles and shortcuts to smooth over holes in their scientific/philosophical systems. The very existence of earnest disagreement by those who occupy themselves with such questions might suggest some humility, as in “here’s my explanation, they have theirs, judge for yourself.” But there’s a third option: the great unwashed masses (including nearly all our ancestors) for whom such questions are never even fleeting thoughts. It’s all frankly immaterial (funnily, both the right and wrong word at once). Life is lived and experienced fundamentally on its surface — unless, for instance, one has been incubated too long within the hallowed halls of academia, lost touch with one’s brethren, and become preoccupied with cosmic investigations. Something quite similar happens to politicians and the wealthy, who typically hyperfocus on gathering to themselves power and then exercising that power over others (typically misunderstood as simply pulling the levers and operating the mechanisms of society). No wonder their pocket of reality looks so strikingly different.

Heard a remark (can’t remember where) that most these days would attack as openly ageist. Basically, if you’re young (let’s say below 25 years of age), then it’s your time to shut up, listen, and learn. Some might even say that true wisdom doesn’t typically emerge until much later in life, if indeed it appears at all. Exceptions only prove the rule. On the flip side, energy, creativity, and indignation (e.g., “it’s not fair! “) needed to drive social movements are typically the domain of those who have less to lose and everything to gain, meaning those just starting out in adult life. A full age range is needed, I suppose, since society isn’t generally age stratified except at the extremes (childhood and advanced age). (Turns out that what to call old people and what counts as old is rather clumsy, though probably not especially controversial.)

With this in mind, I can’t help but to wonder what’s going on with recent waves of social unrest and irrational ideology. Competing factions agitate vociferously in favor of one social/political ideology or another as though most of the ideas presented have no history. (Resemblances to Marxism, Bolshevism, and white supremacy are quite common. Liberal democracy, not so much.) Although factions aren’t by any means populated solely by young people, I observe that roughly a decade ago, higher education in particular transformed itself into an incubator for radicals and revolutionaries. Whether dissatisfaction began with the faculty and infected the students is impossible for me to assess. I’m not inside that intellectual bubble. However, urgent calls for radical reform have since moved well beyond the academy. A political program or ideology has yet to be put forward that I can support fully. (My doomer assessment of what the future holds forestalls knowing with any confidence what sort of program or ideology into which to pour my waning emotional and intellectual energy.) It’s still fairly simple to criticize and denounce, of course. Lots of things egregiously wrong in the world.

My frustration with what passes for political debate (if Twitter is any indication) is the marked tendency to immediately resort to comparisons with Yahtzees in general or Phitler in particular. It’s unhinged and unproductive. Yahtzees are cited as an emotional trigger, much like baseless accusations of racism send everyone scrambling for cover lest they be cancelled. Typically, the Yahtzee/Phitler comparison or accusation itself is enough to put someone on their heels, but wizened folks (those lucky few) recognize the cheap rhetorical trick. The Yahtzee Protocol isn’t quite the same as Godwin’s Law, which states that the longer a discussion goes on (at Usenet in the earliest examples) increases the inevitability likelihood of someone bringing up Yahtzees and Phitler and ruining useful participation. The protocol has been deployed effectively in the Russian-Ukraine conflict, though I’m at a loss to determine in which direction. The mere existence of the now-infamous Azov Battalion, purportedly comprised is Yahtzees, means that automatically, reflexively, the fight is on. Who can say what the background rate of Yahtzee sympathizers (whatever that means) might be in any fighting force or indeed the general population? Not me. Similarly, what threshold qualifies a tyrant to stand beside Phitler on a list of worst evers? Those accusations are flung around like cooked spaghetti thrown against the wall just to see what sticks. Even if the accusation does stick, what possible good does it do? Ah, I know: it makes the accuser look like a virtuous fool.

I had that dream again. You know the one: I have to go take a final test in a class I forgot about, never attended or dropped from my schedule. Most higher-ed students have this dream repeatedly, as do former students (or for those who take the educational enterprise seriously as a life-long endeavor, perpetual students). The dream usually features open-ended anxiety because it’s all anticipation — one never steps into the classroom to sit for the test. But this time, the twist was that the final test transformed into a group problem-solving seminar. The subject matter was an arcane post-calculus specialty (maybe I’ve seen too many Big Bang Theory whiteboards strewn with undecipherable equations), and the student group was stumped trying to solve some sort of engineering problem. In heroic dream mode, I recontextualized the problem despite my lack of expertise, which propelled the group past its block. Not a true test of knowledge or understanding, since I hadn’t attended class and didn’t learn its subject matter, but a reminder that problem-solving is often not straight application of factors easily set forth and manipulable.

Outside of the dream, in my morning twilight (oxymoron alert), I mused on the limitations of tackling social issues like there were engineering problems, which typically regards materials, processes, and personnel as mere resources to be marshaled and acted upon to achieve a goal but with little consideration — at least in the moment — of downstream effects or indeed human values. The Manhattan Project is a prime example, which (arguably) helped the allied powers win WWII but launched the world into the Atomic Age, complete with its own Cold War and the awful specter of mutually assured destruction (MAD). Borrowing a term from economics, it’s easy to rationalize negative collateral effects in terms of creative destruction. I object: the modifier creative masks that the noun is still destruction (cracked eggs needed to make omelets, ya know). Otherwise, maybe the term would be destructive creation. Perhaps I misunderstand, but the breakthrough with the Manhattan Project came about through synthesis of knowledge that lay beyond the purview of most narrowly trained engineers.

That is precisely the problem with many social ills today, those that actually have solutions anyway. The political class meant to manage and administer views problems primarily through a political lens (read: campaigning) and is not especially motivated to solve anything. Similarly, charitable organizations aimed at eradicating certain problems (e.g., hunger, homelessness, crime, educational disadvantage) can’t actually solve any problems because that would be the end of their fundraising and/or government funding, meaning that the organization itself would cease. Synthetic knowledge needed to solve a problem and then terminate the project is anathema to how society now functions; better that problems persist.

Past blog posts on this topic include “Techies and Fuzzies” and “The Man Who Knew Too Little,” each of which has a somewhat different emphasis. I’m still absorbed by the conflict between generalists and specialists while recognizing that both are necessary for full effectiveness. That union is the overarching message, too, of Iain McGilchrist’s The Master and His Emissary (2010), the subject of many past blog posts.

In the spirit of today’s holiday, let me wish everyone on Earth a bit of peace, albeit temporary. Take a brief respite from the ongoing storm of bad news and propaganda. I know that I throw out that word, propaganda, with too much frequency these days, but it’s nearly always the appropriate term. Leading up to the holiday, folks have been told not to enjoy ourselves, not to celebrate (what to celebrate in this sacred-secular amalgam is highly selective and individual), and to fear and avoid family gatherings because, well, pestilence will invade every hearth and kill all inhabitants. It’s quite ridiculous on even a moment’s consideration. Yet that’s the message of the leading emissaries of official wisdom, many of whom seem to believe they are commandants exercising unlimited power over cities, states, and nations as though running prison camps. They’re spreading Christmas fear rather than Christmas cheer. Near as I can tell, most people are saying, sensibly, “bah, humbug” to cowering alone at home.

Also offering here one of Caitlin Johnstone’s many aphorisms, which I often find darkly funny:

Coming up next on CBS News, the uplifting story of a little girl who is constantly being kicked in the head by government officials and the small town that raised money to buy her a helmet.

/rant on

Remember when the War on Christmas meme materialized a few years ago out of thin air, even though no one in particular was on attack? Might have to rethink that one. Seems that every holiday now carries an obligation to revisit the past, recontextualize origin stories, and confess guilt over tawdry details of how the holiday came to be celebrated. Nearly everyone who wants to know knows by now that Christmas is a gross bastardization of pagan celebrations of the winter solstice, cooped by various organized churches (not limited to Christians!) before succumbing to the awesome marketing onslaught (thanks, Coca-Cola) that makes Xmas the “most wonderful time of the year” (as the tune goes) and returning the holiday to its secular roots. Thanksgiving is now similarly ruined, no longer able to be celebrated and enjoyed innocently (like a Disney princess story reinterpreted as a white or male savior story — or even worse, a while male) but instead used as an excuse to admit America’s colonial and genocidal past and extermination mistreatment of native populations as white Europeans encroached ever more persistently on lands the natives held more or less as a commons. Gone are the days when one could gather among family and friends, enjoy a nice meal and good company, and give humble, sincere thanks for whatever bounty fortuna had bestowed. Now it’s history lectures and acrimony and families rent asunder along generational lines, typically initiated by those newly minted graduates of higher education and their newfangled ideas about equity, justice, and victimhood. Kids these days … get off my lawn!

One need not look far afield to find alternative histories that position received wisdom about the past in the cross-hairs just to enact purification rituals that make it/them, what, clean? accurate? whole? I dunno what the real motivation is except perhaps to force whites to self-flagellate over sins of our ancestors. Damn us to hell for having cruelly visited iniquity upon everyone in the process of installing white, patriarchal Europeanness as the dominant Western culture. I admit all of it, though I’m responsible for none of it. Moreover, history stops for no man, no culture, no status quo. White, patriarchal Europeanness is in serious demographic retreat and has arguably already lost its grip on cultural dominance. The future is female (among other things), amirite? Indeed, whether intended or not, that was the whole idea behind the American experiment: the melting pot. Purity was never the point. Mass migration out of politically, economically, and ecologically ravaged regions means that the experiment is no longer uniquely American.

Interdisciplinary approaches to history, if academic rigidity can be overcome, regularly develop new understandings to correct the historical record. Accordingly, the past is remarkably dynamic. (I’ve been especially intrigued by Graham Hancock’s work on ancient civilizations, mostly misunderstood and largely forgotten except for megalithic projects left behind.) But the past is truly awful, with disaster piled upon catastrophe followed by calamity and cataclysm. Still waiting for the apocalypse. Peering too intently into the past is like staring at the sun: it scorches the retinas. Moreover, the entire history of history is replete with stories being told and retold, forgotten and recovered, transformed in each iteration from folklore into fable into myth into legend before finally passing entirely out of human memory systems. How many versions of Christmas are there across cultures and time? Or Thanksgiving, or Halloween, or any Hallmark® holiday that has crossed oceans and settled into foreign lands? What counts as the true spirit of any of them when their histories are so mutable?

/rant off

What if everyone showed up to an event with an expectation of using all their tech and gadgets to facilitate the group objective only to discover that nothing worked? You go to a fireworks display but the fireworks won’t ignite. You go to a concert but the instruments and voices make no sound. You go to a sporting event but none of the athletes can move. Does everyone just go home? Come back another day to try again? Or better yet, you climb into your car to go somewhere but it won’t start. Does everyone just stay home and the event never happens?

Those questions relate to a new “soft” weapons system called Scorpius (no link). The device or devices are said to disrupt and disable enemy tech by issuing a narrowly focused electromagnetic beam. (Gawd, just call it a raygun or phaser. No embarrassment over on-the-nose naming of other things.) Does the beam fry the electronics of its target, like a targeted Carrington event, or just simply scramble the signals, making the tech inoperable? Can tech be hardened against attack, such as being encased in a Faraday cage? Wouldn’t such isolation itself make tech nonfunctional, since electronic communications between locations is the essence of modern devices, especially for targeting and telemetry? These are a few more idle questions (unanswered, since announcements of new weaponry I consulted read like advertising copy) about this latest advance (euphemism alert) in the arms race. Calling a device that can knock a plane (um, warplane) out of the sky (crashing somewhere, obviously) “soft protection” because the mechanism is a beam rather than a missile rather obfuscates the point. Sure, ground-based technologies might be potentially disabled without damage, but would that require continuous beam-based defense?

I recall an old Star Trek episode, the one with the Gorn, where omnipotent aliens disabled all weapons systems of two spaceships postured for battle by superheating controls, making them too hot to handle. Guess no one thought of oven mitts or pencils to push the “Fire!” buttons. (Audiences were meant to think, considering Star Trek was a thinking person’s entertainment, but not too much.) Instead of mass carnage, the two captains were transported to the surface of a nearby planet to battle by proxy (human vs. reptile). In quintessential Star Trek fashion — imagining a hopeful future despite militaristic trappings — the human captain won not only the physical battle but the moral battle (with self) by refusing to dispatch the reptile captain after he/it was disabled. The episode posed interesting questions so long as no one searched in the weeds for plausibility.

We’re confronted now, and again, with many of these same questions, some technical, some strategic, but more importantly, others moral and ethical. Thousands of years of (human) history have already demonstrated the folly of war (but don’t forget profitability). It’s a perennial problem, and from my vantage point, combatants on all sides are no closer to Trekkie moral victory now than in the 1960s. For instance, the U.S. and its allies are responsible for millions of deaths in Iraq, Afghanistan, Syria, and elsewhere just in the last two decades. Go back further in time and imperial designs look more and more like sustained extermination campaigns. But hey, we came to play, and any strategic advantage must be developed and exploited, moral quandaries notwithstanding.

It’s worth pointing out that in the Gorn episode, the captains were deprived of their weapons and resorted to brute force before the human improvised a projectile weapon out of materials handily strewn about, suggesting perhaps that intelligence is the most deadly weapon. Turns out to have been just another arms race.

Continuing from part 2. I’m so slow ….

If cognitive inertia (i.e., fear of change) used to manifest as technophobia, myriad examples demonstrate how technology has fundamentally penetrated the social fabric and shared mental space, essentially flipping the script to fear of missing out (FOMO) of whatever latest, greatest innovation comes down the pike (laden with fraud and deception — caveat emptor). With FOMO, a new phobia has emerged: fear of technological loss, or more specifically, inability to connect to the Internet. This is true especially among the young, born and bred after the onset of the computing and digital communications era. Who knows when, why, or how loss of connectivity might occur? Maybe a Carrrington Event, maybe rolling blackouts due to wildfires (such as those in California and Oregon), maybe a ransomware attack on ISPs, or maybe a totalitarian clampdown by an overweening government after martial law is declared (coming soon to a neighborhood near you!). Or maybe something simpler: infrastructure failure. For some, inability to connect digitally, electronically, is tantamount to total isolation. Being cut off from the thoughts of others and abandoned left to one’s own thoughts, even on the short term, is thus roughly equivalent to the torture of solitary confinement. Forget the notion of digital detox.

/rant on

Cheerleaders for technocracy are legion, of course, while the mind boggles at how society might or necessarily will be organized differently when it all fails (as it must, if for no other reason than energy depletion). Among the bounties of the communications era is a surfeit of entertainments, movies and TV shows especially, that are essentially new stories to replace or supplant old stories. It’s no accident, however, that the new ones come wrapped up in the themes, iconography, and human psychology (is there any other kind, really?) of old ones. Basically, everything old is new again. And because new stories are delivered through hyperpalatable media — relatively cheap, on demand, and removed from face-to-face social contexts — they arguably cause as much disorientation as reorientation. See, for instance, the embedded video, which is rather long and rambling but nevertheless gets at how religious instincts manifest differently throughout the ages and are now embedded in comic book stories and superheros that have overtaken the entertainment landscape.

Mention is made that the secular age coincides roughly with the rise of video stores, a form of on-demand selection of content more recently made even simpler with ubiquitous streaming services. Did people really start hunkering down in their living rooms, eschewing group entertainments and civic involvements only in the 1980s? The extreme lateness of that development in Western history is highly suspect, considering the death of god had been declared back in the middle of the 19th century. Moreover, the argument swings around to the religious instinct, a cry or meaning if you will, being blocked by organized churches and their endemic corruption and instead finding expression in so-called secular religions (oxymoron alert). Gawd, how I tire of everything that functions as psychological grounding being called a religion. Listen, pseudo-religious elements can be found in Cheerios if one twists up one’s mind sufficiently. That doesn’t make General Mills or Kellogg’s new secular-religious prophets.

Back to the main point. Like money grubbing, technophilia might quiet the desperate search for meaning temporarily, since there’s always more of both to acquire. Can’t get enough, ever. But after even partial acquisition, the soul feels strangely dissatisfied and disquieted. Empty, one might even say. So out roving into the public sphere one goes, seeking and pursuing something to fill one’s time and appetites. Curiously, many traditional solutions to this age-old problem taught the seeker to probe within as an alternative. Well screw that! In the hyper-connected 20th-century world, who has time for that measly self-isolation? More reified Cheerios!

/rant off

I’ve often thought that my father was born at just the right time in the United States: too young to remember much of World War II, too old to be drafted into either the Korean War or the Vietnam War, yet well positioned to enjoy the fruits of the postwar boom and the 1960s counterculture. He retired early with a pension from the same company for which he had worked nearly the entirety of his adult life. Thus, he enjoyed the so-called Happy Days of the 1950s (the Eisenhower era) and all of the boom years, including the Baby Boom, the demographer’s term for my cohort (I came at the tail end). Good for him, I suppose. I admit some envy at his good fortune as most of the doors open to him were closed by the time I reached adulthood. It was the older (by 10–15 years) siblings of Boomers who lodged themselves in positions of power and influence. Accordingly, I’ve always felt somewhat like the snotty little brother clamoring for attention but who was overshadowed by the older brother always in the way. Luckily, my late teens and early twenties also fell between wars, so I never served — not that I ever supported the American Empire’s foreign escapades, then or now.

Since things have turned decidedly for the worse and industrial civilization can’t simply keep creaking along but will fail and collapse soon enough, my perspective has changed. Despite some life options having been withdrawn and my never having risen to world-beater status (not that that was ever my ambition, I recognize that, similar to my father, I was born at the right time to enjoy relative peace and tranquility of the second half of the otherwise bogus “American Century.” My good fortune allowed me to lead a quiet, respectable life, and reach a reasonable age (not yet retired) at which I now take stock. Mistakes were made, of course; that’s how we learn. But I’ve avoided the usual character deformations that spell disaster for lots of folks. (Never mind that some of those deformations are held up as admirable; the people who suffer them are in truth cretins of the first order, names withheld).

Those born at the wrong time? Any of those drafted into war (conquest, revolutionary, civil, regional, or worldwide), and certainly anyone in the last twenty years or so. Millennials appeared at the twilight of empire, many of whom are now mature enough to witness its fading glory but generally unable to participate in its bounties meaningfully. They are aware of their own disenfranchisement the same way oppressed groups (religious, ethnic, gender, working class, etc.) have always known they’re getting the shaft. Thus, the window of time one might claim optimal to have been born extends from around 1935 to around 1995, and my father and I both slot in. Beyond that fortuitous window, well, them’s the shakes.

Reblogged from here, also offered without comment.