Posts Tagged ‘Technophilia’

Cynics knew it was inevitable: weaponized drones and robots. Axon Enterprises, Inc., maker of police weaponry (euphemistically termed “public safety technologies”), announced its development of taser equipped drones presumed capable of neutralizing an active shooter inside of 60 seconds. Who knows what sorts of operating parameters restrict their functions or if they can be made invulnerable to hacking or disallowed use as offensive weapons?

A sane, civilized society would recognize that, despite bogus memes about an armed society being a polite society, the prospect of everyone being strapped (like the fabled Old American West) and public spaces (schools, churches, post offices, laundromats, etc.) each being outfitted with neutralizing technologies is fixing the wrong problem. But we are no longer a sane society (begging the question whether we ever were). So let me suggest something radical yet obvious: the problem is not technological, it’s cultural. The modern world has made no progress with respect to indifference toward the suffering of others. Dehumanizing attitudes and technologies are no longer, well, medieval, but they’re no less cruel. For instance, people are not put in public stocks or drawn and quartered anymore, but they are shamed, cancelled, tortured, terrorized, propagandized, and abandoned in other ways that allow maniacs to pretend to others and to themselves that they are part of the solution. Hard to believe that one could now feel nostalgia for the days when, in the aftermath of yet another mass shooting, calls for gun control were met with inaction (other then empty rhetoric) rather than escalation.

The problem with diagnosing the problem as cultural is that no one is in control. Like water, culture goes where it goes and apparently sinks to its lowest ebb. Attempts to channel, direct, and uplift culture might work on a small scale, but at the level of society — and with distorted incentives freedom is certain to deliver — malefactors are guaranteed to appear. Indeed, anything that contributes to the arms race (now tiny, remote-controlled, networked killing devices rather than giant atomic/nuclear ones) only invites greater harm and is not a solution. Those maniacs (social and technical engineers promising safety) have the wrong things wrong.

Small, insular societies with strict internal codes of conduct may have figured out something that large, free societies have not, namely, that mutual respect, knowable communities, and repudiation of advanced technologies give individuals something and someone to care about, a place to belong, and things to do. When the entire world is thrown open, such as with social media, populations become atomized and anonymized, unable to position or understand themselves within a meaningful social context. Anomie and nihilism are often the rotten fruit. Splintered family units, erosion of community involvement, and dysfunctional institutions add to the rot. Those symptoms of cultural collapse need to be addressed even if they are among the most difficult wrong things to get right.

Following up the two previous entries in this series, the Feb. 2022 issue of Scientific American has a cover article by Adam Becker called “The Origins of Space and Time” with the additional teaser “Does spacetime emerge from a more fundamental reality?” (Oddly, the online title is different, making it awkward to find.) I don’t normally read Scientific American, which has become a bit like Time and Newsweek in its blurb-laden, graphics-heavy presentation intended patronizingly for general-interest readers. In fact, I’ll quote the pullout (w/o graphics) that summarizes the article:

How Spacetime Emerges. Space and time are traditionally thought of as the backdrop to the universe. But new research suggests they might not be fundamental; instead spacetime could be an emergent property of a more basic reality, the true backdrop to the cosmos. This idea comes from two theories that attempt to bridge the divide between general relativity and quantum mechanics. The first, string theory, recasts subatomic particles as tiny loops of vibrating string. The second, loop quantum gravity, envisions spacetime being broke down into chunks — discrete bits that combine to create a seemingly smooth continuum.

Being a layperson in such matters, I’ll admit openly that I don’t fully grasp the information presented. Indeed, every breathless announcement from CERN (or elsewhere) about a new subatomic particle discovery or some research group’s new conjectures into quantum this-or-that I typically greet passively at best. Were I a physicist or cosmologist, my interest would no doubt be more acute, but these topics are so far removed from everyday life they essentially become arcane inquiries into the number of angels dancing on the head of a pin. I don’t feel strongly enough to muster denunciation, but discussion of another aspect of pocket reality is worth some effort.

My understanding is that the “more basic reality, the true backdrop” discussed in the article is multidimensionality, something Eric Weinstein has also been grappling with under the name Geometric Unity. (Bizarrely, Alex Jones has also raved about interdimensional beings.) If the universe indeed has several more undetectable dimensions (as memory serves, Weinstein says as many as 14) and human reality is limited to only a few, potentially breaking through to other dimension and/or escaping boundaries of a mere four is tantalizing yet terrifying. Science fiction often explores these topics, usually in the context of space travel and human colonization of the galaxy. As thought experiments, fictional stories can be authentically entertaining and enjoyable. Within nonfiction reality, desire to escape off-world or into extra- or interdimensionality is an expression of desperation considering just how badly humans have fucked up the biosphere and guaranteed an early extinction for most species (including ours). I also chafe at the notion that this world, this reality, is not enough and that pressing forward like some unstoppable chemical reaction or biological infiltration is the obvious next step.

Let me first restate axioms developed in previous blog posts. Narrative is the essential outward form of consciousness. Cognition has many preverbal and nonverbal subtleties, but the exchange of ideas occurs predominantly through narrative, and the story of self (told to oneself) can be understood as stream of consciousness: ongoing self-narration of sensations and events. The principal characteristic of narrative, at least that which is not pure fantasy, is in-the-moment sufficiency. Snap-judgment heuristics are merely temporary placeholders until, ideally at least, thoughtful reconsideration and revision that take time and discernment can be brought to bear. Stories we tell and are told, however, often do not reflect reality well, partly because our perceptual apparatuses are flawed, partly because individuals are untrained and unskilled in critical thinking (or overtrained and distorted), and partly because stories are polluted with emotions that make clear assessments impossible (to say nothing of malefactors with agendas). Some of us struggle to remove confabulation from narrative (as best we can) whereas others embrace it because it’s emotionally gratifying.

A good example of the reality principle is recognition, similar to the 1970s energy crisis, that energy supplies don’t magically appear by simply digging and drilling more of the stuff out of the ground. Those easy-to-get resources have been plundered already. The term peak oil refers to eventual decline in energy production (harvesting, really) when the easy stuff is more than half gone and undiminished (read: increasing) demand impels energy companies to go in search of more exotic supply (e.g., underwater or embedded in shale). If that reality is dissatisfying, a host of dreamt-up stories offer us deliverance from inevitable decline and reduction of lifestyle prerogatives by positing extravagant resources in renewables, hydrogen fuel cells, fusion (not to be confused with fission), or as-yet unexploited regions such as The Arctic National Wildlife Refuge. None of these represent plausible realities (except going into heretofore protected regions and bringing ecological devastation).

The relationship of fictional stories to reality is quite complex. For this blog post, a radically narrow description is that fiction is the imaginary space whereas ideas can be tried out and explored safely in preparation for implementation in reality. Science fiction (i.e., imagining interstellar space travel despite its flat impossibility in Newtonian physics) is a good example. Some believe humans can eventually accomplish what’s depicted in sci-fi, and in certain limited examples we already have. But many sci-fi stories simply don’t present a plausible reality. Taken as vicarious entertainment, they’re AOK superfine with me. But given that Western cultures (I can’t opine on cultures outside the West) have veered dangerously into rank ideation and believing their own hype, too many people believe fervently in aspirational futures that have no hope of ever instantiating. Just like giant pools of oil hidden under the Rocky Mountains (to cite something sent to me just today offering illusory relief from skyrocketing gasoline prices).

Among the many genres of narrative now on offer in fiction, no better example of sought-after-power is the superhero story. Identifying with the technological and financial power of Ironman and Batman or the god-powers of Thor and Wonder Woman is thrilling, perhaps, but again, these are not plausible realities. Yet these superrich, superstrong, superintelligent superheros are everywhere in fiction, attesting to liminal awareness of lack of power and indeed frailty. Many superhero stories are couched as coming-of-age stories for girls, who with grit and determination can fight toe-to-toe with any man and dominate. (Too many BS examples to cite.) Helps, of course, if the girl has magic at her disposal. Gawd, do I tire of these stories, told as origins in suffering, acquisition of skills, and coming into one’s own with the mature ability to force one’s will on others, often in the form of straight-up killing and assassination. Judge, jury, and executioner all rolled into one but entirely acceptable vigilantism if done wearing a supersuit and claiming spurious, self-appointed moral authority.

There are better narratives that don’t conflate power with force or lack plausibility in the world we actually inhabit. In a rather complicated article by Adam Tooze entitled “John Mearsheimer and the Dark Origins of Realism” at The New Statesman, after a lengthy historical and geopolitical analysis of competing narratives, a mode of apprehending reality is described:

… adopting a realistic approach towards the world does not consist in always reaching for a well-worn toolkit of timeless verities, nor does it consist in affecting a hard-boiled attitude so as to inoculate oneself forever against liberal enthusiasm. Realism, taken seriously, entails a never-ending cognitive and emotional challenge. It involves a minute-by-minute struggle to understand a complex and constantly evolving world, in which we are ourselves immersed, a world that we can, to a degree, influence and change, but which constantly challenges our categories and the definitions of our interests. And in that struggle for realism – the never-ending task of sensibly defining interests and pursuing them as best we can – to resort to war, by any side, should be acknowledged for what it is. It should not be normalised as the logical and obvious reaction to given circumstances, but recognised as a radical and perilous act, fraught with moral consequences. Any thinker or politician too callous or shallow to face that stark reality, should be judged accordingly.

What if everyone showed up to an event with an expectation of using all their tech and gadgets to facilitate the group objective only to discover that nothing worked? You go to a fireworks display but the fireworks won’t ignite. You go to a concert but the instruments and voices make no sound. You go to a sporting event but none of the athletes can move. Does everyone just go home? Come back another day to try again? Or better yet, you climb into your car to go somewhere but it won’t start. Does everyone just stay home and the event never happens?

Those questions relate to a new “soft” weapons system called Scorpius (no link). The device or devices are said to disrupt and disable enemy tech by issuing a narrowly focused electromagnetic beam. (Gawd, just call it a raygun or phaser. No embarrassment over on-the-nose naming of other things.) Does the beam fry the electronics of its target, like a targeted Carrington event, or just simply scramble the signals, making the tech inoperable? Can tech be hardened against attack, such as being encased in a Faraday cage? Wouldn’t such isolation itself make tech nonfunctional, since electronic communications between locations is the essence of modern devices, especially for targeting and telemetry? These are a few more idle questions (unanswered, since announcements of new weaponry I consulted read like advertising copy) about this latest advance (euphemism alert) in the arms race. Calling a device that can knock a plane (um, warplane) out of the sky (crashing somewhere, obviously) “soft protection” because the mechanism is a beam rather than a missile rather obfuscates the point. Sure, ground-based technologies might be potentially disabled without damage, but would that require continuous beam-based defense?

I recall an old Star Trek episode, the one with the Gorn, where omnipotent aliens disabled all weapons systems of two spaceships postured for battle by superheating controls, making them too hot to handle. Guess no one thought of oven mitts or pencils to push the “Fire!” buttons. (Audiences were meant to think, considering Star Trek was a thinking person’s entertainment, but not too much.) Instead of mass carnage, the two captains were transported to the surface of a nearby planet to battle by proxy (human vs. reptile). In quintessential Star Trek fashion — imagining a hopeful future despite militaristic trappings — the human captain won not only the physical battle but the moral battle (with self) by refusing to dispatch the reptile captain after he/it was disabled. The episode posed interesting questions so long as no one searched in the weeds for plausibility.

We’re confronted now, and again, with many of these same questions, some technical, some strategic, but more importantly, others moral and ethical. Thousands of years of (human) history have already demonstrated the folly of war (but don’t forget profitability). It’s a perennial problem, and from my vantage point, combatants on all sides are no closer to Trekkie moral victory now than in the 1960s. For instance, the U.S. and its allies are responsible for millions of deaths in Iraq, Afghanistan, Syria, and elsewhere just in the last two decades. Go back further in time and imperial designs look more and more like sustained extermination campaigns. But hey, we came to play, and any strategic advantage must be developed and exploited, moral quandaries notwithstanding.

It’s worth pointing out that in the Gorn episode, the captains were deprived of their weapons and resorted to brute force before the human improvised a projectile weapon out of materials handily strewn about, suggesting perhaps that intelligence is the most deadly weapon. Turns out to have been just another arms race.

Continuing from part 2. I’m so slow ….

If cognitive inertia (i.e., fear of change) used to manifest as technophobia, myriad examples demonstrate how technology has fundamentally penetrated the social fabric and shared mental space, essentially flipping the script to fear of missing out (FOMO) of whatever latest, greatest innovation comes down the pike (laden with fraud and deception — caveat emptor). With FOMO, a new phobia has emerged: fear of technological loss, or more specifically, inability to connect to the Internet. This is true especially among the young, born and bred after the onset of the computing and digital communications era. Who knows when, why, or how loss of connectivity might occur? Maybe a Carrrington Event, maybe rolling blackouts due to wildfires (such as those in California and Oregon), maybe a ransomware attack on ISPs, or maybe a totalitarian clampdown by an overweening government after martial law is declared (coming soon to a neighborhood near you!). Or maybe something simpler: infrastructure failure. For some, inability to connect digitally, electronically, is tantamount to total isolation. Being cut off from the thoughts of others and abandoned left to one’s own thoughts, even on the short term, is thus roughly equivalent to the torture of solitary confinement. Forget the notion of digital detox.

/rant on

Cheerleaders for technocracy are legion, of course, while the mind boggles at how society might or necessarily will be organized differently when it all fails (as it must, if for no other reason than energy depletion). Among the bounties of the communications era is a surfeit of entertainments, movies and TV shows especially, that are essentially new stories to replace or supplant old stories. It’s no accident, however, that the new ones come wrapped up in the themes, iconography, and human psychology (is there any other kind, really?) of old ones. Basically, everything old is new again. And because new stories are delivered through hyperpalatable media — relatively cheap, on demand, and removed from face-to-face social contexts — they arguably cause as much disorientation as reorientation. See, for instance, the embedded video, which is rather long and rambling but nevertheless gets at how religious instincts manifest differently throughout the ages and are now embedded in comic book stories and superheros that have overtaken the entertainment landscape.

Mention is made that the secular age coincides roughly with the rise of video stores, a form of on-demand selection of content more recently made even simpler with ubiquitous streaming services. Did people really start hunkering down in their living rooms, eschewing group entertainments and civic involvements only in the 1980s? The extreme lateness of that development in Western history is highly suspect, considering the death of god had been declared back in the middle of the 19th century. Moreover, the argument swings around to the religious instinct, a cry or meaning if you will, being blocked by organized churches and their endemic corruption and instead finding expression in so-called secular religions (oxymoron alert). Gawd, how I tire of everything that functions as psychological grounding being called a religion. Listen, pseudo-religious elements can be found in Cheerios if one twists up one’s mind sufficiently. That doesn’t make General Mills or Kellogg’s new secular-religious prophets.

Back to the main point. Like money grubbing, technophilia might quiet the desperate search for meaning temporarily, since there’s always more of both to acquire. Can’t get enough, ever. But after even partial acquisition, the soul feels strangely dissatisfied and disquieted. Empty, one might even say. So out roving into the public sphere one goes, seeking and pursuing something to fill one’s time and appetites. Curiously, many traditional solutions to this age-old problem taught the seeker to probe within as an alternative. Well screw that! In the hyper-connected 20th-century world, who has time for that measly self-isolation? More reified Cheerios!

/rant off

Continuing from part 1.

So here’s the dilemma: knowing a little bit about media theory and how the medium shapes the message, I’m spectacularly unconvinced that the cheerleaders are correct and that an entirely new mediascape (a word I thought maybe I had just made up, but alas, no) promises offers to correct the flaws of the older, inherited mediascape. It’s clearly not journalists leading the charge. Rather, comedians, gadflies, and a few academics (behaving as public intellectuals) command disproportionate attention among the digital chattering classes as regular folks seek entertainment and stimulation superior to the modal TikTok video. No doubt a significant number of news junkies still dote on their favorite journalists, but almost no journalist has escaped self-imposed limitations of the chosen media to offer serious reporting. Rather, they offer “commentary” and half-assed observations on human nature (much like like comedians who believe themselves especially insightful — armchair social critics like me probably fit that bill, too). If the sheer count of aggregate followers and subscribers across social media platforms is any indication (it isn’t …), athletes, musicians (mostly teenyboppers and former pop tarts, as I call them), and the irritatingly ubiquitous Kardashian/Jenner clan are the most influential, especially among Millennials and Gen Z, whose tastes skew toward the frivolous. Good luck getting insightful analysis out of those folks. Maybe in time they’ll mature into thoughtful, engaged citizens. After all, Kim Kardashian apparently completed a law degree (but has yet to pass the bar). Don’t quite know what to think of her three failed marriages (so far). Actually, I try not to.

I’ve heard arguments that the public is voting with its attention and financial support for new media and increasingly disregarding the so-called prestige media (no such thing anymore, though legacy media is still acceptable). That may well be, but it seems vaguely ungrateful for established journalists and comedians, having enjoyed the opportunity to apprentice under seasoned professionals, to take acquired skills to emerging platforms. Good information gathering and shaping — even for jokes — doesn’t happen in a vacuum, and responsible journalism in particular can’t simply be repackaging information gathered by others (i.e., Reuters, the Associated Press, and Al Jezeera) with the aforementioned “commentary.” A frequent reason cited for jumping ship is the desire to escape editorial control and institutional attempts to distort the news itself according to some corporate agenda or ideology. Just maybe new platforms have made that possible in a serious way. However, the related desire to take a larger portion of the financial reward for one’s own work (typically as celebrities seeking to extend their 15 minutes of fame — ugh) is a surefire way to introduce subtle, new biases and distortions. The plethora of metrics available online, for instance, allows content creators to see what “hits” or goes viral, inviting service to public interest that is decidedly less than wholesome (like so much rubbernecking).

It’s also curious that, despite all the talk about engaging with one’s audience, new media is mired in broadcast mode, meaning that most content is presented to be read or heard or viewed with minimal or no audience participation. It’s all telling, and because comments sections quickly run off the rails, successful media personalities ignore them wholesale. One weird feature some have adopted during livestreams is to display viewer donations accompanied by brief comments and questions, the donation being a means of separating and promoting one’s question to the top of an otherwise undifferentiated heap. To my knowledge, none has yet tried the established talk radio gambit of taking live telephone calls, giving the public a chance to make a few (unpurchased) remarks before the host resumes control. Though I’ve never been invited (an invitation is required) and would likely decline to participate, the Clubhouse smartphone app appears to offer regular folks a venue to discuss and debate topics of the day. However, reports on the platform dynamics suggest that the number of eager participants quickly rises to an impossible number for realistic group discussion (the classroom, or better yet, graduate seminar establishes better limitations). A workable moderation mechanism has yet to emerge. Instead, participants must “raise their hand” to be called upon to speak (i.e., be unmuted) and can be kicked out of the “room” arbitrarily if the moderator(s) so decide. This is decidedly not how conversation flows face-to-face.

What strikes me is that while different broadcast modes target and/or capture different demographics, they all still package organize content around the same principle: purporting to have obtained information and expertise to be shared with or taught to audiences. Whether subject matter is news, science, psychology, comedy, politics, etc., they have something ostensibly worth telling you (and me), hopefully while enhancing fame, fortune, and influence. So it frankly doesn’t matter that much whether the package is a 3-minute news segment, a brief celebrity interview on a late night talk show, an article published in print or online, a blog post, a YouTube video of varying duration, a private subscription to a Discord Server, a Subreddit, or an Instagram or Twitter feed; they are all lures for one’s attention. Long-form conversations hosted by Jordan Peterson, Joe Rogan, and Lex Fridman break out of self-imposed time limitations of the typical news segment and flow more naturally, but they also meander and get seriously overlong for anyone but long-haul truckers. (How many times have I tuned out partway into Paul VanderKlay’s podcast commentary or given up on on Matt Taibbi’s SubStack (tl;dr)? Yeah, lost count.) Yet these folks enthusiastically embrace the shifting mediascape. The digital communications era is already mature enough that several generations of platforms have come and gone as well-developed media are eventually coopted or turned commercial and innovators drive out weaker competitors. Remember MySpace, Google Plus, or American Online? The list of defunct social media is actually quite long. Because public attention is a perpetually moving target, I’m confident that those now enjoying their moment in the sun will face new challenges until it all eventually goes away amidst societal collapse. What then?

Happy to report that humans have finally outgrown their adolescent fixation, obsession, and infatuation surrounding technology and gadgetry, especially those that blow up things (and people), part of a maladaptive desire to watch the world burn (like a disturbed 14-year-old playing with fire to test the boundaries of control while hoping for the boundary to be breached). We are now in the process of correcting priorities and fixing the damage done. We’re also free from the psychological prison in which we trapped ourselves through status seeking and insistence on rigid ego consciousness by recognizing instead that, as artifacts of a hypersocial species, human cognition is fundamentally distributed among us as each of us is for all intents and purposes a storyteller retelling, reinforcing, and embellishing stories told elsewhere — even though it’s not quite accurate to call it mass mind or collective consciousness — and that indeed all men are brothers (an admitted anachronism, since that phrase encompasses women/sisters, too). More broadly, humans also now understand that we are only one species among many (a relative late-comer in evolutionary time, as it happens) that coexist in a dynamic balance with each other and with the larger entity some call Mother Earth or Gaia. Accordingly, we have determined that our relationship can no longer be that of abuser (us) and abused (everything not us) if the dynamism built into that system is not to take us out (read: trigger human extinction, like most species suffered throughout evolutionary time). If these pronouncements sound too rosy, well, get a clue, fool!

Let me draw your attention to the long YouTube video embedded below. These folks have gotten the clues, though my commentary follows anyway, because SWOTI.

After processing all the hand-waving and calls to immediate action (with inevitable nods to fundraising), I was struck by two things in particular. First, XR’s co-founder Roger Hallan gets pretty much everything right despite an off-putting combination of alarm, desperation, exasperation, and blame. He argues that to achieve the global awakening needed to alter humanity’s course toward (self-)extinction, we actually need charismatic speakers and heightened emotionalism. Scientific dispassion and neutered measured political discourse (such as the Intergovernmental Panel on Climate Change (IPCC) or as Al Gore attempted for decades before going Hollywood already fifteen years ago now) have simply failed to accomplish anything. (On inspection, what history has actually delivered is not characterized by the lofty rhetoric of statesmen and boosters of Enlightenment philosophy but rather resembles a sociologist’s nightmare of dysfunctional social organization, where anything that could possible go wrong pretty much has.) That abysmal failure is dawning on people under the age of 30 or so quite strongly, whose futures have been not so much imperiled as actively robbed. (HOW DARE YOU!? You slimy, venal, incompetent cretins above the age of 30 or so!) So it’s not for nothing that Roger Hallan insists that the XR movement ought to be powered and led by young people, with old people stepping aside, relinquishing positions of power and influence they’ve already squandered.


Second, Chris Hedges, easily the most erudite and prepared speaker/contributor, describes his first-hand experience reporting on rebellion in Europe leading to (1) the collapse of governments and (2) disintegration of societies. He seems to believe that the first is worthwhile, necessary, and/or inevitable even though the immediate result is the second. Civil wars, purges, and genocides are not uncommon throughout history in the often extended periods preceding and following social collapse. The rapidity of governmental collapse once the spark of citizen rebellion becomes inflamed is, in his experience, evidence that driving irresponsible leaders from power is still possible. Hedges’ catchphrase is “I fight fascists because they’re fascists,” which as an act of conscience allows him to sleep at night. A corollary is that fighting may not necessarily be effective, at least on the short term, or be undertaken without significant sacrifice but needs to be done anyway to imbue life with purpose and meaning, as opposed to anomie. Although Hedges may entertain the possibility that social disintegration and collapse will be far, far more serious and widespread once the armed-to-the-teeth American empire cracks up fully (already under way to many observers) than with the Balkan countries, conscientious resistance and rebellion is still recommended.

Much as my attitudes are aligned with XR, Hallan, and Hedges, I’m less well convinced that we should all go down swinging. That industrial civilization is going down and all of us with it no matter what we do is to me an inescapable conclusion. I’ve blogged about this quite a bit. Does ethical behavior demand fighting to the bitter end? Or can we fiddle while Rome burns, so to speak? There’s a lot of middle ground between those extremes, including nihilistic mischief (euphemism alert) and a bottomless well of anticipated suffering to alleviate somehow. More than altering the inevitable, I’m more inclined to focus on forestalling eleventh-hour evil and finding some grace in how we ultimately, collectively meet species death.

Evil exists in the world. History and current events both bear this out amply. Pseudo-philosophers might argue that, like emotions and other immaterial sensations, good and evil are merely reified concepts, meaning they are human constructs with no palpable external reality. Go tell that to victims of evildoers. Human suffering can’t be anonymized, rationalized, or philosophized away quite so handily.

It was sort of refreshing, back in the day, when Google’s motto and/or corporate code of conduct was simple: “Don’t Be Evil.” It acknowledged the potential for being or becoming evil (like any of the Bigs: Big Tobacco, Big Soda, Big Oil, Big Pharma, Big Media, Big Agriculture, etc.) and presumably aspired to resist obvious temptations. That was then (from 2000 to 2018), this is now (2021 until death take us — soon enough, I fear). But like all entities possessed of absurd levels of wealth and power, Google (now reorganized as a subsidiary of Alphabet, but who actually refers to it that way?) and its Silicon Valley brethren have succumbed to temptation and become straight-up evil.

One might charitably assess this development as something unbidden, unanticipated, and unexpected, but that’s no excuse, really. I certainly don’t envy celebrity executives experiencing difficulty resulting from having created unmanageable behemoths loosed on both public and polity unable to recognize beastly fangs until already clamped on their necks. As often occurs, dystopian extrapolations are explored in fiction, sometimes satirically. The dénouement of the HBO show Silicon Valley depicts tech mogul wannabes succeeding in creating an AI (or merely a sophisticated algorithm? doesn’t matter …) that would in time become far too powerful in blind execution of its inner imperative. In the show, characters recognize what they had done and kill their own project rather than allow it to destroy the world. In reality, multiple developers of computer tech platforms (and their embedded dynamic, including the wildly unhelpful albeit accurate term algorithm) lacked the foresight to anticipate awful downstream effects of their brainchildren. Yet now that those effects are manifesting recognizably, these corporations continue to operate and wreak havoc.

Silicon Valley shows a extended software development period of bungling ineptitude punctuated by brilliant though momentary breakthroughs. Characters are smart, flawed people laughably unable to get out of the way of their own success. The pièce de résistance was yoking one so-called “learning machine” to another and initiating what would become a runaway doomsday process (either like ecological collapse, building slowly then making the biosphere uninhabitable all at once, or like the gray goo problem, progressively “processing” biomass at the molecular level until all that remains is lifeless goo). It was a final act of bumbling that demanded the characters’ principled, ethical response before the window of opportunity closed. Real Silicon Valley tech platforms are in the (ongoing) process of rending the social fabric, which is no laughing matter. The issue du jour surrounds free speech and its inverse censorship. More broadly, real Silicon Valley succeeded in gaming human psychology for profit in at least two aspects (could be more as yet unrecognized): (1) mining behavioral data as an exploitable resource, and (2) delivering inexhaustible streams of extremely divisive content (not its own) to drive persistent engagement with its platforms. Yoked together, they operate to drive society mad, and yet, mounting evidence of this development has not produced even an inkling that maybe the damned doomsday devices ought to be shut off. As with the environment, we operate with freedom enough to destroy ourselves. Instead, politicians issue stunningly ineffectual calls for regulation or break-up of monopolies. In the meantime, ever more absurd wealth and power are concentrated in the hands of a few executives who have clearly punted and decided “let’s be evil.” No restraints on their behavioral experimentation across whole societies exist.

Much more to say on this topic in additional parts to come.

The end of every U.S. presidential administration is preceded by a spate of pardons and commutations — the equivalents of a get-out-of-jail-free card offered routinely to conspirators collaborators with the outgoing executive and general-purpose crony capitalists. This practice, along with diplomatic immunity and supranational elevation of people (and corporations-as-people) beyond the reach of prosecution, is a deplorable workaround obviating the rule of law. Whose brilliant idea it was to offer special indulgence to miscreants is unknown to me, but it’s pretty clear that, with the right connections and/or with enough wealth, you can essentially be as bad as you wanna be with little fear of real consequence (a/k/a too big to fail a/k/a too big to jail). Similarly, politicians, whose very job it is to manage the affairs of society, are free to be incompetent and destructive in their brazen disregard for needs of the citizenry. Only modest effort (typically a lot of jawing directed to the wrong things) is necessary to enjoy the advantages of incumbency.

In this moment of year-end summaries, I could choose from among an array of insane, destructive, counter-productive, and ultimately self-defeating nominees (behaviors exhibited by elite powers that be) as the very worst, the baddest of the bad. For me, in the largest sense, that would be the abject failure of the rule of law (read: restraints), which has (so far) seen only a handful of high-office criminals prosecuted successfully (special investigations leading nowhere and failed impeachments don’t count) for their misdeeds and malfeasance. I prefer to be more specific. Given my indignation over the use of torture, that would seem an obvious choice. However, those news stories have been shoved to the back burner, including the ongoing torture of Julian Assange for essentially revealing truths cynics like me already suspected and now know to be accurate, where they general little heat. Instead, I choose war as the very worst, an example of the U.S. (via its leadership) being as bad as it can possibly be. The recent election cycle offered a few candidates who bucked the consensus that U.S. involvement in every unnecessary, undeclared war since WWII is justified. They were effectively shut out by the military-industrial complex. And as the incoming executive tweeted on November 24, 2020, America’s back, baby! Ready to do our worst again (read: some more, since we [the U.S. military] never stopped [making war]). A sizeable portion of the American public is aligned with this approach, too.

So rule of law has failed and we [Americans] are infested with crime and incompetence at the highest levels. Requirements, rights, and protections found in the U.S. Constitution are handily ignored. That means every administration since Truman has been full of war criminals, because torture and elective war are crimes. The insult to my sensibilities is far worse than the unaffordability of war, the failure to win or end conflicts, or the lack of righteousness in our supposed cause. It’s that we [America, as viewed from outside] are belligerent, bellicose aggressors. We [Americans] are predators. And we [Americans, but really all humans] are stuck in an adolescent concept of conduct in the world shared with animals that must kill just to eat. We [humans] make no humanitarian progress at all. But the increasing scale of our [human] destructiveness is progress if drones, robots, and other DARPA-developed weaponry impress.

(more…)

I learned (quickly for once) that Emporis has awarded its annual prize, the 2019 skyscraper of the year, to the Lakhta Center in St. Petersburg, Russia. Although I have blogged quite a bit about skyscrapers and possessed passing familiarity with the name Emporis, I didn’t know buildings actually received awards. In fact, I had suggested that architects held a silent sweepstakes no one actually wins except perhaps in preposterous prestige points for being the tallest building du jour. Guess I was wrong.

Anyway, the Lakhta Center is plenty tall (1,516 ft., more than three times the height of any other building in St. Petersburg) but not a challenger in the international supertall category. Not even in the (current) top ten. But it does feature a version of the twisting design (blogged about here), an apparent antidote to the dreaded box.

So the Lakhta Center can twist, but it can’t exactly shout from the rooftop about its award (since it’s a spire and has no roof). Meanwhile, I remain puzzled that these projects continue to be funded and get built in an era of increasing desperation among peoples who can’t feed, clothe, and house themselves. Tent cities and homeless encampments stand in stark contrast to gleaming skyscrapers. Indeed, if the pandemic has shown us anything, it’s that demand for prime office and/or hotel and condo space in a supertall building is cratering with more of the workforce telecommuting instead of working on site and travelers staying home. I’ve expected these massive, multiyear, multibillion-dollar projects to be abandoned any time now. Yet they continue to move forward, and at no modest pace. My shouts aren’t being heard, either.