Posts Tagged ‘Futurism’

An old Star Trek episode called “A Taste for Armageddon” depicts Capt. Kirk and crew confronting a planetary culture that has adopted purely administrative warfare with a nearby planet, where computer simulations determine outcomes of battles and citizens/inhabitants are notified to report for their destruction in disintegration chambers to comply with those outcomes. Narrative resolution is tidied up within the roughly 1-hour span of the episode, of course, but it was and is nonetheless a thought-provoking scenario. The episode, now 50 years old, prophesies a hyper-rational approach to conflict. (I was 4 years old at the time it aired on broadcast television, and I don’t recall having seen it since. Goes to show how influential high-concept storytelling can be even on someone quite young.) The episode came to mind as I happened across video showing how robot soldiers are being developed to supplement and eventually replace human combatants. See, for example, this:

The robot in the video above is not overtly militarized, but there is no doubt that it will could be. Why the robot takes bipedal, humanoid form with an awkwardly high center of gravity is unclear to me beyond our obvious self-infatuation. Additional videos with two-wheeled, quadriped, and even insect-like multilegged designs having much improved movement and flexibility can be found with a simple search. Any of them can be transformed into ground-based killing machines, as suggested more manifestly in the video below highlighting various walking, rolling, flying, floating, and swimming machines developed to do our dirty work:

(more…)

First, a few reminders:

  • The United States has been in an undeclared state of war for 15 years, the longest in U.S. history and long enough that young people today can say legitimately, “we’ve always been at war with Oceania.” The wars encompass the entirety of both terms of the Obama Administration.
  • The inciting events were attacks on U.S. soil carried out on September 11, 2001 (popularly, 9/11), which remain shrouded in controversy and conspiracy despite the official narrative assigning patsy blame to al-Qaida operating in Afghanistan and Iraq.
  • On the heels of the attacks, the Bush Administration commenced a propaganda campaign to sell invasion and regime change in those two countries and, over widespread public protest, went ahead and launched preemptive wars, ostensibly because an existential threat existed with respect to weapons of mass destruction (WMDs) possessed by Iraq in particular.
  • The propaganda campaign has since been revealed to have been cooked up and untrue, yet it buffaloed a lot of people into believing (even to this day) that Iraq was somehow responsible for 9/11.
  • Our preemptive wars succeeded quickly in toppling governments and capturing (and executing) their leaders but immediately got bogged down securing a peace that never came.
  • Even with an embarrassing mismatch of force, periodic troop surges and draw downs, trillions of dollars wasted spent prosecuting the wars, and incredible, pointless loss of life (especially on the opposing sides), our objective in the Middle East (other than the oil, stupid!) has never been clear. The prospect of final withdrawal is nowhere on the horizon.

Continuous war — declared or merely waged — has been true of the U.S. my whole life, though one would be hard pressed to argue that it truly represents an immediate threat to U.S. citizens except to those unlucky enough to be deployed in war zones. Still, the monkey-on-the-back is passed from administration to administration. One might hope, based on campaign rhetoric, that the new executive (45) might recognize continuous war as the hot potato it is and dispense with it, but the proposed federal budget, with its $52 billion increase in military spending (+10% over 2016), suggests otherwise. Meanwhile, attention has been turned away from true existential threats that have been bandied about in the public sphere for at least a decade: global warming and climate change leading to Near-Term Extinction (NTE). Proximal threats, largely imagined, have absorbed all our available attention, and depending on whom one polls, our worst fears have already been realized.

The 20th and 21st centuries (so far) have been a series of “hot” wars (as distinguished from the so-called Cold War). Indeed, there has scarcely been a time when the U.S. has not been actively engaged fighting phantoms. If the Cold War was a bloodless, ideological war to stem the nonexistent spread of communism, we have adopted and coopted the language of wartime to launch various rhetorical wars. First was LBJ’s War on Poverty, the only “war” aimed at truly helping people. Nixon got into the act with his War on Drugs, which was punitive. Reagan expanded the War on Drugs, which became the War on Crime. Clinton increased the punitive character of the War on Crime by instituting mandatory minimum sentencing, which had the side effect of establishing what some call the prison-industrial complex, inflating the incarceration rate of Americans to the point that the U.S. is now ranked second in the world behind the Seychelles (!), a ranking far, far higher than any other industrialized nation.

If U.S. authoritarians hadn’t found enough people to punish or sought to convince the public that threats exist on all sides, requiring constant vigilance and a massive security apparatus including military, civil police, and intelligence services comprised of 16 separate agencies (of which we know), Bush coined and declared the War on Terror aimed at punishing those foreign and domestic who dare challenge U.S. hegemony in all things. It’s not called a national security state for nuthin’, folks. I aver that the rhetorical War on Poverty has inverted and now become a War on the Poverty-Stricken. De facto debtors’ prisons have reappeared, predatory lending has become commonplace, and income inequality grows more exaggerated with every passing year, leaving behind large segments of the U.S. population as income and wealth pool in an ever-shrinking number of hands. Admittedly, the trend is global.

At some point, perhaps in the 1960s when The Establishment (or more simply, The Man) became a thing to oppose, the actual Establishment must have decided it was high time to circle the wagons and protect its privileges, essentially going to war with (against, really) the people. Now five decades on, holders of wealth and power demonstrate disdain for those outside their tiny circle, and our the government can no longer be said with a straight face to be of, by, and for the people (paraphrasing the last line of Lincoln’s Gettysburg Address). Rather, the government has been hijacked and turned into something abominable. Yet the people are strangely complicit, having allowed history to creep along with social justice in marked retreat. True threats do indeed exist, though not the ones that receive the lion’s share of attention. I surmise that, as with geopolitics, the U.S. government has brought into being an enemy and conflict that bodes not well for its legitimacy. Which collapse occurs first is anyone’s guess.

As a boy, my home included a coffee table book, title unknown, likely published circa 1960, about the origins of human life on Earth. (A more recent book of this type attracting lots of attention is Sapiens: A Brief History of Humankind (2015) by Yuval Harari, which I haven’t yet read.) It was heavily enough illustrated that my siblings and I consulted it mostly for the pictures, which can probably be excused since we were youngsters at the time time. What became of the book escapes me. In the intervening decades, I made no particular study of the ancient world — ancient meaning beyond the reach of human memory systems. Thus, ancient could potentially refer to anthropological history in the tens of thousands of years, evolutionary history stretching across tens of millions of years, geological history over hundreds of millions of years, or cosmological time going back a few billions. For the purpose of this blog post, let’s limit ancient to no more than fifty thousand years ago.

A few months ago, updates (over the 1960 publication) to the story of human history and civilization finally reached me (can’t account for the delay of several decades) via webcasts published on YouTube between Joe Rogan, Randall Carlson, and Graham Hancock. Rogan hosts the conversations; Carlson and Hancock are independent researchers whose investigations converge on evidence of major catastrophes that struck the ancient world during the Younger Dryas Period, erasing most but not all evidence of an antediluvian civilization. Whereas I’m a doomer, they are catastrophists. To call this subject matter fascinating is a considerable understatement. And yet, it’s neither here nor there with respect to how we conduct our day-to-day lives. Their revised history connects to religious origin stories, but such narratives have been relegated to myth and allegory for a long time already, making them more symbolic than historical.

In the tradition of Galileo, Copernicus, Newton, and Darwin, all of whom went against scientific orthodoxy of their times but were ultimately vindicated, Carlson and Graham appear to be rogue scientists/investigators exploring deep history and struggling against the conventional story of the beginnings of civilization around 6,000 years ago in the Middle East and Egypt. John Anthony West is another who disputes the accepted narratives and timelines. West is also openly critical of “quackademics” who refuse to consider accumulating evidence but instead collude to protect their cherished ideological and professional positions. The vast body of evidence being pieced together is impressive, and I truly appreciate their collective efforts. I’m about 50 pp. into Hancock’s Fingerprints of the Gods (1995), which contains copious detail not well suited to the conversational style of a webcast. His follow-up Magicians of the Gods (2015) will have to wait. Carlson’s scholarly work is published at the website Sacred Geometry International (and elsewhere, I presume).

So I have to admit that my blog, launched in 2006 as a culture blog, turned partially into a doomer blog as that narrative gained the weight of overwhelming evidence. What Carlson and Hancock in particular present is evidence of major catastrophes that struck the ancient world and are going to repeat: a different sort of doom, so to speak. Mine is ecological, financial, cultural, and finally civilizational collapse borne out of exhaustion, hubris, frailty, and most importantly, poor stewardship. Theirs is periodic cataclysmic disaster including volcanic eruptions and explosions, great floods (following ice ages, I believe), meteor strikes, earthquakes, tsunamis, and the like, each capable of ending civilization all at once. Indeed, those inevitable events are scattered throughout our geological history, though at unpredictable intervals often spaced tens or hundreds of thousands of years apart. For instance, the supervolcano under Yellowstone is known to blow roughly every 600,000 years, and we’re overdue. Further, the surface of the Moon indicates bombardment from meteors; the Earth’s history of the same is hidden somewhat by continuous transformation of the landscape lacking on the Moon. The number of near misses, also known as near-Earth objects, in the last few decades is rather disconcerting. Any of these disasters could strike at any time, or we could wait another 10,000 years.

Carlson and Hancock warn that we must recognize the dangers, drop our petty international squabbles, and unite as a species to prepare for the inevitable. To do otherwise would be to court disaster. However, far from dismissing the prospect of doom I’ve been blogging about, they merely add another category of things likely to kill us off. They give the impression that we should turn our attention away from sudden climate change, the Sixth Extinction, and other perils to which we have contributed heavily and worry instead about death from above (the skies) and below (the Earth’s crust). It’s impossible to say which is the most worrisome prospect. As a fatalist, I surmise that there is little we can do to forestall any of these eventualities. Our fate is already sealed in one respect or another. That foreknowledge make life precious for me, and frankly, is another reason to put aside our petty squabbles.

This past Thursday was an occasion of protest for many immigrant laborers who did not show up to work. Presumably, this action was in response to recent executive attacks on immigrants and hoped to demonstrate how businesses would suffer without immigrant labor doing jobs Americans frequently do not want. Tensions between the ownership and laboring classes have a long, tawdry history I cannot begin to summarize. As with other contextual failures, I daresay the general public believes incorrectly that such conflicts date from the 19th century when formal sociopolitical theories like Marxism were published, which intersect heavily with labor economics. An only slightly better understanding is that the labor movement commenced in the United Kingdom some fifty years after the Industrial Revolution began, such as with the Luddites. I pause to remind that the most basic, enduring, and abhorrent labor relationship, extending back millennia, is slavery, which ended in the U.S. only 152 years ago but continues even today in slightly revised forms around the globe.

Thursday’s work stoppage was a faint echo of general strikes and unionism from the middle of the 20th century. Gains in wages and benefits, working conditions, and negotiating position transferred some power from owners to laborers during that period, but today, laborers must sense they are back on their heels, defending conditions fought for by their grandparents but ultimately losing considerable ground. Of course, I’m sympathetic to labor, considering I’m not in the ownership class. (It’s all about perspective.) I must also admit, however, to once quitting a job after only one day that was simply too, well, laborious. I had that option at the time, though it ultimately led nearly to bankruptcy for me — a life lesson that continues to inform my attitudes. As I survey the scene today, however, I suspect many laborers — immigrants and native-born Americans alike — have the unenviable choice of accepting difficult, strenuous labor for low pay or being unemployed. Gradual reduction of demand for labor has two main causes: globalization and automation.

(more…)

So the deed is done: the winning candidate has been duly delivered and solemnly sworn in as President of the United States. As I expected, he wasted no time and repaired to the Oval Office immediately after the inauguration (before the inaugural ball!) to sign an executive order aimed at the Affordable Care Act (a/k/a Obamacare), presumably to “ease the burden” as the legislative branch gets underway repealing and replacing the ACA. My only surprise is that he didn’t have a stack of similar executive orders awaiting signature at the very first opportunity. Of course, the president had not held back in the weeks up to the inauguration from issuing intemperate statements, or for that matter, indulging in his favorite form of attack: tweet storms against his detractors (lots of those). The culmination (on the very short term at least — it’s still only the weekend) may well have been the inaugural address itself, where the president announced that American interests come first (when has that ever not been the case?), which is being interpreted by many around the globe as a declaration of preemptive war.

The convention with each new presidential administration is to focus on the first hundred days. Back in November 2016, just after the election, National Public Radio (NPR) fact-checked the outline for the first hundred days provided by the campaign at the end of October 2016. With history speeding by, it’s unclear what portion of those plans have survived. Time will tell, of course, and I don’t expect it will take long — surely nowhere near 100 days.

So what is the difference between fulfilling one’s destiny and meeting one’s fate? The latter has a rather unsavory character to it, like the implied curse of the granted genie’s wish. The former smells vaguely of success. Both have a distinctly tragic whiff of inevitability. Either way, this new president appears to be hurrying headlong to effect changes promised during his campaign. If any wisdom is to be gathered at this most unpredictable moment, perhaps it should be a line offered today by fellow blogger the South Roane Agrarian (which may have in turn been stolen from the British version of House of Cards): “Beware of old men in a hurry.”

Aside: I was going to call this post “Fools Rush In,” but I already have one with that title and the slight revision above seems more accurate, at least until the bandwagon fills up.

Addendum: Seems I was partially right. There was a stack of executive orders ready to sign. However, they’ve been metered out over the course of the week rather than dumped in the hours shortly after the inauguration. What sort of calculation is behind that is pure conjecture. I might point out, though, that attention is riveted on the new president and will never subside, so there is no need, as in television, to keep priming the pump.

Stray links build up over time without my being able to handle them adequately, so I have for some time wanted a way of purging them. I am aware of other bloggers who curate and aggregate links with short commentaries quite well, but I have difficulty making my remarks pithy and punchy. That said, here are a few that I’m ready to purge in this first attempt to dispose of a few links from by backlog.

Skyfarm Fantasies

Futurists have offered myriad visions of technologies that have no hope of being implemented, from flying cars to 5-hour workweeks to space elevators. The newest pipe dream is the Urban Skyfarm, a roughly 30-story tree-like structure with 24 acres of space using solar panels and hydroponics to grow food close to the point of consumption. Utopian engineering such as this crops up frequently (pun intended) and may be fun to contemplate, but in the U.S. at least, we can’t even build high-speed rail, and that technology is already well established elsewhere. I suppose that’s why cities such as Seoul and Singapore, straining to make everything vertical for lack of horizontal space, are the logical test sites.

Leaving Nashville

The City of Nashville is using public funds to buy homeless people bus tickets to leave town and go be poor somewhere else. Media spin is that the city is “helping people in need,” but it’s obviously a NIMBY response to a social problem city officials and residents (not everyone, but enough) would rather not have to address more humanely. How long before cities begin completing with each other in numbers of people they can ship off to other cities? Call it the circle of life when the homeless start gaming the programs, revisiting multiple cities in an endless circuit.

Revisioneering

Over at Rough Type, Nick Carr points to an article in The Nation entitled “Instagram and the Fantasy of of Mastery,” which argues that a variety of technologies now give “artists” the illusion of skill, merit, and vision by enabling work to be easily executed using prefab templates and stylistic filters. For instance, in pop music, the industry standard is to auto-tune everyone’s singing to hide imperfections. Carr’s summary probably is better than the article itself and shows us the logical endpoint of production art in various media undertaken without the difficult work necessary to develop true mastery.

Too Poor to Shop

The NY Post reported over the summer that many Americans are too poor to shop except for necessities. Here are the first two paragraphs:

Retailers have blamed the weather, slow job growth and millennials for their poor results this past year, but a new study claims that more than 20 percent of Americans are simply too poor to shop.

These 26 million Americans are juggling two to three jobs, earning just around $27,000 a year and supporting two to four children — and exist largely under the radar, according to America’s Research Group, which has been tracking consumer shopping trends since 1979.

Current population in the U.S. is around 325 million. Twenty percent of that number is 65 million; twenty-six million is 8 percent. Pretty basic math, but I guess NY Post is not to be trusted to report even simple things accurately. Maybe it’s 20% of U.S. households. I dunno and can’t be bothered to check. Either way, that’s a pretty damning statistic considering the U.S. stock market continues to set new all-time highs — an economic recovery not shared with average Americans. Indeed, here are a few additional newsbits and links stolen ruthlessly from theeconomiccollapseblog.com:

  • The number of Americans that are living in concentrated areas of high poverty has doubled since the year 2000.
  • In 2007, about one out of every eight children in America was on food stamps. Today, that number is one out of every five.
  • 46 million Americans use food banks each year, and lines start forming at some U.S. food banks as early as 6:30 in the morning because people want to get something before the food supplies run out.
  • The number of homeless children in the U.S. has increased by 60 percent over the past six years.
  • According to Poverty USA, 1.6 million American children slept in a homeless shelter or some other form of emergency housing last year.

For further context, theeconomiccollapseblog also points to “The Secret Shame of Middle Class Americans” in The Atlantic, which reports, among other things, that fully 47% of Americans would struggle to scrape together a mere $400 in an emergency.

How do such folks respond to the national shopping frenzy kicking off in a few days with Black Friday, Small Business Saturday, Charitable Sunday, and Cyber Monday? I suggest everyone stay home.

Back in the day, I studied jazz improvisation. Like many endeavors, it takes dedication and continuous effort to develop the ear and learn to function effectively within the constraints of the genre. Most are familiar with the most simple form: the 12-bar blues. Whether more attuned to rhythm, harmony, lyrics, or structure doesn’t much matter; all elements work together to define the blues. As a novice improviser, structure is easy to grasp and lyrics don’t factor in (I’m an instrumentalist), but harmony and rhythm, simple though they may be to understand, are formidable when one is making up a solo on the spot. That’s improvisation. In class one day, after two passes through the chord changes, the instructor asked me how I thought I had done, and I blurted out that I was just trying to fill up the time. Other students heaved a huge sigh of recognition and relief: I had put my thumb on our shared anxiety. None of us were skilled enough yet to be fluent or to actually have something to say — the latter especially the mark of a skilled improvisor — but were merely trying to plug the whole when our turn came.

These days, weekends feel sorta the same way. On Friday night, the next two days often feel like a yawning chasm where I plan what I know from experience will be an improvisation, filling up the available time with shifting priorities, some combination of chores, duties, obligations, and entertainments (and unavoidable bodily functions such as eating, sleeping, etc.). Often enough I go back to work with stories to tell about enviable weekend exploits, but just I often have a nagging feeling that I’m still a novice with nothing much to say or contribute, just filling up the time with noise. And as I contemplate what years and decades may be left to me (if the world doesn’t crack up first), the question arises: what big projects would I like to accomplish before I’m done? That, too, seems an act of improvisation.

I suspect recent retirees face these dilemmas with great urgency until they relax and decide “who cares?” What is left to do, really, before one finally checks out? If careers are completed, children are raised, and most of life’s goals are accomplished, what remains besides an indulgent second childhood of light hedonism? Or more pointedly, what about one’s final years keeps it from feeling like quiet desperation or simply waiting for the Grim Reaper? What last improvisations and flourishes are worth undertaking? I have no answers to these questions. They don’t press upon me just yet with any significance, and I suffered no midlife crisis (so far) that would spur me to address the questions head on. But I can feel them gathering in the back of my mind like a shadow — especially with the specters of American-style fascism, financial and industrial collapse, and NTE looming.

The U.S. election has come and gone. Our long national nightmare is finally over; another one is set to begin after a brief hiatus. (I’m not talking about Decision 2020, though that spectre has already reared its ugly head.) Although many were completely surprised by the result of the presidential race in particular, having placed their trust in polls, statistical models, and punditry to project a winner (who then lost), my previous post should indicate that I’m not too surprised. Michael Moore did much better taking the temperature of the room (more accurately, the nation) than all the other pundits, and even if the result had differed, the underlying sentiments remain. It’s fair to say, I think, that people voted with their guts more than their heads, meaning they again voted their fears, hates, and above all, for revolution change. No matter that the change in store for us will very likely be destructive and against self-interest. Truth is, it would have had to end with destruction with any of the candidates on the ballot.

Given the result, my mind wandered to Hillary Clinton’s book It Takes a Village, probably because we, the citizens of the Unites States of America, have effectively elected the village idiot to the nation’s highest office. Slicing and dicing the voting tallies between the popular vote, electoral votes, and states and counties carried will no doubt be done to death. Paths to victory and defeat will be offered with the handsome benefit of hindsight. Little of that matters, really, when one considers lessons never learned despite ample opportunity. For me, the most basic lesson is that for any nation of people, leaders must serve the interests of the widest constituency, not those of a narrow class of oligarchs and plutocrats. Donald Trump addressed the people far more successfully than did Hillary Clinton (with her polished political doubletalk) and appealed directly to their interests, however base and misguided.

My previous post called Barstool Wisdom contained this apt quote from The Brothers Karamazov by Dostoevsky:

The more stupid one is, the closer one is to reality. The more stupid one is, the clearer one is. Stupidity is brief and artless, while intelligence squirms and hides itself.

We have already seen that our president-elect has a knack for stating obvious truths no one else dares utter aloud. His clarity in that regard, though coarse, contrasts completely with Hillary’s squirmy evasions. Indeed, her high-handed approach to governance, more comfortable in the shadows, bears a remarkable resemblance to Richard Nixon, who also failed to convince the public that he was not a crook. My suspicion is that as Donald Trump gets better acquainted with statecraft, he will also learn obfuscation and secrecy. Some small measure of that is probably good, actually, though Americans are pining for greater transparency, one of the contemporary buzzwords thrown around recklessly by those with no real interest in it. My greater worry is that through sheer stupidity and bullheadedness, other obvious truths, such as commission of war crimes and limits of various sorts (ecological, energetic, financial, and psychological), will go unheeded. No amount of barstool wisdom can overcome those.

This is a continuation from part 1.

A long, tortured argument could be offered how we (in the U.S.) are governed by a narrow class of plutocrats (both now and at the founding) who not-so-secretly distrust the people and the practice of direct democracy, employing instead mechanisms found in the U.S. Constitution (such as the electoral college) to transfer power away from the people to so-called experts. I won’t indulge in a history lesson or other analysis, but it should be clear to anyone who bothers to look that typical holders of elected office (and their appointees) more nearly resemble yesteryear’s landed gentry than the proletariat. Rule by elites is thus quite familiar to us despite plenty of lofty language celebrating the common man and stories repeated ad naseum of a few exceptional individuals (exceptional being the important modifier here) who managed to bootstrap their way into the elite from modest circumstances.

Part 1 started with deGrasse Tyson’s recommendation that experts/elites should pitch ideas at the public’s level and ended with my contention that some have lost their public by adopting style or content that fails to connect. In the field of politics, I’ve never quite understood the obsession with how things present to the public (optics) on the one hand and obvious disregard for true consent of the governed on the other. For instance, some might recall pretty serious public opposition before the fact to invasion of Afghanistan and Iraq in response to the 9/11 attacks. The Bush Administration’s propaganda campaign succeeded in buffaloing a fair percentage of the public, many of whom still believe the rank lie that Saddam Hussein had WMDs and represented enough of an existential threat to the U.S. to justify preemptive invasion. Without indulging in conspiratorial conjecture about the true motivations for invasion, the last decade plus has proven that opposition pretty well founded, though it went unheeded.

(more…)

The last time I blogged about this topic, I took an historical approach, locating the problem (roughly) in time and place. In response to recent blog entries by Dave Pollard at How to Save the World, I’ve delved into the topic again. My comments at his site are the length of most of my own blog entries (3–4 paras.), whereas Dave tends to write in chapter form. I’ve condensed to my self-imposed limit.

Like culture and history, consciousness is a moving train that yields its secrets long after it has passed. Thus, assessing our current position is largely conjectural. Still, I’ll be reckless enough to offer my intuitions for consideration. Dave has been pursuing radical nonduality, a mode of thought characterized by losing one’s sense of self and becoming selfless, which diverges markedly from ego consciousness. That mental posture, described elsewhere by nameless others as participating consciousness, is believed to be what preceded the modern mind. I commented that losing oneself in intense, consuming flow behaviors is commonplace but temporary, a familiar, even transcendent place we can only visit. Its appeals are extremely seductive, however, and many people want to be there full-time, as we once were. The problem is that ego consciousness is remarkably resilient and self-reinforcing. Despite losing oneself from time to time, we can’t be liberated from the self permanently, and pathways to even temporarily getting out of one’s own head are elusive and sometimes self-destructive.

My intuition is that we are fumbling toward just such a quieting of the mind, a new dark age if you will, or what I called self-lite in my discussion with Dave. As we stagger forth, groping blindly in the dark, the transitional phase is characterized by numerous disturbances to the psyche — a crisis of consciousness wholly different from the historical one described previously. The example uppermost in my thinking is people lost down the rabbit hole of their handheld devices and desensitized to the world beyond the screen. Another is the ruined, wasted minds of (arguably) two or more generations of students done great disservice by their parents and educational institutions at all levels, a critical mass of intellectually stunted and distracted young adults by now. Yet another is those radicalized by their close identification with one or more special interest groups, also known as identity politics. A further example is the growing prevalence of confusion surrounding sexual orientation and gender identity. In each example, the individual’s ego is confused, partially suppressed, and/or under attack. Science fiction and horror genres have plenty of instructive examples of people who are no longer fully themselves, their bodies zombified or made into hosts for another entity that takes up residence, commandeering or shunting aside the authentic, original self.

Despite having identified modern ego consciousness as a crisis and feeling no small amount of empathy for those seeking radical nonduality, I find myself in the odd position of defending the modern mind precisely because transitional forms, if I have understood them properly, are so abhorrent. Put another way, while I can see the potential value and allure of extinguishing the self even semi-permanently, I will not be an early adopter. Indeed, if the modern mind took millennia to develop as one of the primary evolutionary characteristics of homo sapiens sapiens, it seems foolish to presume that it can be uploaded into a computer, purposely discarded by an act of will, or devolved in even a few generations. Meanwhile, though the doomer in me recognizes that ego consciousness is partly responsible for bringing us to the brink of (self-)annihilation (financial, geopolitical, ecological), individuality and intelligence are still highly prized where they can be found.