Archive for the ‘History’ Category

This past Thursday was an occasion of protest for many immigrant laborers who did not show up to work. Presumably, this action was in response to recent executive attacks on immigrants and hoped to demonstrate how businesses would suffer without immigrant labor doing jobs Americans frequently do not want. Tensions between the ownership and laboring classes have a long, tawdry history I cannot begin to summarize. As with other contextual failures, I daresay the general public believes incorrectly that such conflicts date from the 19th century when formal sociopolitical theories like Marxism were published, which intersect heavily with labor economics. An only slightly better understanding is that the labor movement commenced in the United Kingdom some fifty years after the Industrial Revolution began, such as with the Luddites. I pause to remind that the most basic, enduring, and abhorrent labor relationship, extending back millennia, is slavery, which ended in the U.S. only 152 years ago but continues even today in slightly revised forms around the globe.

Thursday’s work stoppage was a faint echo of general strikes and unionism from the middle of the 20th century. Gains in wages and benefits, working conditions, and negotiating position transferred some power from owners to laborers during that period, but today, laborers must sense they are back on their heels, defending conditions fought for by their grandparents but ultimately losing considerable ground. Of course, I’m sympathetic to labor, considering I’m not in the ownership class. (It’s all about perspective.) I must also admit, however, to once quitting a job after only one day that was simply too, well, laborious. I had that option at the time, though it ultimately led nearly to bankruptcy for me — a life lesson that continues to inform my attitudes. As I survey the scene today, however, I suspect many laborers — immigrants and native-born Americans alike — have the unenviable choice of accepting difficult, strenuous labor for low pay or being unemployed. Gradual reduction of demand for labor has two main causes: globalization and automation.

(more…)

I don’t have the patience or expertise to prepare and offer a detailed political analysis such as those I sometimes (not very often) read on other blogs. Besides, once the comments start filling up at those sites, every possible permutation is trotted out, muddying the initial or preferred interpretation with alternatives that make at least as much sense. They’re interesting brainstorming sessions, but I have to wonder what is accomplished.

My own back-of-the-envelope analysis is much simpler and probably no closer to (or farther from) being correct, what with everything being open to dispute. So the new POTUS was born in 1946, which puts the bulk of his boyhood in the 1950s, overlapping with the Eisenhower Administration. That period has lots of attributes, but the most significant (IMO), which would impact an adolescent, was the U.S. economy launching into the stratosphere, largely on the back of the manufacturing sector (e.g., automobiles, airplanes, TVs, etc.), and creating the American middle class. The interstate highway system also dates from that decade. Secondarily, there was a strong but misplaced sense of American moral leadership (one might also say authority or superiority), since we took (too much) credit for winning WWII.

However, it wasn’t great for everyone. Racism, misogyny, and other forms of bigotry were open and virulent. Still, if one was lucky to be a white, middle class male, things were arguably about as good as they would get, which many remember rather fondly, either through rose-colored glasses or otherwise. POTUS as a boy wasn’t middle class, but the culture around him supported a worldview that he embodies even now. He’s also never been an industrialist, but he is a real estate developer (some would say slumlord) and media figure, and his models are taken from the 1950s.

The decade of my boyhood was the 1970s, which were the Nixon, Ford, and Carter Administrations. Everyone could sense the wheels were already coming off the bus, and white male entitlement was far diminished from previous decades. The Rust Belt was already a thing. Like children from the 1950s forward, however, I spent a lot of time in front of the TV. Much of it was goofy fun such as Gilligan’s Island, The Brady Bunch, and interestingly enough, Happy Days. It was innocent stuff. What are the chances that, as a boy plopped in front of the TV, POTUS would have seen the show below (excerpted) and taken special notice considering that the character shares his surname?

Snopes confirms that this a real episode from the TV show Trackdown. Not nearly as innocent as the shows I watched. The coincidences that the character is a con man, promises to build a wall, and claims to be the only person who can save the town are eerie, to say the least. Could that TV show be lodged in the back of POTUS’ brain, along with so many other boyhood memories, misremembered and revised the way memory tends to do?

Some have said that the great economic expansion of the 1950s and 60s was an anomaly. A constellation of conditions configured to produce an historical effect, a Golden Era by some reckonings, that cannot be repeated. We simply cannot return to an industrial or manufacturing economy that had once (arguably) made America great. And besides, the attempt would accelerate the collapse of the ecosystem, which is already in free fall. Yet that appears to be the intention of POTUS, whose early regression to childhood is a threat to us all.

So the deed is done: the winning candidate has been duly delivered and solemnly sworn in as President of the United States. As I expected, he wasted no time and repaired to the Oval Office immediately after the inauguration (before the inaugural ball!) to sign an executive order aimed at the Affordable Care Act (a/k/a Obamacare), presumably to “ease the burden” as the legislative branch gets underway repealing and replacing the ACA. My only surprise is that he didn’t have a stack of similar executive orders awaiting signature at the very first opportunity. Of course, the president had not held back in the weeks up to the inauguration from issuing intemperate statements, or for that matter, indulging in his favorite form of attack: tweet storms against his detractors (lots of those). The culmination (on the very short term at least — it’s still only the weekend) may well have been the inaugural address itself, where the president announced that American interests come first (when has that ever not been the case?), which is being interpreted by many around the globe as a declaration of preemptive war.

The convention with each new presidential administration is to focus on the first hundred days. Back in November 2016, just after the election, National Public Radio (NPR) fact-checked the outline for the first hundred days provided by the campaign at the end of October 2016. With history speeding by, it’s unclear what portion of those plans have survived. Time will tell, of course, and I don’t expect it will take long — surely nowhere near 100 days.

So what is the difference between fulfilling one’s destiny and meeting one’s fate? The latter has a rather unsavory character to it, like the implied curse of the granted genie’s wish. The former smells vaguely of success. Both have a distinctly tragic whiff of inevitability. Either way, this new president appears to be hurrying headlong to effect changes promised during his campaign. If any wisdom is to be gathered at this most unpredictable moment, perhaps it should be a line offered today by fellow blogger the South Roane Agrarian (which may have in turn been stolen from the British version of House of Cards): “Beware of old men in a hurry.”

Aside: I was going to call this post “Fools Rush In,” but I already have one with that title and the slight revision above seems more accurate, at least until the bandwagon fills up.

Addendum: Seems I was partially right. There was a stack of executive orders ready to sign. However, they’ve been metered out over the course of the week rather than dumped in the hours shortly after the inauguration. What sort of calculation is behind that is pure conjecture. I might point out, though, that attention is riveted on the new president and will never subside, so there is no need, as in television, to keep priming the pump.

I pause periodically to contemplate deep time, ancient history, and other subjects that lie beyond most human conceptual abilities. Sure, we sorta get the idea of a very long ago past out there in the recesses or on the margins, just like we get the idea of U.S. sovereign debt now approaching $20 trillion. Problem is, numbers lose coherence when they mount up too high. Scales differ widely with respect to time and currency. Thus, we can still think reasonably about human history back to roughly 6,000 years ago, but 20,000 years ago or more draws a blank. We can also think about how $1 million might have utility, but $1 billion and $1 trillion are phantoms that appear only on ledgers and contracts and in the news (typically mergers and acquisitions). If deep time or deep debt feel like they don’t exist except as conceptual categories, try wrapping your head around the deep state , which in the U.S. is understood to be a surprisingly large rogue’s gallery of plutocrats, kleptocrats, and oligarchs drawn from the military-industrial-corporate complex, the intelligence community, and Wall Street. It exists but does so far enough outside the frame of reference most of us share that it effectively functions in the shadow of daylight where it can’t be seen for all the glare. Players are plain enough to the eye as they board their private jets to attend annual meetings of the World Economic Forum in Davos-Klosters, Switzerland, or two years ago the Jackson Hole [Economic] Summit in Jackson Hole, WY, in connection with the American Principles Project, whatever that is. They also enjoy plausible deniability precisely because most of us don’t really believe self-appointed masters of the universe can or should exist.

Another example of a really bad trip down the rabbit hole, what I might call deep cynicism (and a place I rarely allow myself to go), appeared earlier this month at Gin and Tacos (on my blogroll):

The way they [conservatives] see it, half the kids coming out of public schools today are basically illiterate. To them, this is fine. We have enough competition for the kinds of jobs a college degree is supposed to qualify one for as it is. Our options are to pump a ton of money into public schools and maybe see some incremental improvement in outcomes, or we can just create a system that selects out the half-decent students for a real education and future and then warehouse the rest until they’re no longer minors and they’re ready for the prison-poverty-violence cycle [add military] to Hoover them up. Vouchers and Charter Schools are not, to the conservative mind, a better way to educate kids well. They are a cheaper way to educate them poorly. What matters is that it costs less to people like six-figure income earners and home owners. Those people can afford to send their kids to a decent school anyway. Public education, to their way of thinking, used to be about educating people just enough that they could provide blue collar or service industry labor. Now that we have too much of that, a public high school is just a waiting room for prison. So why throw money into it? They don’t think education “works” anyway; people are born Good or Bad, Talented or Useless. So it only makes sense to find the cheapest possible way to process the students who were written off before they reached middle school. If charter schools manage to save 1% of them, great. If not, well, then they’re no worse than public schools. And they’re cheaper! Did I mention that they’re cheaper?

There’s more. I provided only the main paragraph. I wish I could reveal that the author is being arch or ironic, but there is no evidence of that. I also wish I could refute him, but there is similarly no useful evidence for that. Rather, the explanation he provides is a reality check that fits the experience of wide swaths of the American public, namely, that “public high school is just a waiting room for prison” (soon and again, debtor’s prison) and that it’s designed to be just that because it’s cheaper than actually educating people. Those truly interesting in being educated will take care of it themselves. Plus, there’s additional money to be made operating prisons.

Deep cynicism is a sort of radical awareness that stares balefully at the truth and refuses to blink or pretend. A psychologist might call it the reality principle; a scientist might aver that it relies unflinchingly on objective evidence; a philosopher might call it strict epistemology. To get through life, however, most of us deny abundant evidence presented to us daily in favor of dreams and fantasies that assemble into the dominant paradigm. That paradigm includes the notions that evil doesn’t really exist, that we’re basically good people who care about each other, and that our opportunities and fates are not, on the whole, established long before we begin the journey.

Anthropologists, pundits, armchair cultural critics (like me), and others sometimes offer an aspect or characteristic, usually singular, that separates the human species from other animals. (Note: humans are animals, not the crowning creation of god in his own image, the dogma of major religions.) Typical singular aspects include tool use (very early on, fire), language, agriculture, self-awareness (consciousness), and intelligence, that last including especially the ability to conceptualize time and thus remember and plan ahead. The most interesting candidate suggested to me is our ability to kill from a distance. Without going into a list of things we don’t think we share with other species but surprisingly do, it interests me that none other possesses the ability to kill at a distance (someone will undoubtedly prove me wrong on this).

Two phrases spring to mind: nature is red in tooth and claw (Tennyson) and human life is nasty, brutish, and short (Hobbes). Both encapsulate what it means to have to kill to eat, which is hardly unique to animals. All sorts of plants, insects, and microorganisms embed themselves in hosts, sometimes killing the host and themselves. Symbiotic relationships also exist. The instance that interests me, though, is the act of killing in the animal kingdom that requires putting one’s own body at risk in life-or-death attack. Examples falling short of killing abound, such as intimidation to establish hierarchy, but to eat, an animal must kill its prey.

Having watched my share of historical fiction (pre-1800, say, but especially sword-and-sandal and medieval epics) on the TeeVee and at the cinema, the dramatic appeal of warring armies slamming into each other never seems to get old. Fighting is hand-to-hand or sword-to-sword, which are tantamount to the same. Archer’s arrows, projectiles launched from catapults and trebuchets, thrown knives, spears, and axes, and pouring boiling oil over parapets are killing from a relatively short distance, but the action eventually ends up being very close. The warrior code in fighting cultures honors the willingness to put oneself in harm’s way, to risk one’s own body. Leaders often exhibit mutual respect and may even share some intimacy. War may not be directly about eating, since humans are not cannibals under most circumstances; rather, it’s usually about control of resources, so secondarily about eating by amassing power. Those historical dramas often depict victors celebrating by enjoying lavish feasts.

Modern examples of warfare and killing from a distance make raining down death from above a bureaucratic action undertaken with little or no personal risk. Artillery, carpet bombing from 20,000 feet, drone strikes (controlled from the comfort of some computer lab in the Utah desert), and nuclear bombs are the obvious examples. No honorable warrior code attaches to such killing. Indeed, the chain of command separates the execution of kill orders from moral responsibility — probably a necessary disconnect when large numbers of casualties (collateral damage, if one prefers the euphemism) can be expected. Only war criminals, either high on killing or banally impervious to empathy and compassion, would dispatch hundreds of thousands at a time.

If killing from a distance is in most cases about proximity or lack thereof, one further example is worth mentioning: killing across time. While most don’t really conceptualize the space-time continuum as interconnected, the prospect of choices made today manifesting in megadeath in the foreseeable future is precisely the sort of bureaucratized killing from a distance that should be recognized and forestalled. Yet despite our supposed intellectual superiority over other species, we cannot avoid waging war, real and rhetorical, to control resources and narratives that enable us to eat. Eating the future would be akin to consuming seed corn, but that metaphor is not apt. Better perhaps to say that we’re killing the host. We’re embedded in the world, as indeed is everything we know to be alive, and rely upon the profundity of the biosphere for survival. Although the frequent charge is that humanity is a parasite or has become as cancer on the world, that tired assessment, while more accurate than not, is a little on the nose. A more charitable view is that, as a species, humanity, as the apex predator, has expanded its habitat to include the entire biosphere, killing to eat, and is slowly consuming and transforming it into a place uninhabitable by us, just as a yeast culture consumes its medium and grows to fill the space before dying all at once. So the irony or Pyrrhic victory is that we while we may fatten ourselves (well, some of us) in the short term, we have also created conditions leading to our own doom. Compared to other species whose time on Earth lasted tens of millions of years, human life on Earth turns out to be exactly what Hobbes said: nasty, brutish, and short.

I watched John Pilger’s excellent documentary film The War You Don’t See (2010), which deals with perpetual and immoral wars, obfuscations of the governments prosecuting them, and the journalistic media’s failure to question effectively the lies and justifications that got us into war and keeps us there. The documentary reminded me of The Fog of War (2003), Robert McNamara’s rueful rethinking of his activities as Secretary of Defense during the Kennedy and Johnson administrations (thus, the Vietnam War). Seems that lessons a normal, sane person might draw from experience at war fail to find their way into the minds of decision makers, who must somehow believe themselves to be masters of the universe with immense power at their disposal but are really just war criminals overseeing genocides. One telling detail from Pilger’s film is that civilian deaths (euphemistically retermed collateral damage in the Vietnam era) as a percentage of all deaths (including combatants) have increased from 10% (WWI) to 50% (WWII) to 70% (Vietnam) to 90% (Afghanistan and Iraq). That’s one of the reasons why I call them war criminals: we’re depopulating the theaters of war in which we operate.

After viewing the Pilger film, the person sitting next to me asked, “How do you know what he’s saying is true?” More fog. I’m ill-equipped to handle such direct epistemological challenge; it felt to me like a non sequitur. Ultimately, I was relieved to hear that the question was mere devil’s advocacy, but it’s related to the epistemological crisis I’ve blogged about before. Since the date of that blog post, the crisis has only worsened, which is what I expect as legitimate authority is undermined, expertise erodes, and the public sphere devolves into gamification and gotchas (or a series of ongoing cons). If late-stage capitalism has become a nest of corruption, the same is true — with unexpected rapidity — of the computer era and the Information Superhighway (a term no one uses anymore). One early expectation was that enhanced (24/7/365) access to information would yield impressive educational gains, as though the only thing missing were more information, but human nature being what it is, the first valuable innovations resulted from commercializing erotica and porn. Later debate and hand-wringing over the inaccuracy of Wikipedia and the slanted results of Google searches disappeared as everyone simply got used to not being able to trust those sources any too much, just as everyone got used to forfeiting their privacy online.

Today, everything coughed up in our media-saturated information environment is understood either with a grain of salt mountain of skepticism and held in abeyance until solid confirmation can be had (which often never comes) or simply run with because, well, what the hell? Journalists, the well-trained ones possessing integrity anyway, used to be in the first camp, but market forces and the near instantaneity of (faulty, spun) information, given how the Internet has lowered the bar to publication, have pushed journalists into the second camp. As Pilger notes, they have become echo chambers and amplifiers of the utterances of press agents of warmongering governments. Sure, fact checking still occurs, when it’s easy (such as on the campaign trail), but with war reporting in particular, which poses significant hurdles to information gathering, too many reporters simply repeat what they’re told or believe the staging they’re shown.

The U.S. election has come and gone. Our long national nightmare is finally over; another one is set to begin after a brief hiatus. (I’m not talking about Decision 2020, though that spectre has already reared its ugly head.) Although many were completely surprised by the result of the presidential race in particular, having placed their trust in polls, statistical models, and punditry to project a winner (who then lost), my previous post should indicate that I’m not too surprised. Michael Moore did much better taking the temperature of the room (more accurately, the nation) than all the other pundits, and even if the result had differed, the underlying sentiments remain. It’s fair to say, I think, that people voted with their guts more than their heads, meaning they again voted their fears, hates, and above all, for revolution change. No matter that the change in store for us will very likely be destructive and against self-interest. Truth is, it would have had to end with destruction with any of the candidates on the ballot.

Given the result, my mind wandered to Hillary Clinton’s book It Takes a Village, probably because we, the citizens of the Unites States of America, have effectively elected the village idiot to the nation’s highest office. Slicing and dicing the voting tallies between the popular vote, electoral votes, and states and counties carried will no doubt be done to death. Paths to victory and defeat will be offered with the handsome benefit of hindsight. Little of that matters, really, when one considers lessons never learned despite ample opportunity. For me, the most basic lesson is that for any nation of people, leaders must serve the interests of the widest constituency, not those of a narrow class of oligarchs and plutocrats. Donald Trump addressed the people far more successfully than did Hillary Clinton (with her polished political doubletalk) and appealed directly to their interests, however base and misguided.

My previous post called Barstool Wisdom contained this apt quote from The Brothers Karamazov by Dostoevsky:

The more stupid one is, the closer one is to reality. The more stupid one is, the clearer one is. Stupidity is brief and artless, while intelligence squirms and hides itself.

We have already seen that our president-elect has a knack for stating obvious truths no one else dares utter aloud. His clarity in that regard, though coarse, contrasts completely with Hillary’s squirmy evasions. Indeed, her high-handed approach to governance, more comfortable in the shadows, bears a remarkable resemblance to Richard Nixon, who also failed to convince the public that he was not a crook. My suspicion is that as Donald Trump gets better acquainted with statecraft, he will also learn obfuscation and secrecy. Some small measure of that is probably good, actually, though Americans are pining for greater transparency, one of the contemporary buzzwords thrown around recklessly by those with no real interest in it. My greater worry is that through sheer stupidity and bullheadedness, other obvious truths, such as commission of war crimes and limits of various sorts (ecological, energetic, financial, and psychological), will go unheeded. No amount of barstool wisdom can overcome those.

This is a continuation from part 1.

A long, tortured argument could be offered how we (in the U.S.) are governed by a narrow class of plutocrats (both now and at the founding) who not-so-secretly distrust the people and the practice of direct democracy, employing instead mechanisms found in the U.S. Constitution (such as the electoral college) to transfer power away from the people to so-called experts. I won’t indulge in a history lesson or other analysis, but it should be clear to anyone who bothers to look that typical holders of elected office (and their appointees) more nearly resemble yesteryear’s landed gentry than the proletariat. Rule by elites is thus quite familiar to us despite plenty of lofty language celebrating the common man and stories repeated ad naseum of a few exceptional individuals (exceptional being the important modifier here) who managed to bootstrap their way into the elite from modest circumstances.

Part 1 started with deGrasse Tyson’s recommendation that experts/elites should pitch ideas at the public’s level and ended with my contention that some have lost their public by adopting style or content that fails to connect. In the field of politics, I’ve never quite understood the obsession with how things present to the public (optics) on the one hand and obvious disregard for true consent of the governed on the other. For instance, some might recall pretty serious public opposition before the fact to invasion of Afghanistan and Iraq in response to the 9/11 attacks. The Bush Administration’s propaganda campaign succeeded in buffaloing a fair percentage of the public, many of whom still believe the rank lie that Saddam Hussein had WMDs and represented enough of an existential threat to the U.S. to justify preemptive invasion. Without indulging in conspiratorial conjecture about the true motivations for invasion, the last decade plus has proven that opposition pretty well founded, though it went unheeded.

(more…)

In what has become a predictable status quo, President Obama recently renewed our official state of emergency with respect to the so-called War on Terror. It’s far too late to declare a new normal; we’ve been in this holding pattern for 16 years now. The article linked above provides this useful context:

There are now 32 states of national emergency pending in the United States, with the oldest being a 1979 emergency declared by President Jimmy Carter to impose sanctions during the Iran hostage crisis. Most are used to impose economic sanctions — mostly as a formality, because Congress requires it under the International Emergency Economic Powers Act.

In his term in office, Obama has declared 13 new emergencies, continued 21 declared by his predecessors and revoked just two, which imposed sanctions on Liberia and Russia.

Pro forma renewal of multiple states of national emergency is comparable to the 55-year-old U.S. embargo against Cuba, due for reauthorization next month, though indications are that the embargo may finally be relaxed or deauthorized. Both are examples of miserably failed policy, but they confer a semblance of power on the executive branch. Everyone knows by now that no one relinquishes power willingly, so Obama, like chief executives before him, keeps on keeping on ad nauseum.

Considering Obama’s credential as a Constitutional scholar, relatively unique among U.S. presidents, one might expect him to weigh his options with greater circumspection and with an eye toward restoring suspended civil liberties. However, he has shown little interest in doing so (as far as I know). In combination with the election only a couple months away, the U.S. appears to be in a position similar to Germany in 1932 — ready and willing to elect a despot (take your pick …) and continue its slide into fascism. Can’t even imagine avoiding that outcome now.

The surprising number of ongoing emergencies makes me point to James Howard Kunstler and his book The Long Emergency (2006). Though I haven’t read the book (I’m a failed doomer, I suppose), my understanding is that his prediction of a looming and lingering emergency is based on two intertwined factors currently playing out in geopolitics: peak oil and global warming. (“Climate change” is now preferred over “global warming.”) Those two dire threats (and the California drought) have faded somewhat from the headlines, partially due to fatigue, replaced primarily by terrorism and economic stresses, but the dangers never went away. Melting icecaps and glaciers are probably the clearest incontrovertible indications of anthropogenic global warming, which is poised to trigger nonlinear climate change and hasten the Sixth Extinction. We don’t know when, precisely, though time is growing short. Similarly, reports on energy production and consumption are subject to considerable falsification in the public sphere, making it impossible to know just how close in time we are to a new energy crisis. That inevitability has also been the target of a disinformation campaign, but even a rudimentary understanding of scientific principles is sufficient to enable clear thinkers to penetrate the fog.

I have no plans to return to doom blogging with any vigor. One emergency stacked upon the next, ready to collapse in a cascade of woe, has defeated me, and I have zero expectation that any real, meaningful response can be formulated and executed, especially while we are distracted with terrorism and creeping fascism.

The last time I blogged about this topic, I took an historical approach, locating the problem (roughly) in time and place. In response to recent blog entries by Dave Pollard at How to Save the World, I’ve delved into the topic again. My comments at his site are the length of most of my own blog entries (3–4 paras.), whereas Dave tends to write in chapter form. I’ve condensed to my self-imposed limit.

Like culture and history, consciousness is a moving train that yields its secrets long after it has passed. Thus, assessing our current position is largely conjectural. Still, I’ll be reckless enough to offer my intuitions for consideration. Dave has been pursuing radical nonduality, a mode of thought characterized by losing one’s sense of self and becoming selfless, which diverges markedly from ego consciousness. That mental posture, described elsewhere by nameless others as participating consciousness, is believed to be what preceded the modern mind. I commented that losing oneself in intense, consuming flow behaviors is commonplace but temporary, a familiar, even transcendent place we can only visit. Its appeals are extremely seductive, however, and many people want to be there full-time, as we once were. The problem is that ego consciousness is remarkably resilient and self-reinforcing. Despite losing oneself from time to time, we can’t be liberated from the self permanently, and pathways to even temporarily getting out of one’s own head are elusive and sometimes self-destructive.

My intuition is that we are fumbling toward just such a quieting of the mind, a new dark age if you will, or what I called self-lite in my discussion with Dave. As we stagger forth, groping blindly in the dark, the transitional phase is characterized by numerous disturbances to the psyche — a crisis of consciousness wholly different from the historical one described previously. The example uppermost in my thinking is people lost down the rabbit hole of their handheld devices and desensitized to the world beyond the screen. Another is the ruined, wasted minds of (arguably) two or more generations of students done great disservice by their parents and educational institutions at all levels, a critical mass of intellectually stunted and distracted young adults by now. Yet another is those radicalized by their close identification with one or more special interest groups, also known as identity politics. A further example is the growing prevalence of confusion surrounding sexual orientation and gender identity. In each example, the individual’s ego is confused, partially suppressed, and/or under attack. Science fiction and horror genres have plenty of instructive examples of people who are no longer fully themselves, their bodies zombified or made into hosts for another entity that takes up residence, commandeering or shunting aside the authentic, original self.

Despite having identified modern ego consciousness as a crisis and feeling no small amount of empathy for those seeking radical nonduality, I find myself in the odd position of defending the modern mind precisely because transitional forms, if I have understood them properly, are so abhorrent. Put another way, while I can see the potential value and allure of extinguishing the self even semi-permanently, I will not be an early adopter. Indeed, if the modern mind took millennia to develop as one of the primary evolutionary characteristics of homo sapiens sapiens, it seems foolish to presume that it can be uploaded into a computer, purposely discarded by an act of will, or devolved in even a few generations. Meanwhile, though the doomer in me recognizes that ego consciousness is partly responsible for bringing us to the brink of (self-)annihilation (financial, geopolitical, ecological), individuality and intelligence are still highly prized where they can be found.