From a statement (PDF link) by Supreme Court Justice Neil Gorsuch regarding Title 42 (quite long for this type of post but worthwhile):

[T]he history of this case illustrates the disruption we have experienced over the last three years in how our laws are made and our freedoms observed.

Since March 2020, we may have experienced the greatest intrusions on civil liberties in the peacetime history of this country. Executive officials across the country issued emergency decrees on a breathtaking scale. Governors and local leaders imposed lockdown orders forcing people to remain in their homes.

They shuttered businesses and schools public and private. They closed churches even as they allowed casinos and other favored businesses to carry on. They threatened violators not just with civil penalties but with criminal sanctions too.

They surveilled church parking lots, recorded license plates, and issued notices warning that attendance at even outdoor services satisfying all state social-distancing and hygiene requirements could amount to criminal conduct. They divided cities and neighborhoods into color-coded zones, forced individuals to fight for their freedoms in court on emergency timetables, and then changed their color-coded schemes when defeat in court seemed imminent.

Federal executive officials entered the act too. Not just with emergency immigration decrees. They deployed a public-health agency to regulate landlord-tenant relations nationwide.They used a workplace-safety agency to issue a vaccination mandate for most working Americans.

They threatened to fire noncompliant employees, and warned that service members who refused to vaccinate might face dishonorable discharge and confinement. Along the way, it seems federal officials may have pressured social-media companies to suppress information about pandemic policies with which they disagreed.

While executive officials issued new emergency decrees at a furious pace, state legislatures and Congress—the bodies normally responsible for adopting our laws—too often fell silent. Courts bound to protect our liberties addressed a few—but hardly all—of the intrusions upon them. In some cases, like this one, courts even allowed themselves to be used to perpetuate emergency public-health decrees for collateral purposes, itself a form of emergency-lawmaking-by-litigation.

Doubtless, many lessons can be learned from this chapter in our history, and hopefully serious efforts will be made to study it. One lesson might be this: Fear and the desire for safety are powerful forces. They can lead to a clamor for action—almost any action—as long as someone does something to address a perceived threat. 

A leader or an expert who claims he can fix everything, if only we do exactly as he says, can prove an irresistible force. We do not need to confront a bayonet, we need only a nudge, before we willingly abandon the nicety of requiring laws to be adopted by our legislative representatives and accept rule by decree. Along the way, we will accede to the loss of many cherished civil liberties—the right to worship freely, to debate public policy without censorship, to gather with friends and family, or simply to leave our homes. 

We may even cheer on those who ask us to disregard our normal lawmaking processes and forfeit our personal freedoms. Of course, this is no new story. Even the ancients warned that democracies can degenerate toward autocracy in the face of fear.

But maybe we have learned another lesson too. The concentration of power in the hands of so few may be efficient and sometimes popular. But it does not tend toward sound government. However wise one person or his advisors may be, that is no substitute for the wisdom of the whole of the American people that can be tapped in the legislative process.

Decisions produced by those who indulge no criticism are rarely as good as those produced after robust and uncensored debate. Decisions announced on the fly are rarely as wise as those that come after careful deliberation. Decisions made by a few often yield unintended consequences that may be avoided when more are consulted. Autocracies have always suffered these defects. Maybe, hopefully, we have relearned these lessons too.

In the 1970s, Congress studied the use of emergency decrees. It observed that they can allow executive authorities to tap into extraordinary powers. Congress also observed that emergency decrees have a habit of long outliving the crises that generate them; some federal emergency proclamations, Congress noted, had remained in effect for years or decades after the emergency in question had passed.

At the same time, Congress recognized that quick unilateral executive action is sometimes necessary and permitted in our constitutional order. In an effort to balance these considerations and ensure a more normal operation of our laws and a firmer protection of our liberties, Congress adopted a number of new guardrails in the National Emergencies Act.

Despite that law, the number of declared emergencies has only grown in the ensuing years. And it is hard not to wonder whether, after nearly a half-century and in light of our Nation’s recent experience, another look is warranted. It is hard not to wonder, too, whether state legislatures might profitably reexamine the proper scope of emergency executive powers at the state level. 

At the very least, one can hope that the Judiciary will not soon again allow itself to be part of the problem by permitting litigants to manipulate our docket to perpetuate a decree designed for one emergency to address another. Make no mistake—decisive executive action is sometimes necessary and appropriate. But if emergency decrees promise to solve some problems, they threaten to generate others. And rule by indefinite emergency edict risks leaving all of us with a shell of a democracy and civil liberties just as hollow.


Douglas Murray wrote a book called The Strange Death of Europe (2017), which is behind a long list of books as yet unread by me. Doubtful I will ever get to is. My familiarity with his thesis stems from his many appearances in interviews and webcasts describing the book. Murray continues a long line of declinists (mentioned here) prophesying and/or chronicling the long, slow demise of Europe, and in an only slightly wider sense, the West. Analyses of the causes and manifestations of decline range all over the map but typically include a combination of incompetence, exhaustion, and inevitability. For Murray, the strangeness is that it appears self-inflicted and openly desired out of some misplaced sense of guilt and shame following an entirely atypical (for the West) moral reckoning, perhaps for having been the dominant global culture for roughly five hundred years. This blog already says plenty about disaster, destruction, and doom, so let me instead pose a different question: Is the cultural inheritance of the West at all worth honoring and preserving? Won’t be an easy answer.

The question is partly prompted by the breathless announcement last fall in the alumni magazine of one of my alma maters of a newly created position (a further example of needless administrative bloat): Associate Dean of Equity and Inclusion (among other flowery academic titles attached to the position). For reasons unknown, the full sequence Diversity, Equity, and Inclusion (a/k/a DIE — yes, the initialism is purposely disordered) was avoided, but as with many trendy turns of phrase, precision hardly matters since the omitted word is inferred. Let’s first consider the term cultural inheritance. If one takes a warts-and-all approach, the full spectrum from glory to atrocity accumulated throughout history informs what descends from our forebears in the form of inheritance. Indeed, highs and lows are scarcely separable. Whereas individual memory tends to select mostly good parts, cultural memory records everything — albeit with interpretive bias — so long as one has the inclination to consult history books.

If one takes a charitable view of history, admirable events, practices, norms, and artifacts become the celebratory focus while nasty parts may be acknowledged but are nonetheless elided, forgotten, or shoved forcibly down the memory hole. But if one takes a pessimistic view, the very worst parts represent an irredeemable, permanent stain on everything else (the aforementioned moral reckoning). Recent arguments (revisionist history say many) that the founding of the United States of America, a political entity, should be redated to 1619 to coincide with the establishment of slavery rather than 1776 with the Declaration of Independence is an example. Uncharacteristically perhaps, I take the charitable view while the trés chic view in academe (echoed in corporate life and far-left government policy) is to condemn the past for failing to embody Woke standards of the present. Multiple influential segments of Western culture have thus succumbed to ideological possession as demonstrated by self-denunciation, self-flagellation, and complete loss of faith in Western institutions because of, well, slavery, genocide, colonialism, warfare, torture, racism, child labor, human trafficking, and other abominations, almost none not all of which are relegated to the past.

Read the rest of this entry »

Caitlin Johnstone at her website on the subject of hero worship and false idols:

Pranksters mowing a giant penis on the grounds of an elite coronation party feels like the beginning of the end of something. People find the idea of a British king in 2023 absurd, and public opinion of the monarchy will only go down from here. Idols keep falling off their pedestals.

This sort of thing is happening everywhere; public figures once held in high esteem just keep losing face. The Catholic Church pedophile scandal kind of started it off. Twitter showed everyone that celebrities are just idiots with bad opinions. US presidents are ridiculous cartoons now, with the last one an incoherent buffoon and the current one a disintegrating dementia patient.

And it feels like it’s happening faster and faster. The Dalai Lama trying to tongue kiss that kid. Chomsky meeting with Epstein. Bernie Sanders falling all over himself to serve the establishment he once vocally decried. People just don’t get to keep their heroes anymore.

People of even modest wisdom know that Three-Card Monte is a con. The scam goes by myriad other names, including The Shell Game, Follow the Queen, Find the Lady, Chase the Ace, and Triplets (even more in foreign languages). The underlying trick is simple: distract players’ attention from the important thing by directing attention elsewhere. Pickpockets do the same, bumping into marks or patting them on the shoulder to obscure awareness of the lift from the back pocket or purse. A conman running The Shell Game may also use a plant (someone in on the con allowed to succeed) to give the false impression that winning the game is possible. New marks appear with each new generation, waiting to be initiated and perhaps lose a little money in the process of learning that they are being had. Such wariness and suspicion are worthwhile traits to acquire and redeploy as each of us is enticed and exploited by gotcha capitalism when not being outright scammed and cheated.

Recognizing all variations of this tactic — showing an unimportant thing while hiding something far more important (or more simply: look here, pay no attention there like in The Wizard of Oz) — and protecting oneself is ultimately a losing proposition considering how commonplace the behavior is. For instance, to make a positive first impression while, say, on a date or at a job interview, everyone projects a subtly false version of themselves (i.e., being on best behavior) to mask one’s true self that emerges inevitably over time. Salespeople and savvy negotiators and shoppers are known to feign interest (or disinterest) to be better positioned psychologically at the bargaining table. My previous blog post called “Divide and Conquer” is yet another example. My abiding frustration with the practice (or malpractice?) of politics led me to the unhappy realization that politicians are running their own version of The Shell Game.

Lengthy analysis might be undertaken regarding which aspects of governance and statecraft should be hidden and which exposed. Since this isn’t an academic blog and without indulging in highbrow philosophizing of interest to very few, my glossy perspective stems from classic, liberal values most Americans are taught (if taught civics at all) underpin the U.S. Constitution. Indeed, the Bill of Rights iterates some of what should remain private with respect to governmental interest in citizens. In contrast, the term transparency is often bandied about to describe how government should ideally operate. Real-world experience demonstrates that relationship is now inverted: they (gov’t) know nearly everything about us (citizens); we are allowed to know very little about them. Surveillance and prying into the lives of citizens by governments and corporations are the norm while those who surveil operate in the shadows, behind veils of secrecy, and with no real scrutiny or legal accountability. One could argue that with impunity already well established, malefactors no longer bother to pretend to serve the citizenry, consult public opinion, or tell the truth but instead operate brazenly in full public view. That contention is for another blog post.

Read the rest of this entry »

Back to an old complaint on mine: living in a world so polluted that it’s impossible to assert that pristine, untouched (by human interventions) places exist anymore. Every bit of soil, water, and air is now affected by chemical alteration. degradation, and pollutants subtle and gross. Was intrigued to learn that despite despoliation, life finds a way (for now, anyway): a new marine ecosystem has formed in the middle of the ocean, normally devoid of much life, amongst the Great Pacific Garbage Patch (which goes by other names). Here’s the abstract from a scientific paper on the subject:

… the high seas are colonized by a diverse array of coastal species, which survive and reproduce in the open ocean, contributing strongly to its floating community composition. Analysis of rafting plastic debris in the eastern North Pacific Subtropical Gyre revealed 37 coastal invertebrate taxa, largely of Western Pacific origin, exceeding pelagic taxa richness by threefold. Coastal taxa, including diverse taxonomic groups and life history traits, occurred on 70.5% of debris items. Most coastal taxa possessed either direct development or asexual reproduction, possibly facilitating long-term persistence on rafts … results suggest that the historical lack of available substrate limited the colonization of the open ocean by coastal species, rather than physiological or ecological constraints as previously assumed. It appears that coastal species persist now in the open ocean as a substantial component of a neopelagic community sustained by the vast and expanding sea of plastic debris.

This news reminds me of flourishing ecosystems (significantly, absent humans) in Le Zone Rouge and the Chernobyl exclusion zone. To bright-siders, maybe these examples are glass-half-full observations, meaning take whatever good news presents itself and move on. Of course, I’m a glass-half-empty type, disgusted and dispirited by the very existence of all our floating refuse, which is expected to triple by 2060, largely from single-use plastics. (Readers are spared the indignity of stomach-churning pictures of trash floating in the ocean or beaches strewn with trash.) Everyone knows by now a good portion of human trash ends up in garbage gyres in all the major oceanic bodies. Yet transition to a better mode of packaging and consumption to avoid further wreckage appears nowhere on the horizon. Although each of us participates at some level (by virtue of being alive) in the cycle of extraction, production, distribution, consumption, and disposal, I really blame industry for refusing steadfastly to adopt practices that don’t lead to obvious harms; I blame governments for not regulating industries that pollute wantonly; and I blame economists for failing to account for externalities such as the Great Pacific Garbage Patch because, well, to do so would invalidate most business models.

So chalk another one up for the bad guys. A tiny thread of sliver silver lining pokes through an otherwise suffocating blanket of darkness.

From an article by Bruce Abramson called “Pity the Child” (ostensibly a review of the book Stolen Youth), published on April 4, 2023, at RealClearWire:

Though few recognized it as such at the time, the decision to shutter much of the world in March 2020 unraveled the entire socioeconomic fabric of modern life. As anyone who has ever studied or worked with any complex system can confirm, nothing ever restarts quite as it was before a shutdown. 

American society was hardly the exception. The hibernation derailed every pre-existing positive trend and accelerated all the negative. The restart, unfolding in uneven fits-and-starts over the course of two years, introduced an entirely new sociology. Though its precise contours are still taking shape, a few things are clear: Woke reigns supreme and children are expendable.

Let’s say one has a sheet or sheaf of paper to cut. Lots of tools available for that purpose. The venerable scissors can do the job for small projects, though the cut line is unlikely to be very straight if that’s among the objectives. (Heard recently that blunt-nosed scissors for cutting construction paper are no longer used in kindergarten and the early grades, resulting in kids failing to develop the dexterity to manipulate that ubiquitous tool.) A simple razor blade (i.e., utility knife) drawn along a straightedge can cut 1–5 sheets at once but loses effectiveness at greater thicknesses. The machete-blade paper cutter found in copy centers cuts more pages at once but requires skill to use properly and safely. The device usually (but not always) includes an alignment guide for the paper and guard for the blade to discourage users from slicing fingers and hands. A super-heavy-duty paper cutter I learned to use for bookbinding could cut two reams of paper at a time and produced an excellent cut line. It had a giant clamp so that media (paper, card stock, etc.) didn’t shift during the cut (a common weakness of the machete blade) and required the operator to press buttons located at two corners of the standing machine (one at each hip) to prohibit anyone who became too complacent from being tempted to reach in and, as a result, slicing their fingers clean off. That idiot-proofing feature was undoubtedly developed after mishaps that could be attributed to either faulty design or user error depending on which side of the insurance claim one found oneself.

Fool-proofing is commonplace throughout the culture, typically sold with the idea of preserving health and wellness or saving lives. For instance, the promise (still waiting for convincing evidence) that self-driving cars can manage the road better in aggregate than human drivers hides the entirely foreseeable side effect of eroding attention and driving skill (already under assault from the ubiquitous smart phone no one can seem to put down). Plenty of anecdotes of gullible drivers who believed the marketing hype, forfeited control to autodrive, stopped paying attention, and ended up dead put the lie to that canard. In another example, a surprising upswing in homeschooling (not synonymous with unschooling) is also underway, resulting in keeping kids out of state-run public school. Motivations for opting out include poor academic quality, incompatible beliefs (typically related to religious faith or lack thereof), botched response to the pandemic, and the rise of school shootings. If one responded with fear at every imaginable provocation or threat, many entirely passive and unintentional, the bunker mentality that develops is somewhat understandable. Moreover, demands that others (parent, teachers, manufacturers, civil authorities, etc.) take responsibility for protecting individual citizens. If extended across all thinking, it doesn’t take long before a pathological complex develops.

Another protective trend is plugging one’s ears and refusing to hear discomfiting truth, which is already difficult to discern from the barrage of lies and gaslighting that pollute the infosphere. Some go further by killing silencing the messenger and restricting free speech as though that overreach somehow protects against uncomfortable ideas. Continuing from the previous post about social contagion, the viral metaphor for ideas and thinking, i.e., how the mind is “infected” by ideas from outside itself, is entirely on point. I learned about memes long before the “meme” meme (i.e., “going viral”) popularized and debased the term. The term originated in Richard Dawkins’ book The Selfish Gene (1976), though I learned about memes from Daniel Dennett’s book Consciousness Explained (1992). As part of information theory, Dennett describes the meme as an information carrier similar to genes (phonetic similarity was purposeful). Whether as cognition or biology, the central characteristic is that of self-replicating (and metamorphosing or mutating) bits or bytes of info. The viral metaphor applies to how one conceptualizes the body’s and/or mind’s defensive response to inevitable contact with nastiness (bugs, viruses, ideas). Those who want to remain unexposed to either biological pathogens (uninfected) or dangerous ideas (ideologically pure) are effectively deciding to live within a bubble that indeed provides protection but then renders them more vulnerable if/when they exit the bubble. They effectively trap themselves inside. That’s because the immune system is dynamic and can’t harden itself against virulent invaders except through ongoing exposure. Obviously, there’s a continuum between exposure to everything and nothing, but by veering too close to the negative pole, the immune system is weakened, making individuals vulnerable to pathogens healthy people fend off easily.

The hygiene hypothesis suggests that children not allowed to play in the sand and dirt or otherwise interact messily with the environment (including pets) are prone to asthma, allergies, and autoimmune diseases later in life. Jonathan Haidt makes a similar argument with respect to behavior in his book The Coddling of the American Mind (2018) (co-authored with Greg Lukianoff), namely, that overprotecting children by erecting too many guides, guards, and fool-proofing ironically ends up hobbling children and making them unable to cope with the rigors of life. Demands for trigger warnings, safe spaces, deplatforming, and outright censorship are precisely that inability to cope. There is no easy antidote because, well, life is hard sometimes. However, unless one is happy to be trapped inside a faux protective bubble of one’s own making, then maybe consider taking off the training wheels and accepting some risk, fully recognizing that to learn, grow, and develop, stumbling and falling are part of the process. Sure, life will leave some marks, but isn’t that at least partly the point?

Can’t remember how I first learned the term conversion hysteria (a/k/a conversion disorder a/k/a functional neurologic symptom disorder) but it was early in adulthood. The meaning is loose and subject to interpretation, typically focusing more on symptoms presented than triggers or causes. My lay understanding is that conversion hysteria occurs when either an individual or group works themselves into a lather over some subject and loses psychological mooring. I had my own experience with it when younger and full of raging hormones but later got myself under control. I also began to recognize that numerous historical events bore strong enough resemblance to categorize them as instances of group conversion hysteria. In recent years, clinical psychologist Mattias Desmet’s description of mass formation psychosis fits the same pattern, which is elaborated by him more formally. Some reports refer to Desmet’s description as “discredited.” I decline to referee the debate.

Two historical events where people lost their minds in response to severe disruption of social norms are the Salem witch trials and the Weimar/Nazi era in Germany. Two further, more recent episodes are Trump Derangement Syndrome in the U.S. and the Covid Cult worldwide, neither of which are over. The latter features governments and petty bureaucrats everywhere misapplying authoritarian force to establish biosecurity regimes over what turns out to have been a hypochondriac response to a bad flu virus (and yes, it was pretty bad) along with a maniacal power grab. What all episodes share is the perception — real or imagined — of some sort of virulent infection that triggers fear-laden desperation to purge the scourge at literally any cost, including destroying the host. The viral metaphor applies whether the agent is literally a virus alien to the physical body or merely an idea (meme) alien to the social body.

Let me acknowledge (as suggested here) Jordan Peterson’s comments in his appearance on The Joe Rogan Experience that such events resemble social contagion that come and go in waves. However, those waves are not like the regular, established intervals of the tides or daylight/nighttime. Rather, they’re more like rogue waves or tsunamis that break across segments of a culture unpredictably. Peterson’s primary example was the very thing that brought him to prominence: Canadian legislation requiring that teachers use students’ preferred pronouns. While initially part of a broad social movement in support of transgender students in Canada and elsewhere, the issue has since become foundational to Woke ideology. Peterson said to Rogan that by pushing the matter into the mainstream (rather than it being at issue for a tiny fraction of students), Canadian legislators were quite literally opening the floodgates to a wave of confusion among youths already wrestling with identity. I can’t recall if Peterson said as much at the time (2017?) or is projecting onto the past.

Read the rest of this entry »

Even without being a historian (you or me), it’s easy to recognize seminal figures in U.S. history who have articulated the basic ideology behind what has grown to be a maniacal notion of what a world power can and should be. For instance, not very long after the American Revolution and the purported end of the Colonial Era, President James Monroe established the Monroe Doctrine, claiming the entire Western Hemisphere as being within America’s sphere of influence and warning others across the Atlantic not to intervene. Later in the 19th century, Abraham Lincoln responded to the Southern Secession by launching the American Civil War, establishing that no state could leave the Union. A period of isolationism followed, broken when the U.S. joined WWI (unclear to me why the U.S. fought that war). Woodrow Wilson laid out the principles of liberal internationalism in 1917:

The American military, the president told a joint session of Congress, was a force that could be used to make the world “safe for democracy” … Wilson’s doctrine was informed by two main ideas: first, the Progressive Era fantasy that modern technologies and techniques — especially those borrowed from the social sciences — could enable the rational management of foreign affairs, and second, the notion that “a partnership of democratic nations” was the surest way to establish a “steadfast concert for peace.”

from “Empire Burlesque” by Daniel Bessner (Harper’s Magazine, July 2022)

.
Note that that bit of rhetoric, “safe for democracy,” has been trotted out for over a century now yet shows no sign of losing its mojo. It helps, of course, that no one really knows what democracy is anymore. The public is subjected to relentless narrative spin and propaganda, bread and circuses, and inferior to nonexistent education that muddies the concept beyond recognition. Ten months prior to the U.S. entry into the next world war, influential American magazine publisher (Time, Life, Fortune, Sports Illustrated) Henry Luce added further justification for growing U.S. geopolitical ambitions:

… the Japanese attacked Pearl Harbor, and the United States, which had already been aiding the Allies, officially entered the war. Over the next four years, a broad swath of the foreign policy elite arrived at Luce’s conclusion [from just before the war]: the only way to guarantee the world’s safety was for the United States to dominate it. By the war’s end, Americans had accepted this righteous duty, of becoming, in Luce’s words, “the powerhouse … lifting the life of mankind from the level of the beasts to what the Psalmist called a little lower than the angels.”

from “Empire Burlesque” by Daniel Bessner (Harper’s Magazine, July 2022)

.
There has since been no going back, only solidification and strengthening of what is called The American Century (thanks again to Luce) but really represents the spread of a global empire. So much for the end of colonialism, now pursued primarily through other means but still reverting to overt militarism whenever and wherever necessary. Just like civilizations, empires have come and gone throughout human history with power centers shifting somewhat reliably if unpredictably. The American Empire will undoubtedly join others in the dustbin of history no matter whether anyone survives the 21st century to survey the wreckage. Moreover, the illusion that The American Century can be extended is handily dispelled by the Macrofutilist, noting that corporations are leading the charge into the abyss:

Humans have no agency in this world dominated, at every institution and at every level of those institutions, by corporations and states that function as corporations. Under the rubric of the corporation, every public good or resource is under relentless exploitation, subject only to the fictional “control” by political or legal structures. Bolstered by near-total capture of every ancillary human social event or condition, corporations are wonderfully positioned to lead humanity off its cliff of resource degradation and impending scarcity … The horror is so monumental, so vast in its iniquity, so above any moderation, so all-consuming in its reach, so supreme in its command, that the subject of corporate ownership of the means of species destruction risks becoming boring. Who has the right to speak of “resistance” or “change” or “giving back” when all forms of social control are under obdurate corporate ownership?

from Corporations Are the Perfect Vehicle to Drive Humanity to Its Self-Extinction

.
Although it’s impossible to establish beyond reasonable doubt who’s actually driving the bus — corporations, the military-industrial complex (those two form a tautology by now), elected members of government, the Deep State, or some other nefarious cabal — it’s probably fair to say that members of each group have taken into their hearts the desire for full-spectrum dominance. That term originally meant complete military control of a theater of war. However, as its very name frankly admits, activities of the Select Subcommittee on the Weaponization of the Federal Government signal a new style of Hobbesian war of all against all has begun. Indeed, what I used to call creeping fascism no longer needs the modifier creeping. The end game may have finally arrived, the evidence being everywhere if one has the fortitude to look.

Another from Hari Kunzru’s “Easy Chair” column, this time the July 2022 issue of Harper’s Magazine:

We might hear in [Thomas] Nagel’s cosmic detachment an echo of anatta — the Buddhist doctrine that there is no essence or soul grounding human existence. For Buddhists, the clear light of reality is visible only to those who abandon the illusion of selfhood. Objectivity, in the way non-Buddhists usually think about it, doesn’t erase the self, even if it involves a flight from individuality. It actually seems to make the self more powerful, more authoritative. The capacity to be objective is seen as something to strive for, an overcoming of the cognitive biases that smear or smudge the single window and impart our ability to see the world “as it really is.” Objectivity is earned through rigor and discipline. It is selfhood augmented.

anattā