Posts Tagged ‘Politics’

From a statement (PDF link) by Supreme Court Justice Neil Gorsuch regarding Title 42 (quite long for this type of post but worthwhile):

[T]he history of this case illustrates the disruption we have experienced over the last three years in how our laws are made and our freedoms observed.

Since March 2020, we may have experienced the greatest intrusions on civil liberties in the peacetime history of this country. Executive officials across the country issued emergency decrees on a breathtaking scale. Governors and local leaders imposed lockdown orders forcing people to remain in their homes.

They shuttered businesses and schools public and private. They closed churches even as they allowed casinos and other favored businesses to carry on. They threatened violators not just with civil penalties but with criminal sanctions too.

They surveilled church parking lots, recorded license plates, and issued notices warning that attendance at even outdoor services satisfying all state social-distancing and hygiene requirements could amount to criminal conduct. They divided cities and neighborhoods into color-coded zones, forced individuals to fight for their freedoms in court on emergency timetables, and then changed their color-coded schemes when defeat in court seemed imminent.

Federal executive officials entered the act too. Not just with emergency immigration decrees. They deployed a public-health agency to regulate landlord-tenant relations nationwide.They used a workplace-safety agency to issue a vaccination mandate for most working Americans.

They threatened to fire noncompliant employees, and warned that service members who refused to vaccinate might face dishonorable discharge and confinement. Along the way, it seems federal officials may have pressured social-media companies to suppress information about pandemic policies with which they disagreed.

While executive officials issued new emergency decrees at a furious pace, state legislatures and Congress—the bodies normally responsible for adopting our laws—too often fell silent. Courts bound to protect our liberties addressed a few—but hardly all—of the intrusions upon them. In some cases, like this one, courts even allowed themselves to be used to perpetuate emergency public-health decrees for collateral purposes, itself a form of emergency-lawmaking-by-litigation.

Doubtless, many lessons can be learned from this chapter in our history, and hopefully serious efforts will be made to study it. One lesson might be this: Fear and the desire for safety are powerful forces. They can lead to a clamor for action—almost any action—as long as someone does something to address a perceived threat. 

A leader or an expert who claims he can fix everything, if only we do exactly as he says, can prove an irresistible force. We do not need to confront a bayonet, we need only a nudge, before we willingly abandon the nicety of requiring laws to be adopted by our legislative representatives and accept rule by decree. Along the way, we will accede to the loss of many cherished civil liberties—the right to worship freely, to debate public policy without censorship, to gather with friends and family, or simply to leave our homes. 

We may even cheer on those who ask us to disregard our normal lawmaking processes and forfeit our personal freedoms. Of course, this is no new story. Even the ancients warned that democracies can degenerate toward autocracy in the face of fear.

But maybe we have learned another lesson too. The concentration of power in the hands of so few may be efficient and sometimes popular. But it does not tend toward sound government. However wise one person or his advisors may be, that is no substitute for the wisdom of the whole of the American people that can be tapped in the legislative process.

Decisions produced by those who indulge no criticism are rarely as good as those produced after robust and uncensored debate. Decisions announced on the fly are rarely as wise as those that come after careful deliberation. Decisions made by a few often yield unintended consequences that may be avoided when more are consulted. Autocracies have always suffered these defects. Maybe, hopefully, we have relearned these lessons too.

In the 1970s, Congress studied the use of emergency decrees. It observed that they can allow executive authorities to tap into extraordinary powers. Congress also observed that emergency decrees have a habit of long outliving the crises that generate them; some federal emergency proclamations, Congress noted, had remained in effect for years or decades after the emergency in question had passed.

At the same time, Congress recognized that quick unilateral executive action is sometimes necessary and permitted in our constitutional order. In an effort to balance these considerations and ensure a more normal operation of our laws and a firmer protection of our liberties, Congress adopted a number of new guardrails in the National Emergencies Act.

Despite that law, the number of declared emergencies has only grown in the ensuing years. And it is hard not to wonder whether, after nearly a half-century and in light of our Nation’s recent experience, another look is warranted. It is hard not to wonder, too, whether state legislatures might profitably reexamine the proper scope of emergency executive powers at the state level. 

At the very least, one can hope that the Judiciary will not soon again allow itself to be part of the problem by permitting litigants to manipulate our docket to perpetuate a decree designed for one emergency to address another. Make no mistake—decisive executive action is sometimes necessary and appropriate. But if emergency decrees promise to solve some problems, they threaten to generate others. And rule by indefinite emergency edict risks leaving all of us with a shell of a democracy and civil liberties just as hollow.


Douglas Murray wrote a book called The Strange Death of Europe (2017), which is behind a long list of books as yet unread by me. Doubtful I will ever get to is. My familiarity with his thesis stems from his many appearances in interviews and webcasts describing the book. Murray continues a long line of declinists (mentioned here) prophesying and/or chronicling the long, slow demise of Europe, and in an only slightly wider sense, the West. Analyses of the causes and manifestations of decline range all over the map but typically include a combination of incompetence, exhaustion, and inevitability. For Murray, the strangeness is that it appears self-inflicted and openly desired out of some misplaced sense of guilt and shame following an entirely atypical (for the West) moral reckoning, perhaps for having been the dominant global culture for roughly five hundred years. This blog already says plenty about disaster, destruction, and doom, so let me instead pose a different question: Is the cultural inheritance of the West at all worth honoring and preserving? Won’t be an easy answer.

The question is partly prompted by the breathless announcement last fall in the alumni magazine of one of my alma maters of a newly created position (a further example of needless administrative bloat): Associate Dean of Equity and Inclusion (among other flowery academic titles attached to the position). For reasons unknown, the full sequence Diversity, Equity, and Inclusion (a/k/a DIE — yes, the initialism is purposely disordered) was avoided, but as with many trendy turns of phrase, precision hardly matters since the omitted word is inferred. Let’s first consider the term cultural inheritance. If one takes a warts-and-all approach, the full spectrum from glory to atrocity accumulated throughout history informs what descends from our forebears in the form of inheritance. Indeed, highs and lows are scarcely separable. Whereas individual memory tends to select mostly good parts, cultural memory records everything — albeit with interpretive bias — so long as one has the inclination to consult history books.

If one takes a charitable view of history, admirable events, practices, norms, and artifacts become the celebratory focus while nasty parts may be acknowledged but are nonetheless elided, forgotten, or shoved forcibly down the memory hole. But if one takes a pessimistic view, the very worst parts represent an irredeemable, permanent stain on everything else (the aforementioned moral reckoning). Recent arguments (revisionist history say many) that the founding of the United States of America, a political entity, should be redated to 1619 to coincide with the establishment of slavery rather than 1776 with the Declaration of Independence is an example. Uncharacteristically perhaps, I take the charitable view while the trés chic view in academe (echoed in corporate life and far-left government policy) is to condemn the past for failing to embody Woke standards of the present. Multiple influential segments of Western culture have thus succumbed to ideological possession as demonstrated by self-denunciation, self-flagellation, and complete loss of faith in Western institutions because of, well, slavery, genocide, colonialism, warfare, torture, racism, child labor, human trafficking, and other abominations, almost none not all of which are relegated to the past.

(more…)

People of even modest wisdom know that Three-Card Monte is a con. The scam goes by myriad other names, including The Shell Game, Follow the Queen, Find the Lady, Chase the Ace, and Triplets (even more in foreign languages). The underlying trick is simple: distract players’ attention from the important thing by directing attention elsewhere. Pickpockets do the same, bumping into marks or patting them on the shoulder to obscure awareness of the lift from the back pocket or purse. A conman running The Shell Game may also use a plant (someone in on the con allowed to succeed) to give the false impression that winning the game is possible. New marks appear with each new generation, waiting to be initiated and perhaps lose a little money in the process of learning that they are being had. Such wariness and suspicion are worthwhile traits to acquire and redeploy as each of us is enticed and exploited by gotcha capitalism when not being outright scammed and cheated.

Recognizing all variations of this tactic — showing an unimportant thing while hiding something far more important (or more simply: look here, pay no attention there like in The Wizard of Oz) — and protecting oneself is ultimately a losing proposition considering how commonplace the behavior is. For instance, to make a positive first impression while, say, on a date or at a job interview, everyone projects a subtly false version of themselves (i.e., being on best behavior) to mask one’s true self that emerges inevitably over time. Salespeople and savvy negotiators and shoppers are known to feign interest (or disinterest) to be better positioned psychologically at the bargaining table. My previous blog post called “Divide and Conquer” is yet another example. My abiding frustration with the practice (or malpractice?) of politics led me to the unhappy realization that politicians are running their own version of The Shell Game.

Lengthy analysis might be undertaken regarding which aspects of governance and statecraft should be hidden and which exposed. Since this isn’t an academic blog and without indulging in highbrow philosophizing of interest to very few, my glossy perspective stems from classic, liberal values most Americans are taught (if taught civics at all) underpin the U.S. Constitution. Indeed, the Bill of Rights iterates some of what should remain private with respect to governmental interest in citizens. In contrast, the term transparency is often bandied about to describe how government should ideally operate. Real-world experience demonstrates that relationship is now inverted: they (gov’t) know nearly everything about us (citizens); we are allowed to know very little about them. Surveillance and prying into the lives of citizens by governments and corporations are the norm while those who surveil operate in the shadows, behind veils of secrecy, and with no real scrutiny or legal accountability. One could argue that with impunity already well established, malefactors no longer bother to pretend to serve the citizenry, consult public opinion, or tell the truth but instead operate brazenly in full public view. That contention is for another blog post.

(more…)

Can’t remember how I first learned the term conversion hysteria (a/k/a conversion disorder a/k/a functional neurologic symptom disorder) but it was early in adulthood. The meaning is loose and subject to interpretation, typically focusing more on symptoms presented than triggers or causes. My lay understanding is that conversion hysteria occurs when either an individual or group works themselves into a lather over some subject and loses psychological mooring. I had my own experience with it when younger and full of raging hormones but later got myself under control. I also began to recognize that numerous historical events bore strong enough resemblance to categorize them as instances of group conversion hysteria. In recent years, clinical psychologist Mattias Desmet’s description of mass formation psychosis fits the same pattern, which is elaborated by him more formally. Some reports refer to Desmet’s description as “discredited.” I decline to referee the debate.

Two historical events where people lost their minds in response to severe disruption of social norms are the Salem witch trials and the Weimar/Nazi era in Germany. Two further, more recent episodes are Trump Derangement Syndrome in the U.S. and the Covid Cult worldwide, neither of which are over. The latter features governments and petty bureaucrats everywhere misapplying authoritarian force to establish biosecurity regimes over what turns out to have been a hypochondriac response to a bad flu virus (and yes, it was pretty bad) along with a maniacal power grab. What all episodes share is the perception — real or imagined — of some sort of virulent infection that triggers fear-laden desperation to purge the scourge at literally any cost, including destroying the host. The viral metaphor applies whether the agent is literally a virus alien to the physical body or merely an idea (meme) alien to the social body.

Let me acknowledge (as suggested here) Jordan Peterson’s comments in his appearance on The Joe Rogan Experience that such events resemble social contagion that come and go in waves. However, those waves are not like the regular, established intervals of the tides or daylight/nighttime. Rather, they’re more like rogue waves or tsunamis that break across segments of a culture unpredictably. Peterson’s primary example was the very thing that brought him to prominence: Canadian legislation requiring that teachers use students’ preferred pronouns. While initially part of a broad social movement in support of transgender students in Canada and elsewhere, the issue has since become foundational to Woke ideology. Peterson said to Rogan that by pushing the matter into the mainstream (rather than it being at issue for a tiny fraction of students), Canadian legislators were quite literally opening the floodgates to a wave of confusion among youths already wrestling with identity. I can’t recall if Peterson said as much at the time (2017?) or is projecting onto the past.

(more…)

Even without being a historian (you or me), it’s easy to recognize seminal figures in U.S. history who have articulated the basic ideology behind what has grown to be a maniacal notion of what a world power can and should be. For instance, not very long after the American Revolution and the purported end of the Colonial Era, President James Monroe established the Monroe Doctrine, claiming the entire Western Hemisphere as being within America’s sphere of influence and warning others across the Atlantic not to intervene. Later in the 19th century, Abraham Lincoln responded to the Southern Secession by launching the American Civil War, establishing that no state could leave the Union. A period of isolationism followed, broken when the U.S. joined WWI (unclear to me why the U.S. fought that war). Woodrow Wilson laid out the principles of liberal internationalism in 1917:

The American military, the president told a joint session of Congress, was a force that could be used to make the world “safe for democracy” … Wilson’s doctrine was informed by two main ideas: first, the Progressive Era fantasy that modern technologies and techniques — especially those borrowed from the social sciences — could enable the rational management of foreign affairs, and second, the notion that “a partnership of democratic nations” was the surest way to establish a “steadfast concert for peace.”

from “Empire Burlesque” by Daniel Bessner (Harper’s Magazine, July 2022)

.
Note that that bit of rhetoric, “safe for democracy,” has been trotted out for over a century now yet shows no sign of losing its mojo. It helps, of course, that no one really knows what democracy is anymore. The public is subjected to relentless narrative spin and propaganda, bread and circuses, and inferior to nonexistent education that muddies the concept beyond recognition. Ten months prior to the U.S. entry into the next world war, influential American magazine publisher (Time, Life, Fortune, Sports Illustrated) Henry Luce added further justification for growing U.S. geopolitical ambitions:

… the Japanese attacked Pearl Harbor, and the United States, which had already been aiding the Allies, officially entered the war. Over the next four years, a broad swath of the foreign policy elite arrived at Luce’s conclusion [from just before the war]: the only way to guarantee the world’s safety was for the United States to dominate it. By the war’s end, Americans had accepted this righteous duty, of becoming, in Luce’s words, “the powerhouse … lifting the life of mankind from the level of the beasts to what the Psalmist called a little lower than the angels.”

from “Empire Burlesque” by Daniel Bessner (Harper’s Magazine, July 2022)

.
There has since been no going back, only solidification and strengthening of what is called The American Century (thanks again to Luce) but really represents the spread of a global empire. So much for the end of colonialism, now pursued primarily through other means but still reverting to overt militarism whenever and wherever necessary. Just like civilizations, empires have come and gone throughout human history with power centers shifting somewhat reliably if unpredictably. The American Empire will undoubtedly join others in the dustbin of history no matter whether anyone survives the 21st century to survey the wreckage. Moreover, the illusion that The American Century can be extended is handily dispelled by the Macrofutilist, noting that corporations are leading the charge into the abyss:

Humans have no agency in this world dominated, at every institution and at every level of those institutions, by corporations and states that function as corporations. Under the rubric of the corporation, every public good or resource is under relentless exploitation, subject only to the fictional “control” by political or legal structures. Bolstered by near-total capture of every ancillary human social event or condition, corporations are wonderfully positioned to lead humanity off its cliff of resource degradation and impending scarcity … The horror is so monumental, so vast in its iniquity, so above any moderation, so all-consuming in its reach, so supreme in its command, that the subject of corporate ownership of the means of species destruction risks becoming boring. Who has the right to speak of “resistance” or “change” or “giving back” when all forms of social control are under obdurate corporate ownership?

from Corporations Are the Perfect Vehicle to Drive Humanity to Its Self-Extinction

.
Although it’s impossible to establish beyond reasonable doubt who’s actually driving the bus — corporations, the military-industrial complex (those two form a tautology by now), elected members of government, the Deep State, or some other nefarious cabal — it’s probably fair to say that members of each group have taken into their hearts the desire for full-spectrum dominance. That term originally meant complete military control of a theater of war. However, as its very name frankly admits, activities of the Select Subcommittee on the Weaponization of the Federal Government signal a new style of Hobbesian war of all against all has begun. Indeed, what I used to call creeping fascism no longer needs the modifier creeping. The end game may have finally arrived, the evidence being everywhere if one has the fortitude to look.

The Bulletin of the Atomic Scientists has reset its doomsday clock ten seconds closer to midnight (figuratively, nuclear Armageddon), bringing the world closest to catastrophe in its history. Bizarrely, its statement published today points squarely at Russia as the culprit but fails to mention the participation of the United States and other NATO countries in the conflict. Seems to me a rather unethical deployment of distinctly one-sided rhetoric. With the clock poised so close to midnight, there’s nearly nowhere left to reset the clock until the bombs fly. In contrast, two of the blogs I read that are openly critical of provocations and escalation against Russia, not by Russia, are Bracing Views (on my blogroll) and Caitlin Johnstone (not on my blogroll). Neither minces words, and both either suggest or say openly that U.S. leadership are the bad guys indulging in nuclear brinkmanship. Many more example of that ongoing debate are available. Judge for yourself whose characterizations are more accurate and useful.

Although armed conflict at one scale or another, one time or another, is inescapable given mankind’s violent nature, it takes no subtle morality or highfalutin ethics to be default antiwar, whereas U.S. leadership is uniformly prowar. Can’t know for sure what the motivations are, but the usual suspects include fear of appearing weak (um, it’s actually that fear that’s weak, dummies) and the profit motive (war is a racket). Neither convinces me in the slightest that squandering lives, energy, and lucre make war worth contemplating except in extremis. Military action (and its logistical support such as the U.S. is providing to Ukraine) should only be undertaken with the gravest regret and reluctance. Instead, war is glorified and valorized. Fools rush in ….

No need to think hard on this subject, no matter where one gets information and reporting (alternately, disinformation and misreporting). Also doesn’t ultimately matter who are the good guys or the bad guys. What needs to happen in all directions and by all parties is deescalation and diplomacy. Name calling, scapegoating, and finger pointing might soothe some that they’re on the right side of history. The bad side of history will be when nuclear powers go MAD, at which point no one can stake any moral high ground.

The difference between right and wrong is obvious to almost everyone by the end of kindergarten. Temptations persist and everyone does things great and small known to be wrong when enticements and advantages outweigh punishments. C’mon, you know you do it. I do, too. Only at the conclusion of a law degree or the start of a political career (funny how those two often coincide) do things get particularly fuzzy. One might add military service to those exceptions except that servicemen are trained not to think, simply do (i.e., follow orders without question). Anyone with functioning ethics and morality also recognizes that in legitimate cases of things getting unavoidably fuzzy in a hypercomplex world, the dividing line often can’t be established clearly. Thus, venturing into the wide, gray, middle area is really a signal that one has probably already gone too far. And yet, demonstrating that human society has not really progressed ethically despite considerable gains in technical prowess, egregiously wrong things are getting done anyway.

The whopper of which nearly everyone is guilty (thus, guilty pleasure) is … the Whopper. C’mon, you know you eat it do it. I know I do. Of course, the irresistible and ubiquitous fast food burger is really only one example of a wide array of foodstuffs known to be unhealthy, cause obesity, and pose long-term health problems. Doesn’t help that, just like Big Tobacco, the food industry knowingly refines their products (processed foods, anyway) to be hyperstimuli impossible to ignore or resist unless one is iron willed or develops an eating disorder. Another hyperstimulus most can’t escape is the smartphone (or a host of other electronic gadgets). C’mon, you know you crave the digital pacifier. I don’t, having managed to avoid that particular trap. For me, electronics are always only tools. However, railing against them with respect to how they distort cognition (as I have) convinces exactly no one, so that argument goes on the deferral pile.

Another giant example not in terms of participation but in terms of effect is the capitalist urge to gather to oneself as much filthy lucre as possible only to sit heartlessly on top of that nasty dragon’s hoard while others suffer in plain sight all around. C’mon, you know you would do it if you could. I know I would — at least up to a point. Periods of gross inequality come and go over the course of history. I won’t make direct comparisons between today and any one of several prior Gilded Ages in the U.S., but it’s no secret that the existence today of several hundy billionaires and an increasing number of mere multibillionaires represents a gross misallocation of financial resources: funneling the productivity of the masses (and fiat dollars whiffed into existence with keystrokes) into the hands of a few. Fake philanthropy to launder reputations fail to convince me that such folks are anything other than miserly Scrooges fixated on maintaining and growing their absurd wealth, influence, and bogus social status at the cost of their very souls. Seriously, who besides sycophants and climbers would want to even be in the same room as one of those people (names withheld)? Maybe better not to answer that question.

(more…)

In the sense that all news is local and all learning is individual, meaning that it’s only when something is individualized and particularized that it takes on context and meaning, I may finally understand (some doubt still) Sheldon Wolin’s term “inverted totalitarianism,” part of the subtitle of his 2006 book Democracy Incorporated: Managed Democracy and the Specter of Inverted Totalitarianism. Regrettably, this book is among the (many) dozens that await my attention, so I can’t yet claim to have done the work. (I did catch a long YouTube interview of Wolin conducted by Chris Hedges, but that’s a poor substitute for reading the book.) My process is to percolate on a topic and its ancillary ideas over time until they come together satisfactorily, and my provisional understanding of the issues is closer to “proxy tyranny” than “inverted totalitarianism.”

I daresay most of us conceptualize tyranny and totalitarianism in the bootheel versions that manifested in several 20th-century despotic regimes (and survives in several others in the 21st century, names and locations withheld) where population management is characterized by stomping people down, grinding them into dust, and treating them as an undifferentiated mass. Administrators (e.g., secret police) paid close attention to anyone who might pose a problem for the regimes, and neighbors and family members were incentivized to betray inform on anyone who might be on officialdom’s radar. The 21st-century manifestation is different in that computers do most information gathering — a dragnet thrown over everyone — and we inform on ourselves by oversharing online. Close attention is still paid, but human eyes may never see extensive dossiers (forever records?) kept on each of us until something prompts attention. A further distinction is that in bootheel totalitarianism, intense scrutiny and punishment were ubiquitous, whereas at least in 21st-century America, a sizeable portion of the population can be handily ignored, abandoned, and/or forgotten. They’re powerless, harmless, and inconsequential, not drawing attention. Additionally, there is also no bottom to how low they can sink, as the burgeoning homeless population demonstrates.

If tyranny is normally understood as emanating from the top down, it’s inversion is bottom up. Wolin’s inverted totalitarianism is not a grassroots phenomenon but rather corporate capture of government. While Wolin’s formulation may be true (especially at the time his book was published), government has relinquished none of its power so much as realigned its objectives to fit corporate profit motives, and in doing so, shifted administrative burdens to proxies. Silicon Valley corporations (of the big data type especially) are the principal water carriers, practicing surveillance capitalism and as private entities exercising censorious cancellation of dissenting opinion that no formal government could countenance. Similarly, an entire generation of miseducated social justice warriors scours social media for evidence of noncomforming behavior, usually some offense of the meme of the moment a/k/a “I support the current thing” (though racism is the perennial accusation — an original sin that can never be forgiven or assuaged), waiting to pounce in indignation and destroy lives and livelihoods. Cancel culture is a true bottom-up phenomenon, with self-appointed emissaries doing the work that the government is only too happy to hand off to compliant, brainwashed ideologues.

In the Covid era, nonconforming individuals (e.g., those who refuse the jab(s) or call bullshit on continuously shifting narratives announced by various agencies that lack legal standing to compel anything) are disenfranchised in numerous ways even while the wider culture accepts that the pandemic is indeed endemic and simply gets on with life. Yet every brick-and-mortar establishment has been authorized, deputized, and indeed required to enforce unlawful policies of the moment as proxies for government application of force. Under threat of extended closure, every restaurant, retailer, arts organization, and sports venue demanded the literal or figurative equivalent of “papers please” to enter and assemble. Like the airlines, people are increasingly regarded as dehumanized cargo, treated roughly like the famous luggage ape (and not always without good reason). In most places, restrictions have been lifted; in others they persist. But make no mistake, this instantiation of proxy tyranny — compelling others to do the dirty work so that governments can not so plausibly deny direct responsibility — is the blueprint for future mistreatment. Personally, I’m rather ashamed that fewer Americans stood up for what is right and true (according to me, obviously), echoing this famous admission of moral failure. For my own part, I’ve resisted (and paid the price for that resistance) in several instances.

In sales and marketing (as I understand them), one of the principal techniques to close a sale is to generate momentum by getting the prospective mark buyer to agree to a series of minor statements (small sells) leading to the eventual purchasing decision (the big sell or final sale). It’s narrow to broad, the reverse of the broad-to-narrow paragraph form many of us were taught in school. Both organizational forms proceed through assertions that are easy to swallow before getting to the intended conclusion. That conclusion could be either an automotive purchase or adoption of some argument or ideology. When the product, service, argument, or ideology is sold effectively by a skilled salesman or spin doctor narrative manager, that person may be recognized as a closer, as in sealing the deal.

Many learn to recognize the techniques of the presumptive closer and resist being drawn in too easily. One of those techniques is to reframe the additional price of something as equivalent to, say, one’s daily cup of coffee purchased at some overpriced coffee house. The presumption is that if one has the spare discretionary income to buy coffee every day, then one can put that coffee money instead toward a higher monthly payment. Suckers might fall for it — even if they don’t drink coffee — because the false equivalence is an easily recognized though bogus substitution. The canonical too-slick salesman no one trusts is the dude on the used car lot wearing some awful plaid jacket and sporting a pornstache. That stereotype, borne out of the 1970s, barely exists anymore but is kept alive by repetitive reinforcement in TV and movies set in that decade or at least citing the stereotype for cheap effect (just as I have). But how does one spot a skilled rhetorician, spewing social and political hot takes to drive custom narratives? Let me identify a few markers.

Thomas Sowell penned a brief article entitled “Point of No Return.” I surmise (admitting my lack of familiarity) that creators.com is a conservative website, which all by itself does not raise any flags. Indeed, in heterodox fashion, I want to read well reasoned arguments with which I may not always agree. My previous disappointment that Sowell fails in that regard was only reinforced by the linked article. Take note that the entire article uses paragraphs that are reduced to bite-sized chunks of only one or two sentences. Those are small sells, inviting closure with every paragraph break.

Worse yet, only five (small) paragraphs in, Sowell succumbs to Godwin’s Law and cites Nazis recklessly to put the U.S. on a slippery slope toward tyranny. The obvious learned function of mentioning Nazis is to trigger a reaction, much like baseless accusations of racism, sexual misconduct, or baby eating. It puts everyone on the defensive without having to demonstrate the assertion responsibly, which is why the first mention of Nazis in argument is usually sufficient to disregard anything else written or said by the person in question. I might have agreed with Sowell in his more general statements, just as conservatism (as in conservation) appeals as more and more slips away while history wears on, but after writing “Nazi,” he lost me entirely (again).

Sowell also raises several straw men just to knock them down, assessing (correctly or incorrectly, who can say?) what the public believes as though there were monolithic consensus. I won’t defend the general public’s grasp of history, ideological placeholders, legal maneuvers, or cultural touchstones. Plenty of comedy bits demonstrate the deplorable level of awareness of individual members of society like they were fully representative of the whole. Yet plenty of people pay attention and accordingly don’t make the cut when offering up idiocy for entertainment. (What fun, ridiculing fools!) The full range of opinion on any given topic is not best characterized by however many idiots and ignoramuses can be found by walking down the street and shoving a camera and mic in their astonishingly unembarrassed faces.

So in closing, let me suggest that, in defiance of the title of this blog post, Thomas Sowell is in fact not a closer. Although he drops crumbs and morsels gobbled up credulously by those unable to recognize they’re being sold a line of BS, they do not make a meal. Nor should Sowell’s main point, i.e., the titular point of no return, be accepted when his burden of proof has not been met. That does not necessary mean Sowell is wrong in the sense that even a stopped close tells the time correctly twice a day. The danger is that even if he’s partially correction some of the time, his perspective and program (nonpartisan freedom! whatever that may mean) must be considered with circumspection and disdain. Be highly suspicious before buying what Sowell is selling. Fundamentally, he’s a bullshit artist.

/rant on

One of my personal favorites among my own blog posts is my remapping of the familiar Rock, Paper, Scissors contest to Strong, Stupid, and Smart, respectively. In that post, I concluded (among other things) that, despite a supposedly equal three-way power dynamic, in the long run, nothing beats stupid. I’ve been puzzling recently over this weird dynamic in anticipation of a mass exodus of boomers from the labor force as they age into retirement (or simply die off). (Digression about the ruling gerontocracy withheld.) It’s not by any stretch clear that their younger cohorts divided into not-so-cleverly named demographics are prepared to bring competence or work ethic to bear on labor needs, which includes job descriptions ranging across the spectrum from blue collar to white collar to bona fide expert.

Before being accused of ageism and boomerism, I don’t regard the issue as primarily a function of age but rather as a result of gradual erosion of educational standards that has now reached such a startling level of degradation that many American institutions are frankly unable to accomplish their basic missions for lack of qualified, competent, engaged workers and managers. See, for example, this Gallup poll showing how confidence in U.S. institutions is ebbing. Curious that the U.S. Congress is at the very bottom, followed closely and unsurprisingly by the TV news. Although the poll only shows year-over-year decline, it’s probably fair to say that overall consensus is that institutions simply cannot be relied upon anymore to do their jobs effectively. I’ve believed that for some time about Cabinet-level agencies that, administration after administration, never manage to improve worsening conditions that are the very reason for their existence. Some of those failures are arguably because solutions to issues simply do not exist (such as with the renewed energy crisis or the climate emergency). But when addressing concerns below the level of global civilization, my suspicion is that failure is the result of a combination of corruption (including careerism) and sheer incompetence.

The quintessential example came to my attention in the embedded YouTube video, which spells out in gruesome detail how schools of education are wickedly distorted by ideologues pushing agendas that don’t produce either better educational results or social justice. Typically, it’s quite the reverse.

In short, school administrators and curriculum designers are incompetent boobs (much like many elected government officials) possessed of decision-making authority who have managed to quell dissent among the ranks precisely because many who know better are invested in careers and pension programs that would be sacrificed in order to call bullshit on insane things now being forced on everyone within those institutions. Those of us who attended college often witnessed how, over the course of several decades, faculties have essentially caved repeatedly on issues where administrators acted in contravention of respectable educational goals and policies. Fortitude to resist is not in abundance for reasons quite easy to understand. Here’s one cry from the wilderness by a college professor retiring early to escape the madness. One surmises that whoever is hired as a replacement will check a number of boxes, including compliance with administrative diktats.

Viewpoint diversity may well be the central value being jettisoned, along with the ability to think clearly. If cultism is truly the correct characterization, administrators have adopted an open agenda of social reform and appear to believe that they, uniquely, have arrived at the correct answers (final solutions?) to be brainwashed into both teachers and students as doctrine. Of course, revolutions are launched on the strength of such misguided convictions, often purging resistance violently and revealing that best intentions matter little in the hellscapes that follow. But on the short term, the basic program is to silent dissent, as though banishing disallowed thinking from the public sphere collapses viewpoint diversity. Nope, sorry. That’s not how cognition works except in totalitarian regimes that remove all awareness of options and interpretations we in open societies currently take for granted. It’s barking mad, for instance, to think that all the propaganda flung at the American public about, say, the proxy war in Ukraine is truly capable of buffaloing the entire population into believing we (the U.S. and NATO) are the good guys in the conflict. (There are no goods guys.) Even the Chinese government, with its restricted Internet and social credit system, can’t squelch entirely the yearning to think and breathe freely. Instead, that’s the domain of North Korea, which only despots hold up as a salutary model.

My point, which bears reiteration, is that poorly educated, miseducated, and uneducated ignoramuses (the ignorati, whose numbers have swelled) in positions of power and influence embody precisely the unmovable, unreachable, slow, grinding stupidity that can’t be overcome by knowledge, understanding, expertise, or appeals to reason and good faith. Stupid people don’t know just how stupid they are but sally forth with blind confidence in themselves, and their abject stupidity becomes like kryptonite used to weaken others. One can use smarts (scissors) once in a while to cut through stupidity (paper), but in the end, nothing beats stupid.

/rant off