Archive for the ‘Legal Matters’ Category

Coming back to this topic after some time (pt. 1 here). My intention was to expand upon demands for compliance, and unsurprisingly, relevant tidbits continuously pop up in the news. The dystopia American society is building for itself doesn’t disappoint — not that anyone is hoping for such a development (one would guess). It’s merely that certain influential elements of society reliably move toward consolidation of power and credulous citizens predictably forfeit their freedom and autonomy with little or no hesitation. The two main examples to discuss are Black Lives Matter (BLM) and the response to to the global pandemic, which have occurred simultaneously but are not particularly related.

The BLM movement began in summer 2013 but boiled over in summer 2020 on the heels of the George Floyd killing, with protests spilling over into straightforward looting, mayhem, and lawlessness. That fit of high emotional pique found many protester accosting random strangers in public and demanding a raised fist in support of the movement, which was always ideologically disorganized but became irrational and power-hungry as Wokedom discovered its ability to submit others to its will. In response, many businesses erected what I’ve heard called don’t-hurt-me walls in apparent support of BLM and celebration of black culture so that windows would not be smashed and stores ransacked. Roving protests in numerous cities demanded shows of support, though with what exactly was never clear, from anyone encountered. Ultimately, protests morphed into a sort of protection racket, and agitators learned to enjoy making others acquiesce to arbitrary demands. Many schools and corporations now conduct mandatory training to, among other things, identify unconscious bias, which has the distinct aroma of original sin that can never be assuaged or forgiven. It’s entirely understandable that many individuals, under considerable pressure to conform as moral panic seized the country, play along to keep the peace or keep their jobs. Backlash is building, of course.

The much larger example affecting everyone, nationwide and globally, is the response to the pandemic. Although quarantines have been used in the past to limit regional outbreaks of infectious disease, the global lockdown of business and travel was something entirely new. Despite of lack of evidence of efficacy, the precautionary principle prevailed and nearly everyone was forced into home sequestration and later, after an embarrassingly stupid scandal (in the U.S.), made to don masks when venturing out in public. As waves of viral infection and death rolled across the globe, political leaders learned to enjoy making citizens acquiesce to capricious and often contradictory demands. Like BLM, a loose consensus emerged about the “correct” way to handle the needs of the moment, but the science and demographics of the virus produced widely variant interpretations of such correctness. A truly coordinated national response in the U.S. never coalesced, and hindsight has judged the whole morass a fundamentally botched job of maintaining public health in most countries.

But political leaders weren’t done demanding compliance. Any entirely novel vaccine protocol was rushed into production after emergency use authorization was obtained and indemnification (against what?) was granted to the pharma companies that developed competing vaccines. Whether this historical moment will turn out to be something akin to the thalidomide scandal remains to be seen, but at the very least, the citizenry is being driven heavily toward participation in a global medical experiment. Some states even offer million-dollar lotteries to incentivize individuals to comply and take the jab. Open discussion of risks associated with the new vaccines has been largely off limits, and a two-tier society is already emerging: the vaccinated and the unclean (which is ironic, since many of the unclean have never been sick).

Worse yet (and like the don’t-hurt-me walls), many organizations are adopting as-yet-unproven protocols and requiring vaccination for participants in their activities (e.g., schools, sports, concerts) or simply to keep one’s job. The mask mandate was a tolerable discomfort (though not without many principled refusals), but forcing others to be crash test dummies experimental test subjects is well beyond the pale. Considering how the narrative continues to evolve and transform, thoughtful individuals trying to evaluate competing truth claims for themselves are unable to get clear, authoritative answers. Indeed, it’s hard to imagine a situation where authorities in politics, medicine, science, and journalism could worked so assiduously to undermine their own credibility. Predictably, heads (or boards of directors) of many organizations are learning to enjoy the newly discovered power to transform their organizations into petty fiefdoms and demand compliance from individuals — usually under the claim of public safety (“for the children” being unavailable this time). Considering how little efficacy has yet been truly demonstrated with any of the various regimes erected to contain or stall the pandemic, the notion that precautions undertaken have been worth giving injudicious authority to people up and down various power hierarchies to compel individuals remains just that: a notion.

Tyrants and bullies never seem to tire of watching others do the submission dance. In the next round, be ready to hop on one leg and/or bark like a dog when someone flexes on you. Land of the free and home of the brave no longer.

Addendum

The CDC just announced an emergency meeting to be held (virtually) June 18 to investigate reports (800+ via the Vaccination Adverse Effect Reporting System (VAERS), which almost no one had heard of only a month ago) of heart inflammation in adolescents following vaccination against the covid virus. Significant underreporting is anticipated following the circular logic that since authorities declared the vaccines safe prematurely (without standard scientific evidence to support such a statement), the effects cannot be due to the vaccine. What will be the effect of over 140 million people having been assured that vaccination is entirely safe, taken the jab, and then discovered “wait! maybe not so much ….” Will the complete erosion of trust in what we’re instructed told by officialdom and its mouthpieces in journalism spark widespread, organized, grassroots defiance once the bedrock truth is laid bare? Should it?

For more than a decade, I’ve had in the back of my mind a blog post called “The Power of Naming” to remark that bestowing a name gives something power, substance, and in a sense, reality. That post never really came together, but its inverse did. Anyway, here’s a renewed attempt.

The period of language acquisition in early childhood is suffused with learning the names of things, most of which is passive. Names of animals (associated closely with sounds they make) are often a special focus using picture books. The kitty, doggie, and horsie eventually become the cat, dog, and horse. Similarly, the moo-cow and the tweety-bird shorten to cow and bird (though songbird may be an acceptable holdover). Words in the abstract are signifiers of the actual things, aided by the text symbols learned in literate cultures to reinforce mere categories instead of examples grounded in reality. Multiply the names of things several hundred thousand times into adulthood and indeed throughout life and one can develop a formidable vocabulary supporting expressive and nuanced thought and speech. Do you know the differences between acute, right, obtuse, straight, and reflex angles? Does it matter? Does your knowledge of barware inform when to use a flute, coupe, snifter, shot (or shooter or caballito), nosing glass (or Glencairn), tumbler, tankard, goblet, sling, and Stein? I’d say you’ve missed something by never having drunk dark beer (Ger.: Schwarzbier) from a frosted schooner. All these varieties developed for reasons that remain invisible to someone content to drink everything from the venerable red Solo cup. Funnily enough, the red Solo cup now comes in different versions, fooling precisely no one.

Returning to book blogging, Walter Ong (in Orality and Literacy) has curious comparisons between primarily oral cultures and literate cultures. For example:

Oral people commonly think of names (one kind of words) as conveying power over things. Explanations of Adam’s naming of the animals in Genesis 2:20 usually call condescending attention to this presumably quaint archaic belief. Such a belief is in fact far less quaint than it seems to unreflective chirographic and typographic folk. First of all, names do give humans beings power over what they name: without learning a vast store of names, one is simply powerless to understand, for example, chemistry and to practice chemical engineering. And so with all other intellectual knowledge. Secondly, chirographic and typographic folk tend to think of names as labels, written or printed tags imaginatively affixed to an object named. Oral folk have no sense of a name as a tag, for they have no idea of a name as something that can be seen. Written or printed representations of words can be labels; real, spoken words cannot be. [p. 33]

This gets at something that has been developing over the past few decades, namely, that as otherwise literate (or functionally literate) people gather more and more information through electronic media (screens that serve broadcast and cable TV, YouTube videos, prerecorded news for streaming, and podcasts, and most importantly, audiobooks — all of which speak content to listeners), the spoken word (re)gains primacy and the printed word fades into disuse. Electronic media may produce a hybrid of orality/literacy, but words are no longer silent, internal, and abstract. Indeed, words — all by themselves — are understood as being capable of violence. Gone are the days when “stick and stones ….” Now, fighting words incite and insults sting again.

Not so long ago, it was possible to provoke a duel with an insult or gesture, such as a glove across the face. Among some people, defense of honor never really disappeared (though dueling did). History has taken a strange turn, however. Proposed legislation to criminalize deadnaming (presumably to protect a small but growing number of transgender and nonbinary people who have redefined their gender identity and accordingly adopted different names) recognizes the violence of words but then tries to transmute the offense into an abstract criminal law. It’s deeply mixed up, and I don’t have the patience to sort it out.

More to say in later blog posts, but I’ll raise the Counter-Enlightenment once more to say that the nature of modern consciousness if shifting somewhat radically in response to stimuli and pressures that grew out of an information environment, roughly 70 years old now but transformed even more fundamentally in the last 25 years, that is substantially discontinuous from centuries-old traditions. Those traditions displaced even older traditions inherited from antiquity. Such is the way of the world, I suppose, and with the benefit of Walter Ong’s insights, my appreciation of the outlines is taking better shape.

Considering the acceleration of practically everything in the late-modern world (postmodern refers to something quite different), which makes planning one’s higher education somewhat fraught if the subject matter studied is rendered flatly out-of-date or moribund by the time of either graduation or entry into the workforce, I’ve heard it recommended that expertise in any particular subject area may be less important than developing expertise in at least one subject that takes a systems approach. That system might be language and communications, mathematics (or any other hard science), history, economics and finance, business administration, computer coding, law and governance, etc. So long as a rigorous understanding of procedures and rules is developed, a structuralist mindset can be repeated and transferred into other subject areas. Be careful, however, not to conflate this approach with a liberal arts education, which is sometimes described as learning how to learn and is widely applicable across disciplines. The liberal arts have fallen distinctly out of favor in the highly technological and technocratic world, which cares little for human values resistant to quantification. Problem is, Western societies in particular are based on liberal democratic institutions now straining due to their sclerotic old age. And because a liberal arts education is scarcely undertaken anymore, civics and citizenship are no longer taught. Even the study of English has now been corrupted (postmodern does apply here) to the point that the basic liberal arts skill of critical thinking is being lost through attrition. Nowhere is that more abundantly clear than in bristling debate over free speech and censorship.

Aside. Although society tinkers and refines itself (sometimes declines) over time, a great body of cultural inheritance informs how things are done properly within an ideology or system. When tinkering and refinement become outright intransigence and defiance of an established order, it’s commonplace to hear the objection “but that’s not how _______ works.” For instance, debate over climate science or the utility of vaccines often has one party proclaiming “trust [or believe] the science.” However, that’s not how science works (i.e., through unquestioning trust or belief). The scientific method properly understood includes verification, falsification, and revision when results and assertions fail to establish reasonable certainty (not the same as consensus). Similarly, critical thinking includes a robust falsification check before “facts” can be accepted at face value. So-called “critical studies” (a/k/a grievance studies), like religious faith, typically positions bald assertions beyond the reach of falsification. Well, sorry, that’s not how critical thinking works.

Being older and educated before critical studies were fully legitimized (or gave rise to things as risible as feminist glaciology), my understand has always been that free speech and other rights are absolutes that cannot be sliced and diced into bits. That way lies casuistry, where law founders frequently. Thus, if one wishes, say, to trample or burn the U.S. flag in protest, no law can be passed or constitutional amendment enacted to carve out an exception disallowed that instance of dissenting free speech. A lesser example is kneeling silently rather than participating in singing the national anthem before a sporting event. Though offensive to certain individual’s sensibilities, silencing speech is far worse according to liberal democratic values. Whatever our ideological or political difference are, we cannot work them out when one party has the power to place topics out or bounds or remove others from discussion entirely. The point at which spirited debate crosses over into inciting violence or fomenting insurrection is a large gray area, which is the subject of the second impeachment of 45. Civil law covers such contingencies, so abridging free speech, deplatforming, and adopting the formulation “language is violence” are highly improper responses under the liberal form of government codified in the U.S. Constitution, which includes the Bill of Rights originally omitted from the U.S. Constitution but quickly added to articulate the rights fully.

Liberal democratic ideology arose in mercantile, substantially agrarian Western societies before scientific, industrial, and capitalist revolutions built a full head of steam, so to speak. Considering just how much America has developed since the Colonial Period, it’s no surprise society has outgrown its own founding documents. More pointedly, the intellectual commons was a much smaller environment, often restricted to a soapbox in the town square and the availability of book, periodicals,and broadsides. Today, the public square has moved online to a bewildering array of social media platforms that enables publication of one’s ideas well beyond the sound of one’s voice over a crowd or the bottleneck of a publisher’s printing press. It’s an entirely new development, and civil law has not kept pace. Whether Internet communications are regulated like the airwaves or nationalized like the U.S. military, it’s clear that the Wild West uber-democratic approach (where anyone can basically say anything) has failed. Demands for regulation (restrictions on free speech) are being taken seriously and acted upon by the private corporations that run social media platforms. During this interim phase, it’s easy for me, as a subscriber to liberal democratic values, to insist reflexively on free speech absolutism. The apparent mood of the public lies elsewhere.

The end of every U.S. presidential administration is preceded by a spate of pardons and commutations — the equivalents of a get-out-of-jail-free card offered routinely to conspirators collaborators with the outgoing executive and general-purpose crony capitalists. This practice, along with diplomatic immunity and supranational elevation of people (and corporations-as-people) beyond the reach of prosecution, is a deplorable workaround obviating the rule of law. Whose brilliant idea it was to offer special indulgence to miscreants is unknown to me, but it’s pretty clear that, with the right connections and/or with enough wealth, you can essentially be as bad as you wanna be with little fear of real consequence (a/k/a too big to fail a/k/a too big to jail). Similarly, politicians, whose very job it is to manage the affairs of society, are free to be incompetent and destructive in their brazen disregard for needs of the citizenry. Only modest effort (typically a lot of jawing directed to the wrong things) is necessary to enjoy the advantages of incumbency.

In this moment of year-end summaries, I could choose from among an array of insane, destructive, counter-productive, and ultimately self-defeating nominees (behaviors exhibited by elite powers that be) as the very worst, the baddest of the bad. For me, in the largest sense, that would be the abject failure of the rule of law (read: restraints), which has (so far) seen only a handful of high-office criminals prosecuted successfully (special investigations leading nowhere and failed impeachments don’t count) for their misdeeds and malfeasance. I prefer to be more specific. Given my indignation over the use of torture, that would seem an obvious choice. However, those news stories have been shoved to the back burner, including the ongoing torture of Julian Assange for essentially revealing truths cynics like me already suspected and now know to be accurate, where they general little heat. Instead, I choose war as the very worst, an example of the U.S. (via its leadership) being as bad as it can possibly be. The recent election cycle offered a few candidates who bucked the consensus that U.S. involvement in every unnecessary, undeclared war since WWII is justified. They were effectively shut out by the military-industrial complex. And as the incoming executive tweeted on November 24, 2020, America’s back, baby! Ready to do our worst again (read: some more, since we [the U.S. military] never stopped [making war]). A sizeable portion of the American public is aligned with this approach, too.

So rule of law has failed and we [Americans] are infested with crime and incompetence at the highest levels. Requirements, rights, and protections found in the U.S. Constitution are handily ignored. That means every administration since Truman has been full of war criminals, because torture and elective war are crimes. The insult to my sensibilities is far worse than the unaffordability of war, the failure to win or end conflicts, or the lack of righteousness in our supposed cause. It’s that we [America, as viewed from outside] are belligerent, bellicose aggressors. We [Americans] are predators. And we [Americans, but really all humans] are stuck in an adolescent concept of conduct in the world shared with animals that must kill just to eat. We [humans] make no humanitarian progress at all. But the increasing scale of our [human] destructiveness is progress if drones, robots, and other DARPA-developed weaponry impress.

(more…)

I’ve never before gone straight back with a redux treatment of a blog post. More typically, it takes more than a year before revisiting a given topic, sometimes several years. This time, supplemental information came immediately, though I’ve delayed writing about it. To wit, a Danish study published November 18, 2020, in the Annals of Internal Medicine indicates our face mask precautions against the Coronavirus may be ineffective:

Our results suggest that the recommendation to wear a surgical mask when outside the home among others did not reduce, at conventional levels of statistical significance, the incidence of SARS-CoV-2 infection in mask wearers in a setting where social distancing and other public health measures were in effect, mask recommendations were not among those measures, and community use of masks was uncommon. Yet, the findings were inconclusive and cannot definitively exclude a 46% reduction to a 23% increase in infection of mask wearers in such a setting. It is important to emphasize that this trial did not address the effects of masks as source control or as protection in settings where social distancing and other public health measures are not in effect.

The important phrase there is “did not reduce, at conventional levels of statistical significance,” which is followed by the caveat that the study was partial and so is inconclusive. To say something is statistically insignificant means that results do not exceed the calculated margin of error or randomness. A fair bit of commentary follows the published study, which I have not reviewed.

We’re largely resorting to conventional wisdom with respect to mask wearing. Most businesses and public venues (if open at all) have adopted the mask mandate out of conformity and despite wildly conflicting reports of their utility. Compared to locking down all nonessential social and economic activity, however, I remain resigned to their adoption even though I’m suspicious (as any cynic or skeptic should be) that they don’t work — at least not after the virus is running loose. There is, however, another component worth considering, namely, the need to been seen doing something, not nothing, to address the pandemic. Some rather bluntly call that virtue signalling, such as the pathologist at this link.

In the week since publication of the Danish study and the pathologist’s opinion (note the entirely misleading title), there has been a deluge of additional information, editorials, and protests (no more links, sorry) calling into question recommendations from health organizations and responses by politicians. Principled and unprincipled dissent was already underway since May 2020, which is growing with each month hardship persists. Of particular note is the Supreme Court’s 5-4 decision against New York Gov. Andrew Cuomo’s mandate that religious services be restricted to no more than 10 people in red zones and no more than 25 in orange zones. Score one for the Bill of Rights being upheld even in a time of crisis.

This 9-year-old blog post continues to attract attention. I suspect the reason behind sustained interest is use of the term structural violence, which sits adjacent to voguish use of the term structural racism. Existence of permanent, institutionalized violence administered procedurally rather than through blunt, behavioral force (arguably still force but obfuscated through layers of bureaucracy) seems pretty plain to most observers. Typical analyses cite patriarchy and white supremacism as principal motivators, and those elements are certainly present throughout American history right up to today. I offer a simpler explanation: greed. Thus, most (though not all) institutionalized violence can be chalked up to class warfare, with the ownership class and its minions openly exacting tribute and stealing everyone’s future. Basically, all resources (material, labor, tax dollars, debt, etc.) can be attached, and those best positioned to bend administrative operations to their will — while pretending to help commoners — stand to gain immensely.

It doesn’t much matter anymore whose resources are involved (pick your oppressed demographic). Any pool is good enough to drain. But because this particular type of violence has become structural, after gathering the demographic data, it’s an easy misdirection to spin the narrative according to divergent group results (e.g., housing, educational opportunity, incarceration rates) where such enduring structures have been erected. While there is certainly some historical truth to that version of the story, the largest inanimate resource pools are not readily divisible that way. For instance, trillions of dollars currently being created out of nothingness to prop up Wall Street (read: the ownership class) redound upon the entire U.S. tax base. It’s not demographically focused (besides the beneficiaries, obviously) but is quite literally looting the treasury. Much the same can be said of subscriber and customer bases of commercial behemoths such as Walmart, Amazon, McDonald’s, and Netflix. Those dollars are widely sourced. One can observe, too, that the ownership class eschews such pedestrian fare. Elites avoiding common American experience is reflected as well in the U.S. armed services, where (depending on whom one believes: see here and here) participation (especially enlisted men and women) skews toward the working class. Consider numerous recent U.S. presidents (and their offspring) who manage to skip out on prospective military service.

What’s surprising, perhaps, is that it’s taken so long for this entrenched state of affairs (structural violence visited on all of us not wealthy enough to be supranational) to be recognized and acted upon by the masses. The Occupy Movement was a recent nonviolent suggestion that we, the 99%, have had quite enough of this shit. Or course, it got brutally shut down and dispersed. A couple days ago, a caravan of looters descended upon the so-called Magnificent Mile in Chicago, the site of numerous flagship stores of luxury brands and general retailers. I don’t approve of such criminal activity any more than the ownership class looting the treasury. But it’s not hard to imagine that, in the minds of some of the Chicago looters at least, their livelihoods and futures have been actively stolen from them. “Look, over there! In that window! Resources for the benefit of rich people. They’ve been stealing from us for generations. Now let’s steal from them.” The big difference is that designer handbags, electronics, and liquor hauled away from breached storefronts is relatively minor compared to structural violence of which we’ve become more acutely aware recently. Put another way, complaining about these looters while ignoring those looters is like complaining about someone pulling your hair while someone else is severing your legs with a chainsaw, leaving you permanently disabled (if not dead). They’re not even remotely in the same world of harm.

Most news I gather is for me unsurprising. That’s the regrettable condition of a doomer continuously learning of different sorts of corruption and awfulness piling up. For instance, the coronavirus crisis is unsurprising to me, as I’ve opined many times that a pandemic was overdue. The previous time I remember being surprised — sickened actually — was learning of the Great Pacific Garbage Patch. (Similar garbage gyres are found in all oceanic bodies.) I’m surprised and sickened yet again upon learning that the Environmental Protection Agency (EPA) has suspended enforcement of environmental laws against industries that despoil the environment in the course of their activities. Polluters are being granted, in effect, a license to kill. The 7-pp. memo can be found here.

I tried to read the memo, but it’s formulated in that dry, bureaucratic style that obfuscates meaning and puts readers to sleep. The news is reported here in a more readable fashion. The EPA’s action is purportedly a temporary response to the pandemic, but the crisis and the response seem to me unrelated except in the sense of “never let a serious crisis go to waste.” I fully expect opportunists to further consolidate power at the Federal level; I never suspected the crisis would be used to enable rape and pillage of the earth’s resources without consequence. No doubt, free rein to relax precautions is a dream many industrialists harbor, which aligns handily with GOP politics. Even to a cynic, however, this revision of policy is astonishing.

The earth has suffered quite a series of insults and injuries at the hands of its apex predator. How much more the earth can absorb is an impossible question to answer. However, it will obviously outlast us. We depend wholly on it, while it is indifferent to our needs. So the decision to loosen up and accept destruction not normally countenanced only hastens us early into the grave we have been digging for ourselves for the past three centuries or so. The pandemic and industrial civilization are already in the process of killing us (and in truth, probably most everything else). No need to accelerate further.

Much ado over nothing was made this past week regarding a technical glitch (or control room error) during the first of two televised Democratic presidential debates where one pair of moderators’ mics was accidentally left on and extraneous, unintended speech leaked into the broadcast. It distracted the other pair of moderators enough to cause a modest procedural disruption. Big deal. This was not the modal case of a hot mic where someone, e.g., a politician, swears (a big no-no despite the shock value being almost completely erased in today’s media landscape) or accidentally reveals callous attitudes (or worse) thinking that no one important was listening or recording. Hot mics in the past have led to public outrage and criminal investigations. One recent example that still sticks in everyone’s craw was a novice political candidate who revealed he could use his fame and impudent nerve to “grab ’em by the pussy.” Turned out not to be the career killer everyone thought it would be.

The latest minor furor over a hot mic got me thinking, however, about inadvertent revelation of matters of genuine public interest. Three genres spring to mind: documentary films, whistle-blowing, and investigative journalism, that last including category outliers such as Wikileaks. Whereas a gaffe on a hot mic usually means the leaker/speaker exposes him- or herself and thus has no one else to blame, disclosures occurring in the other three categories are often against the will of those exposed. It’s obviously in the public interest to know about corruption, misbehavior, and malfeasance in corporate and political life, but the manner in which such information is made public is controversial. Those who expose others suffer harassment and persecution. Documentarians probably fare the best with respect to being left alone following release of information. Michael Moore, for all his absurd though entertaining theatrics, is free (so far as I know) to go about his business and do as he pleases. However, gestures to protect whistle-blowers are just that: gestures. Those who have leaked classified government information in particular, because they gained access to such information through security clearances and signed nondisclosure agreements (before knowing what secrets they were obliged to keep, which is frankly the way such obligations work), are especially prone to reprisal and prosecution. Such information is literally not theirs to disclose, but when keeping others’ secrets is heinous enough, some people feel their conscience and moral duty are superior to job security and other risks involved. Opinions vary, sometimes passionately. And now even journalists who uncover or merely come into possession of evidence of wrongdoing and later publish it — again, decidedly in the public interest — are subject to (malicious?) prosecution. Julian Assange is the current test case.

The free speech aspect of revealing someone else’s amoral and criminal acts is a fraught argument. However, it’s clear that as soon as damaging information comes to light, focus shifts away from the acts and their perpetrators to those who publish the information. Shifting the focus is a miserable yet well-established precedent by now, the result being that most folks who might consider coming forward to speak up now keep things to themselves rather than suffer entirely foreseeable consequences. In that light, when someone comes forward anyway, knowing that they will be hounded, vilified, arrested, and worse, he or she deserved more respect for courage and self-sacrifice than generally occurs in the aftermath of disclosure. The flip side — condemnation, prosecution, and death threats — are already abundant in the public sphere.

Some time after reports of torture at Guantánamo, Abu Ghraib, and Bagram went public, a handful of low-level servicemen (“bad apples” used to deflect attention down the command hierarchy) were prosecuted, but high-level officials (e.g., former U.S. presidents Bush and Obama, anyone in their respective administrations, and commanding officers on site) were essentially immunized from prosecution. That example is not quite the same as going after truth-tellers, but it’s a rather egregious instance of bad actors going unprosecuted. I’m still incensed by it. And that’s why I’m blogging about the hot mic. Lots of awful things go on behind the scenes without public knowledge or sanction. Those who commit high crimes (including war crimes) clearly know what they’re doing is wrong. Claims of national security are often invoked and gag laws are legislated into existence on behalf of private industry. When leaks do inevitably occur, those accused immediately attack the accuser, often with the aid of others in the media. Denials may also be issued (sometimes not — why bother?), but most bad actors hide successfully behind the deflecting shift of focus. When will those acting in the shadows against the public interest and in defiance of domestic and international law ever be brought to justice? I daresay the soul of the nation is at stake, and as long as officialdom escapes all but temporary public relations problems to be spun, the pride everyone wants to take as Americans eludes us. In the meantime, there’s a lot to answer for, and it keeps piling up.

Throughout human history, the question “who should rule?” has been answered myriad ways. The most enduring answer is simple: he who can muster and deploy the most force of arms and then maintain control over those forces. Genghis Khan is probably the most outrageously successful example and is regarded by the West as a barbarian. Only slightly removed from barbarians is the so-called Big Man, who perhaps adds a layer of diplomacy by running a protection racket while selectively providing and distributing spoils. As societies move further away from subsistence and immediacy, various absolute rulers are established, often through hereditary title. Call them Caesar, chief, dear leader, emir, emperor (or empress), kaiser, king (or queen), pharaoh, premier, el presidente, sultan, suzerain, or tsar, they typically acquire power through the accident of birth and are dynastic. Some are female but most are male, and they typically extract tribute and sometimes demand loyalty oaths.

Post-Enlightenment, rulers are frequently democratically elected administrators (e.g., legislators, technocrats, autocrats, plutocrats, kleptocrats, and former military) ideally meant to be representative of common folks. In the U.S., members of Congress (and of course the President) are almost wholly drawn from the ranks of the wealthy (insufficient wealth being a de facto bar to office) and are accordingly estranged from American life the many different ways most of us experience it. Below the top level of visible, elected leaders is a large, hidden apparatus of high-level bureaucratic functionaries (often appointees), the so-called Deep State, that is relatively stable and made up primarily of well-educated, white-collar careerists whose ambitions for themselves and the country are often at odds with the citizenry.

I began to think about this in response to a rather irrational reply to an observation I made here. Actually, it wasn’t even originally my observation but that of Thomas Frank, namely, that the Deep State is largely made up of the liberal professional class. The reply reinforced the notion who better to rule than the “pros”? History makes the alternatives unthinkable. Thus, the Deep State’s response to the veritable one-man barbarian invasion of the Oval Office has been to seek removal of the interloper by hook or by crook. (High office in this case was won unexpectedly and with unnamed precedent by rhetorical force — base populism — rather than by military coup, making the current occupant a quasi-cult leader; similarly, extracted tribute is merely gawking attention rather than riches.)

History also reveals that all forms of political organization suffer endemic incompetence and corruption, lending truth to Winston Churchill’s witticism “Democracy is the worst form of government, except for all the others.” Indeed, recent rule by technocrats has been supremely awful, leading to periodic market crashes, extreme wealth inequality, social stigmatization, and forever wars. Life under such rule is arguably better than under various other political styles; after all, we gots our vaunted freedoms and enviable material comforts. But the exercise of those freedoms does not reliably deliver either ontological security or psychological certainty we humans crave. In truth, our current form of self-governance has let nothing get in the way of plundering the planet for short-term profit. That ongoing priority is making Earth uninhabitable not just for other species but for humans, too. In light of this fact, liberal technocratic democracy could be a far worse failure than most: it will have killed billions (an inevitability now under delayed effect).

Two new grassroots movements (to my knowledge) have appeared that openly question who should rule: the Sunrise Movement (SM) and the Extinction Rebellion (ER). SM is a youth political movement in the U.S. that acknowledges climate change and supports the Green New Deal as a way of prioritizing the desperate existential threat modern politics and society have become. For now at least, SM appears to be content with working within the system, replacing incumbents with candidates it supports. More intensely, ER is a global movement centered in the U.K. that also acknowledges that familiar modern forms of social and political organization (there are several) no longer function but in fact threaten all of us with, well, extinction. One of its unique demands is that legislatures be drawn via sortition from the general population to be more representative of the people. Further, sortition avoids the established pattern of those elected to lead representational governments from being corrupted by the very process of seeking and attaining office.

I surmise attrition and/or replacement (the SM path) are too slow and leave candidates vulnerable to corruption. In addition, since no one relinquishes power willingly, current leaders will have to be forced out via open rebellion (the ER path). I’m willing to entertain either path but must sadly conclude that both are too little, too late to address climate change and near-term extinction effectively. Though difficult to establish convincingly, I suspect the time to act was in the 1970s (or even before) when the Ecology Movement arose in recognition that we cannot continue to despoil our own habitat without consequence. That message (social, political, economic, and scientific all at once) was as inert then as it is now. However, fatalism acknowledged, some other path forward is better than our current systems of rule.

First, a bit of history. The U.S. Constitution was ratified in 1788 and superseded the Articles of Confederation. The first ten Amendments, ratified in 1791 (rather quickly after the initial drafting and adoption of the main document — oops, forgot these obvious assumptions), are known as the Bill of Rights. The final amendment to date, the 27th Amendment, though proposed in 1789 along with others, was not ratified until 1992. A half dozen additional amendments approved by Congress have not yet been ratified, and a large number of other unapproved amendments have been proposed.

The received wisdom is that, by virtue of its lengthy service as the supreme law of the land, the U.S. Constitution has become sacrosanct and invulnerable to significant criticism and further amendment. That wisdom has begun to be questioned actively as a result of (at least) two factors: (1) recognition that the Federal government serves the common good and citizenry rather poorly, having become corrupt and dysfunctional, and (2) the Electoral College, an anachronism from the Revolutionary Era that skews voting power away from cities, handed two recent presidential elections to candidates who failed to win the popular vote yet won in the Electoral College. For a numerical analysis of how electoral politics is gamed to subvert public opinion, resulting in more government seats held by Republicans than voting (expressing the will of the people) would indicate, see this article by the Brookings Institute.

These are issues of political philosophy and ongoing public debate, spurred by dissatisfaction over periodic Federal shutdowns, power struggles between the executive and legislative branches that are tantamount to holding each other hostage, and income inequality that pools wealth and power in the hands of ever fewer people. The judicial branch (especially the U.S. Supreme Court) is also a significant point of contention; its newly appointed members are increasingly right wing but have not (yet) taken openly activist roles (e.g., reversing Roe v. Wade). As philosophy, questioning the wisdom of the U.S. Constitution requires considerable knowledge of history and comparative government to undertake with equanimity (as opposed to emotionalism). I don’t possess such expert knowledge but will observe that the U.S. is an outlier among nations in relying on a centuries-old constitution, which may not have been the expectation or intent of the drafters.

It might be too strong to suggest just yet that the public feels betrayed by its institutions. Better to say that, for instance, the U.S. Constitution is now regarded as a flawed document — not for its day (with limited Federal powers) but for the needs of today (where the Federal apparatus, including the giant military, has grown into a leviathan). This would explain renewed interest in direct democracy (as opposed to representative government), flirtations with socialism (expanded over the blended system we already have), and open calls for revolution to remove a de facto corporatocracy. Whether the U.S. Constitution can or should survive these challenges is the question.

Update

Seems I was roughly half a year early. Harper’s Magazine has as its feature article for the October 2019 issue a serendipitous article: “Constitution in Crisis” (not behind a paywall, I believe). The cover of the issue, however, poses a more provocative question: “Do We Need the Constitution?” Decide for yourself, I suppose, if you’re aligned with the revolutionary spirit.