Posts Tagged ‘Madness’

“Language is dynamic” is a phrase invoked in praise or derision of shifts in usage. Corollaries include “the only constant is change” and “time’s arrow points in only one direction” — both signalling that stasis is an invalid and ultimately futile conservative value. The flip side might well be the myth of progress, understood in reference not to technological advancement but human nature’s failure to rise above its base (animal) origins. This failure is especially grotesque considering that humans currently albeit temporarily live in an age of material abundance that would provide amply for everyone if that largesse were justly and equitably produced and distributed. However, resources (including labor) are being systematically exploited, diverted, and hoarded by a small, unethical elite (what some call “alpha chimps”) who often use state power to subjugate vulnerable populations to funnel further tribute to the already obscenely wealthy top of the socioeconomic hierarchy. But that’s a different diatribe.

Although I’m sensitive the dynamism of language — especially terms for broad ideas in need of short, snappy neologisms — I’m resistant to adopting most new coin. For instance, multiple colors of pill (red, blue, white, and black to my knowledge) refer to certain narrative complexes that people, in effect, swallow. Similarly, the “blue church” is used to refer to legacy media struggling desperately (and failing) to retain its last shreds of legitimacy and authority. (Dignity is long gone.) Does language really need these terms or are hipsters just being clever? That question probably lacks a definitive answer.

My real interest with this blog post, however, is how the modern digital mediascape has given rise to a curious phenomenon associated with cancel culture: deletion of tweets and social media posts to scrub one’s past of impropriety as though the tweet or post never happened. (I’ve never deleted a post nor have any plans to.) Silicon Valley hegemons can’t resist getting their piece of the action, too, by applying deeply flawed algorithms to everyone’s content to demonetize, restrict, and/or remove (i.e., censor) offensive opinion that runs counter to (shifting) consensus narratives decided upon in their sole discretion as water carriers for officialdom. Algorithmic dragnets are not effective kludges precisely because thoughts are not synonymous with their online expression; one merely points to the other. Used to be said that the Internet is forever, so wait before posting or tweeting a reasonable duration so that irresponsible behavior (opinion and trolling, mostly) can be tempered. Who knows who possesses technical expertise and access to tweet and video archives other than, say, the Wayback Machine? When a public figure says or does something dumb, a search-and-destroy mission is often launched to resurrect offending and damning past utterance. Of course, scrub-a-dub erasure or deletion is merely another attempt to manage narrative and isn’t a plea for forgiveness, which doesn’t exist in the public sphere anyway except for rehabilitated monsters such as past U.S. presidents a/k/a war criminals. And the Internet isn’t in fact forever; ask an archivist.

Shifting language, shifting records, shifting sentiment, shifting intellectual history are all aspects of culture that develop naturally and inevitably over time. We no longer believe, for instance, in the four elements or geocentrism (a/k/a the Aristotelian-Ptolemaic system; never mind the intransigent Flat Earthers who need not be silenced). Darker aspects of these shifts, however, include the remarkable Orwellian insight that “Who controls the past controls the future: who controls the present controls the past” from the 1949 novel Nineteen Eighty-Four. Here’s the passage for context:

Who controls the past, controls the future: who controls the present, controls the past … The mutability of the past is the central tenet of Ingsoc. Past events, it is argued, have no objective existence, but survive only in written records and in human memories. The past is whatever the records and the memories agree upon. And since the Party is in full control of all records, and in equally full control of the minds of its members, it follows that the past is whatever the Party chooses to make it.

In 2021, the awful lesson is taken to heart by multiple parties (not the Party in the novel but wannabes) who have latched maniacally onto Orwellian mechanisms of thought control specifically through the manipulation of records, history, and language. But as mentioned above, policing mere expression is not the same as policing thought itself, at least among those who retain critical thinking skills and independence of mind. I abstain judgment how effective attempted brainwashing is with the masses but will at least mention that Yeonmi Park, who escaped from North Korea in 2007 before settling in the U.S. in 2014, describes the chilling totalitarian thought control exercised by the North Korean government — the stuff of nightmare dystopianism. The template is by now well established and despots everywhere are only too happy to implement it repeatedly, following an evil trajectory that should be resisted at every turn while still possible.

Coming back to this topic after some time (pt. 1 here). My intention was to expand upon demands for compliance, and unsurprisingly, relevant tidbits continuously pop up in the news. The dystopia American society is building for itself doesn’t disappoint — not that anyone is hoping for such a development (one would guess). It’s merely that certain influential elements of society reliably move toward consolidation of power and credulous citizens predictably forfeit their freedom and autonomy with little or no hesitation. The two main examples to discuss are Black Lives Matter (BLM) and the response to to the global pandemic, which have occurred simultaneously but are not particularly related.

The BLM movement began in summer 2013 but boiled over in summer 2020 on the heels of the George Floyd killing, with protests spilling over into straightforward looting, mayhem, and lawlessness. That fit of high emotional pique found many protester accosting random strangers in public and demanding a raised fist in support of the movement, which was always ideologically disorganized but became irrational and power-hungry as Wokedom discovered its ability to submit others to its will. In response, many businesses erected what I’ve heard called don’t-hurt-me walls in apparent support of BLM and celebration of black culture so that windows would not be smashed and stores ransacked. Roving protests in numerous cities demanded shows of support, though with what exactly was never clear, from anyone encountered. Ultimately, protests morphed into a sort of protection racket, and agitators learned to enjoy making others acquiesce to arbitrary demands. Many schools and corporations now conduct mandatory training to, among other things, identify unconscious bias, which has the distinct aroma of original sin that can never be assuaged or forgiven. It’s entirely understandable that many individuals, under considerable pressure to conform as moral panic seized the country, play along to keep the peace or keep their jobs. Backlash is building, of course.

The much larger example affecting everyone, nationwide and globally, is the response to the pandemic. Although quarantines have been used in the past to limit regional outbreaks of infectious disease, the global lockdown of business and travel was something entirely new. Despite of lack of evidence of efficacy, the precautionary principle prevailed and nearly everyone was forced into home sequestration and later, after an embarrassingly stupid scandal (in the U.S.), made to don masks when venturing out in public. As waves of viral infection and death rolled across the globe, political leaders learned to enjoy making citizens acquiesce to capricious and often contradictory demands. Like BLM, a loose consensus emerged about the “correct” way to handle the needs of the moment, but the science and demographics of the virus produced widely variant interpretations of such correctness. A truly coordinated national response in the U.S. never coalesced, and hindsight has judged the whole morass a fundamentally botched job of maintaining public health in most countries.

But political leaders weren’t done demanding compliance. Any entirely novel vaccine protocol was rushed into production after emergency use authorization was obtained and indemnification (against what?) was granted to the pharma companies that developed competing vaccines. Whether this historical moment will turn out to be something akin to the thalidomide scandal remains to be seen, but at the very least, the citizenry is being driven heavily toward participation in a global medical experiment. Some states even offer million-dollar lotteries to incentivize individuals to comply and take the jab. Open discussion of risks associated with the new vaccines has been largely off limits, and a two-tier society is already emerging: the vaccinated and the unclean (which is ironic, since many of the unclean have never been sick).

Worse yet (and like the don’t-hurt-me walls), many organizations are adopting as-yet-unproven protocols and requiring vaccination for participants in their activities (e.g., schools, sports, concerts) or simply to keep one’s job. The mask mandate was a tolerable discomfort (though not without many principled refusals), but forcing others to be crash test dummies experimental test subjects is well beyond the pale. Considering how the narrative continues to evolve and transform, thoughtful individuals trying to evaluate competing truth claims for themselves are unable to get clear, authoritative answers. Indeed, it’s hard to imagine a situation where authorities in politics, medicine, science, and journalism could worked so assiduously to undermine their own credibility. Predictably, heads (or boards of directors) of many organizations are learning to enjoy the newly discovered power to transform their organizations into petty fiefdoms and demand compliance from individuals — usually under the claim of public safety (“for the children” being unavailable this time). Considering how little efficacy has yet been truly demonstrated with any of the various regimes erected to contain or stall the pandemic, the notion that precautions undertaken have been worth giving injudicious authority to people up and down various power hierarchies to compel individuals remains just that: a notion.

Tyrants and bullies never seem to tire of watching others do the submission dance. In the next round, be ready to hop on one leg and/or bark like a dog when someone flexes on you. Land of the free and home of the brave no longer.

Addendum

The CDC just announced an emergency meeting to be held (virtually) June 18 to investigate reports (800+ via the Vaccination Adverse Effect Reporting System (VAERS), which almost no one had heard of only a month ago) of heart inflammation in adolescents following vaccination against the covid virus. Significant underreporting is anticipated following the circular logic that since authorities declared the vaccines safe prematurely (without standard scientific evidence to support such a statement), the effects cannot be due to the vaccine. What will be the effect of over 140 million people having been assured that vaccination is entirely safe, taken the jab, and then discovered “wait! maybe not so much ….” Will the complete erosion of trust in what we’re instructed told by officialdom and its mouthpieces in journalism spark widespread, organized, grassroots defiance once the bedrock truth is laid bare? Should it?

This article at Scientific American argues in support of a fundamental change to its style sheet. A style sheet, for the uninitiated, is a guide to how a publication presents its output, including formatting, commonly used spellings, and preferred grammar. For instance, should ordinals (i.e., 1st, 2nd, 3rd, etc.) be raised? Or should web be capitalized when referring to the World Wide Web? The change Scientific American just adopted is dropping the term climate change in favor of climate emergency. Well, good for Scientific American, I guess. My lack of enthusiasm or urgency — the very urgency signaled by the revised term now that the emergency is upon us (um, has been for decades already if one thinks in terms of geological or evolutionary time rather than mere decades of human history) — stems not from the truthfulness or effectiveness of the arguments but by my assessment that the problem is flatly unsolvable at this late date and that, as a global civilization, we’re doing almost nothing to combat it anyway. That’s been the case since the basic problem swung into public view in the 1970s, and it’s been the case since James Howard Kunstler published The Long Emergency in 2006.

Climate emergency deniers have claimed that recent volcanic eruptions in the Caribbean, Iceland, and Hawaii have erased or nullified all the efforts by humans to stem the release of greenhouse gases from industrial activity. According to this link, that’s comparing apples and oranges: peak volcanic activity vs. a sliver of human activity. Since 1750 (a conventional start date of the Industrial Revolution), it’s human activity driving the climate emergency, not volcanic activity. Moreover, industrial activity shows no signs of abating, at least until is all creaks to a halt when the infernal machine will no longer crank. The blocked Suez Canal and deep freeze in Texas both remind how fragile industrial infrastructure is; just wait for a Carrington Event to fry everything at once. This link explains human carbon emissions (also mentions volcanoes), which continues to increase in volume every single year. (This past year might (might!) be an anomaly due to the pandemic, but things are already ramping back up.) And besides, humans can’t control volcanoes (did someone suggest dropping nukes in them to “seal them up”?) We can’t even control ourselves.

Some while back, I purged from my blogroll all the doom links and determined that industrial civilization is in its death throes, so why bother blogging about it anymore? Similarly, the last time I cited the Doomsday Clock in January 2020, it was (metaphorically) 100 seconds to midnight. The Clock today still sits at that harrowing eve of destruction, and I didn’t link to the January 2021 statement, which includes discussions of the novel coronavirus, nuclear threats, and climate change (the older term), summarizing them together as a wake-up call. Really? So now it’s time to get serious? Nope, don’t think so. The proper time is long past due, the catastrophic future is already locked in, and we’ve been steadfastly pretending there is nothing to see (so that there will eventually be nothing to do — a self-fulfilling prophecy). It’s merely unknown when members of the human species begin dropping like flies.

From Alan Jacob’s Breaking Bread with the Dead (2020):

The German sociologist Gerd-Günter Voss outlined the development, over many centuries, of three forms of the “conduct of life.” The first is the traditional: in this model your life takes the forms that the lives of people in your culture and class have always taken, at least for as long as anyone remembers. The key values in the traditional conduct of life are “security and regularity.” The second model is the strategic: people who follow this model have clear goals in mind (first, to get into an elite university; later, to become a radiologist or own their own company or retire at fifty) and form a detailed strategic plan to achieve those goals. But, Voss suggests, those two models, while still present in various parts of the world, are increasingly being displaced by a third model for the conduct of life: the situational.

The situational model has arisen in recent social orders that are unprecedentedly dynamic and fluid. People are less likely to plan to be radiologists when they hear that radiologists may be replaced by computers. They are less likely to plan to own a company when whatever business they’re inclined toward may not exist in a decade … they are less likely to plan to have children … They might not even want to plan to have dinner with a friend a week from Friday …

… the situational conduct of life is … a way of coping with social acceleration. But it’s also, or threatens to be, an abandonment of serious reflection on what makes life good. You end up just managing the moment … The feeling of being at a “frenetic standstill” is highly characteristic of the depressed person.

/rant on

The self-appointed Thought Police continue their rampage through the public sphere, campaigning to disallow certain thoughts and fence off unacceptable, unsanitary, unhygienic, unhealthy utterances lest they spread, infect, and distort their host thinkers. Entire histories are being purged from, well, history, to pretend they either never happened or will never happen again, because (doncha know?) attempting to squeeze disreputable thought out of existence can’t possibly result in those forbidden fruits blossoming elsewhere, in the shadows, in all their overripe color and sweetness. The restrictive impulse — policing free speech and free thought — is as old as it is stupid. For example, it’s found in the use of euphemisms that pretend to mask the true nature of all manner of unpleasantness, such as death, racial and national epithets, unsavory ideologies, etc. However, farting is still farting, and calling it “passing wind” does nothing to reduce its stink. Plus, we all fart, just like we all inevitably traffic in ideas from time to time that are unwholesome. Manners demand some discretion when farting broaching some topics, but the point is that one must learn how to handle such difficulty responsibly rather than attempting to hold it in drive it out of thought entirely, which simply doesn’t work. No one knows automatically how to navigate through these minefields.

Considering that the body and mind possess myriad inhibitory-excitatory mechanisms that push and/or pull (i.e., good/bad, on/off, native/alien), a wizened person might recognize that both directions are needed to achieve balance. For instance, exposure to at least some hardship has the salutary effect of building character, whereas constant indulgence results in spoiled children (later, adults). Similarly, the biceps/triceps operate in tandem and opposition and need each other to function properly. However, most inhibitory-excitatory mechanisms aren’t so nearly binary as our language tends to imply but rather rely on an array of inputs. Sorting them all out is like trying to answer the nature/nurture question. Good luck with that.

Here’s a case in point: student and professional athletes in the U.S. are often prohibited from kneeling in dissent during the playing of the national anthem. The prohibition does nothing to ameliorate the roots of dissent but only suppresses its expression under narrow, temporary circumstances. Muzzling speech (ironically in the form of silent behavior) prior to sports contests may actually boomerang to inflame it. Some athletes knuckle under and accept the deal they’re offered (STFU! or lose your position — note the initialism used to hide the curse word) while others take principled stands (while kneeling, ha!) against others attempting to police thought. Some might argue that the setting demands good manners and restraint, while others argue that, by not stomping around the playing field carrying placards, gesticulating threateningly, or chanting slogans, restraint is being used. Opinions differ, obviously, and so the debate goes on. In a free society, that’s how it works. Societies with too many severe restrictions, often bordering on or going fully into fascism and totalitarianism, are intolerable to many of us fed current-day jingoism regarding democracy, freedom, and self-determination.

Many members of the U.S. Congress, sworn protectors of the U.S. Constitution, fundamentally misunderstand the First Amendment, or at least they conveniently pretend to. (I suspect it’s the former). Here is it for reference:

Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the Government for a redress of grievances.

Defending the First Amendment against infringement requires character and principles. What we have instead, both in Congress and in American society, are ideologues and authorities who want to make some categories flatly unthinkable and subject others to prosecution. Whistleblowing falls into the latter category. They are aided by the gradual erosion of educational achievement and shift away from literacy to orality, which robs language of its richness and polysemy. If words are placed out of bounds, made unutterable (but not unthinkable), the very tools of thought and expression are removed. The thoughts themselves may be driven underground or reduced to incoherence, but that’s not a respectable goal. Only under the harshest conditions (Orwell depicted them) can specific thoughts be made truly unthinkable, which typically impoverishes and/or breaks the mind of the thinker or at least results in pro forma public assent while private dissent gets stuffed down. To balance and combat truly virulent notions, exposure and discussion is needed, not suppression. But because many public figures have swallowed a bizarre combination of incoherent notions and are advocating for them, the mood is shifting away from First Amendment protection. Even absolutists like me are forced to reconsider, as for example with this article. The very openness to consideration of contrary thinking may well be the vulnerability being exploited by crypto-fascists.

Calls to establish a Ministry of Truth have progressed beyond the Silicon Valley tech platforms’ arbitrary and (one presumes) algorithm-driven removal of huge swaths of user-created content to a new bill introduced in the Colorado State Senate to establish a Digital Communications Regulation commission (summary here). Maybe this first step toward hammering out a legislative response to surveillance capitalism will rein in the predatory behaviors of big tech. The cynic in me harbors doubts. Instead, resulting legislation is far more likely to be aimed at users of those platforms.

/rant off

Evil exists in the world. History and current events both bear this out amply. Pseudo-philosophers might argue that, like emotions and other immaterial sensations, good and evil are merely reified concepts, meaning they are human constructs with no palpable external reality. Go tell that to victims of evildoers. Human suffering can’t be anonymized, rationalized, or philosophized away quite so handily.

It was sort of refreshing, back in the day, when Google’s motto and/or corporate code of conduct was simple: “Don’t Be Evil.” It acknowledged the potential for being or becoming evil (like any of the Bigs: Big Tobacco, Big Soda, Big Oil, Big Pharma, Big Media, Big Agriculture, etc.) and presumably aspired to resist obvious temptations. That was then (from 2000 to 2018), this is now (2021 until death take us — soon enough, I fear). But like all entities possessed of absurd levels of wealth and power, Google (now reorganized as a subsidiary of Alphabet, but who actually refers to it that way?) and its Silicon Valley brethren have succumbed to temptation and become straight-up evil.

One might charitably assess this development as something unbidden, unanticipated, and unexpected, but that’s no excuse, really. I certainly don’t envy celebrity executives experiencing difficulty resulting from having created unmanageable behemoths loosed on both public and polity unable to recognize beastly fangs until already clamped on their necks. As often occurs, dystopian extrapolations are explored in fiction, sometimes satirically. The dénouement of the HBO show Silicon Valley depicts tech mogul wannabes succeeding in creating an AI (or merely a sophisticated algorithm? doesn’t matter …) that would in time become far too powerful in blind execution of its inner imperative. In the show, characters recognize what they had done and kill their own project rather than allow it to destroy the world. In reality, multiple developers of computer tech platforms (and their embedded dynamic, including the wildly unhelpful albeit accurate term algorithm) lacked the foresight to anticipate awful downstream effects of their brainchildren. Yet now that those effects are manifesting recognizably, these corporations continue to operate and wreak havoc.

Silicon Valley shows a extended software development period of bungling ineptitude punctuated by brilliant though momentary breakthroughs. Characters are smart, flawed people laughably unable to get out of the way of their own success. The pièce de résistance was yoking one so-called “learning machine” to another and initiating what would become a runaway doomsday process (either like ecological collapse, building slowly the making the biosphere uninhabitable all at once, or like the gray goo problem, progressively “processing” biomass at the molecular level until all that remains is lifeless goo). It was a final act of bumbling that demanded the characters’ principled, ethical response before the window of opportunity closed. Real Silicon Valley tech platforms are in the (ongoing) process of rending the social fabric, which is no laughing matter. The issue du jour surrounds free speech and its inverse censorship. More broadly, real Silicon Valley succeeded in gaming human psychology for profit in at least two aspects (could be more as yet unrecognized): (1) mining behavioral data as an exploitable resource, and (2) delivering inexhaustible streams of extremely divisive content (not its own) to drive persistent engagement with its platforms. Yoked together, they operate to drive society mad, and yet, mounting evidence of this development has not produced even an inkling that maybe the damned doomsday devices ought to be shut off. As with the environment, we operate with freedom enough to destroy ourselves. Instead, politicians issue stunningly ineffectual calls for regulation or break-up of monopolies. In the meantime, ever more absurd wealth and power are concentrated in the hands of a few executives who have clearly punted and decided “let’s be evil.” No restraints on their behavioral experimentation across whole societies exist.

Much more to say on this topic in additional parts to come.

Black Friday has over the past decades become the default kickoff of annual consumer madness associated with the holiday season and its gift-giving tradition. Due to the pandemic, this year has been considerably muted in comparison to other years — at least in terms of crowds. Shopping has apparently moved online fairly aggressively, which is an entirely understandable result of everyone being locked down and socially distanced. (Lack of disposable income ought to be a factor, too, but American consumers have shown remarkable willingness to take on substantial debt when able in support of mere lifestyle.) Nevertheless, my inbox has been deluged over the past week with incessant Black Friday and Cyber Monday advertising. Predictably, retailers continue feeding the frenzy.

Uncharacteristically, perhaps, this state of affairs is not the source of outrage on my part. I recognize that we live in a consumerist, capitalist society that will persist in buying and selling activities even in the face of increasing hardship. I’m also cynical enough to expect retailers (and the manufacturers they support, even if those manufacturers are Chinese) to stoke consumer desire through advertising, promotions, and discount sales. It’s simply what they do. Why stop now? Thus far, I’ve seen no rationalizations or other arguments excusing how it’s a little ghoulish to be profiting while so many are clearly suffering and facing individual and household fiscal cliffs. Instead, we rather blandly accept that the public needs to be served no less by mass market retailers than by, say, grocery and utility services. Failure by the private sector to maintain functioning supply lines (including nonessentials, I suppose) during a crisis would look too much like the appalling mismanagement of the same crisis by local, state, and federal governments. Is it ironic that centralized bureaucracies reveal themselves as incompetent at the very same time they consolidate power? Or more cynically, isn’t it outrageous that they barely even try anymore to address the true needs of the public?

One of the questions I’ve posed unrhetorically is this: when will it finally become undeniably clear that instead of being geared to growth we should instead be managing contraction? I don’t know the precise timing, but the issue will be forced on us sooner or later as a result of radically diminishing return (compared to a century ago, say) on investment (ROI) in the energy sector. In short, we will be pulled back down to earth from the perilous heights we scaled as resources needed to keep industrial civilization creaking along become ever more difficult to obtain. (Maybe we’ll have to start using the term unobtainium from the Avatar movies.) Physical resources are impossible to counterfeit at scale, unlike the bogus enormous increase in the fiat money supply via debt creation. If/when hyperinflation makes us all multimillionaires because everything is grossly overvalued, the absurd paradox of being cash rich yet resource poor ought to wake up some folks.

I’ve mentioned the precautionary principle several times, most notably here. Little of our approach to precautions has changed in the two years since that blog post. At the same time, climate change and Mother Nature batter us aggressively. Eventualities remain predictable. Different precautions are being undertaken with respect to the pandemic currently gripping the planet. Arguably, the pandemic is either a subset of Mother Nature’s fury or, if the virus was created in a lab, a self-inflicted wound. Proper pandemic precautions have been confounded by undermining of authority, misinformation, lack of coordination, and politically biased narratives. I’m as confused as the next poor sap. However, low-cost precautions such as wearing masks are entirely acceptable, notwithstanding refusals of many Americans to cooperate after authorities muddied the question of their effectiveness so completely. More significant precautions such as lockdowns and business shutdowns have morphed into received wisdom among government bodies yet are questioned widely as being a cure worse than the disease, not to mention administrative overreach (conspiratorial conjecture withheld).

Now comes evidence published in the New England Journal of Medicine on November 11, 2020, that costly isolation is flatly ineffective at stemming infection rates. Here are the results and conclusions from the abstract of the published study:

Results
A total of 1848 recruits volunteered to participate in the study; within 2 days after arrival on campus, 16 (0.9%) tested positive for SARS-CoV-2, 15 of whom were asymptomatic. An additional 35 participants (1.9%) tested positive on day 7 or on day 14. Five of the 51 participants (9.8%) who tested positive at any time had symptoms in the week before a positive qPCR test. Of the recruits who declined to participate in the study, 26 (1.7%) of the 1554 recruits with available qPCR results tested positive on day 14. No SARS-CoV-2 infections were identified through clinical qPCR testing performed as a result of daily symptom monitoring. Analysis of 36 SARS-CoV-2 genomes obtained from 32 participants revealed six transmission clusters among 18 participants. Epidemiologic analysis supported multiple local transmission events, including transmission between roommates and among recruits within the same platoon.
Conclusions
Among Marine Corps recruits, approximately 2% who had previously had negative results for SARS-CoV-2 at the beginning of supervised quarantine, and less than 2% of recruits with unknown previous status, tested positive by day 14. Most recruits who tested positive were asymptomatic, and no infections were detected through daily symptom monitoring. Transmission clusters occurred within platoons.

So an initial 0.9% tested positive, then an additional 1.9%. This total 2.8% compares to 1.7% in the control group (tested but not isolated as part of the study). Perhaps the experimental and control groups are a bit small (1848 and 1554, respectively), and it’s not clear why the experimental group infection rate is higher than that of the control group, but the evidence points to the uselessness of trying to limit the spread of the virus by quarantining and/or isolation. Once the virus is present in a population, it spreads despite precautions.

A mantra is circulating that we should “trust the science.” Are these results to be trusted? Can we call off all the lockdowns and closures? It’s been at least eight months that the virus has been raging throughout the U.S. Although there might be some instances of isolated populations with no infection, the wider population has by now been exposed. Moreover, some individuals who self-isolated effectively may not have been exposed, but in all likelihood, most of us have been. Accordingly, renewed lockdowns, school and business closures, and destruction of entire industries are a pretense of control we never really had. Their costs are enormous and ongoing. A stay-at-home order (advisory, if you prefer) just went into effect for the City of Chicago on November 16, 2020. My anecdotal observation is that most Chicagoans are ignoring it and going about their business similar to summer and fall months. It’s nothing like the ghost town effect of March and April 2020. I daresay they may well be correct to reject the received wisdom of our civic leaders.

/rant on

Remember all those folks in the weeks and days preceding election day on November 4, 2020, who were buying guns, ammo, and other provisions in preparation for civil breakdown? (No one known personally, of course, and gawd no not actually any of us, either; just them other others who don’t read blogs or anything else.) Well, maybe they were correct adopting the precautionary principal (notably absent from a host of other perils besetting us). But as of this writing, nothing remotely resembling widespread disruption — feared by some, hotly anticipated by others — has developed. But wait! There’s still time. Considering Americans were set up by both political parties to distrust the outcome of the presidential race no matter which candidate claimed to have prevailed, we now face weeks or months of legal challenges and impatient formation of agitators (again, both sides) demanding their candidate be declared the winner (now, dammit!) by the courts instead of either official ballot-counters or the liberal-biased MSM. To say our institutions have failed us, and further, that political operatives all the way up to the sitting president have been openly fomenting violence in the streets, is a statement of the obvious.

Among my concerns more pressing than who gets to sit in the big chair, however, is the whipsawing stock market. Although no longer an accurate proxy of overall economic health or asset valuation, the stock market’s thoroughly irrational daily reaction to every rumor of, say, a vaccine for the raging coronavirus, or resumption of full economic activity and profitability despite widespread joblessness, renewed lockdowns, and a massive wave of homelessness in the offing due to bankruptcies, evictions, and foreclosures, none of this bodes well for the short-term future and maintenance of, oh, I dunno, supply lines to grocery stores. Indeed, I suspect we are rapidly approaching our very own Minsky Moment, which Wikipedia describes as “a sudden, major collapse of asset values which marks the end of the growth phase of a cycle in credit markets or business activity” [underlying links omitted]. This is another prospective event (overdue, actually) for which the set-up has been long prepared. Conspiratorial types call it “the great reset” — something quite different from a debt jubilee.

For lazy thinkers, rhyming comparisons with the past frequently resort to calling someone a Nazi (or the new Hitler) or reminding everyone of U.S. chattel slavery. At the risk of being accused of similar stupidity, I suggest that we’re not on the eve of a 1929-style market crash and ensuing second great depression (though those could well happen, too, bread lines having already formed in 2020) but are instead poised at the precipice of hyperinflation and intense humiliation akin to the Weimar Republic in 1933 or so. American humiliation will result from recognition that the U.S. is now a failed state and doesn’t even pretend anymore to look after its citizens or the commonweal. Look no further than the two preposterous presidential candidates, neither of whom made any campaign promises to improve the lives of average Americans. Rather, the state has been captured by kleptocrats. Accordingly, no more American exceptionalism and no more lying to ourselves how we’re the model for the rest of the world to admire and emulate.

Like Germany in the 1930s, the U.S. has also suffered military defeats and stagnation (perhaps by design) and currently demonstrates a marked inability to manage itself economically, politically, or culturally. Indeed, the American people may well be ungovernable at this point, nourished on a thin gruel of rugged individualism that forestalls our coming together to address adversity effectively. The possibility of another faux-populist savior arising out of necessity only to lead us over the edge (see the Great Man Theory of history) seems eerily likely, though the specific form that descent into madness would take is unclear. Recent history already indicates a deeply divided American citizenry having lost its collective mind but not yet having gone fully apeshit, flinging feces and destroying what remains of economically ravaged communities for the sheer sport of it. (I’ve never understood vandalism.) That’s what everyone was preparing for with emergency guns, ammo, and provisions. How narrowly we escaped catastrophe (or merely delayed it) should be clear in the fullness of time.

/rant off

Supporting the Vietnam war was dumb. Supporting the Iraq invasion after being lied
to about Vietnam was an order of magnitude dumber. Supporting any US war agendas
after being lied to about Iraq is an order of magnitude even dumber than that.
—Caitlin Johnstone

Upon rereading, and with the advantage of modest hindsight, I think I got it exactly correct in this 5-year-old blog post. Even the two brief comments are correct. More specifically, the United States is understood to be the sole remaining military superpower following the collapse of the Soviet Union in 1991. Never mind that numerous countries count themselves members of the nuclear club (cue Groucho Marx joke) and thus possess sufficient power to destroy the world. Never mind that the U.S. failed to win the Korean War or the Vietnam War (the two major U.S. military involvements post-WWII), or in fact any of numerous 21st-century wars (undeclared, de facto, continuing). Never mind that the U.S. has been successful at multiple smaller regime-change actions, often on the back of a civil war instigated by the U.S. and purposefully designed to install a puppet leader. And never mind that the capitalist competition for control of economic resources and capture of perpetual growth is being won handily by China. Nope, the U.S. is no longer the only superpower but is instead busy transitioning from superpower (military and economic) to failed state. Or in the language of that old blog post, the U.S. is now a geopolitical Strong/Stupid hybrid but is actively deploying stupidity in a feverish play to be merely Stupid. The weirdest aspect, perhaps, is that it’s being done right in front of god and everybody, yet few bother to take notice.

It’s no stretch to assert that in the U.S. in particular (but also true of nearly every regime across the world), we’re piling stupidity upon stupidity. If I were inclined to go full conspiracy like some QAnon fool, I’d have to say that the power elite have adopted a deep, 3D-chess strategy that means one of two possible things using the Rock-Paper-Scissors power dynamic algorithm (which, unlike tic-tac-toe, produces a winner) modified and inverted to Strong-Stupid-Smart: it’s either (1) very Smart of them to appear so Stupid, granting victory (against all appearances) over Strong (but only Strong in a three-legged contest), or (2) they reject the algorithm entirely in the misguided belief that nuthin’ beats stoopid. That second option would indeed be entirely consistent with Stupid.

Take for instance three looming issues: the pandemic (and its follow-on effects), the U.S. presidential election (ugh, sorry, it’s unavoidable), and climate change. They loom threateningly despite being well underway already. But with each, we’ve acted and behaved very stupidly, stunningly so I would argue, boxing ourselves in and doing worse damage over time than if we had taken proper steps early on. But as suggested in a previous blog post, the truth is that decision-makers haven’t really even tried to address these issues with the purpose of solving, resolving, winning, remedying, or ameliorating entirely predictable outcomes. Rather, issues are being either swept under the rug (ignored with the futile hope that they will go away or resolve themselves on their own) or displaced in time for someone else to handle. This second option occurs quite a lot, which is also known as kicking the can down the road or stealing from the future (as with sovereign debt). What happens when there’s no more future (for humans and their institutions, anyway) because it’s been squandered in the present? You already know the answer(s) to that question.