Archive for the ‘Education’ Category

With each successive election cycle, I become more cynical (how is that even possible?) about the candidates and their supposed earnest mission to actually serve the public interest. The last couple cycles have produced a new meme that attempts to shift blame for poor governance to the masses: the low-information voter. Ironically, considering the fact that airwaves, magazines, books, public addresses, online venues, and even dinner conversations (such as they still exist if diners aren’t face-planted in their screens) are positively awash in political commentary and pointless debate and strategizing, there is no lack of information available. However, being buried under a déluge of information is akin to a defense attorney hiding damning discovery in an ocean of irrelevance, so I have some sympathy for voters who are thwarted in attempts to make even modestly informed decisions about political issues.

Multiply this basic relationship across many facets of ordinary life and the end result is the low-information citizen (also low-information consumer). Some parties (largely sellers of things, including ideas) possess a profusion of information, whereas useful, actionable information is hidden from the citizen/consumer by an information avalanche. For example, onerous terms of an insurance contract, debt instrument, liability waiver, or even routine license agreement are almost never read prior to signing or otherwise consenting; the acronym tl;dr (stands for “too long; didn’t read”) applies. In other situations, information is withheld entirely, such as pricing comparisons one might undertake if high-pressure sales tactics were not deployed to force potential buyers in decisions right here, right now, dammit! Or citizens are disempowered from exercising any critical judgment by erecting secrecy around a subject, national security being the utility excuse for everything the government doesn’t want people to know.

Add to this the concerted effort (plain enough to see if one bothers to look) to keep the population uneducated, without options and alternatives, scrambling just to get through the day/week/month (handily blocking information gathering), and thus trapped in a condition of low information. Such downward pressure (survival pressure, one might say when considering the burgeoning homeless population) is affecting a greater portion of the population than ever. The American Dream that energized and buoyed the lives of many generations of people (including immigrants) has morphed into the American Nightmare. Weirdly, the immigrant influx has not abated but rather intensified. However, I consider most of those folks (political, economic, and ecological) refugees, not immigrants.

So those are the options available to powers players, where knowledge is power: (1) withhold information, (2) if information can’t be withheld, then bury it as a proverbial needle in a haystack, and (3) render a large percentage of the public unable to process and evaluate information by keeping them undereducated. Oh, here’s another: (4) produce a mountain of mis- and disinformation that bewilders everyone. This last one is arguably the same as (2) except that the intent is propaganda or psyop. One could also argue that miseducating the public (e.g., various grievance studies blown into critical race theory now being taught throughout the educational system) is the same as undereducating. Again, intent matters. Turning someone’s head and radicalizing them with a highly specialized toolkit (mostly rhetorical) for destabilizing social relations is tantamount to making them completely deranged (if not merely bewildered).

These are elements of the ongoing epistemological crisis I’ve been observing for some time now, with the side effect of a quick descent into social madness being used to justify authoritarian (read: fascist) concentration of power and rollback of individual rights and freedoms. The trending term sensemaking also applies, referring to reality checks needed to keep oneself aligned with truth, which is not the same as consensus. Groups are forming up precisely for that purpose, centered on evidentiary rigor as well as skepticism toward the obvious disinformation issuing from government agencies and journalists who shape information according to rather transparent brazen agendas. I won’t point to any particular trusted source but instead recommend everyone do their best (no passivity!) to keep their wits about them and think for themselves. Not an easy task when the information environment is so thoroughly polluted — one might even say weaponized — that it requires special workarounds to navigate effectively.

Guy McPherson used to say in his presentations that we’re all born into bondage, meaning that there is no escape from Western civilization and its imperatives, including especially participation in the money economy. The oblique reference to chattel slavery is clumsy, perhaps, but the point is nonetheless clear. For all but a very few, civilization functions like Tolkien’s One Ring, bringing everyone ineluctably under its dominion. Enlightenment cheerleaders celebrate that circumstance and the undisputed material and technological (same thing, really) bounties of the industrial age, but Counter-Enlightenment thinkers recognize reasons for profound discontent. Having blogged at intervals about the emerging Counter-Enlightenment and what’s missing from modern technocratic society, my gnawing guilt by virtue of forced participation in the planet-killing enterprise of industrial civilization is growing intolerable. Skipping past the conclusion drawn by many doomers that collapse and ecocide due to unrestrained human consumption of resources (and the waste stream that follows) have already launched a mass extinction that will extirpate most species (including large mammals such as humans), let me focus instead on gross dysfunction occurring at levels falling more readily within human control.

An Empire of War

Long overdue U.S. troop withdrawal from Afghanistan has already yielded Taliban resurgence, which was a foregone conclusion at whatever point U.S. troops left (and before them, Soviets). After all, the Taliban lives there and had only to wait. Distasteful and inhumane as it may be to Westerners, a powerful faction (religious fanatics) truly wants to live under a 7th-century style of patriarchy. Considering how long the U.S. occupied the country, a new generation of wannabe patriarchs came to adulthood — an unbroken intergenerational descent. Of course, the U.S. (and others) keeps arming them. Indeed, I heard that the U.S. military is considering bombing raids to destroy the war machines left behind as positions were so swiftly abandoned. Oops, too late! This is the handiest example how failed U.S. military escapades extending over decades net nothing of value to anyone besides weapons and ordnance manufacturers and miserable careerists within various government branches and agencies. The costs (e.g., money, lives, honor, sanity) are incalculable and spread with each country where the American Empire engages. Indeed, the military-industrial complex chooses intervention and war over peace at nearly every opportunity (though careful not to poke them bears too hard). And although the American public’s inability to affect policy (unlike the Vietnam War era) doesn’t equate with participation, the notion that it’s a government of the people deposits some of the blame on our heads anyway. My frustration is that nothing is learned and the same war crimes mistakes keep being committed by maniacs who ought to know better.

Crony and Vulture Capitalism

Critics of capitalism are being proven correct far more often than are apologists and earnest capitalists. The two subcategories I most deplore are crony capitalism and vulture capitalism, both of which typically accrue to the benefit of those in no real need of financial assistance. Crony capitalism is deeply embedded within our political system and tilts the economic playing field heavily in favor of those willing to both pay for and grant favors rather than let markets sort themselves out. Vulture capitalism extracts value out of dead hosts vulnerable resource pools by attacking and often killing them off (e.g., Microsoft, Walmart, Amazon), or more charitably, absorbing them to create monopolies, often by hostile takeover at steep discounts. Distressed mortgage holders forced into short sales, default, and eviction is the contemporary example. Rationalizing predatory behavior as competition is deployed regularly.

Other historical economic systems had similarly skewed hierarchies, but none have reached quite the same heartless, absurd levels of inequality as late-stage capitalism. Pointing to competing systems and the rising tide that lifts all boats misdirects people to make ahistorical comparisons. Human psychology normally restricts one’s points of comparison to contemporaries in the same country/region. Under such narrow comparison, the rank injustice of hundred-billionaires (or even simply billionaires) existing at the same time as giant populations of political/economic/climate refugees and the unhoused (the new, glossy euphemism for homelessness) demonstrates the soul-forfeiting callousness of the top quintile and/or 1% — an ancient lesson never learned. Indeed, aspirational nonsense repackages suffering and sells it back to the underclass, which as a matter of definition will always exist but need not have to live as though on an entirely different planet from Richistan.

Human Development

Though I’ve never been a big fan of behaviorism, the idea that a hypercomplex stew of influences, inputs, and stimuli leads to better or worse individual human development, especially in critical childhood years but also throughout life, is pretty undeniable. As individuals aggregate into societies, the health and wellbeing of a given society is linked to the health and wellbeing of those very individuals who are understood metaphorically as the masses. Behaviorism would aim to optimize conditions (as if such a thing were possible), but because American institutions and social systems have been so completely subordinated to capitalism and its distortions, society has stumbled and fumbled from one brand of dysfunction to another, barely staying ahead of revolution or civil war (except that one time …). Indeed, as the decades have worn on from, say, the 1950s (a nearly idyllic postwar reset that looms large in the memories of today’s patrician octogenarians), it’s difficult to imaging how conditions could have deteriorated any worse other than a third world war.

Look no further than the U.S. educational system, both K–12 and higher ed. As with other institutions, education has had its peaks and valleys. However, the crazy, snowballing race to the bottom witnessed in the last few decades is utterly astounding. Stick a pin in it: it’s done. Obviously, some individuals manage to get educated (some doing quite well, even) despite the minefield that must be navigated, but the exception does not prove the rule. Countries that value quality education (e.g., Finland, China, Singapore, Japan, South Korea) in deed, not just in empty words trotted out predictably by every presidential campaign, routinely trounce decidedly middling results in the U.S. and reveal that dysfunctional U.S. political systems and agencies (Federal, state, municipal) just can’t get the job done properly anymore. (Exceptions are always tony suburbs populated by high-earning and -achieving parents who create opportunities and unimpeded pathways for their kids.) Indeed, the giant babysitting project that morphs into underclass school-to-prison and school-to-military service (cannon fodder) pipelines are what education has actually become for many. The opportunity cost of failing to invest in education (or by proxy, American youth) is already having follow-on effects. The low-information voter is not a fiction, and it extends to every American institution that requires clarity to see through the fog machine operated by the mainstream media.

As an armchair social critic, I often struggle to reconcile how history unfolds without a plan, and similarly, how society self-organizes without a plan. Social engineering gets a bad rap for reasons: it doesn’t work (small exceptions exist) and subverts the rights and freedoms of individuals. However, the rank failure to achieve progress (in human terms, not technological terms) does not suggest stasis. By many measures, the conditions in which we live are cratering. For instance, Dr. Gabor Maté discusses the relationship of stress to addiction in a startling interview at Democracy Now! Just how bad is it for most people?

… it never used to be that children grew up in a stressed nuclear family. That wasn’t the normal basis for child development. The normal basis for child development has always been the clan, the tribe, the community, the neighborhood, the extended family. Essentially, post-industrial capitalism has completely destroyed those conditions. People no longer live in communities which are still connected to one another. People don’t work where they live. They don’t shop where they live. The kids don’t go to school, necessarily, where they live. The parents are away most of the day. For the first time in history, children are not spending most of their time around the nurturing adults in their lives. And they’re spending their lives away from the nurturing adults, which is what they need for healthy brain development.

Does that not sound like self-hobbling? A similar argument can be made about human estrangement from the natural world, considering how rural-to-urban migration (largely completed in the U.S. but accelerating in the developing world) has rendered many Americans flatly unable to cope with, say, bugs and dirt and labor (or indeed most any discomfort). Instead, we’ve trapped ourselves within a society that is, as a result of its organizing principles, slowly grinding down everyone and everything. How can any of us (at least those of us without independent wealth) choose not to participate in this wretched concatenation? Nope, we’re all guilty.

So far, this multipart blog post has trafficked in principles and generalities. Let me try now to be more specific, starting with an excerpt from Barry Lynn’s article in Harper’s Magazine titled “The Big Tech Extortion Racket” (Sept. 2020):

… around the middle of the nineteenth century, Americans began to develop technologies that could not be broken into component pieces. This was especially true of the railroad and the telegraph … Such corporations [railroad and telegraph companies] posed one overarching challenge: they charged some people more than others to get to market. They exploited their control over an essential service in order to extort money, and sometimes political favors … Americans found the answer to this problem in common law. For centuries, the owners of ferries, stagecoaches, and inns had been required to serve all customers for the same price and in the order in which they arrived. In the late nineteenth century, versions of such “common carrier” rules were applied to the new middleman corporations.

Today we rightly celebrate the Sherman Antitrust Act of 1890, which gave Americans the power to break apart private corporations. But in many respects, the Interstate Commerce Act of 1887 was the more important document. This act was based on the understanding that monopoly networks like the railroad and the telegraph could be used to influence the actions of people who depend on them, and hence their power must be carefully restricted …

For a century and a half, Americans used common carrier policies to ensure the rule of law in activities that depended on privately held monopolies … regulations freed Americans to take full advantage of every important network technology introduced during these years, including telephones, water and electrical services, energy pipelines, and even large, logistics-powered retailers. Citizens did not have to worry that the men who controlled the technologies involved would exploit their middleman position to steal other people’s business or disrupt balances of power.

I appreciate that Barry Lynn brings up the Interstate Commerce Act. If this legal doctrine appeared in the net neutrality debate a few years ago, it must have escaped my notice. While Internet Service Providers (ISPs) enable network access and connectivity, those utilities have not yet exhibited let’s-be-evil characteristics. Similarly, phone companies (including cell phones) and public libraries may well be eavesdropping and/or monitoring activities of the citizenry, but the real action lies elsewhere, namely, on social media networks and with online retailers. Evil is arguably concentrated in the FANG (or FAANG) corporations but has now grown to be ubiquitous in all social networks (e.g., Twitter) operating as common carriers (Zoom? Slack?) and across academe, nearly all of which have succumbed to moral panic. They are interpreting correctly, sad to observe, demands to censor and sanitize others’ no-longer-free speech appearing on their networks or within their realms. How much deeper it goes toward shaping politics and social engineering is quasi-conspiratorial and impossible for me to assess.

Much as I would prefer to believe that individuals possess the good sense to shift their activities away from social networks or turn their attention from discomfiting information sources, that does not appear to be the case. Demands for trigger warnings and safe spaces commonplace a few years ago on college campuses have instead morphed into censorious removal, deplatforming, and cancellation from the entire public sphere. Those are wrong responses in free societies, but modern institutions and technologies have gotten out of hand and outstripped the limits of normal human cognition. In short, we’re a society gone mad. So rather than accept responsibility to sort out information overflow oneself, many are demanding that others do it for them, and evil private corporations are complying (after a fashion). Moreover, calls for creation of an Orwellian Ministry of Truth, rebranded as a Truth Commission and Reality Czar, could hardly be any more chillingly and fascistically bizarre. People really need someone to brainwash decide for them what is real? Has anyone at the New York Times actually read Orwell’s dystopian novel 1984 and taken to heart its lessons?

From an article in the Sept. 2020 issue (I’m lagging in my reading) of Harper’s Magazine by Laurent Dubreuil titled “Nonconforming“:

American academia is a hotbed of proliferating identities supported and largely shaped by the higher ranks of administrators, faculty, student groups, alumni, and trustees. Not all identities are equal in dignity, history, or weight. Race, gender, and sexual orientation were the three main dimensions of what in the 1970s began to be called identity politics. These traits continue to be key today. But affirmed identities are mushrooming.

… identity politics as now practiced does not put an end to racism, sexism, or other sorts of exclusion or exploitation. Ready-made identities imprison us in stereotyped narratives of trauma. In short, identity determinism has become an additional layer of oppression, one that fails to address the problems it clumsily articulates.

Considering the acceleration of practically everything in the late-modern world (postmodern refers to something quite different), which makes planning one’s higher education somewhat fraught if the subject matter studied is rendered flatly out-of-date or moribund by the time of either graduation or entry into the workforce, I’ve heard it recommended that expertise in any particular subject area may be less important than developing expertise in at least one subject that takes a systems approach. That system might be language and communications, mathematics (or any other hard science), history, economics and finance, business administration, computer coding, law and governance, etc. So long as a rigorous understanding of procedures and rules is developed, a structuralist mindset can be repeated and transferred into other subject areas. Be careful, however, not to conflate this approach with a liberal arts education, which is sometimes described as learning how to learn and is widely applicable across disciplines. The liberal arts have fallen distinctly out of favor in the highly technological and technocratic world, which cares little for human values resistant to quantification. Problem is, Western societies in particular are based on liberal democratic institutions now straining due to their sclerotic old age. And because a liberal arts education is scarcely undertaken anymore, civics and citizenship are no longer taught. Even the study of English has now been corrupted (postmodern does apply here) to the point that the basic liberal arts skill of critical thinking is being lost through attrition. Nowhere is that more abundantly clear than in bristling debate over free speech and censorship.

Aside. Although society tinkers and refines itself (sometimes declines) over time, a great body of cultural inheritance informs how things are done properly within an ideology or system. When tinkering and refinement become outright intransigence and defiance of an established order, it’s commonplace to hear the objection “but that’s not how _______ works.” For instance, debate over climate science or the utility of vaccines often has one party proclaiming “trust [or believe] the science.” However, that’s not how science works (i.e., through unquestioning trust or belief). The scientific method properly understood includes verification, falsification, and revision when results and assertions fail to establish reasonable certainty (not the same as consensus). Similarly, critical thinking includes a robust falsification check before “facts” can be accepted at face value. So-called “critical studies” (a/k/a grievance studies), like religious faith, typically positions bald assertions beyond the reach of falsification. Well, sorry, that’s not how critical thinking works.

Being older and educated before critical studies were fully legitimized (or gave rise to things as risible as feminist glaciology), my understand has always been that free speech and other rights are absolutes that cannot be sliced and diced into bits. That way lies casuistry, where law founders frequently. Thus, if one wishes, say, to trample or burn the U.S. flag in protest, no law can be passed or constitutional amendment enacted to carve out an exception disallowed that instance of dissenting free speech. A lesser example is kneeling silently rather than participating in singing the national anthem before a sporting event. Though offensive to certain individual’s sensibilities, silencing speech is far worse according to liberal democratic values. Whatever our ideological or political difference are, we cannot work them out when one party has the power to place topics out or bounds or remove others from discussion entirely. The point at which spirited debate crosses over into inciting violence or fomenting insurrection is a large gray area, which is the subject of the second impeachment of 45. Civil law covers such contingencies, so abridging free speech, deplatforming, and adopting the formulation “language is violence” are highly improper responses under the liberal form of government codified in the U.S. Constitution, which includes the Bill of Rights originally omitted from the U.S. Constitution but quickly added to articulate the rights fully.

Liberal democratic ideology arose in mercantile, substantially agrarian Western societies before scientific, industrial, and capitalist revolutions built a full head of steam, so to speak. Considering just how much America has developed since the Colonial Period, it’s no surprise society has outgrown its own founding documents. More pointedly, the intellectual commons was a much smaller environment, often restricted to a soapbox in the town square and the availability of book, periodicals,and broadsides. Today, the public square has moved online to a bewildering array of social media platforms that enables publication of one’s ideas well beyond the sound of one’s voice over a crowd or the bottleneck of a publisher’s printing press. It’s an entirely new development, and civil law has not kept pace. Whether Internet communications are regulated like the airwaves or nationalized like the U.S. military, it’s clear that the Wild West uber-democratic approach (where anyone can basically say anything) has failed. Demands for regulation (restrictions on free speech) are being taken seriously and acted upon by the private corporations that run social media platforms. During this interim phase, it’s easy for me, as a subscriber to liberal democratic values, to insist reflexively on free speech absolutism. The apparent mood of the public lies elsewhere.

Something in an online discussion brought me back to my days as a Boy Scout. (No, not that, with your nasty, nasty assumptions.) It was one of the first merit badges I earned: Citizenship in the Community (link to PDF). I can’t remember any of the content anymore (haven’t yet consulted the PDF), and indeed, looking back with the advantage of several decades of hindsight, I have a hard time imagining any of the (morality? ethics?) lessons learned back then having had much durable impact despite remembering an emerging confidence and awareness (a commonplace delusion of youth) of my position within the community. Still, I appreciate having had many Boy Scout character-building experiences, which led to simple and enduring understandings of ideals such as honor, duty, preparedness, service, forbearance, shouldering hardships, and perhaps most of all, accepting responsibility for others, particularly those younger and weaker. (I’m not claiming to be any sort of paragon of virtue. Cynicism and misanthropy may have wrecked that aspiration.) I never served in the military, but I surmise others learn similar lessons slightly later in life when more readily absorbed and not so easily forgotten. In the past decade plus, some may seek these lessons through participation in endurance sports or martial arts (if not distorted by bad instruction like in Cobra Kai), though the focus outward (i.e., toward community and mutual reliance) may not be as strong.

The subject came up in a discussion of participants in small-scale democracy, something I’ve always known is messy, unrewarding, thankless, and sometimes costly yet still necessary to be a good citizen contributing to one’s community. Many adults get their first taste of local democratic groups (read: self-governing) through parent groups like the Parent-Teacher Association (PTA). Or maybe it’s a performing arts organization, home owner’s association, church council, social work hotline, self-help group, or cooperative. Doesn’t matter which. (Political activism and organizing might be something quite different. Hard to say.) Groups run on the good will and dedication of volunteered time and skills for the benefit of members of the community. As with any population, there are always free riders: those who contribute nothing but enjoy and/or extract benefits. In fact, if everyone were integrally involved, organizational complexity would become unmanageable. If activities of such groups seem like a piece of cake or vaguely utopian, just join one and see how different character types behave. Lotta dead wood in such organization. Moreover, power mongers and self-aggrandizers often take over small-scale democracies and run them like private fiefdoms. Or difficult policy and finance discussions divide otherwise like-minded groups into antagonists. As I said, it’s a decidedly messy undertaking.

Members of the community outside of the executive group (typically a board of directors) also have legitimate interests. Maybe community members attend meetings to keep informed or weigh in online with unconstructive complaints and criticisms (or even mockery and trolling) but then refuse to contribute anything worthwhile. Indeed, boards often have difficulty recruiting new officers or participants because no one wants to take on responsibility and face potential criticism directed at them. I’ve also seen boards settle into the same few folks year after year whose opinions and leadership grow stale and calcifies.

Writ large, leadership skills learned through citizenship in the community rise to the equivalents of Boy Scout merit badges Citizenship in the Nation and Citizenship in the World (no links but searchable). Skills deployed at those strata would arguably require even greater wherewithal and wisdom, with stakes potentially being much higher. Regrettably, having just passed through an election cycle and change of leadership in the U.S., my dour assessment is that leadership has failed miserably at multiple issues. The two most significant involve how we fail to organize society for the benefit of all, namely, economic equality and resource sustainability. Once market forces came to bear on social organization and corporate entities grew too large to be rooted in community service anymore, greed and corruption destroyed high-minded ideals. More self-aggrandizers and careerists than ever (no names, fill in the blanks, they’re all famous — or infamous) rose to the tops of organizations and administrations, especially politics, news media, and the punditry. Their logical antidotes are routinely and ruthlessly disenfranchised and/or ignored. The lasting results are financial inequality run amok and unsustainable resource addictions (energy mostly) that are toxifying the environment and reducing the landscape to ruin and inhabitability. (Perpetual war is a third institutional failure that could be halted almost immediately if moral clarity were somehow to appear.) It’s all out there, plain to see, yet continues to mount because of execrable leadership. Some argue it’s really a problem with human nature, a kind of original stain on our souls that can never be erased and so should be forgiven or at least understood (and rationalized away) within a large context. I’m not yet ready to excuse national and world leaders. Their culpability is criminal.

David Sirota, author of Back to our Future: How the 1980s Explain the World We Live in Now — Our Culture, Our Politics, Our Everything (2011), came to my attention (how else?) through a podcast. He riffed pretty entertainingly on his book, now roughly one decade old, like a rock ‘n’ roller stuck (re)playing his or her greatest hits into dotage. However, his thesis was strong and appealing enough that I picked up a copy (read: borrowed from the library) to investigate despite the datedness of the book (and my tardiness). It promised to be an easy read.

Sirota’s basic thesis is that memes and meme complexes (a/k/a memeplexes, though Sirota never uses the term meme) developed in the 80s and deployed through a combination of information and entertainment media (thus, infotainment) form the narrative background we take for granted in the early part of the 20th century. Children fed a steady diet of clichés, catchphrases, one-liners, archetypes, and story plots have now grown to adulthood and are scarcely able to peer behind the curtain to question the legitimacy or subtext of the narrative shapes and distortions imbibed during childhood like mother’s milk. The table of contents lists four parts (boldface section titles are Sirota’s; descriptive text is mine):

  • Liking Ike, Hating Woodstock. How the 50s and 60s decades were (the first?) assigned reductive demographic signifiers, handily ignoring the true diversity of experience during those decades. More specifically, the boom-boom 50s (economics, births) were recalled nostalgically in 80s TV and films while the 60s were recast as being all about those dirty, hairy hippies and their music, drugs, and sexual licentiousness, all of which had to be invalidated somehow to regain lost wholesomeness. The one-man promotional vehicle for this pleasing self-deception was Michael J. Fox, whose screen personae (TV and film) during the 80s (glorifying the 50s but openly shitting on the 60s) were instrumental in reforming attitudes about our mixed history.
  • The Jump Man Chronicles. How the Great Man Theory of History was developed through glorification of heroes, rogues, mavericks, and iconoclasts who came into their own during the 80s. That one-man vehicle was Michael Jordan, whose talents and personal magnetism were so outsized that everyone aspired to be “like Mike,” which is to say, a superhero elevated beyond mere mortal rules and thus immortalized. The effect was duplicated many times over in popular culture, with various entertainment icons and political operatives subverting thoughtful consideration of real-world problems in favor of jingoistic portrayals.
  • Why We (Continue to) Fight. How the U.S. military was rehabilitated after losing the Vietnam War, gifting us with today’s hypermilitarism and permanent wars. Two principal tropes were deployed to shape public opinion: the Legend of the Spat upon Veteran and the Hands Tied Behind Their Backs Myth. Each was trotted out reliably whenever we needed to misremember our past as fictionalized in the 80s.
  • The Huxtable Effect. How “America’s dad” helped accommodate race relations to white anxiety, primarily to sell a TV show. In contrast with various “ghetto TV” shows of the 70s that depicted urban working poor (various ethnicities), The Cosby Show presented an upscale black family who transcended race by simply ignoring the issue — a privilege of wealth and celebrity. The Obama campaign and subsequent administration copied this approach, pretending American society had become postracial despite his never truly being able to escape the modifier black because the default (no modifier needed) in America is always white. This is the most fraught part of the book, demonstrating that despite whatever instructions we get from entertainment media and pundits, we remain stuck in an unresolved, unhealed, inescapable trap.

(more…)

I’ve mentioned the precautionary principle several times, most notably here. Little of our approach to precautions has changed in the two years since that blog post. At the same time, climate change and Mother Nature batter us aggressively. Eventualities remain predictable. Different precautions are being undertaken with respect to the pandemic currently gripping the planet. Arguably, the pandemic is either a subset of Mother Nature’s fury or, if the virus was created in a lab, a self-inflicted wound. Proper pandemic precautions have been confounded by undermining of authority, misinformation, lack of coordination, and politically biased narratives. I’m as confused as the next poor sap. However, low-cost precautions such as wearing masks are entirely acceptable, notwithstanding refusals of many Americans to cooperate after authorities muddied the question of their effectiveness so completely. More significant precautions such as lockdowns and business shutdowns have morphed into received wisdom among government bodies yet are questioned widely as being a cure worse than the disease, not to mention administrative overreach (conspiratorial conjecture withheld).

Now comes evidence published in the New England Journal of Medicine on November 11, 2020, that costly isolation is flatly ineffective at stemming infection rates. Here are the results and conclusions from the abstract of the published study:

Results
A total of 1848 recruits volunteered to participate in the study; within 2 days after arrival on campus, 16 (0.9%) tested positive for SARS-CoV-2, 15 of whom were asymptomatic. An additional 35 participants (1.9%) tested positive on day 7 or on day 14. Five of the 51 participants (9.8%) who tested positive at any time had symptoms in the week before a positive qPCR test. Of the recruits who declined to participate in the study, 26 (1.7%) of the 1554 recruits with available qPCR results tested positive on day 14. No SARS-CoV-2 infections were identified through clinical qPCR testing performed as a result of daily symptom monitoring. Analysis of 36 SARS-CoV-2 genomes obtained from 32 participants revealed six transmission clusters among 18 participants. Epidemiologic analysis supported multiple local transmission events, including transmission between roommates and among recruits within the same platoon.
Conclusions
Among Marine Corps recruits, approximately 2% who had previously had negative results for SARS-CoV-2 at the beginning of supervised quarantine, and less than 2% of recruits with unknown previous status, tested positive by day 14. Most recruits who tested positive were asymptomatic, and no infections were detected through daily symptom monitoring. Transmission clusters occurred within platoons.

So an initial 0.9% tested positive, then an additional 1.9%. This total 2.8% compares to 1.7% in the control group (tested but not isolated as part of the study). Perhaps the experimental and control groups are a bit small (1848 and 1554, respectively), and it’s not clear why the experimental group infection rate is higher than that of the control group, but the evidence points to the uselessness of trying to limit the spread of the virus by quarantining and/or isolation. Once the virus is present in a population, it spreads despite precautions.

A mantra is circulating that we should “trust the science.” Are these results to be trusted? Can we call off all the lockdowns and closures? It’s been at least eight months that the virus has been raging throughout the U.S. Although there might be some instances of isolated populations with no infection, the wider population has by now been exposed. Moreover, some individuals who self-isolated effectively may not have been exposed, but in all likelihood, most of us have been. Accordingly, renewed lockdowns, school and business closures, and destruction of entire industries are a pretense of control we never really had. Their costs are enormous and ongoing. A stay-at-home order (advisory, if you prefer) just went into effect for the City of Chicago on November 16, 2020. My anecdotal observation is that most Chicagoans are ignoring it and going about their business similar to summer and fall months. It’s nothing like the ghost town effect of March and April 2020. I daresay they may well be correct to reject the received wisdom of our civic leaders.

Caveat: rather overlong for me, but I got rolling …

One of the better articles I’ve read about the pandemic is this one by Robert Skidelsky at Project Syndicate (a publication I’ve never heard of before). It reads as only slightly conspiratorial, purporting to reveal the true motivation for lockdowns and social distancing, namely, so-called herd immunity. If that’s the case, it’s basically a silent admission that no cure, vaccine, or inoculation is forthcoming and the spread of the virus can only be managed modestly until it has essentially raced through the population. Of course, the virus cannot be allowed to simply run its course unimpeded, but available impediments are limited. “Flattening the curve,” or distributing the infection and death rates over time, is the only attainable strategy and objective.

Wedding mathematical and biological insights, as well as the law of mass action in chemistry, into an epidemic model may seem obvious now, but it was novel roughly a century ago. We’re also now inclined, if scientifically oriented and informed, to understand the problem and its potential solutions management in terms of engineering rather than medicine (or maybe in terms of triage and palliation). Global response has also made the pandemic into a political issue as governments obfuscate and conceal true motivations behind their handling (bumbling in the U.S.) of the pandemic. Curiously, the article also mentions financial contagion, which is shaping up to be worse in both severity and duration than the viral pandemic itself.

(more…)

In educational philosophy, learning is often categorized in three domains: the cognitive, the affective, and the psychomotor (called Bloom’s Taxonomy). Although formal education admittedly concentrates primarily on the cognitive domain, a well-rounded person gives attention to all three. The psychomotor domain typically relates to tool use and manipulation, but if one considers the body itself a tool, then athletics and physical workouts are part of a balanced approach. The affective domain is addressed through a variety of mechanisms, not least of which is narrative, much of it entirely fictional. We learn how to process emotions through vicarious experience as a safe way to prepare for the real thing. Indeed, dream life is described as the unconscious mind’s mechanism for consolidating memory and experience as well as rehearsing prospective events (strategizing) in advance. Nightmares are, in effect, worst-case scenarios dreamt up for the purpose of avoiding the real thing (e.g., falling from a great height or venturing too far into the dark — a proxy for the unknown). Intellectual workouts address the cognitive domain. While some are happy to remain unbalanced, focusing on strengths found exclusively in a single domain (gym rats, eggheads, actors) and thus remaining physically, emotionally, or intellectually stunted or immature, most understand that workouts in all domains are worth seeking out as elements of healthy development.

One form of intellectual workout is debate, now offered by various media and educational institutions. Debate is quite old but has been embraced with renewed gusto in a quest to develop content (using new media) capable of drawing in viewers, which mixes educational objectives with commercial interests. The time-honored political debate used to be good for determining where to cast one’s vote but has become nearly useless in the last few decades as neither the sponsoring organizations, the moderators, nor the candidates seem to understand anymore how to run a debate or behave properly. Instead, candidates use the opportunity to attack each other, ignore questions and glaring issues at hand, and generally refuse to offer meaningful responses to the needs of voters. Indeed, this last was among the principal innovations of Bill Clinton: roll out some appealing bit of vacuous rhetoric yet offer little to no guidance what policies will actually be pursued once in office. Two presidential administrations later, Barack Obama did much the same, which I consider a most egregious betrayal or bait-and-switch. Opinions vary.

In a recent Munk Debate, the proposition under consideration was whether humankind’s best days lie ahead or behind. Optimists won the debate by a narrow margin (determined by audience vote); however, debate on the issue is not binding truth, nor does debate really resolve the question satisfactorily. The humor and personalities of the debaters probably had more influence than their arguments. Admitting that I possess biases, I found myself inclined favorably toward the most entertaining character, though what I find entertaining is itself further bias not shared especially with many others. In addition, I suspect the audience did not include many working class folks or others who see their prospects for better lives diminishing rapidly, which skews the resulting vote. The age-old parental desire to leave one’s children a better future than their own is imperiled according to this poll (polls may vary considerably — do your own search). How one understands “better off” is highly variable, but the usual way that’s understood is in terms of material wellbeing.

Folks on my radar (names withheld) range widely in their enthusiasm or disdain for debate. The poles appears to be default refusal to accept invitations to debate (often couched as open challenges to professed opinions) as a complete waste of time to earnest desire to participate in, host, and/or moderate debates as a means of informing the public by providing the benefit of expert argumentation. As an intellectual workout, I appreciate the opportunity to hear debates (at least when I’m not exasperated by a speaker’s lack of discipline or end-around arguments), but readers can guess from the title of this post that I expect nothing to be resolved by debate. Were I ever to be offered an opportunity to participate, I can well imagine accepting the invitation and having some fun flexing my intellectual muscles, but I would enter into the event with utterly no expectation of being able to convince anyone of anything. Minds are already too well made up on most issues. If I were offered a spot on some bogus news-and-opinion show to be a talking head, shot from the shoulders up and forced to shout and interrupt to get a brief comment or soundbite in edgewise, that I would decline handily as a total waste of time.