Archive for the ‘Education’ Category

So far, this multipart blog post has trafficked in principles and generalities. Let me try now to be more specific, starting with an excerpt from Barry Lynn’s article in Harper’s Magazine titled “The Big Tech Extortion Racket” (Sept. 2020):

… around the middle of the nineteenth century, Americans began to develop technologies that could not be broken into component pieces. This was especially true of the railroad and the telegraph … Such corporations [railroad and telegraph companies] posed one overarching challenge: they charged some people more than others to get to market. They exploited their control over an essential service in order to extort money, and sometimes political favors … Americans found the answer to this problem in common law. For centuries, the owners of ferries, stagecoaches, and inns had been required to serve all customers for the same price and in the order in which they arrived. In the late nineteenth century, versions of such “common carrier” rules were applied to the new middleman corporations.

Today we rightly celebrate the Sherman Antitrust Act of 1890, which gave Americans the power to break apart private corporations. But in many respects, the Interstate Commerce Act of 1887 was the more important document. This act was based on the understanding that monopoly networks like the railroad and the telegraph could be used to influence the actions of people who depend on them, and hence their power must be carefully restricted …

For a century and a half, Americans used common carrier policies to ensure the rule of law in activities that depended on privately held monopolies … regulations freed Americans to take full advantage of every important network technology introduced during these years, including telephones, water and electrical services, energy pipelines, and even large, logistics-powered retailers. Citizens did not have to worry that the men who controlled the technologies involved would exploit their middleman position to steal other people’s business or disrupt balances of power.

I appreciate that Barry Lynn brings up the Interstate Commerce Act. If this legal doctrine appeared in the net neutrality debate a few years ago, it must have escaped my notice. While Internet Service Providers (ISPs) enable network access and connectivity, those utilities have not yet exhibited let’s-be-evil characteristics. Similarly, phone companies (including cell phones) and public libraries may well be eavesdropping and/or monitoring activities of the citizenry, but the real action lies elsewhere, namely, on social media networks and with online retailers. Evil is arguably concentrated in the FANG (or FAANG) corporations but has now grown to be ubiquitous in all social networks (e.g., Twitter) operating as common carriers (Zoom? Slack?) and across academe, nearly all of which have succumbed to moral panic. They are interpreting correctly, sad to observe, demands to censor and sanitize others’ no-longer-free speech appearing on their networks or within their realms. How much deeper it goes toward shaping politics and social engineering is quasi-conspiratorial and impossible for me to assess.

Much as I would prefer to believe that individuals possess the good sense to shift their activities away from social networks or turn their attention from discomfiting information sources, that does not appear to be the case. Demands for trigger warnings and safe spaces commonplace a few years ago on college campuses have instead morphed into censorious removal, deplatforming, and cancellation from the entire public sphere. Those are wrong responses in free societies, but modern institutions and technologies have gotten out of hand and outstripped the limits of normal human cognition. In short, we’re a society gone mad. So rather than accept responsibility to sort out information overflow oneself, many are demanding that others do it for them, and evil private corporations are complying (after a fashion). Moreover, calls for creation of an Orwellian Ministry of Truth, rebranded as a Truth Commission and Reality Czar, could hardly be any more chillingly and fascistically bizarre. People really need someone to brainwash decide for them what is real? Has anyone at the New York Times actually read Orwell’s dystopian novel 1984 and taken to heart its lessons?

From an article in the Sept. 2020 issue (I’m lagging in my reading) of Harper’s Magazine by Laurent Dubreuil titled “Nonconforming“:

American academia is a hotbed of proliferating identities supported and largely shaped by the higher ranks of administrators, faculty, student groups, alumni, and trustees. Not all identities are equal in dignity, history, or weight. Race, gender, and sexual orientation were the three main dimensions of what in the 1970s began to be called identity politics. These traits continue to be key today. But affirmed identities are mushrooming.

… identity politics as now practiced does not put an end to racism, sexism, or other sorts of exclusion or exploitation. Ready-made identities imprison us in stereotyped narratives of trauma. In short, identity determinism has become an additional layer of oppression, one that fails to address the problems it clumsily articulates.

Considering the acceleration of practically everything in the late-modern world (postmodern refers to something quite different), which makes planning one’s higher education somewhat fraught if the subject matter studied is rendered flatly out-of-date or moribund by the time of either graduation or entry into the workforce, I’ve heard it recommended that expertise in any particular subject area may be less important than developing expertise in at least one subject that takes a systems approach. That system might be language and communications, mathematics (or any other hard science), history, economics and finance, business administration, computer coding, law and governance, etc. So long as a rigorous understanding of procedures and rules is developed, a structuralist mindset can be repeated and transferred into other subject areas. Be careful, however, not to conflate this approach with a liberal arts education, which is sometimes described as learning how to learn and is widely applicable across disciplines. The liberal arts have fallen distinctly out of favor in the highly technological and technocratic world, which cares little for human values resistant to quantification. Problem is, Western societies in particular are based on liberal democratic institutions now straining due to their sclerotic old age. And because a liberal arts education is scarcely undertaken anymore, civics and citizenship are no longer taught. Even the study of English has now been corrupted (postmodern does apply here) to the point that the basic liberal arts skill of critical thinking is being lost through attrition. Nowhere is that more abundantly clear than in bristling debate over free speech and censorship.

Aside. Although society tinkers and refines itself (sometimes declines) over time, a great body of cultural inheritance informs how things are done properly within an ideology or system. When tinkering and refinement become outright intransigence and defiance of an established order, it’s commonplace to hear the objection “but that’s not how _______ works.” For instance, debate over climate science or the utility of vaccines often has one party proclaiming “trust [or believe] the science.” However, that’s not how science works (i.e., through unquestioning trust or belief). The scientific method properly understood includes verification, falsification, and revision when results and assertions fail to establish reasonable certainty (not the same as consensus). Similarly, critical thinking includes a robust falsification check before “facts” can be accepted at face value. So-called “critical studies” (a/k/a grievance studies), like religious faith, typically positions bald assertions beyond the reach of falsification. Well, sorry, that’s not how critical thinking works.

Being older and educated before critical studies were fully legitimized (or gave rise to things as risible as feminist glaciology), my understand has always been that free speech and other rights are absolutes that cannot be sliced and diced into bits. That way lies casuistry, where law founders frequently. Thus, if one wishes, say, to trample or burn the U.S. flag in protest, no law can be passed or constitutional amendment enacted to carve out an exception disallowed that instance of dissenting free speech. A lesser example is kneeling silently rather than participating in singing the national anthem before a sporting event. Though offensive to certain individual’s sensibilities, silencing speech is far worse according to liberal democratic values. Whatever our ideological or political difference are, we cannot work them out when one party has the power to place topics out or bounds or remove others from discussion entirely. The point at which spirited debate crosses over into inciting violence or fomenting insurrection is a large gray area, which is the subject of the second impeachment of 45. Civil law covers such contingencies, so abridging free speech, deplatforming, and adopting the formulation “language is violence” are highly improper responses under the liberal form of government codified in the U.S. Constitution, which includes the Bill of Rights originally omitted from the U.S. Constitution but quickly added to articulate the rights fully.

Liberal democratic ideology arose in mercantile, substantially agrarian Western societies before scientific, industrial, and capitalist revolutions built a full head of steam, so to speak. Considering just how much America has developed since the Colonial Period, it’s no surprise society has outgrown its own founding documents. More pointedly, the intellectual commons was a much smaller environment, often restricted to a soapbox in the town square and the availability of book, periodicals,and broadsides. Today, the public square has moved online to a bewildering array of social media platforms that enables publication of one’s ideas well beyond the sound of one’s voice over a crowd or the bottleneck of a publisher’s printing press. It’s an entirely new development, and civil law has not kept pace. Whether Internet communications are regulated like the airwaves or nationalized like the U.S. military, it’s clear that the Wild West uber-democratic approach (where anyone can basically say anything) has failed. Demands for regulation (restrictions on free speech) are being taken seriously and acted upon by the private corporations that run social media platforms. During this interim phase, it’s easy for me, as a subscriber to liberal democratic values, to insist reflexively on free speech absolutism. The apparent mood of the public lies elsewhere.

Something in an online discussion brought me back to my days as a Boy Scout. (No, not that, with your nasty, nasty assumptions.) It was one of the first merit badges I earned: Citizenship in the Community (link to PDF). I can’t remember any of the content anymore (haven’t yet consulted the PDF), and indeed, looking back with the advantage of several decades of hindsight, I have a hard time imagining any of the (morality? ethics?) lessons learned back then having had much durable impact despite remembering an emerging confidence and awareness (a commonplace delusion of youth) of my position within the community. Still, I appreciate having had many Boy Scout character-building experiences, which led to simple and enduring understandings of ideals such as honor, duty, preparedness, service, forbearance, shouldering hardships, and perhaps most of all, accepting responsibility for others, particularly those younger and weaker. (I’m not claiming to be any sort of paragon of virtue. Cynicism and misanthropy may have wrecked that aspiration.) I never served in the military, but I surmise others learn similar lessons slightly later in life when more readily absorbed and not so easily forgotten. In the past decade plus, some may seek these lessons through participation in endurance sports or martial arts (if not distorted by bad instruction like in Cobra Kai), though the focus outward (i.e., toward community and mutual reliance) may not be as strong.

The subject came up in a discussion of participants in small-scale democracy, something I’ve always known is messy, unrewarding, thankless, and sometimes costly yet still necessary to be a good citizen contributing to one’s community. Many adults get their first taste of local democratic groups (read: self-governing) through parent groups like the Parent-Teacher Association (PTA). Or maybe it’s a performing arts organization, home owner’s association, church council, social work hotline, self-help group, or cooperative. Doesn’t matter which. (Political activism and organizing might be something quite different. Hard to say.) Groups run on the good will and dedication of volunteered time and skills for the benefit of members of the community. As with any population, there are always free riders: those who contribute nothing but enjoy and/or extract benefits. In fact, if everyone were integrally involved, organizational complexity would become unmanageable. If activities of such groups seem like a piece of cake or vaguely utopian, just join one and see how different character types behave. Lotta dead wood in such organization. Moreover, power mongers and self-aggrandizers often take over small-scale democracies and run them like private fiefdoms. Or difficult policy and finance discussions divide otherwise like-minded groups into antagonists. As I said, it’s a decidedly messy undertaking.

Members of the community outside of the executive group (typically a board of directors) also have legitimate interests. Maybe community members attend meetings to keep informed or weigh in online with unconstructive complaints and criticisms (or even mockery and trolling) but then refuse to contribute anything worthwhile. Indeed, boards often have difficulty recruiting new officers or participants because no one wants to take on responsibility and face potential criticism directed at them. I’ve also seen boards settle into the same few folks year after year whose opinions and leadership grow stale and calcifies.

Writ large, leadership skills learned through citizenship in the community rise to the equivalents of Boy Scout merit badges Citizenship in the Nation and Citizenship in the World (no links but searchable). Skills deployed at those strata would arguably require even greater wherewithal and wisdom, with stakes potentially being much higher. Regrettably, having just passed through an election cycle and change of leadership in the U.S., my dour assessment is that leadership has failed miserably at multiple issues. The two most significant involve how we fail to organize society for the benefit of all, namely, economic equality and resource sustainability. Once market forces came to bear on social organization and corporate entities grew too large to be rooted in community service anymore, greed and corruption destroyed high-minded ideals. More self-aggrandizers and careerists than ever (no names, fill in the blanks, they’re all famous — or infamous) rose to the tops of organizations and administrations, especially politics, news media, and the punditry. Their logical antidotes are routinely and ruthlessly disenfranchised and/or ignored. The lasting results are financial inequality run amok and unsustainable resource addictions (energy mostly) that are toxifying the environment and reducing the landscape to ruin and inhabitability. (Perpetual war is a third institutional failure that could be halted almost immediately if moral clarity were somehow to appear.) It’s all out there, plain to see, yet continues to mount because of execrable leadership. Some argue it’s really a problem with human nature, a kind of original stain on our souls that can never be erased and so should be forgiven or at least understood (and rationalized away) within a large context. I’m not yet ready to excuse national and world leaders. Their culpability is criminal.

David Sirota, author of Back to our Future: How the 1980s Explain the World We Live in Now — Our Culture, Our Politics, Our Everything (2011), came to my attention (how else?) through a podcast. He riffed pretty entertainingly on his book, now roughly one decade old, like a rock ‘n’ roller stuck (re)playing his or her greatest hits into dotage. However, his thesis was strong and appealing enough that I picked up a copy (read: borrowed from the library) to investigate despite the datedness of the book (and my tardiness). It promised to be an easy read.

Sirota’s basic thesis is that memes and meme complexes (a/k/a memeplexes, though Sirota never uses the term meme) developed in the 80s and deployed through a combination of information and entertainment media (thus, infotainment) form the narrative background we take for granted in the early part of the 20th century. Children fed a steady diet of clichés, catchphrases, one-liners, archetypes, and story plots have now grown to adulthood and are scarcely able to peer behind the curtain to question the legitimacy or subtext of the narrative shapes and distortions imbibed during childhood like mother’s milk. The table of contents lists four parts (boldface section titles are Sirota’s; descriptive text is mine):

  • Liking Ike, Hating Woodstock. How the 50s and 60s decades were (the first?) assigned reductive demographic signifiers, handily ignoring the true diversity of experience during those decades. More specifically, the boom-boom 50s (economics, births) were recalled nostalgically in 80s TV and films while the 60s were recast as being all about those dirty, hairy hippies and their music, drugs, and sexual licentiousness, all of which had to be invalidated somehow to regain lost wholesomeness. The one-man promotional vehicle for this pleasing self-deception was Michael J. Fox, whose screen personae (TV and film) during the 80s (glorifying the 50s but openly shitting on the 60s) were instrumental in reforming attitudes about our mixed history.
  • The Jump Man Chronicles. How the Great Man Theory of History was developed through glorification of heroes, rogues, mavericks, and iconoclasts who came into their own during the 80s. That one-man vehicle was Michael Jordan, whose talents and personal magnetism were so outsized that everyone aspired to be “like Mike,” which is to say, a superhero elevated beyond mere mortal rules and thus immortalized. The effect was duplicated many times over in popular culture, with various entertainment icons and political operatives subverting thoughtful consideration of real-world problems in favor of jingoistic portrayals.
  • Why We (Continue to) Fight. How the U.S. military was rehabilitated after losing the Vietnam War, gifting us with today’s hypermilitarism and permanent wars. Two principal tropes were deployed to shape public opinion: the Legend of the Spat upon Veteran and the Hands Tied Behind Their Backs Myth. Each was trotted out reliably whenever we needed to misremember our past as fictionalized in the 80s.
  • The Huxtable Effect. How “America’s dad” helped accommodate race relations to white anxiety, primarily to sell a TV show. In contrast with various “ghetto TV” shows of the 70s that depicted urban working poor (various ethnicities), The Cosby Show presented an upscale black family who transcended race by simply ignoring the issue — a privilege of wealth and celebrity. The Obama campaign and subsequent administration copied this approach, pretending American society had become postracial despite his never truly being able to escape the modifier black because the default (no modifier needed) in America is always white. This is the most fraught part of the book, demonstrating that despite whatever instructions we get from entertainment media and pundits, we remain stuck in an unresolved, unhealed, inescapable trap.

(more…)

I’ve mentioned the precautionary principle several times, most notably here. Little of our approach to precautions has changed in the two years since that blog post. At the same time, climate change and Mother Nature batter us aggressively. Eventualities remain predictable. Different precautions are being undertaken with respect to the pandemic currently gripping the planet. Arguably, the pandemic is either a subset of Mother Nature’s fury or, if the virus was created in a lab, a self-inflicted wound. Proper pandemic precautions have been confounded by undermining of authority, misinformation, lack of coordination, and politically biased narratives. I’m as confused as the next poor sap. However, low-cost precautions such as wearing masks are entirely acceptable, notwithstanding refusals of many Americans to cooperate after authorities muddied the question of their effectiveness so completely. More significant precautions such as lockdowns and business shutdowns have morphed into received wisdom among government bodies yet are questioned widely as being a cure worse than the disease, not to mention administrative overreach (conspiratorial conjecture withheld).

Now comes evidence published in the New England Journal of Medicine on November 11, 2020, that costly isolation is flatly ineffective at stemming infection rates. Here are the results and conclusions from the abstract of the published study:

Results
A total of 1848 recruits volunteered to participate in the study; within 2 days after arrival on campus, 16 (0.9%) tested positive for SARS-CoV-2, 15 of whom were asymptomatic. An additional 35 participants (1.9%) tested positive on day 7 or on day 14. Five of the 51 participants (9.8%) who tested positive at any time had symptoms in the week before a positive qPCR test. Of the recruits who declined to participate in the study, 26 (1.7%) of the 1554 recruits with available qPCR results tested positive on day 14. No SARS-CoV-2 infections were identified through clinical qPCR testing performed as a result of daily symptom monitoring. Analysis of 36 SARS-CoV-2 genomes obtained from 32 participants revealed six transmission clusters among 18 participants. Epidemiologic analysis supported multiple local transmission events, including transmission between roommates and among recruits within the same platoon.
Conclusions
Among Marine Corps recruits, approximately 2% who had previously had negative results for SARS-CoV-2 at the beginning of supervised quarantine, and less than 2% of recruits with unknown previous status, tested positive by day 14. Most recruits who tested positive were asymptomatic, and no infections were detected through daily symptom monitoring. Transmission clusters occurred within platoons.

So an initial 0.9% tested positive, then an additional 1.9%. This total 2.8% compares to 1.7% in the control group (tested but not isolated as part of the study). Perhaps the experimental and control groups are a bit small (1848 and 1554, respectively), and it’s not clear why the experimental group infection rate is higher than that of the control group, but the evidence points to the uselessness of trying to limit the spread of the virus by quarantining and/or isolation. Once the virus is present in a population, it spreads despite precautions.

A mantra is circulating that we should “trust the science.” Are these results to be trusted? Can we call off all the lockdowns and closures? It’s been at least eight months that the virus has been raging throughout the U.S. Although there might be some instances of isolated populations with no infection, the wider population has by now been exposed. Moreover, some individuals who self-isolated effectively may not have been exposed, but in all likelihood, most of us have been. Accordingly, renewed lockdowns, school and business closures, and destruction of entire industries are a pretense of control we never really had. Their costs are enormous and ongoing. A stay-at-home order (advisory, if you prefer) just went into effect for the City of Chicago on November 16, 2020. My anecdotal observation is that most Chicagoans are ignoring it and going about their business similar to summer and fall months. It’s nothing like the ghost town effect of March and April 2020. I daresay they may well be correct to reject the received wisdom of our civic leaders.

Caveat: rather overlong for me, but I got rolling …

One of the better articles I’ve read about the pandemic is this one by Robert Skidelsky at Project Syndicate (a publication I’ve never heard of before). It reads as only slightly conspiratorial, purporting to reveal the true motivation for lockdowns and social distancing, namely, so-called herd immunity. If that’s the case, it’s basically a silent admission that no cure, vaccine, or inoculation is forthcoming and the spread of the virus can only be managed modestly until it has essentially raced through the population. Of course, the virus cannot be allowed to simply run its course unimpeded, but available impediments are limited. “Flattening the curve,” or distributing the infection and death rates over time, is the only attainable strategy and objective.

Wedding mathematical and biological insights, as well as the law of mass action in chemistry, into an epidemic model may seem obvious now, but it was novel roughly a century ago. We’re also now inclined, if scientifically oriented and informed, to understand the problem and its potential solutions management in terms of engineering rather than medicine (or maybe in terms of triage and palliation). Global response has also made the pandemic into a political issue as governments obfuscate and conceal true motivations behind their handling (bumbling in the U.S.) of the pandemic. Curiously, the article also mentions financial contagion, which is shaping up to be worse in both severity and duration than the viral pandemic itself.

(more…)

In educational philosophy, learning is often categorized in three domains: the cognitive, the affective, and the psychomotor (called Bloom’s Taxonomy). Although formal education admittedly concentrates primarily on the cognitive domain, a well-rounded person gives attention to all three. The psychomotor domain typically relates to tool use and manipulation, but if one considers the body itself a tool, then athletics and physical workouts are part of a balanced approach. The affective domain is addressed through a variety of mechanisms, not least of which is narrative, much of it entirely fictional. We learn how to process emotions through vicarious experience as a safe way to prepare for the real thing. Indeed, dream life is described as the unconscious mind’s mechanism for consolidating memory and experience as well as rehearsing prospective events (strategizing) in advance. Nightmares are, in effect, worst-case scenarios dreamt up for the purpose of avoiding the real thing (e.g., falling from a great height or venturing too far into the dark — a proxy for the unknown). Intellectual workouts address the cognitive domain. While some are happy to remain unbalanced, focusing on strengths found exclusively in a single domain (gym rats, eggheads, actors) and thus remaining physically, emotionally, or intellectually stunted or immature, most understand that workouts in all domains are worth seeking out as elements of healthy development.

One form of intellectual workout is debate, now offered by various media and educational institutions. Debate is quite old but has been embraced with renewed gusto in a quest to develop content (using new media) capable of drawing in viewers, which mixes educational objectives with commercial interests. The time-honored political debate used to be good for determining where to cast one’s vote but has become nearly useless in the last few decades as neither the sponsoring organizations, the moderators, nor the candidates seem to understand anymore how to run a debate or behave properly. Instead, candidates use the opportunity to attack each other, ignore questions and glaring issues at hand, and generally refuse to offer meaningful responses to the needs of voters. Indeed, this last was among the principal innovations of Bill Clinton: roll out some appealing bit of vacuous rhetoric yet offer little to no guidance what policies will actually be pursued once in office. Two presidential administrations later, Barack Obama did much the same, which I consider a most egregious betrayal or bait-and-switch. Opinions vary.

In a recent Munk Debate, the proposition under consideration was whether humankind’s best days lie ahead or behind. Optimists won the debate by a narrow margin (determined by audience vote); however, debate on the issue is not binding truth, nor does debate really resolve the question satisfactorily. The humor and personalities of the debaters probably had more influence than their arguments. Admitting that I possess biases, I found myself inclined favorably toward the most entertaining character, though what I find entertaining is itself further bias not shared especially with many others. In addition, I suspect the audience did not include many working class folks or others who see their prospects for better lives diminishing rapidly, which skews the resulting vote. The age-old parental desire to leave one’s children a better future than their own is imperiled according to this poll (polls may vary considerably — do your own search). How one understands “better off” is highly variable, but the usual way that’s understood is in terms of material wellbeing.

Folks on my radar (names withheld) range widely in their enthusiasm or disdain for debate. The poles appears to be default refusal to accept invitations to debate (often couched as open challenges to professed opinions) as a complete waste of time to earnest desire to participate in, host, and/or moderate debates as a means of informing the public by providing the benefit of expert argumentation. As an intellectual workout, I appreciate the opportunity to hear debates (at least when I’m not exasperated by a speaker’s lack of discipline or end-around arguments), but readers can guess from the title of this post that I expect nothing to be resolved by debate. Were I ever to be offered an opportunity to participate, I can well imagine accepting the invitation and having some fun flexing my intellectual muscles, but I would enter into the event with utterly no expectation of being able to convince anyone of anything. Minds are already too well made up on most issues. If I were offered a spot on some bogus news-and-opinion show to be a talking head, shot from the shoulders up and forced to shout and interrupt to get a brief comment or soundbite in edgewise, that I would decline handily as a total waste of time.

Nicholas Carr has a pair of thoughtful new posts at his blog Rough Type (see blogroll) under the tag “infinite media.” The second of the two is about context collapse, restoration, and content collapse. I won’t review that particular post; I’m merely pointing to it for you to read. Carr is a journalist and media theorist whose work is especially interesting to me as a partial antidote to what I’ve been calling our epistemological crisis. In short, he offers primers on how to think about stuff, that stuff being the primary medium through which most people now gather information: via screens.

Relatedly, the other media theorist to whom I pay attention is Alan Jacobs, who has a recent book (which I read but didn’t review or blog about) called more simply How to Think. It’s about recognizing and avoiding cognitive biases on the way to more disciplined, clear thinking. I mention these two fellows together because I’ve been reading their blogs and books for over a decade now and have been curious to observe how their public interactions have changed over time. They have each embraced and abandoned various new media (particularly social media) and adopted more stringent media ecology. Carr posts ocassionally now and has closed comments at his blog (a shame, since his commentariat was valuable, quite unlike the troll mob at most sites). Jacobs is even more aggressive, starting and abandoning one blog after another (was active at multiple URLs, one formerly on my blogroll) and deleting his Twitter account entirely. Whatever goings-on occur at Facebook I can’t say; I never go there. These aren’t criticisms. We all evolve our associations and activities. But these two are unusual, perhaps, in that they evaluate and recommend with varying vehemence how to interact with electronic media tools.

The wide-open Web available to Americans (but restricted in some countries) used to be valorized as a wholly democratic, organic, grass-roots, decentralized force for good where information yearned to breathe free. Though pioneered by academic institutions, it wasn’t long before the porn industry became the first to monetize it effectively (cuz duh! that’s there the money was — at least initially) and then the whole thing was eventually overwhelmed by others with unique agendas and mechanisms, including commerce, surveillance, and propaganda. The surfeit of information demanded curation, and social media with algorithmic feeds became the default for folks either too lazy or just untrained (or uninterested) in how to think for themselves. Along the way, since a surprisingly large portion of human activity diverted to online media, that activity turned into a resource mined, harvested, and in turn monetized, much like the voting public has become a resource tracked, polled, channeled, activated, disenfranchized, corrupted, and analyzed to death.

An earlier media theorist I read with enthusiasm, Neil Postman, recommended that curricula include the study of semantics as applied to media. (Use of a word like semantics sends nonacademics running for the hills, but the recommendation is basically about thinking critically, even skeptically, regarding information, its sources, and its means of distribution.) The rise of handheld omnimedia postdates Postman, so I can only surmise that the bewildering array of information we confront absorb every day, which I liken to drinking from a fire hose, only compounds Postman’s concern that students are severely overmatched by media (especially advertising) intent on colonizing and controlling their minds. Thus, today’s information environment is a far cry from the stately slowness of earlier eras when teaching and learning (to say nothing of entertainment) were conducted primarily through reading, lecture, and discussion.

A comment came in on this blog chiding me for still blogging after 14 years. I admit hardly anyone reads anymore; they watch (or listen, as with audio-only podcasts). Preferred forms of media consumption have moved on from printed text, something USA Today recognized decades ago when it designed its print publication and sidewalk distribution boxes to look more like TVs. Nonetheless, the modest reproach reminded me of a cry in the wilderness by Timothy Burke: why he still blogs, though quite infrequently. (There’s a brokeback can’t-quit-you joke in there somewhere I’ll leave unformulated.) So this blog may indeed be past its proper expiration date, yet it remains for me one of the best means for organizing how I think about stuff. Without it, I’m afraid thoughts would be rattling loose inside my head, disorganized, only to be displaced by the next slurp from the fire hose.

For readers coming to this blog post lacking context, I’m currently reading and book-blogging Pankaj Mishra’s Age of Anger. It explores Western intellectual history that gives rise to feelings of radical discontent over injustices that have not been addressed or remedied successfully for the entirety of the modern era despite centuries of supposed progress.

Continuing from part 1, the case of Voltaire is a curious one. A true child of the Enlightenment, my inference is that he came along too late to participate in the formulation of foundational Enlightenment ideals but later became one of their chief proponents as they diffused throughout Europe and into Russia and elsewhere. He joined many, many others in a belief (against a preponderance of evidence) in human progress, if not perfectibility. (Technical progress is an entirely different matter.) One of the significant aspects of his ideology and writings was his sustained attack on Christianity, or more particularly, Catholicism. More than three centuries later, the secularization of Europe and diminished influence of medieval church dogma stand out as part of the same intellectual tradition.

Enlightenment canon includes aspirational faith in the ability of reason, mechanisms, systems, and administrative prowess to order the affairs of men properly. (How one defines properly, as distinct from equitably or justly, is a gaping hole primed for debate.) In the course of the last few centuries, history has demonstrated that instrumental logic spawned by this ideology has given rise to numerous totalitarian regimes that have subjugated entire populations, often quite cruelly, in modernizing and Westernizing projects. Voltaire found himself in the thick of such projects by willingly aligning himself with despots and rulers who victimized their own peoples in pursuit of industrialization and imitation of urbane French and British models. Russians Peter the Great (reigned May 7, 1682 to February 8, 1725) and Catherine the Great (reigned July 9, 1762 to November 17, 1796) were among those for whom Voltaire acted as apologist and intellectual co-conspirator. Here’s what Mishra has to say:

Voltaire was an unequivocal top-down modernizer, like most of the Enlightenment philosophes, and an enraptured chronicler in particular of Peter the Great. Russian peasants had paid a steep price for Russia’s Westernization, exposed as they were to more oppression and exploitation as Peter tried in the seventeenth century to build a strong military and bureaucratic state. Serfdom, near extinct in most of Western Europe by the thirteen century, was actually strengthened by Peter in Russia. Coercing his nobles into lifetime service to the state, [effectively] postponing the emergence of a civil society, Peter the Great waged war endlessly. But among educated Europeans, who until 1789 saw civilization as something passed down from the enlightened few to the ignorant many, Russia was an admirably progressive model. [pp. 98–99]

and slightly later

… it was Voltaire who brought a truly religious ardour to the cult of Catherine. As the Empress entered into war with Poland and Turkey in 1768, Voltaire became her cheerleader. Catherine claimed to be protecting the rights of religious minorities residing in the territories of her opponents. The tactic, repeatedly deployed by later European imperialists in Asia and Africa, had the expected effect on Voltaire, who promptly declared Catherine’s imperialistic venture to be a crusade for the Enlightenment. [p. 102]

No doubt plenty of rulers throughout history understood in the proverbial sense that to make an omelette, a few eggs must be broken, and that by extension, their unpopular decisions must be reshaped and propagandized to the masses to forestall open revolt. Whose eggs are ultimately broken is entirely at issue. That basic script is easily recognizable as being at work even today. Justifications for administrative violence ought to fail to convince those on the bottom rungs of society who make most of the real sacrifices — except that propaganda works. Thus, the United States’ multiple, preemptive wars of aggression and regime change (never fully declared or even admitted as such) have continued to be supported or at least passively accepted by a majority of Americans until quite recently. Mishra makes this very same point using an example different from mine:

… cossetted writers and artists would in the twentieth century transfer their fantasies of an idea society to Soviet leaders, who seemed to be bringing a superhuman energy and progressive rhetoric to Peter the Great’s rational schemes of social engineering. Stalin’s Russia, as it ruthlessly eradicated its religious and evidently backward enemies in the 1930s, came to ‘constitute … a quintessential Enlightenment utopia’. But the Enlightenment philosophes had already shown, in their blind adherence to Catherine, how reason could degenerate into dogma and new, more extensive forms of domination, authoritarian state structures, violent top-down manipulation of human affairs (often couched in terms of humanitarian concern) and indifference to suffering. [pp. 104–105]

As I reread the chapter in preparation for this blog post, I was surprised to find somewhat less characterization of Voltaire than of Rousseau. Indeed, it is more through Rousseau’s criticism of the dominant European paradigm that the schism between competing intellectual traditions is explored. Mishra circles back to Rousseau repeatedly but does not hesitate to show where his ideas, too, are insufficient. For instance, whereas pro-Enlightenment thinkers are often characterized as being lost in abstraction and idealization (i.e., ideologically possessed), thus estranged from practical reality or history, Rousseau’s empathy and identification with commoners does not provide enough structure for Rousseau to construct a viable alternative to the historical thrust of the day. Mishra quotes a contemporary critic (Joseph de Maistre) who charged Rousseau with irresponsible radicalism:

… he often discovers remarkable truths and expresses them better than anyone else, but these truths are sterile to his hands … No one shapes their materials better than he, and no one builds more poorly. Everything is good except his systems. [p. 110]

The notion that leaders (monarchs, emperors, presidents, prime ministers, social critics, and more recently, billionaires) ought to be in the business of engineering society rather than merely managing it is tacitly assumed. Indeed, there is a parallel hubris present in Rousseau as a thought leader having questionable moral superiority through his vehement criticism of the Enlightenment:

His confidence and self-righteousness derived from his belief that he had at least escaped the vices of modern life: deceit and flattery. In his solitude, he was convinced, like many converts to ideological causes and religious beliefs, that he was immune to corruption. A conviction of his incorruptibility was what gave his liberation from social pieties a heroic aura and moved him from a feeling of powerlessness to omnipotence. In the movement from victimhood to moral supremacy, Rousseau enacted the dialectic of ressentiment that has become commonplace in our time. [pp. 111–112]

This is a recapitulation of the main thesis of the book, which Mishra amplifies only a couple paragraphs later:

Rousseau actually went beyond the conventional political categories and intellectual vocabularies of left and right to outline the basic psychological outlook of those who perceive themselves as abandoned or pushed behind. He provided the basic vocabulary for their characteristic new expressions of discontent, and then articulated their longing for a world cleansed of the social sources of dissatisfaction. Against today’s backdrop of near-universal political rage, history’s greatest militant lowbrow seems to have grasped, and embodied, better than anyone the incendiary appeal of victimhood in societies built around the pursuit of wealth and power. [p. 112]

Does “the incendiary appeal of victimhood” sound like a potent component of today’s Zeitgeist? Or for that matter “militant lowbrow” (names withheld)? At the end of the 18th century, Voltaire and Rousseau were among the primary men of letters, the intelligentsia, the cognoscenti, articulating competing social views and values with major sociopolitical revolutions following shortly thereafter. The oft-observed rhyming (not repetition) of history suggests another such period may well be at hand.