Archive for the ‘Debate’ Category

In my preparations for a speech to be given in roughly two months, I stumbled across a prescient passage in an essay entitled “Jesuitism” from Latter-Day Pamphlets (1850) by Thomas Carlyle. Connect your own dots as this is offered without comment.

… this, then, is the horrible conclusion we have arrived at, in England as in all countries; and with less protest against it hitherto, and not with more, in England than in other countries? That the great body of orderly considerate men; men affecting the name of good and pious, and who, in fact, excluding certain silent exceptionary individuals one to the million, such as the Almighty Beneficence never quite withholds, are accounted our best men,–have unconsciously abnegated the sacred privilege and duty of acting or speaking the truth; and fancy that it is not truth that is to be acted, but that an amalgam of truth and falsity is the safe thing. In parliament and pulpit, in book and speech, in whatever spiritual thing men have to commune of, or to do together, this is the rule they have lapsed into, this is the pass they have arrived at. We have to report than Human Speech is not true! That it is false to a degree never witnessed in this world till lately. Such a subtle virus of falsity in the very essence of it, as far excels all open lying, or prior kinds of falsity; false with consciousness of being sincere! The heart of the world is corrupted to the core; a detestable devil’s-poison circulates in the life-blood of mankind; taints with abominable deadly malady all that mankind do. Such a curse never fell on men before.

For the falsity of speech rests on a far deeper falsity. False speech, as is inevitable when men long practise it, falsifies all things; the very thoughts, or fountains of speech and action become false. Ere long, by the appointed curse of Heaven, a man’s intellect ceases to be capable of distinguishing truth, when he permits himself to deal in speaking or acting what is false. Watch well the tongue, for out of it are the issues of life! O, the foul leprosy that heaps itself in monstrous accumulation over Human Life, and obliterates all the divine features of it into one hideous mountain of purulent disease, when Human Life parts company with truth; and fancies, taught by Ignatius or another, that lies will be the salvation of it! We of these late centuries have suffered as the sons of Adam never did before; hebetated, sunk under mountains of torpid leprosy; and studying to persuade ourselves that this is health.

And if we have awakened from the sleep of death into the Sorcerer’s Sabbath of Anarchy, is it not the chief of blessings that we are awake at all? Thanks to Transcendent Sansculottism and the long-memorable French Revolution, the one veritable and tremendous Gospel of these bad ages, divine Gospel such as we deserved, and merciful too, though preached in thunder and terror! Napoleon Campaignings, September Massacres, Reigns of Terror, Anacharsis Clootz and Pontiff Robespierre, and still more beggarly tragicalities that we have since seen, and are still to see: what frightful thing were not a little less frightful than the thing we had? Peremptory was our necessity of putting Jesuitism away, of awakening to the consciousness of Jesuitism. ‘Horrible,’ yes: how could it be other than horrible? Like the valley of Jehoshaphat, it lies round us, one nightmare wilderness, and wreck of dead-men’s bones, this false modern world; and no rapt Ezekiel in prophetic vision imaged to himself things sadder, more horrible and terrible, than the eyes of men, if they are awake, may now deliberately see. Many yet sleep; but the sleep of all, as we judge by their maundering and jargoning, their Gorham Controversies, street-barricadings, and uneasy tossings and somnambulisms, is not far from ending. Novalis says, ‘We are near awakening when we dream that we are dreaming.’ [italics in original]

A complex of interrelated findings about how consciousness handles the focus of perception has been making the rounds. Folks are recognizing the limited time each of us has to deal with everything pressing upon us for attention and are adopting the notion of the bandwidth of consciousness: the limited amount of perception / memory / thought one can access or hold at the forefront of attention compared to the much larger amount occurring continuously outside of awareness (or figuratively, under the hood). Similarly, the myriad ways attention is diverted by advertisers and social media (to name just two examples) to channel consumer behaviors or increase time-on-device metrics have become commonplace topics of discussion. I’ve used the terms information environment, media ecology, and attention economy is past posts on this broad topic.

Among the most important observations is how the modern infosphere has become saturated with content, much of it entirely pointless (when not actively disorienting or destructive), and how many of us willingly tune into it without interruption via handheld screens and earbuds. It’s a steady flow of stimulation (overstimulation, frankly) that is the new normal for those born and/or bred to the screen (media addicts). Its absence or interruption is discomfiting (like a toddler’s separation anxiety). However, mental processing of information overflow is tantamount to drinking from a fire hose: only a modest fraction of the volume rushing nonstop can be swallowed. Promoters of meditation and presensing, whether implied or manifest, also recognize that human cognition requires time and repose to process and consolidate experience, transforming it into useful knowledge and long-term memory. More and more stimulation added on top is simply overflow, like a faucet filling the bathtub faster than drain can let water out, spilling overflow onto the floor like digital exhaust. Too bad that the sales point of these promoters is typically getting more done, because dontcha know, more is better even when recommending less.

Quanta Magazine has a pair of articles (first and second) by the same author (Jordana Cepelewicz) describing how the spotlight metaphor for attention is only partly how cognition works. Many presume that the mind normally directs awareness or attention to whatever the self prioritizes — a top-down executive function. However, as any loud noise, erratic movement, or sharp pain demonstrates, some stimuli are promoted to awareness by virtue of their individual character — a bottom-up reflex. The fuller explanation is that neuroscientists are busy researching brain circuits and structures that prune, filter, or gate the bulk of incoming stimuli so that attention can be focused on the most important bits. For instance, the article mentions how visual perception circuits process categories of small and large differently, partly to separate figure from ground. Indeed, for cognition to work at all, a plethora of inhibitory functions enable focus on a relatively narrow subset of stimuli selected from the larger set of available stimuli.

These discussions about cognition (including philosophical arguments about (1) human agency vs. no free will or (2) whether humans exist within reality or are merely simulations running inside some computer or inscrutable artificial intelligence) so often get lost in the weeds. They read like distinctions without differences. No doubt these are interesting subjects to contemplate, but at the same time, they’re sorta banal — fodder for scientists and eggheads that most average folks dismiss out of hand. In fact, selective and inhibitory mechanisms are found elsewhere in human physiology, such as pairs of muscles to move to and fro or appetite stimulants / depressants (alternatively, activators and deactivators) operating in tandem. Moreover, interactions are often not binary (on or off) but continuously variable. For my earlier post on this subject, see this.

Well, dammit! Guess I’m gonna have to add a SWOTI tag after all. Obviously, I’ve been paying too much attention to bogus pronouncements by economists.

/rant on

Yet more fools stating confidently that climate change is not really a serious concern has me gasping in exasperation. Take, for instance, this astounding paragraph by Egon von Greyerz:

Yes, of course global warming has taken place recently as the effect of climate cycles. But the cycle has just peaked again which means that all the global warming activists will gradually cool down with the falling temperatures in the next few decades. The sun and the planets determine climate cycles and temperatures, like they have for many millions of years, and not human beings. [emphasis added]

So no climate change worries to disturb anyone’s dreams. Sleep soundly. I’m so relieved. All the effort expended over the past decades toward understanding climate change can be waived off with a mere three sentences by a motivated nonexpert. The linked webpage offers no support whatsoever for these bald statements but instead goes on to offer economic prophecy (unironically, of certain doom). For minimal counter-evidence regarding climate change, embedded below is a two-year-old video explaining how some regions are expected to become uninhabitable due to high wet-bulb temperatures.

The article ends with these brief paragraphs:

There is no absolute protection against this scenario [economic collapse] since it will hit all aspects of life and virtually all people. Obviously, people living off the land in remote areas will suffer less whilst people in industrial and urban areas will suffer considerably.

The best financial protection is without hesitation physical gold and some silver. These metals are critical life insurance. But there are clearly many other important areas of protection to plan for. A circle of friends and family is absolutely essential. [emphasis in original]

Ok, so I’m wrong: they guy’s not an economist at all; he’s a salesman. After placating one catastrophe only to trot out another, his scaremongering message clear: buy gold and silver. Might not be a bad idea, actually, but that won’t protect against TEOTWAWKI. So whose eyes are deceiving them, Egon’s or mine (or yours)? He’s selling precious metals; I’m sharing the truth (best as I can ascertain, anyway).

The other idiotic thing to darker my brow was several actual economists asked about the economic effects of implementing Greta Thunberg’s dream world (sarcasm much?). If her dream world is spelled out somewhere, I haven’t seen it, nor is it provided (link or otherwise) in the article. Seems like the sort of invented argument attached to a trending name for the purpose of clickbait attacking the messenger and thus shooting down her message. However, let me be generous for a moment and suggest that efforts to stop climate change include, at a minimum, getting off fossil fuels, reforming Big Ag, and denying developing nations their quest to join the First-World Age of Abundance. Those are the three subjects discussed in the article. These economists’ conclusion? It will be, um, costly. Well, yeah, true! Very costly indeed. I agree entirely. But what of the cost if those things aren’t done? Isn’t that question implied? Isn’t that what Greta Thunberg has insisted upon? The answer is it will cost far more, though perhaps not in something as cravenly readily quantifiable as profit or cost. Referring again to the embedded video above, it will cost us the very habitability of the planet, and not in just a few restricted regions we can add to existing sacrifice zones. Widespread species dislocation and die-off will include the human species, since we rely on all the others. Some prophesy a human death pulse of monstrous proportion (several billions, up to perhaps 90% of us) or even near-term human extinction. Is that costly enough to think about the problem differently, urgently, as Greta Thunberg does? Might the question be better framed as the cost of not implementing Greta Thunberg’s dream world so that economists are sent off on a different analytical errand?

In the middle of the 19th century, Scottish satirist Thomas Carlyle called economics The Dismal Science, which description stuck. The full context of that coinage may have had more to do with slavery than poor scholarship, so in the context of lying or at least misleading with numbers, I propose instead calling it The Deceitful Science. Among the stupid habits to dispel is the risible notion that, by measuring something as a means of understanding it, we grasp its fullness, and concomitantly, what’s really important. I suggest further that most economists deceive themselves by performing a fundamentally wrong kind of analysis.

The issue of deceit is of some importance beyond getting at the truth of climate change. Everything in the public sphere these days is susceptible to spin, massage, and reframing to such a degree that an epistemological crisis (my apt term) has fundamentally altered sense making, with the result that most nonexperts simply don’t know what to believe anymore. Economists are doing no one any favors digressing into areas beyond their Deceitful Science.

/rant off

Delving slightly deeper after the previous post into someone-is-wrong-on-the-Internet territory (worry not: I won’t track far down this path), I was dispirited after reading some economist dude with the overconfidence hubris to characterize climate change as fraud. At issue is the misframing of proper time periods in graphical data for the purpose of overthrowing government and altering the American way of life. (Um, that’s the motivation? Makes no sense.) Perhaps this fellow’s intrepid foray into the most significant issue of our time (only to dismiss it) is an aftereffect of Freakonomics emboldening economists to offer explanations and opinions on matters well outside their field of expertise. After all, truly accurate, relevant information is only ever all about numbers (read: the Benjamins), shaped and delivered by economists, physical sciences be damned.

The author of the article has nothing original to say. Rather, he repackages information from the first of two embedded videos (or elsewhere?), which examines time frames of several trends purportedly demonstrating global warming (a term most scientists and activists have disused in favor of climate change, partly to distinguish climate from weather). Those trends are heat waves, extent of Arctic ice, incidence of wildfires, atmospheric carbon, sea level, and global average temperature. Presenters of weather/climate information (such as the IPCC) are accused of cherry-picking dates (statistical data arranged graphically) to present a false picture, but then similar data with other dates are used to depict another picture supposedly invalidating the first set of graphs. It’s a case of lying with numbers and then lying some more with other numbers.

Despite the claim that “reports are easily debunked as fraud,” I can’t agree that this example of climate change denial overcomes overwhelming scientific consensus on the subject. It’s not so much that the data are wrong (I acknowledge they can be misleading) but that the interpretation of effects of industrial activity since 1750 (a more reasonable comparative baseline) isn’t so obvious as simply following shortened or lengthened trend lines and demographics up or down. That’s typically zooming in or out to render the picture most amenable to a preferred narrative, precisely what the embedded video does and in turn accuses climate scientists and activists of doing. The comments under the article indicate a chorus of agreement with the premise that climate change is a hoax or fraud. Guess those commentators haven’t caught up yet with rising public sentiment, especially among the young.

Having studied news and evidence of climate change as a layperson for roughly a dozen years now, the conclusions drawn by experts (ignoring economists) convince me that we’re pretty irredeemably screwed. The collapse of industrial civilization and accompanying death pulse are the predicted outcomes but a precise date is impossible to provide because it’s a protracted process. An even worse possibility is near-term human extinction (NTHE), part of the larger sixth mass extinction. Absorbing this information has been a arduous, ongoing, soul-destroying undertaking for me, and evidence keeps being supplemented and revised, usually with ever-worsening prognoses. However, I’m not the right person to argue the evidence. Instead, see this lengthy article (with profuse links) by Dr. Guy McPherson, which is among the best resources outside of the IPCC.

In fairness, except for the dozen years I’ve spent studying the subject, I’m in no better position to offer inexpert opinion than some economist acting the fool. But regular folks are implored to inform and educate themselves on a variety of topics if nothing else than so that they can vote responsibly. My apprehension of reality and human dynamics may be no better than the next, but as history proceeds, attempting to make sense of the deluge of information confronting everyone is something I take seriously. Accordingly, I’m irked when contentious issues are warped and distorted, whether earnestly or malignantly. Maybe economists, like journalists, suffer from a professional deformation that confers supposed explanatory superpowers. However, in the context of our current epistemological crisis, I approach their utterances and certainty with great skepticism.

Periodically, I come across preposterously stupid arguments (in person and online) I can’t even begin to dispel. One such argument is that carbon is plant food, so we needn’t worry about greenhouse gases such as carbon dioxide, a byproduct of industrial activity. Although I’m unconvinced by such arrant capsule arguments, I’m also in a lousy position to contend with them because convincing evidence lies outside my scientific expertise. Moreover, evidence (should I bother to gather it) is too complex and involved to fit within a typical conversation or simple explanation. Plus, evidence relies on scientific literacy and critical reasoning often lacking in the lay public. Scientific principles work better for me rather than, for example, the finely tuned balances Nature is constantly tinkering with — something we humans can hope to discover only partially. Yet we sally forth aggressively and heedlessly to manipulate Nature at our peril, which often results in precisely the sort of unintended consequence scientists in Brazil found when mosquitoes altered genetically (to reduce their numbers as carriers of disease) developed into mosquitoes hardier and more difficult to eradicate than if we had done nothing. The notion that trees respond favorably to increased carbon in the atmosphere has been a thorn in my side for some time. Maybe it’s even partly true; I can’t say. However, the biological and geophysical principle I adhere to is that even small changes in geochemistry (minute according to some scales, e.g., parts per million or per billion) have wildly disproportionate effects. The main effect today is climate changing so fast that many organisms can’t adapt or evolve quickly enough to keep up. Instead, they’re dying en masse and going extinct.

The best analogy is the narrow range of healthy human body temperature centered on 98.6 °F. Vary not far up (fever) or down (hypothermia) and human physiology suffers and become life threatening. Indeed, even in good health, we humans expend no small effort keeping body temperature from extending far into either margin. Earth also regulates itself through a variety of blind mechanisms that are in the process of being wrecked by human activity having risen by now to the level of terraforming, much like a keystone species alters its environment. So as the planet develops the equivalent of a fever, weather systems and climate (not the same things) react, mostly in ways that make life on the surface much harder to sustain and survive. As a result, trees are in the process of dying. Gail Zawacki’s blog At Wit’s End (on my blogroll) explores this topic in excruciating and demoralizing detail. Those who are inclined to deny offhandedly are invited to explore her blog. The taiga (boreal forest) and the Amazonian rainforest are among the most significant ecological formations and carbon sinks on the planet. Yet both are threatened biomes. Deforestation and tree die-off is widespread, of course. For example, since 2010, an estimated 129 million trees in California have died from drought and bark beetle infestation. In Colorado, an estimated more than 800 millions dead trees still standing (called snags) are essentially firestarter. To my way of thinking, the slow, merciless death of trees is no small matter, and affected habitats may eventually be relegated to sacrifice zones like areas associated with mining and oil extraction.

Like the bait “carbon is plant food,” let me suggest that the trees have begun to rebel by falling over at the propitious moment to injure and/or kill hikers and campers. According to this article at Outside Magazine, the woods are not safe. So if mosquitoes, rattlesnakes, mountain lions, or bears don’t getcha first, beware of the trees. Even broken branches and dead tree trunks that haven’t fallen fully to the ground (known as hung snags, widow-makers, and foolkillers) are known to take aim at human interlopers. This is not without precedent. In The Lord of the Rings, remember that the Ents (tree herders) went to war with Isengard, while the Huorns destroyed utterly the Orcs who had laid siege to Helm’s Deep. Tolkien’s tale is but a sliver of a much larger folklore regarding the enchanted forest, where men are lost or absorbed (as with another Tolkien character, Old Man Willow). Veneration of elemental forces of nature (symbols of both life and its inverse death) is part of our shared mythology, though muted in an era of supposed scientific sobriety. M. Night Shyamalan has weak explorations of similar themes in several of his films. Perhaps Tolkien understood at an intuitive level the silent anger and resentment of the trees, though slow to manifest, and their eventual rebellion over mistreatment by men. It’s happening again, right now, all around us. Go ahead: prove me wrong.

For readers coming to this blog post lacking context, I’m currently reading and book-blogging Pankaj Mishra’s Age of Anger. It explores Western intellectual history that gives rise to feelings of radical discontent over injustices that have not been addressed or remedied successfully for the entirety of the modern era despite centuries of supposed progress.

Continuing from part 1, the case of Voltaire is a curious one. A true child of the Enlightenment, my inference is that he came along too late to participate in the formulation of foundational Enlightenment ideals but later became one of their chief proponents as they diffused throughout Europe and into Russia and elsewhere. He joined many, many others in a belief (against a preponderance of evidence) in human progress, if not perfectibility. (Technical progress is an entirely different matter.) One of the significant aspects of his ideology and writings was his sustained attack on Christianity, or more particularly, Catholicism. More than three centuries later, the secularization of Europe and diminished influence of medieval church dogma stand out as part of the same intellectual tradition.

Enlightenment canon includes aspirational faith in the ability of reason, mechanisms, systems, and administrative prowess to order the affairs of men properly. (How one defines properly, as distinct from equitably or justly, is a gaping hole primed for debate.) In the course of the last few centuries, history has demonstrated that instrumental logic spawned by this ideology has given rise to numerous totalitarian regimes that have subjugated entire populations, often quite cruelly, in modernizing and Westernizing projects. Voltaire found himself in the thick of such projects by willingly aligning himself with despots and rulers who victimized their own peoples in pursuit of industrialization and imitation of urbane French and British models. Russians Peter the Great (reigned May 7, 1682 to February 8, 1725) and Catherine the Great (reigned July 9, 1762 to November 17, 1796) were among those for whom Voltaire acted as apologist and intellectual co-conspirator. Here’s what Mishra has to say:

Voltaire was an unequivocal top-down modernizer, like most of the Enlightenment philosophes, and an enraptured chronicler in particular of Peter the Great. Russian peasants had paid a steep price for Russia’s Westernization, exposed as they were to more oppression and exploitation as Peter tried in the seventeenth century to build a strong military and bureaucratic state. Serfdom, near extinct in most of Western Europe by the thirteen century, was actually strengthened by Peter in Russia. Coercing his nobles into lifetime service to the state, [effectively] postponing the emergence of a civil society, Peter the Great waged war endlessly. But among educated Europeans, who until 1789 saw civilization as something passed down from the enlightened few to the ignorant many, Russia was an admirably progressive model. [pp. 98–99]

and slightly later

… it was Voltaire who brought a truly religious ardour to the cult of Catherine. As the Empress entered into war with Poland and Turkey in 1768, Voltaire became her cheerleader. Catherine claimed to be protecting the rights of religious minorities residing in the territories of her opponents. The tactic, repeatedly deployed by later European imperialists in Asia and Africa, had the expected effect on Voltaire, who promptly declared Catherine’s imperialistic venture to be a crusade for the Enlightenment. [p. 102]

No doubt plenty of rulers throughout history understood in the proverbial sense that to make an omelette, a few eggs must be broken, and that by extension, their unpopular decisions must be reshaped and propagandized to the masses to forestall open revolt. Whose eggs are ultimately broken is entirely at issue. That basic script is easily recognizable as being at work even today. Justifications for administrative violence ought to fail to convince those on the bottom rungs of society who make most of the real sacrifices — except that propaganda works. Thus, the United States’ multiple, preemptive wars of aggression and regime change (never fully declared or even admitted as such) have continued to be supported or at least passively accepted by a majority of Americans until quite recently. Mishra makes this very same point using an example different from mine:

… cossetted writers and artists would in the twentieth century transfer their fantasies of an idea society to Soviet leaders, who seemed to be bringing a superhuman energy and progressive rhetoric to Peter the Great’s rational schemes of social engineering. Stalin’s Russia, as it ruthlessly eradicated its religious and evidently backward enemies in the 1930s, came to ‘constitute … a quintessential Enlightenment utopia’. But the Enlightenment philosophes had already shown, in their blind adherence to Catherine, how reason could degenerate into dogma and new, more extensive forms of domination, authoritarian state structures, violent top-down manipulation of human affairs (often couched in terms of humanitarian concern) and indifference to suffering. [pp. 104–105]

As I reread the chapter in preparation for this blog post, I was surprised to find somewhat less characterization of Voltaire than of Rousseau. Indeed, it is more through Rousseau’s criticism of the dominant European paradigm that the schism between competing intellectual traditions is explored. Mishra circles back to Rousseau repeatedly but does not hesitate to show where his ideas, too, are insufficient. For instance, whereas pro-Enlightenment thinkers are often characterized as being lost in abstraction and idealization (i.e., ideologically possessed), thus estranged from practical reality or history, Rousseau’s empathy and identification with commoners does not provide enough structure for Rousseau to construct a viable alternative to the historical thrust of the day. Mishra quotes a contemporary critic (Joseph de Maistre) who charged Rousseau with irresponsible radicalism:

… he often discovers remarkable truths and expresses them better than anyone else, but these truths are sterile to his hands … No one shapes their materials better than he, and no one builds more poorly. Everything is good except his systems. [p. 110]

The notion that leaders (monarchs, emperors, presidents, prime ministers, social critics, and more recently, billionaires) ought to be in the business of engineering society rather than merely managing it is tacitly assumed. Indeed, there is a parallel hubris present in Rousseau as a thought leader having questionable moral superiority through his vehement criticism of the Enlightenment:

His confidence and self-righteousness derived from his belief that he had at least escaped the vices of modern life: deceit and flattery. In his solitude, he was convinced, like many converts to ideological causes and religious beliefs, that he was immune to corruption. A conviction of his incorruptibility was what gave his liberation from social pieties a heroic aura and moved him from a feeling of powerlessness to omnipotence. In the movement from victimhood to moral supremacy, Rousseau enacted the dialectic of ressentiment that has become commonplace in our time. [pp. 111–112]

This is a recapitulation of the main thesis of the book, which Mishra amplifies only a couple paragraphs later:

Rousseau actually went beyond the conventional political categories and intellectual vocabularies of left and right to outline the basic psychological outlook of those who perceive themselves as abandoned or pushed behind. He provided the basic vocabulary for their characteristic new expressions of discontent, and then articulated their longing for a world cleansed of the social sources of dissatisfaction. Against today’s backdrop of near-universal political rage, history’s greatest militant lowbrow seems to have grasped, and embodied, better than anyone the incendiary appeal of victimhood in societies built around the pursuit of wealth and power. [p. 112]

Does “the incendiary appeal of victimhood” sound like a potent component of today’s Zeitgeist? Or for that matter “militant lowbrow” (names withheld)? At the end of the 18th century, Voltaire and Rousseau were among the primary men of letters, the intelligentsia, the cognoscenti, articulating competing social views and values with major sociopolitical revolutions following shortly thereafter. The oft-observed rhyming (not repetition) of history suggests another such period may well be at hand.

The “American character,” if one can call it into being merely by virtue of naming it (the same rhetorical trick as solutionism), is diverse and ever-changing. Numerous characterizations have been offered throughout history, with Alexis de Tocqueville’s Democracy in America (1835 and 1840) being perhaps the one cited most frequently despite its outdatedness. Much in American character has changed since that time, and it’s highly questionable to think it was unified even then. However, as a means of understanding ourselves, it’s as good a place to start as any. A standard criticism of American character as seen from outside (i.e., when Americans travel abroad) is the so-called ugly American: loud, inconsiderate, boorish, and entitled. Not much to argue with there. A more contemporary assessment by Morris Berman, found throughout his “American trilogy,” is that we Americans are actually quite stupid, unaccountably proud of it, and constantly hustling (in the pejorative sense) in pursuit of material success. These descriptions don’t quite match up with familiar jingoism about how great America is (and of course, Americans), leading to non-Americans clamoring to emigrate here, or the self-worship we indulge in every national holiday celebrating political and military history (e.g., Independence Day, Veteran’s Day, Memorial Day).

I recently ran afoul of another ugly aspect of our national character: our tendency toward aggression and violence. In truth, this is hardly unique to Americans. Yet it came up glaringly in the context of a blog post at Pharyngula citing a Tweet comparing uneven application of law (and indignation among online chatterers?) when violence is committed by the political left vs. the political right. Degree of violence clearly matters, but obvious selection bias was deployed to present an egregiously lop-sided perspective. Radicals on both the left and right have shown little compunction about using violence to achieve their agendas. Never mind how poorly conceived those agendas may be. What really surprised me, however, was that my basic objection to violence in all forms across the spectrum was met with snark and ad hominem attack. When did reluctance to enact violence (including going to war) until extremity demands it become controversial?

My main point was that resorting to violence typically invalidates one’s objective. It’s a desperation move. Moreover, using force (e.g., intimidation, threats, physical violence — including throwing milkshakes) against ideological opponents is essentially policing others’ thoughts. But they’re fascists, right? Violence against them is justified because they don’t eschew violence. No, wrong. Mob justice and vigilantism obviate the rule of law and criminalize any perpetrator of violence. It’s also the application of faulty instrumental logic, ceding any principled claim to moral authority. But to commentators at the blog post linked above, I’m the problem because I’m not in support of fighting fascists with full force. Guess all those masked, caped crusaders don’t recognize that they’re contributing to lawlessness and mayhem. Now even centrists come in for attack for not be radical (or aggressive, or violent) enough. Oddly silent in the comments is the blog host, P.Z. Myers, who has himself communicated approval of milkshake patrols and Nazi punching, as though the presumptive targets (identified rather haphazardly and incorrectly in many instances) have no right to their own thoughts and ideas, vile though they may be, and that violence is the right way to “teach them a lesson.” No one learns the intended lesson when the victim of violence. Rather, if not simply cowed into submission (not the same as agreement), tensions tend to escalate into further and increasing violence. See also reaction formation.

Puzzling over this weird exchange with these, my fellow Americans (the ideologically possessed ones anyway), caused me to backtrack. For instance, the definition of fascism at is “a governmental system led by a dictator having complete power, forcibly suppressing opposition and criticism, regimenting all industry, commerce, etc., and emphasizing an aggressive nationalism and often racism.” That definition sounds more like totalitarianism or dictatorship and is backward looking, specifically to Italy’s Benito Mussolini in the period 1922 to 1943. However, like national characters, political moods and mechanisms change over time, and the recent fascist thrust in American politics isn’t limited to a single leader with dictatorial power. Accordingly, the definition above has never really satisfied me.

I’ve blogged repeatedly about incipient fascism in the U.S., the imperial presidency (usually associated with George W. Bush but also characteristic of Barack Obama), James Howard Kunstler’s prediction of a cornpone fascist coming to power (the way paved by populism), and Sheldon Wolin’s idea of inverted totalitarianism. What ties these together is how power is deployed and against what targets. More specifically, centralized power (or force) is directed against domestic populations to advance social and political objectives without broad public support for the sole benefit of holders of power. That’s a more satisfactory definition of fascism to me, certainly far better that Peter Schiff’s ridiculous equation of fascism with socialism. Domination of others to achieve objectives describes the U.S. power structure (the military-industrial-corporate complex) to a tee. That doesn’t mean manufactured consent anymore; it means bringing the public into line, especially through propaganda campaigns, silencing of criticism, prosecuting whistle-blowers, and broad surveillance, all of which boil down to policing thought. The public has complied by embracing all manner of doctrine against enlightened self-interest, the very thing that was imagined to magically promote the general welfare and keep us from wrecking things or destroying ourselves unwittingly. Moreover, public support is not really obtained through propaganda and domination, only the pretense of agreement found convincing by fools. Similarly, admiration, affection, and respect are not won with a fist. Material objectives (e.g., resource reallocation, to use a familiar euphemism) achieved through force are just common theft.

So what is Antifa doing? It’s forcibly silencing others. It’s doing the work of fascist government operatives by proxy. It’s fighting fascism by becoming fascist, not unlike the Republican-led U.S. government in 2008 seeking bailouts for banks and large corporations, handily transforming our economy into a socialist experiment (e.g, crowd-funding casino capitalism through taxation). Becoming the enemy to fight the enemy is a nice trick of inversion, and many are so flummoxed by these contradictions they resort to Orwellian doublethink to reconcile the paradox. Under such conditions, there are no arguments that can convince. Battle lines are drawn, tribal affiliations are established, and the ideological war of brother against brother, American against American, intensifies until civility crumbles around us. Civil war and revolution haven’t occurred in the U.S. for 150 years, but they are popping up regularly around the globe, often at the instigation of the U.S. government (again, acting against the public interest). Is our turn coming because we Americans have been divided and conquered instead of recognizing the real source of threat?

I put aside Harari’s book from the previous blog post in favor of Pankaj Mishra’s Age of Anger: A History of the Present (2017). Mishra’s sharp cultural criticism is far more convincing than Harari’s Panglossian perspective. Perhaps some of that is due to an inescapable pessimism in my own character. Either way, I’ve found the first 35 pages dense with observations of interest to me as a blogger and armchair cultural critic. Some while back, I published a post attempting to delineate (not very well, probably) what’s missing in the modern world despite its obvious material abundance. Reinforcing my own contentions, Mishra’s thesis (as I understand it so far) is this: we today share with others post-Enlightenment an array of resentments and hatreds (Fr.: ressentiment) aimed incorrectly at scapegoats for political and social failure to deliver the promises of progressive modernity equitably. For instance, Mishra describes

… flamboyant secular radicals in the nineteenth and early twentieth centuries: the aesthetes who glorified war, misogyny and pyromania; the nationalists who accused Jews and liberals of rootless cosmopolitanism and celebrated irrational violence; and the nihilists, anarchists and terrorists who flourished in almost every continent against a background of cosy political-financial alliances, devastating economic crises and obscene inequalities. [pp. 10–11]

Contrast and/or compare his assessment of the recent past:

Beginning in the 1990s, a democratic revolution of aspiration … swept across the world, sparking longings for wealth, status and power, in addition to ordinary desires for stability and contentment, in the most unpromising circumstances. Egalitarian ambition broke free of old social hierarchies … The culture of [frantic] individualism went universal … The crises of recent years have uncovered an extensive failure to realize the ideals of endless economic expansion and private wealth creation. Most newly created ‘individuals’ toil within poorly imagined social and political communities and/or states with weakening sovereignty … individuals with very different pasts find themselves herded by capitalism and technology into a common present, where grossly unequal distributions of wealth and power have created humiliating new hierarchies. This proximity … is rendered more claustrophobic by digital communications … [S]hocks of modernity were once absorbed by inherited social structures of family and community, and the state’s welfare cushions [something mentioned here, too]. Today’s individuals are directly exposed to them in an age of accelerating competition on uneven playing fields, where it is easy to feel that there is no such thing as either society or state, and that there is only a war of all against all. [pp. 12–14]

These long quotes (the second one cut together from longer paragraphs) are here because Mishra is remarkably eloquent in his diagnosis of globalized culture. Although I’ve only read the prologue, I expect to find support for my long-held contention that disorienting disruptions of modernity (using Anthony Giddens’ sociological definition rather than the modish use of the term Postmodern to describe only the last few decades) create unique and formidable challenges to the formation of healthy self-image and personhood. Foremost among these challenges is an unexpectedly oppressive information environment: the world forced into full view and inciting comparison, jealousy, envy, and hatred stemming from routine and ubiquitous frustrations and humiliations as we each struggle in life getting our personal share of attention, renown, and reward.

Another reason Mishra provides for our collective anger is a deep human yearning not for anarchism or radical freedom but rather for belonging and absorption within a meaningful social context. This reminds me of Erich Fromm’s book Escape from Freedom (1941), which I read long ago but can’t remember so well anymore. I do remember quite vividly how counter-intuitive was the suggestion that absolute freedom is actually burdensome as distinguished from the usual programming we get about breaking free of all restraints. (Freedom! Liberty!) Indeed, Mishra provides a snapshot of multiple cultural and intellectual movements from the past two centuries where abandoning oneself to a cause, any cause, was preferable to the boredom and nothingness of everyday life absent purpose other than mere existence. The modern substitute for larger purpose — commodity culture — is a mere shadow of better ways of spending one’s life. Maybe commodity culture is better than sacrificing one’s life fighting wars (a common fate) or destroying others, but that’s a much longer, more difficult argument.

More to follow as my reading progresses.

Throughout human history, the question “who should rule?” has been answered myriad ways. The most enduring answer is simple: he who can muster and deploy the most force of arms and then maintain control over those forces. Genghis Khan is probably the most outrageously successful example and is regarded by the West as a barbarian. Only slightly removed from barbarians is the so-called Big Man, who perhaps adds a layer of diplomacy by running a protection racket while selectively providing and distributing spoils. As societies move further away from subsistence and immediacy, various absolute rulers are established, often through hereditary title. Call them Caesar, chief, dear leader, emir, emperor (or empress), kaiser, king (or queen), pharaoh, premier, el presidente, sultan, suzerain, or tsar, they typically acquire power through the accident of birth and are dynastic. Some are female but most are male, and they typically extract tribute and sometimes demand loyalty oaths.

Post-Enlightenment, rulers are frequently democratically elected administrators (e.g., legislators, technocrats, autocrats, plutocrats, kleptocrats, and former military) ideally meant to be representative of common folks. In the U.S., members of Congress (and of course the President) are almost wholly drawn from the ranks of the wealthy (insufficient wealth being a de facto bar to office) and are accordingly estranged from American life the many different ways most of us experience it. Below the top level of visible, elected leaders is a large, hidden apparatus of high-level bureaucratic functionaries (often appointees), the so-called Deep State, that is relatively stable and made up primarily of well-educated, white-collar careerists whose ambitions for themselves and the country are often at odds with the citizenry.

I began to think about this in response to a rather irrational reply to an observation I made here. Actually, it wasn’t even originally my observation but that of Thomas Frank, namely, that the Deep State is largely made up of the liberal professional class. The reply reinforced the notion who better to rule than the “pros”? History makes the alternatives unthinkable. Thus, the Deep State’s response to the veritable one-man barbarian invasion of the Oval Office has been to seek removal of the interloper by hook or by crook. (High office in this case was won unexpectedly and with unnamed precedent by rhetorical force — base populism — rather than by military coup, making the current occupant a quasi-cult leader; similarly, extracted tribute is merely gawking attention rather than riches.)

History also reveals that all forms of political organization suffer endemic incompetence and corruption, lending truth to Winston Churchill’s witticism “Democracy is the worst form of government, except for all the others.” Indeed, recent rule by technocrats has been supremely awful, leading to periodic market crashes, extreme wealth inequality, social stigmatization, and forever wars. Life under such rule is arguably better than under various other political styles; after all, we gots our vaunted freedoms and enviable material comforts. But the exercise of those freedoms does not reliably deliver either ontological security or psychological certainty we humans crave. In truth, our current form of self-governance has let nothing get in the way of plundering the planet for short-term profit. That ongoing priority is making Earth uninhabitable not just for other species but for humans, too. In light of this fact, liberal technocratic democracy could be a far worse failure than most: it will have killed billions (an inevitability now under delayed effect).

Two new grassroots movements (to my knowledge) have appeared that openly question who should rule: the Sunrise Movement (SM) and the Extinction Rebellion (ER). SM is a youth political movement in the U.S. that acknowledges climate change and supports the Green New Deal as a way of prioritizing the desperate existential threat modern politics and society have become. For now at least, SM appears to be content with working within the system, replacing incumbents with candidates it supports. More intensely, ER is a global movement centered in the U.K. that also acknowledges that familiar modern forms of social and political organization (there are several) no longer function but in fact threaten all of us with, well, extinction. One of its unique demands is that legislatures be drawn via sortition from the general population to be more representative of the people. Further, sortition avoids the established pattern of those elected to lead representational governments from being corrupted by the very process of seeking and attaining office.

I surmise attrition and/or replacement (the SM path) are too slow and leave candidates vulnerable to corruption. In addition, since no one relinquishes power willingly, current leaders will have to be forced out via open rebellion (the ER path). I’m willing to entertain either path but must sadly conclude that both are too little, too late to address climate change and near-term extinction effectively. Though difficult to establish convincingly, I suspect the time to act was in the 1970s (or even before) when the Ecology Movement arose in recognition that we cannot continue to despoil our own habitat without consequence. That message (social, political, economic, and scientific all at once) was as inert then as it is now. However, fatalism acknowledged, some other path forward is better than our current systems of rule.

Everyone is familiar with the convention in entertainment media where characters speak without the use of recognizable language. (Not related really to the convention of talking animals.) The first instance I can recall (someone correct me if earlier examples are to be found) is the happy-go-lucky bird Woodstock from the old Peanuts cartoons (do kids still recognize that cast of characters?), whose dialog was shown graphically as a series of vertical lines:

When the cartoon made its way onto TV for holiday specials, its creator Charles Schultz used the same convention to depict adults, never shown onscreen but with dialogue voiced by a Harmon-muted trombone. Roughly a decade later, two characters from the Star Wars franchise “spoke” in languages only other Star Wars characters could understand, namely, Chebacca (Chewie) and R2D2. More recently, the character Groot from Guardians of the Galaxy (known to me only through the Marvel movie franchise, not through comic books) speaks only one line of dialogue, “I am Groot,” which is understood as full speech by others Guardians characters. When behemoths larger than a school bus (King Kong, Godzilla, Jurassic dinosaurs, Cloverfield, Kaiju, etc.) appear, the characters are typically denied the power of speech beyond the equivalent of a lion’s roar. (True villains talk little or not at all as they go about their machinations — no monologuing! unless it’s a James Bond film. An exception notable for its failure to charm audiences is Ultron, who wouldn’t STFU. You can decide for yourself which is the worse kind of villainy.)

This convention works well enough for storytelling and has the advantage of allowing the reader/viewer to project onto otherwise blank speech. However, when imported into the real world, especially in politics, the convention founders. There is no Babelfish universal translator inserted in the ear to transform nonsense into coherence. The obvious example of babblespeech is 45, whose speech when off the teleprompter is a series of rambling non sequiturs, free associations, slogans, and sales pitches. Transcripts of anyone’s extemporaneous speech reveal lots of restarts and blind alleys; we all interrupt ourselves to redirect. However, word salad that substitutes for meaningful content in 45’s case is tragicomic: alternately entirely frustrating or comically entertaining depending on one’s objective. Satirical news shows fall into the second category.

45 is certainly not the first. Sarah Palin in her time as a media darling (driver of ratings and butt of jokes — sound familiar?) had a knack for crazy speech combinations that were utter horseshit yet oddly effective for some credulous voters. She was even a hero to some (nearly a heartbeat away from being the very first PILF). We’ve also now been treated to a series of public interrogations where a candidate for a cabinet post or an accused criminal offers testimony before a congressional panel. Secretary of Education Betsy DeVos famously evaded simple yes/no questions during her confirmation hearing, and Supreme Court Justice Brett Kavanaugh similarly refused to provide direct answers to direct questions. Unexpectedly, sacrificial lamb Michael Cohen does give direct answers to many questions, but his interlocutors then don’t quite know how to respond considering their experience and expectation that no one answers appropriately.

What all this demonstrates is that there is often a wide gulf between what is said and what is heard. In the absence of what might be understood as effective communication (honest, truthful, and forthright), audiences and voters fill in the blanks. Ironically, we also can’t handle hear too much truth when confronted by its awfulness. None of this is a problem in storytelling, but when found in politic narratives, it’s emblematic of how dysfunctional our communications have become, and with them, the clear thought and principled activity of governance.