Posts Tagged ‘Memes’

Most of us are familiar with a grandpa, uncle, or father who eventually turns into a cranky old man during late middle age or in his dotage. (Why is it a mostly male phenomenon?) In the last three decades, Clint Eastwood typecast himself as a cranky old man, building on lone-wolf characters (mostly cops, criminals, and cowboys) established earlier in his career. In real life, these guys spout talking points absorbed from mainstream media and narrative managers, or if they are truly lazy and/or can’t articulate anything coherently on their own, merely forward agitprop via e-mail like chain mail of yore. They also demonstrate remarkably forgivable racism, sexism, and bigotry, such as Eastwood’s rather enjoyable and ultimately redeemed character in the film Gran Torino. If interaction with such a fellow is limited to Thanksgiving gatherings once per year, crankiness can be tolerated fairly easily. If interactions are ongoing, then a typical reaction is simply to delete e-mail messages unread, or in the case of unavoidable face-to-face interaction, to chalk it up: Well, that’s just Grandpa Joe or Uncle Bill or Dad. Let him rant; he’s basically harmless now that he’s so old he creaks.

Except that not all of them are so harmless. Only a handful of the so-called Greatest Generation (I tire of the term but it’s solidly established) remain in positions of influence. However, lots of Boomers still wield considerable power despite their advancing age, looming retirement (and death), and basic out-of-touchness with a culture that has left them behind. Nor are their rants and bluster necessarily wrong. See, for instance, this rant by Tom Engelhardt, which begins with these two paragraphs:

Let me rant for a moment. I don’t do it often, maybe ever. I’m not Donald Trump. Though I’m only two years older than him, I don’t even know how to tweet and that tells you everything you really need to know about Tom Engelhardt in a world clearly passing me by. Still, after years in which America’s streets were essentially empty, they’ve suddenly filled, day after day, with youthful protesters, bringing back a version of a moment I remember from my youth and that’s a hopeful (if also, given Covid-19, a scary) thing, even if I’m an old man in isolation in this never-ending pandemic moment of ours.

In such isolation, no wonder I have the urge to rant. Our present American world, after all, was both deeply unimaginable — before 2016, no one could have conjured up President Donald Trump as anything but a joke — and yet in some sense, all too imaginable …

If my own father (who doesn’t read this blog) could articulate ideas as well as Engelhardt, maybe I would stop deleting unread the idiocy he forwards via e-mail. Admittedly, I could well be following in my father’s footsteps, as the tag rants on this blog indicates, but at least I write my own screed. I’m far less accomplished at it than, say, Engelhardt, Andy Rooney (in his day), Ralph Nader, or Dave Barry, but then, I’m only a curmudgeon-in-training, not having fully aged (or elevated?) yet to cranky old manhood.

As the fall presidential election draws near (assuming that it goes forward), the choice in the limited U.S. two-party system is between one of two cranky old men, neither of which is remotely capable of guiding the country through this rough patch at the doomer-anticipated end of human history. Oh, and BTW, echoing Engelhardt’s remark above, 45 has been a joke all of my life — a dark parody of success — and remains so despite occupying the Oval Office. Their primary opponent up to only a couple months ago was Bernie Sanders, himself a cranky old man but far more endearing at it. This is what passes for the best leadership on offer?

Many Americans are ready to move on to someone younger and more vibrant, able to articulate a vision for something, well, different from the past. Let’s skip right on past candidates (names withheld) who parrot the same worn-out ideas as our fathers and grandfathers. Indeed, a meme emerged recently to the effect that the Greatest Generation saved us from various early 20th-century scourges (e.g., Nazis and Reds) only for the Boomers to proceed in their turn to mess up the planet so badly nothing will survive new scourges already appearing. It may not be fair to hang such labels uniformly around the necks of either generation (or subsequent ones); each possesses unique characteristics and opportunities (some achieved, others squandered) borne out of their particular moment in history. But this much is clear: whatever happens with the election and whichever generational cohort assumes power, the future is gonna be remarkably different.

Purpose behind consumption of different genres of fiction varies. For most of us, it’s about responding to stimuli and experiencing emotions vicariously, which is to say, safely. For instance, tragedy and horror can be enjoyed, if that’s the right word, in a fictional context to tweak one’s sensibilities without significant effect outside the story frame. Similarly, fighting crime, prosecuting war, or repelling an alien invasion in a video game can be fun but is far removed from actually doing those things in real life (not fun). For less explicitly narrative forms, such as music, feelings evoked are aesthetic and artistic in nature, which makes a sad song or tragic symphony enjoyable on its own merits without bleeding far into real sadness or tragedy. Cinema (now blurred with broadcast TV and streaming services) is the preeminent storytelling medium that provoke all manner of emotional response. After reaching a certain age (middle to late teens), emotional detachment from depiction of sexuality and violent mayhem makes possible digestion of such stimulation for the purpose of entertainment — except in cases where prior personal trauma is triggered. Before that age, nightmare-prone children are prohibited.

Dramatic conflict is central to driving plot and story forward, and naturally, folks are drawn to some stories while avoiding others. Although I’m detached enough not to be upset by, say, zombie films where people and zombies alike are dispatched horrifically, I wouldn’t say I enjoy gore or splatter. Similarly, realistic portrayals of war (e.g., Saving Private Ryan) are not especially enjoyable for me despite the larger story, whether based on true events or entirely made up. The primary reason I leave behind a movie or TV show partway through is because I simply don’t enjoy watching suffering.

Another category bugs me even more: when fiction intrudes on reality to remind me too clearly of actual horrors (or is it the reverse: reality intruding on fiction?). It doesn’t happen often. One of the first instances I recall was in Star Trek: The Next Generation when the story observed that (fictional) warp travel produced some sort of residue akin to pollution. The reminder that we humans are destroying the actual environment registered heavily on me and ruined my enjoyment of the fictional story. (I also much prefer the exploration and discovery aspects of Star Trek that hew closer to Gene Roddenberry’s original vision than the militaristic approach now central to Star Trek.) A much more recent intrusion occurs in the rather adolescent TV show The 100, where a global nuclear exchange launched by an artificial intelligence has the follow-on effect a century later of remaining nuclear sites going critical, melting down, and irradiating the Earth, making it uninhabitable. This bothers me because that’s my expectation what happens in reality, probably not too long (decades) after industrial civilization collapses and most or all of us are dead. This prospect served up as fiction is simply too close to reality for me to enjoy vicariously.

Another example of fiction intruding too heavily on my doomer appreciation of reality occurred retroactively. As high-concept science fiction, I especially enjoyed the first Matrix movie. Like Star Trek, the sequels degraded into run-of-the-mill war stories. But what was provocative about the original was the matrix itself: a computer-generated fiction situated within a larger reality. Inside the matrix was pleasant enough (though not without conflict), but reality outside the matrix was truly awful. It was a supremely interesting narrative and thought experiment when it came out in 1999. Now twenty-one years later, it’s increasingly clear that we are living in a matrix-like, narrative-driven hyperreality intent on deluding ourselves of a pleasant equilibrium that simply isn’t in evidence. In fact, as societies and as a civilization, we’re careening out of control, no brakes, no steering. Caitlin Johnstone explores this startling after-the-fact realization in an article at Medium.com, which I found only a couple days ago. Reality is in fact far worse than the constructed hyperreality. No wonder no one wants to look at it.

The first time I wrote on this title was here. I’m pretty satisfied with that 11-year-old blog post. Only recently, I copped to use of reframing to either zoom in on detail or zoom out to context, a familiar rhetorical device. Here I’m zooming out again to the god’s eye view of things.

The launching point for me is James Howard Kunstler’s recent blog post explaining and apologizing for his generation’s principal error: financialization of the U.S. economy. In that post, he identifies characteristics in grandparents and parents of boomers as each responds and adapts to difficulties of the most self-destructive century in human history. Things destroyed include more than just lives, livelihoods, and the biosphere. After several centuries of rising expectations and faith in progress (or simply religious faith), perhaps the most telling destruction is morale, first in the reckless waste of WWI (the first mechanized war), then repeatedly in serial economic and political catastrophes and wars that litter the historical record right up to today. So it’s unsurprising (but not excusable) that boomers, seeing in unavoidable long-term destruction our powerlessness to master ourselves or in fact much of anything — despite the paradox of developing and obtaining more power at every opportunity — embarked on a project to gather to themselves as much short-term wealth and power as possible because, well, why the fuck not? Kunstler’s blog post is good, and he admits that although the masters-of-the-universe financial wizards who converted the economy into a rigged casino/carnival game for their own benefit are all boomers, not all boomers are responsible except in the passive sense that we (includes me, though I’m just as powerless as the next) have allowed it to transpire without the necessary corrective: revolt.

Zooming out, however, I’m reminded of Jared Diamond’s assessment that the greatest mistake humans ever committed was the Agricultural Revolution 10–13 millennia ago. That context might be too wide, so let me restrict to the last 500 years. One theory propounded by Morris Berman in his book Why America Failed (2011) is that after the discovery of the New World, the cohort most involved in colonizing North America was those most desperate and thus inclined to accept largely unknown risks. To them, the lack of ontological security and contingent nature of their own lives were undeniable truths that in turn drive distortion of the human psyche. Thus, American history and character are full of abominations hardly compensated for by parallel glories. Are boomers, or more generally Americans, really any worse than others throughout history? Probably not. Too many counter-examples to cite.

The current endgame phase of history is difficult to assess as we experience it. However, a curious theory came to my attention that fits well with my observation of a fundamental epistemological crisis that has made human cognition into a hall of mirrors. (See also here and here, and I admit cognition may have always been a self-deception.) In a recent Joe Rogan podcast, Eric Weinstein, who comes across as equally brilliant and disturbed (admitting that not much may separate those two categories), opines that humans can handle only 3–4 layers of deception before collapsing into disorientation. It’s probably a feature, not a bug, and many have learned to exploit it. The example Weinstein discusses (derivative of others’ analyses, I think) is professional wrestling. Fans and critics knew for a very long time that wrestling looks fake, yet until the late 1980s, wrestlers and promoters held fast to the façade that wresting matches are real sporting competitions rather than being “sports entertainments.” Once the jig was up, it turned out that fans didn’t really care; it was real enough for them. Now we’ve come full circle with arguments (and the term kayfabe) that although matches are staged and outcomes known in advance, the wresting itself is absolutely for real. So we encounter a paradox where what we’re told and shown is real, except that it isn’t, except that it sorta is, ultimately finding that it’s turtles all the way down. Enthusiastic, even rabid, embrace of the unreality of things is now a prime feature of the way we conduct ourselves.

Professional wrestling was not the first organization or endeavor to offer this style of mind-bending unreality. Deception and disinformation (e.g., magic shows, fortune-telling, con jobs, psyops) have been around forever. However, wrestling may well have perfected the style for entertainment purposes, which has in turn infiltrated nearly all aspects of modern life, not least of which are economics and politics. Thus, we have crypto- and fiat currencies based on nothing, where money can be materialized out of thin air to save itself from worthlessness, at least until that jig is up, too. We also have twin sham candidates for this fall’s U.S. presidential election, both clearly unfit for the job for different reasons. And in straightforward fictional entertainment, we have a strong revival of magical Medievalism, complete with mythical creatures, spells, and blades of fortune. As with economics and politics, we know it’s all a complex of brazen lies and gaslighting, but it’s nonetheless so tantalizing that its entertainment value outstrips and sidelines any calls to fidelity or integrity. Spectacle and fakery are frankly more interesting, more fun, more satisfying. Which brings me to my favorite Joe Bageant quote:

We have embraced the machinery of our undoing as recreation.

That man is me. Thrice in the last month I’ve stumbled headlong into subjects where my ignorance left me grasping in the dark for a ledge or foothold lest I be swept into a maelstrom of confusion by someone’s claims. This sensation is not unfamiliar, but it’s usually easy to beat back. Whereas I possess multiple areas of expertise and as an autodidact am constantly absorbing information, I nonetheless recognize that even in areas where I consider myself qualified to act and/or opine confidently, others possess authority and expertise far greater than mine. Accordingly, I’ve always considered myself a generalist. (A jack of all trades is not quite the same thing IMO, but I decline to draw that distinction here.)

Decisions must inevitably be made on insufficient information. That’s true because more information can always be added on top, which leads to paralysis or infinite regress if one doesn’t simply draw an arbitrary line and stop dithering. This is also why I aver periodically that consciousness is based on sufficiency, meaning “good enough.” A paradox exists between a decision being good enough to proceed despite the obvious incompleteness of information that allows for full, balanced analysis, if fullness can even be achieved. Knowledge is thus sufficient and insufficient at the same time. Banal, everyday purchasing decisions at the grocery store are low risk. Accepting a job offer, moving to a new city, and proposing marriage carry significant risks but are still decisions made on insufficient information precisely because they’re prospective. No way of knowing with certainty how things will turn out. (more…)

Didn’t expect to come back to this one so soon, but an alternative meaning behind my title just appeared. Whereas the first post was about cancel culture, this redux is about finding people willing and able to act as mouthpieces for whatever narrative the powers that be wish to foist on the public, as in “Where do they dig up these characters people?”

Wide-ranging opinion is not difficult to obtain in large populations, so although plenty of folks are willing to be paid handsomely to mouth whatever words are provided to them (e.g., public relations hacks, social media managers, promoters, spokespersons, actors, and straight-up shills in advertisements of all sorts), a better approach is simply to find people who honestly believe the chosen narrative so that they can do others’ bidding guilelessly, which is to say, without any need of selling their souls. This idea first came to my attention in an interview (can’t remember the source) given by Noam Chomsky where is chided the interviewer, who had protested that no one was telling him what to say, by observing that if he didn’t already share the desired opinion, he wouldn’t have the job. The interviewer was hired and retained precisely because he was already onboard. Those who depart from the prescribed organizational perspective are simply not hired, or if their opinions evolve away from the party line, they are fired. No need to name names, but many have discovered that journalistic objectivity (or at least a pose of objectivity) and independent thought are not high values in the modern media landscape.

Here’s a good example: 19-year-old climate change denier/skeptic Naomi Seibt is being billed as the anti-Greta Thunberg. No doubt Seibt believes the opinions she will be presenting at the Heartland Institute later this week. All the more authenticity if she does. But it’s a little suspicious, brazen and clumsy even, that another European teenage girl is being raised up to dispel Time Magazine‘s 2019 Person of the Year, Greta Thunberg. Maybe it’s even true, as conspiracists suggest, that Thunberg herself is being used to drive someone else’s agenda. The MSM is certainly using her to drive ratings. These questions are all ways to distract from the main point, which is that we’re driving ourselves to extinction (alongside most of the rest of the living world) by virtue of the way we inhabit the planet and consume its finite resources.

Here’s a second example: a “debate” on the subject of socialism between economists Paul Krugman and Richard Wolff on PBS‘s show Democracy Now!

 

Let me disclose my biases up front. I’ve never liked economists as analysts of culture, sociology, or electoral politics. Krugman in particular has always read like more of an apologist for economic policies that support the dysfunctional status quo, so I pay him little attention. On the other hand, Wolff has engaged his public as a respectable teacher/explainer of the renewed socialist movement of which he is a part, and I give him my attention regularly. In truth, neither of these fellow needed to be “dug up” from obscurity. Both are heavily covered in the media, and they did a good job not attacking each other while making their cases in the debate.

The weird thing was how Krugman is so clearly triggered by the word socialism, even though he acknowledges that the U.S. has many robust examples of socialism already. He was clearly the one designated to object to socialism as an ideology and describes socialism as an electoral kiss of death. Maybe he has too many childhood memories of ducking, covering, and cowering during those Atomic Era air raid drills and so socialism and communism were imprinted on him as evils never to be entertained. At least three generations after him lack those memories, however, and are not traumatized by the prospect of socialism. In fact, that’s what the Democratic primaries are demonstrating: no fear but rather enthusiastic support for the avowed Democratic Socialist on the ballots. Who are the fearful ones? Capitalists. They would be wise to learn sooner than later that the public, as Wolff says plainly, is ready for change. Change is coming for them.

One of the victims of cancel culture, coming to my attention only days ago, is Kate Smith (1907–1986), a singer of American popular song. Though Smith had a singing career spanning five decades, she is best remembered for her version(s) of Irving Berlin’s God Bless America, which justifiably became a bit of Americana. The decades of Smith’s peak activity were the 1930s and 40s.

/rant on

I dunno what goes through people’s heads, performing purity rituals or character excavation on folks long dead. The controversy stems from Smith having a couple other songs in her discography: That’s Why Darkies Were Born (1931) and Pickaninny Heaven from the movie Hello, Everybody! (1933). Hate to break it anyone still living under a rock, but these dates are not far removed from minstrelsy, blackface, and The Birth of a Nation (1915) — a time when typical Americans referred to blacks with a variety of terms we now consider slurs. Such references were still used during the American civil rights movement (1960s) and are in use among some virulent white supremacists even today. I don’t know the full context of Kate Smith having sung those songs, but I suspect I don’t need to. In that era, popular entertainment had few of the sensibilities regarding race we now have (culture may have moved on, but it’s hard to say with a straight face it’s evolved or progressed humanely), and uttering commonly used terms back then was not automatic evidence of any sort of snarling racism.

I remember having heard my grandparents, nearly exact contemporaries of Kate Smith, referring to blacks (the term I grew up with, still acceptable I think) with other terms we no longer consider acceptable. It shocked me, but to them, that’s simply what blacks were called (the term(s) they grew up with). Absolutely nothing in my grandparents’ character or behavior indicated a nasty, racist intent. I suspect the same was true of Kate Smith in the 1930s.

Back when I was a librarian, I also saw plenty of sheet music published before 1920 or so with the term darkie (or darkey) in the title. See for example this. The Library of Congress still uses the subject headings “negro spirituals” (is there another kind?) and “negro songs” to refer to various subgenres of American folk song that includes slave songs, work songs, spirituals, minstrel music, protest songs, etc. Maybe we should cancel the Library of Congress. Some published music titles from back then even call them coon songs. That last one is totally unacceptable today, but it’s frankly part of our history, and like changing character names in Mark Twain’s Huckleberry Finn, sanitizing the past does not make it go away or any less discomfiting. But if you wanna bury your head in the sand, go ahead, ostrich.

Also, if some person or entity ever does some questionably racist, sexist, or malign thing (even something short of abominable) situated contextually in the past, does that mean he, she, or it must be cancelled irrevocably? If that be the case, then I guess we gotta cancel composer Richard Wagner, one of the most notorious anti-Semites of the 19th century. Also, stop watching Pixar, Marvel, and Star Wars films (among others), because remember that time when Walt Disney Studios (now Walt Disney Company) made a racist musical film, Song of the South (1946)? Disney’s tainted legacy (extending well beyond that one movie) is at least as awful as, say, Kevin Spacey, and we’re certainly not about to rehabilitate him.

/rant off

Periodically, I come across preposterously stupid arguments (in person and online) I can’t even begin to dispel. One such argument is that carbon is plant food, so we needn’t worry about greenhouse gases such as carbon dioxide, a byproduct of industrial activity. Although I’m unconvinced by such arrant capsule arguments, I’m also in a lousy position to contend with them because convincing evidence lies outside my scientific expertise. Moreover, evidence (should I bother to gather it) is too complex and involved to fit within a typical conversation or simple explanation. Plus, evidence relies on scientific literacy and critical reasoning often lacking in the lay public. Scientific principles work better for me rather than, for example, the finely tuned balances Nature is constantly tinkering with — something we humans can hope to discover only partially. Yet we sally forth aggressively and heedlessly to manipulate Nature at our peril, which often results in precisely the sort of unintended consequence scientists in Brazil found when mosquitoes altered genetically (to reduce their numbers as carriers of disease) developed into mosquitoes hardier and more difficult to eradicate than if we had done nothing. The notion that trees respond favorably to increased carbon in the atmosphere has been a thorn in my side for some time. Maybe it’s even partly true; I can’t say. However, the biological and geophysical principle I adhere to is that even small changes in geochemistry (minute according to some scales, e.g., parts per million or per billion) have wildly disproportionate effects. The main effect today is climate changing so fast that many organisms can’t adapt or evolve quickly enough to keep up. Instead, they’re dying en masse and going extinct.

The best analogy is the narrow range of healthy human body temperature centered on 98.6 °F. Vary not far up (fever) or down (hypothermia) and human physiology suffers and become life threatening. Indeed, even in good health, we humans expend no small effort keeping body temperature from extending far into either margin. Earth also regulates itself through a variety of blind mechanisms that are in the process of being wrecked by human activity having risen by now to the level of terraforming, much like a keystone species alters its environment. So as the planet develops the equivalent of a fever, weather systems and climate (not the same things) react, mostly in ways that make life on the surface much harder to sustain and survive. As a result, trees are in the process of dying. Gail Zawacki’s blog At Wit’s End (on my blogroll) explores this topic in excruciating and demoralizing detail. Those who are inclined to deny offhandedly are invited to explore her blog. The taiga (boreal forest) and the Amazonian rainforest are among the most significant ecological formations and carbon sinks on the planet. Yet both are threatened biomes. Deforestation and tree die-off is widespread, of course. For example, since 2010, an estimated 129 million trees in California have died from drought and bark beetle infestation. In Colorado, an estimated more than 800 millions dead trees still standing (called snags) are essentially firestarter. To my way of thinking, the slow, merciless death of trees is no small matter, and affected habitats may eventually be relegated to sacrifice zones like areas associated with mining and oil extraction.

Like the bait “carbon is plant food,” let me suggest that the trees have begun to rebel by falling over at the propitious moment to injure and/or kill hikers and campers. According to this article at Outside Magazine, the woods are not safe. So if mosquitoes, rattlesnakes, mountain lions, or bears don’t getcha first, beware of the trees. Even broken branches and dead tree trunks that haven’t fallen fully to the ground (known as hung snags, widow-makers, and foolkillers) are known to take aim at human interlopers. This is not without precedent. In The Lord of the Rings, remember that the Ents (tree herders) went to war with Isengard, while the Huorns destroyed utterly the Orcs who had laid siege to Helm’s Deep. Tolkien’s tale is but a sliver of a much larger folklore regarding the enchanted forest, where men are lost or absorbed (as with another Tolkien character, Old Man Willow). Veneration of elemental forces of nature (symbols of both life and its inverse death) is part of our shared mythology, though muted in an era of supposed scientific sobriety. M. Night Shyamalan has weak explorations of similar themes in several of his films. Perhaps Tolkien understood at an intuitive level the silent anger and resentment of the trees, though slow to manifest, and their eventual rebellion over mistreatment by men. It’s happening again, right now, all around us. Go ahead: prove me wrong.

Continuing (after some delay) from part 1, Pankaj Mishra concludes chapter 4 of The Age of Anger with an overview of Iranian governments that shifted from U.S./British client state (headed by the Shah of Iran, reigned 1941–1979) to its populist replacement (headed by Ayatollah Khomeini, ruled 1979–1989), both leaders having been authoritarians. During the period discussed, Iran underwent the same modernization and infiltration by liberal, Western values and economics, which produced a backlash familiar from Mishra’s descriptions of other nations and regions that had experienced the same severed roots of place since the onset of the Enlightenment. Vacillation among two or more styles of government might be understood as a thermostatic response: too hot/cold one direction leads to correction in another direction. It’s not a binary relationship, however, between monarchy and democracy (to use just one example). Nor are options between a security state headed by an installed military leader and a leader elected by popular vote. Rather, it’s a question of national identity being alternatively fractured and unified (though difficult to analyze and articulate) in the wake of multiple intellectual influences.

According to Lewis and Huntington, modernity has failed to take root in intransigently traditional and backward Muslim countries despite various attempts to impose it by secular leaders such as Turkey’s Atatürk, the Shah of Iran, Algeria’s Ben Bella, Egypt’s Nasser and Sadat, and Pakistan’s Ayub Khan.

Since 9/11 there have been many versions, crassly populist as well as solemnly intellectual, of the claims by Lewis and Huntington that the crisis in Muslim countries is purely self-induced, and [that] the West is resented for the magnitude of its extraordinary success as a beacon of freedom, and embodiment of the Enlightenment’s achievements … They have mutated into the apparently more sophisticated claim that the clash of civilizations occurs [primarily] within Islam, and that Western interventions are required on behalf of the ‘good Muslim’, who is rational, moderate and liberal. [p. 127]

This is history told by the putative winners. Mishra goes on:

Much of the postcolonial world … became a laboratory for Western-style social engineering, a fresh testing site for the Enlightenment ideas of secular progress. The philosophes had aimed at rationalization, or ‘uniformization’, of a range of institutions inherited from an intensely religious era. Likewise, postcolonial leaders planned to turn illiterate peasants into educated citizens, to industrialize the economy, move the rural population to cities, alchemize local communities into a singular national identity, replace the social hierarchies of the past with an egalitarian order, and promote the cults of science and technology among a pious and often superstitious population. [p. 133]

Readers may recognize this project and/or process by its more contemporary name: globalization. It’s not merely a war of competing ideas, however, because those ideas manifest in various styles of social and political organization. Moreover, the significance of migration from rural agrarian settings to primarily urban and suburban ones can scarcely be overstated. This transformation (referring to the U.S. in the course of the 20th century) is something James Howard Kunstler repeatedly characterizes rather emphatically as the greatest misallocation of resources in the history of the world. Mishra summarizes the effects of Westernization handily:

In every human case, identity turns out to be porous and inconsistent rather than fixed and discrete; and prone to get confused and lost in the play of mirrors. The cross-currents of ideas and inspirations — the Nazi reverence for Atatürk, a gay French philosopher’s denunciation of the modern West and sympathy for the Iranian Revolution, or the various ideological inspirations for Iran’s Islamic Revolution (Zionism, Existentialism, Bolshevism and revolutionary Shiism) — reveal that the picture of a planet defined by civilizations closed off from one another and defined by religion (or lack thereof) is a puerile cartoon. They break the simple axis — religious-secular, modern-medieval, spiritual-materialist — on which the contemporary world is still measured, revealing that its populations, however different their pasts, have been on converging and overlapping paths. [p. 158]

These descriptions and analyses put me in mind of a fascinating book I read some years ago and reviewed on Amazon (one of only a handful of Amazon reviews): John Reader’s Man on Earth (1988). Reader describes and indeed celebrates incredibly diverse ways of inhabiting the Earth specially adapted to the landscape and based on evolving local practices. Thus, the notion of “place” is paramount. Comparison occurs only by virtue of juxtaposition. Mishra does something quite different, drawing out the connective ideas that account for “converging and overlapping paths.” Perhaps inevitably, disturbances to collective and individual identities that flow from unique styles of social organization, especially those now operating at industrial scale (i.e., industrial civilization), appear to be picking up. For instance, in the U.S., even as mass shootings (a preferred form of attack but not the only one) appear to be on the rise at the same time that violent crime is at an all-time low, perpetrators of violence are not limited to a few lone wolves, as the common trope goes. According to journalist Matt Agorist,

mass shootings — in which murdering psychopaths go on rampages in public spaces — have claimed the lives of 339 people since 2015 [up to mid-July 2019]. While this number is certainly shocking and far too high, during this same time frame, police in America have claimed the lives of 4,355 citizens.

And according to this article in Vox, this crazy disproportion (police violence to mass shootings) is predominantly an American thing at least partly because of our high rate of fetishized civilian gun ownership. Thus, the self-described “land of the free, home of the brave” has transformed itself into a paranoid garrison state affecting civil authority even more egregiously than the disenfranchised (mostly young men). Something similar occurred during the Cold War, when leaders became hypervigilant for attacks and invasions that never came. Whether a few close calls during the height of the Cold War were the result of escalating paranoia, brinkmanship, or true, maniacal, existential threats from a mustache-twirling, hand-rolling despot hellbent on the destruction of the West is a good question, probably impossible to answer convincingly. However, the result today of this mindset couldn’t be more disastrous:

It is now clear that the post-9/11 policies of pre-emptive war, massive retaliation, regime change, nation-building and reforming Islam have failed — catastrophically failed — while the dirty war against the West’s own Enlightenment [the West secretly at war with itself] — inadvertently pursued through extrajudicial murder, torture, rendition, indefinite detention and massive surveillance — has been a wild success. The uncodified and unbridled violence of the ‘war on terror’ ushered in the present era of absolute enmity in which the adversaries, scornful of all compromise, seek to annihilate each other. Malignant zealots have emerged at the very heart of the democratic West after a decade of political and economic tumult; the simple explanatory paradigm set in stone soon after the attacks of 9/11 — Islam-inspired terrorism versus modernity — lies in ruins. [pp.124–125]

A potpourri of recent newsbits and developments. Sorry, no links or support provided. If you haven’t already heard of most of these, you must be living under a rock. On a moment’s consideration, that may not be such a bad place to dwell.

rant on/

I just made up the word of the title, but anyone could guess its origin easily. Many of today’s political and thought leaders (not quite the same thing; politics doesn’t require much thought), as well as American institutions, are busy creating outrageously preposterous legacies for themselves. Doomers like me doubt anyone will be around to recall in a few decades. For instance, the mainstream media (MSM) garners well-deserved rebuke, often attacking each other in the form of one of the memes of the day: a circular firing squad. Its brazen attempts at thought-control (different thrusts at different media organs) and pathetic abandonment of mission to inform the public with integrity have hollowed it out. No amount of rebranding at the New York Times (or elsewhere) will overcome the fact that the public has largely moved on, swapping superhero fiction for the ubiquitous fictions spun by the MSM and politicians. The RussiaGate debacle may be the worst example, but the MSM’s failures extend well beyond that. The U.S. stock market wobbles madly around its recent all-time high, refusing to admit its value has been severely overhyped and inflated through quantitative easing, cheap credit (an artificial monetary value not unlike cryptocurrencies or fiat currency created out of nothing besides social consensus), and corporate buybacks. The next crash (already well overdue) is like the last hurricane: we might get lucky and it will miss us this season, but eventually our lottery number will come up like those 100-year floods now occurring every few years or decades.

Public and higher education systems continue to creak along, producing a glut of dropouts and graduates ill-suited to do anything but the simplest of jobs requiring no critical thought, little training, and no actual knowledge or expertise. Robots and software will replace them anyway. Civility and empathy are cratering: most everyone is ready and willing to flip the bird, blame others, air their dirty laundry in public, and indulge in casual violence or even mayhem following only modest provocation. Who hasn’t fantasized just a little bit about acting out wildly, pointlessly like the mass killers blackening the calendar? It’s now de rigueur. Thus, the meme infiltrates and corrupts vulnerable minds regularly. Systemic failure of the U.S. healthcare and prison systems — which ought to be public institutions but are, like education, increasingly operated for profit to exploit public resources — continues to be exceptional among developed nations, as does the U.S. military and its bloated budget.

Gaffe-prone Democratic presidential candidate Joe Biden cemented his reputation as a goof years ago yet continues to build upon it. One might think that at his age enough would have been enough, but the allure of the highest office in the land is just too great, so he guilelessly applies for the job and the indulgence of the American public. Of course, the real prize-winner is 45, whose constant stream of idiocy and vitriol sends an entire nation scrambling daily to digest their Twitter feeds and make sense of things. Who knows (certainly I don’t) how serious was his remark that he wanted to buy Greenland? It makes a certain sense that a former real-estate developer would offhandedly recommend an entirely new land grab. After all, American history is based on colonialism and expansionism. No matter that the particular land in question is not for sale (didn’t matter for most of our history, either). Of course, everyone leapt into the news cycle with analysis or mockery, only the second of which was appropriate. Even more recent goofiness was 45’s apparent inability to read a map resulting in the suggestion that Hurricane Dorian might strike Alabama. Just as with the Greenland remark, PR flacks went to work to manage and reconfigure public memory, revising storm maps for after-the-fact justification. Has anyone in the media commented that such blatant historical revisionism is the stuff of authoritarian leaders (monarchs, despots, and tyrants) whose underlings and functionaries, fearing loss of livelihood if not indeed life, provide cover for mistakes that really ought to lead to simple admission of error and apology? Nope, just add more goofs to the heaping pile of preposterity.

Of course, the U.S. is hardly alone in these matters. Japan and Russia are busily managing perception of their respective ongoing nuclear disasters, including a new one in Russia that has barely broken through our collective ennui. Having followed the U.S. and others into industrialization and financialization of its economy, China is running up against the same well-known ecological despoliation and limits to growth and is now circling the drain with us. The added spectacle of a trade war with the petulant president in the U.S. distracts everyone from coming scarcity. England has its own clownish supreme leader, at least for now, trying to manage an intractable but binding issue: Brexit. (Does every head of state need a weirdo hairdo?) Like climate change, there is no solution no matter how much steadfast hoping and wishing one into existence occurs, so whatever eventually happens will throw the region into chaos. Folks shooting each other for food and fresh water in the Bahamas post-Hurricane Dorian is a harbinger of violent hair-triggers in the U.S. poised to fire at anything that moves when true existential threats finally materialize. Thus, our collective human legacy is absurd and self-destroying. No more muddling through.

/rant off

Returning to Pankaj Mishra’s The Age of Anger, chapter 2 (subtitled “Progress and its Contradictions”) profiles two writers of the 18th-century Enlightenment: François-Marie Arouet (1694–1778), better known by his nom de plume Voltaire, and Jean-Jacques Rousseau (1712–1778). Voltaire was a proponent and embodiment of Enlightenment values and ethics, whereas Rousseau was among the primary critics. Both were hugely influential, and the controversy inherent in their relative perspectives is unresolved even today. First come Rousseau’s criticisms (in Mishra’s prose):

… the new commercial society, which was acquiring its main features of class divisions, inequality and callous elites during the eighteenth century, made its members corrupt, hypocritical and cruel with its prescribed values of wealth, vanity and ostentation. Human beings were good by nature until they entered such a society, exposing themselves to ceaseless and psychologically debilitating transformation and bewildering complexity. Propelled into an endless process of change, and deprived of their peace and stability, human beings failed to be either privately happy or active citizens [p. 87]

This assessment could easily be mistaken for a description of the 1980s and 90s: ceaseless change and turmoil as new technological developments (e.g., the Internet) challenged everyone to reorient and reinvent themselves, often as a brand. Cultural transformation in the 18th century, however, was about more than just emerging economic reconfigurations. New, secular, free thought and rationalism openly challenged orthodoxies formerly imposed by religious and political institutions and demanded intellectual and entrepreneurial striving to participate meaningfully in charting new paths for progressive society purportedly no longer anchored statically in the past. Mishra goes on:

It isn’t just that the strong exploit the weak; the powerless themselves are prone to enviously imitate the powerful. But people who try to make more of themselves than others end up trying to dominate others, forcing them into positions of inferiority and deference. The lucky few on top remain insecure, exposed to the envy and malice of the also-rans. The latter use all means available to them to realize their unfulfilled cravings while making sure to veil them with a show of civility, even benevolence. [p. 89]

Sounds quite contemporary, no? Driving the point home:

What makes Rousseau, and his self-described ‘history of the human heart’, so astonishingly germane and eerily resonant is that, unlike his fellow eighteenth-century writers, he described the quintessential inner experience of modernity for most people: the uprooted outsider in the commercial metropolis, aspiring for a place in it, and struggling with complex feelings of envy, fascination, revulsion and rejection. [p. 90]

While most of the chapter describes Rousseau’s rejection and critique of 18th-century ethics, Mishra at one point depicts Rousseau arguing for instead of against something:

Rousseau’s ideal society was Sparta, small, harsh, self-sufficient, fiercely patriotic and defiantly un-cosmopolitan and uncommercial. In this society at least, the corrupting urge to promote oneself over others, and the deceiving of the poor by the rich, could be counterpoised by the surrender of individuality to public service, and the desire to seek pride for community and country. [p. 92]

Notably absent from Mishra’s profile is the meme mistakenly applied to Rousseau’s diverse criticism: the noble savage. Rousseau praises provincial men (patriarchal orientation acknowledged) largely unspoilt by the corrupting influence of commercial, cosmopolitan society devoted to individual self-interest and amour propre, and his ideal (above) is uncompromising. Although Rousseau had potential to insinuate himself successfully in fashionable salons and academic posts, his real affinity was with the weak and downtrodden — the peasant underclass — who were mostly passed over by rapidly modernizing society. Others managed to raise their station in life above the peasantry to join the bourgeoisie (disambiguation needed on that term). Mishra’s description (via Rousseau) of this middle and upper middle class group provided my first real understanding of popular disdain many report toward bourgeois values using the derisive term bourgie (clearer when spoken than when written).

Profile of Voltaire to follow in part 2.