Archive for the ‘Philosophy’ Category

Ask parents what ambitions they harbor for their child or children and among the most patterned responses is “I just want them to be happy.” I find such an answer thoughtless and disingenuous, and the insertion of the hedge just to make happiness sound like a small ask is a red herring. To begin with, for most kids still in their first decade, happiness and playfulness are relatively effortless and natural so long as a secure, loving environment is provided. Certainly not a default setting, but it’s still quite commonplace. As the dreamy style of childhood cognition is gradually supplanted by supposedly more logical, rational, adult thinking, and as children become acquainted with iniquities of both history and contemporary life, innocence and optimism become impossible to retain. Cue the sullen teenager confronting the yawning chasm between desire and reality. Indeed, few people seem to make the transition into adulthood knowing with much clarity how to be happy in the midst of widespread travail and suffering. Instead, young adults frequently substitute self-destructive, nihilistic hedonism, something learned primarily (says me) from the posturing of movie characters and the celebrities who portray them. (Never understood the trope of criminals hanging at nightclubs, surrounded by drug addicts, nymphos, other unsavory types, and truly awful music, where they can indulge their assholery before everything inevitably goes sideways.)

Many philosophies recommend simplicity, naturalness, and independence as paths to happiness and moral rectitude. Transcendentalism was one such response to social and political complexities that spoil and/or corrupt. Yet two centuries on, the world has only gotten more and more complex, pressing on everyone especially for information processing in volume and sophistication that does not at all come naturally to most and is arguably not part of our evolutionary toolkit. Multiple social issues, if one is to engage them fairly, hinge on legalistic arguments and bewildering wordplay that render them fundamentally intractable. Accordingly, many waive away all nuance and adopt pro forma attitudes. Yet the airwaves, social media, the Internet, and even dinner conversations are suffused by the worst sorts of hypercomplexity and casuistry that confound even those who traffic regularly in such rhetoric. It’s a very long way from “I just want to be happy.”

(more…)

A quick search revealed that over 15 years of blog posts, the word macrohistory has been used only once. On reflection, macrohistory is something in which I’ve been involved for some time — mostly as a dilettante. Several book reviews and three book-blogging series (one complete, two either on hiatus or fully abandoned) concern macrohistory, and my own several multi-part blogs connect disparate dots over broader topics (if not quite history in the narrow sense). My ambition, as with macrohistory, is to tease out better (if only slightly) understandings of ourselves (since humans and human culture are obviously the most captivating thing evar). Scientists direct similar fascination to the inner workings of nonhuman systems — or at least larger systems in which humans are embedded. Thus, macrohistory can be distinguished from natural history by their objects of study. Relatedly, World-Systems Theory associated with Immanuel Wallerstein and The Fourth Turning (1997 book by William Strauss and Neil Howe) take similarly broad perspectives and attempt to identify historical dynamics and patterns not readily apparent. Other examples undoubtedly exist.

This is all preliminary to discussing a rather depressing article from the December 2020 issue of Harper’s Magazine: Rana Dasgupta’s disquieting (ahem) essay “The Silenced Majority” (probably behind a paywall). The subtitle poses the question, “Can America still afford democracy?” This innocuous line begs the question whether the U.S. (America and the United States of America [and its initialisms U.S. and U.S.A.] being sloppily equivalent almost everywhere, whereas useful distinctions describe the United Kingdom, Great Britain, and England) actually has or practices democracy anymore, to which many would answer flatly “nope.” The essay is an impressive exercise, short of book length, in macrohistory, though it’s limited to Western cultures, which is often the case with history told from inside the bubble. Indeed, if (as the aphorism goes) history is written/told primarily by the victors, one might expect to hear only of an ongoing series of victories and triumphs with all the setbacks, losses, and discontinuities excised like some censored curated Twitter or Facebook Meta discussion. One might also wonder how that same history reads when told from the perspective of non-Western countries, especially those in transitional regions such as Poland, Ukraine, Turkey, and Iran or those with histories long predating the rise of the West roughly 500 years ago, i.e., China, Japan, Egypt, and the lost cultures of Central America. Resentments of the Islamic world, having been eclipsed by the West, are a case in point. My grasp of world history is insufficient to entertain those perspectives. I note, however, that with globalism, the histories of all regions of the world are now intimately interconnected even while perspectives differ.

Dasgupta describes foundational Enlightenment innovations that animate Western thinking, even though the ideas are often poorly contextualized or understood. To wit:

In the seventeenth century, England was an emerging superpower. Supremacy would come from its invention of a world principle of property. This principle was developed following contact with the Americas, where it became possible to conjure vast new English properties “out of nothing”—in a way that was impracticable, for instance, in the militarized, mercantile societies of India. Such properties were created by a legal definition of ownership designed so that it could be applied only to the invaders. “As much land as a man tills, plants, improves, cultivates, and can use the product of,” John Locke wrote in 1689, “so much is his property.” When combined with other new legal categories such as “the savage” and “the state of nature,” this principle of property engendered societies such as Carolina, where Locke’s patron, the first earl of Shaftesbury, was a lord proprietor.

Obvious, isn’t it, that by imposing the notion of private property on indigenous inhabitants of North America, colonialists established ownership rights over territories where none had previously existed? Many consider that straightforward theft (again, begging the question) or at least fencing the commons. (Attempts to do the same in the open oceans and in space [orbit] will pick up as technology allows, I surmise.) In addition, extension of property ownership to human trafficking, i.e., slavery and its analogues still practiced today, has an exceptionally long history and was imported to the Americas, though the indigenous population proved to be poor candidates for subjugation. Accordingly, others were brought to North America in slave trade that extended across four centuries.

Dasgupta goes on:

From their pitiless opposition to the will of the people, we might imagine that British elites were dogmatic and reactionary. (Period dramas depicting stuck-up aristocrats scandalized by eccentricity and innovation flatter this version of history.) The truth is that they were open-minded radicals. They had no sentimentality about the political order, cutting the head off one king and sending another into exile. They could invent financial and legal structures (such as the Bank of England, founded in 1694) capable of releasing unprecedented market energies. Even their decision to exploit American land with African labor demonstrated their world-bending pursuit of wealth. Their mines and plantations would eventually supply the capital for the first industrial revolution. They loved fashion and technology, they believed in rationality, progress, and transparency. They were the “founding fathers” of our modern world.

And yet they presided over a political system as brutal as it was exclusive. Why? The answer is simple. They could not afford democracy, but also, crucially, they did not need it. [emphasis in original]

So much for the awe and sacred respect in which Enlightenment philosophers and the Founders are held — or used to be. Statues of these dudes (always dudes, natch) are being pulled down all the time. Moreover, association of liberal democracy with the 17th century is a fundamental mistake, though neoliberalism (another poorly defined and understood term) aims to shift backwards to a former or hybrid state of human affairs some are beginning to call digital feudalism.

The article goes on to discuss the balancing act and deals struck over the course of centuries to maintain economic and political control by the ownership class. It wasn’t until the 1930s and the postwar economic boom in the U.S. that democracy as commonly understood took root significantly. The labor movement in particular was instrumental in forcing FDR’s New Deal social programs, even though populism and socialism as political movements had been successfully beaten back. Interestingly, the hallowed American nuclear family (limited in its scope racially), an ahistorical formation that enjoyed a roughly 30-year heyday from 1945 to 1975, coincides with the rise of the American middle class and now-aged democratic institutions. They’re all connected with widely distributed wealth and prosperity. But after the oil crisis and stagflation of the middle 1970s, gains enjoyed by the middle class have steadily eroded and/or been actively beaten back (again!) so that dominant themes today are austerity imposed on the masses and inequality coughing up hundy-billionaires with increasing frequency. Estimates are that 30-40% of the American citizenry lives in poverty, bumping up against failed state territory. Inequality has returned to Gilded Age levels if not exceeded them. Dasgupta fails to cite perhaps the major underlying cause of this shift away from affordable democracy, back toward the brutal, world principal of property: falling EROI. Cheap foreign labor, productivity gains, and creation of a giant debtor society have simply not offset the disappearance of cheap energy.

Dasgupta’s further discussion of an emerging two-tier economy along with the Silicon Valley technocracy follows, but I’ll stop short here and encourage readers instead to investigate and think for themselves. Lots of guides and analyses help to illuminate the macrohistory, though I find the conclusions awful in their import. Dasgupta drives home the prognosis:

The neoliberal revolution aimed to restore the supremacy of capital after its twentieth-century subjugation by nation-states, and it has succeeded to an astonishing degree. As states compete and collude with gargantuan new private powers, a new political world arises. The principle of labor, which dominated the twentieth century—producing the industrious, democratic society we have come to regard, erroneously, as the norm—is once again being supplanted by a principle of property, the implications and consequences of which we know only too well from our history books.

“Language is dynamic” is a phrase invoked in praise or derision of shifts in usage. Corollaries include “the only constant is change” and “time’s arrow points in only one direction” — both signalling that stasis is an invalid and ultimately futile conservative value. The flip side might well be the myth of progress, understood in reference not to technological advancement but human nature’s failure to rise above its base (animal) origins. This failure is especially grotesque considering that humans currently albeit temporarily live in an age of material abundance that would provide amply for everyone if that largesse were justly and equitably produced and distributed. However, resources (including labor) are being systematically exploited, diverted, and hoarded by a small, unethical elite (what some call “alpha chimps”) who often use state power to subjugate vulnerable populations to funnel further tribute to the already obscenely wealthy top of the socioeconomic hierarchy. But that’s a different diatribe.

Although I’m sensitive the dynamism of language — especially terms for broad ideas in need of short, snappy neologisms — I’m resistant to adopting most new coin. For instance, multiple colors of pill (red, blue, white, and black to my knowledge) refer to certain narrative complexes that people, in effect, swallow. Similarly, the “blue church” is used to refer to legacy media struggling desperately (and failing) to retain its last shreds of legitimacy and authority. (Dignity is long gone.) Does language really need these terms or are hipsters just being clever? That question probably lacks a definitive answer.

My real interest with this blog post, however, is how the modern digital mediascape has given rise to a curious phenomenon associated with cancel culture: deletion of tweets and social media posts to scrub one’s past of impropriety as though the tweet or post never happened. (I’ve never deleted a post nor have any plans to.) Silicon Valley hegemons can’t resist getting their piece of the action, too, by applying deeply flawed algorithms to everyone’s content to demonetize, restrict, and/or remove (i.e., censor) offensive opinion that runs counter to (shifting) consensus narratives decided upon in their sole discretion as water carriers for officialdom. Algorithmic dragnets are not effective kludges precisely because thoughts are not synonymous with their online expression; one merely points to the other. Used to be said that the Internet is forever, so wait before posting or tweeting a reasonable duration so that irresponsible behavior (opinion and trolling, mostly) can be tempered. Who knows who possesses technical expertise and access to tweet and video archives other than, say, the Wayback Machine? When a public figure says or does something dumb, a search-and-destroy mission is often launched to resurrect offending and damning past utterance. Of course, scrub-a-dub erasure or deletion is merely another attempt to manage narrative and isn’t a plea for forgiveness, which doesn’t exist in the public sphere anyway except for rehabilitated monsters such as past U.S. presidents a/k/a war criminals. And the Internet isn’t in fact forever; ask an archivist.

Shifting language, shifting records, shifting sentiment, shifting intellectual history are all aspects of culture that develop naturally and inevitably over time. We no longer believe, for instance, in the four elements or geocentrism (a/k/a the Aristotelian-Ptolemaic system; never mind the intransigent Flat Earthers who need not be silenced). Darker aspects of these shifts, however, include the remarkable Orwellian insight that “Who controls the past controls the future: who controls the present controls the past” from the 1949 novel Nineteen Eighty-Four. Here’s the passage for context:

Who controls the past, controls the future: who controls the present, controls the past … The mutability of the past is the central tenet of Ingsoc. Past events, it is argued, have no objective existence, but survive only in written records and in human memories. The past is whatever the records and the memories agree upon. And since the Party is in full control of all records, and in equally full control of the minds of its members, it follows that the past is whatever the Party chooses to make it.

In 2021, the awful lesson is taken to heart by multiple parties (not the Party in the novel but wannabes) who have latched maniacally onto Orwellian mechanisms of thought control specifically through the manipulation of records, history, and language. But as mentioned above, policing mere expression is not the same as policing thought itself, at least among those who retain critical thinking skills and independence of mind. I abstain judgment how effective attempted brainwashing is with the masses but will at least mention that Yeonmi Park, who escaped from North Korea in 2007 before settling in the U.S. in 2014, describes the chilling totalitarian thought control exercised by the North Korean government — the stuff of nightmare dystopianism. The template is by now well established and despots everywhere are only too happy to implement it repeatedly, following an evil trajectory that should be resisted at every turn while still possible.

Among those building fame and influence via podcasting on YouTube is Michael Malice. Malice is a journalist (for what organization?) and the author of several books, so he has better preparation and content than many who (like me) offer only loose opinion. He latest book (no link) is The Anarchist Handbook (2021), which appears to be a collection of essays (written by others, curated by Malice) arguing in theoretical support of anarchism (not to be confused with chaos). I say theoretical because, as a hypersocial species of animal, humans never live in significant numbers without forming tribes and societies for the mutual benefit of their members. Malice has been making the rounds discussing his book and is undoubtedly an interesting fellow with well rehearsed arguments. Relatedly, he argues in favor of objectivism, the philosophy of Ayn Rand that has been roundly criticized and dismissed yet continues to be attractive especially to purportedly self-made men and women (especially duped celebrities) of significant wealth and achievement.

Thus far in life, I’ve disdained reading Rand or getting too well acquainted with arguments in favor of anarchism and/or objectivism. As an armchair social critic, my bias skews toward understanding how things work (i.e., Nassim Taleb’s power laws) in actuality rather than in some crackpot theory. As I understand it, the basic argument put forward to support any variety of radical individualism is that everyone working in his or her own rational self-interest, unencumbered by the mores and restrictions of polite society, leads to the greatest (potential?) happiness and prosperity. Self-interest is not equivalent to selfishness, but even if it were, the theorized result would still be better than any alternative. A similar argument is made with respect to economics, known as the invisible hand. In both, hidden forces (often digital or natural algorithms), left alone to perform their work, enhance conditions over time. Natural selection is one such hidden force now better understood as a component of evolutionary theory. (The term theory when used in connection with evolution is an anachronism and misnomer, as the former theory has been scientifically substantiated as a power law.) One could argue as well that human society is a self-organizing entity (disastrously so upon even casual inspection) and that, because of the structure of modernity, we are all situated within a thoroughly social context. Accordingly, the notion that one can or should go it alone is a delusion because it’s flatly impossible to escape the social surround, even in aboriginal cultures, unless one is totally isolated from other humans in what little remains of the wilderness. Of course, those few hardy individuals who retreat into isolation typically bring with them the skills, training, tools, and artifacts of society. A better example might be feral children, lost in the wilderness at an early age and deprived of human society but often taken in by a nonhuman animal (and thus socialized differently).

My preferred metaphor when someone insists on total freedom and isolation away from the maddening crowd is traffic — usually automobile traffic but foot traffic as well. Both are examples of aggregate flow arising out of individual activity, like drops of rain forming into streams, rivers, and floods. When stuck in automobile congestion or jostling for position in foot traffic, it’s worthwhile to remember that you are the traffic, a useful example of synecdoche. Those who buck the flow, cut the line, or drive along the shoulder — often just to be stuck again a little farther ahead — are essentially practicing anarchists or me-firsters, whom the rest of us simply regard as assholes. Cultures differ with respect to the orderliness of queuing, but even in those places where flow is irregular and unpredictable, a high level of coordination (lost on many American drivers who can’t figger a roundabout a/k/a traffic circle) is nonetheless evident.

As I understand it, Malice equates cooperation with tyranny because people defer to competence, which leads to hierarchy, which results in power differentials, which transforms into tyranny (petty or profound). (Sorry, can’t locate the precise formulation.) Obvious benefits (e.g., self-preservation) arising out of mutual coordination (aggregation) such as in traffic flows are obfuscated by theory distilled into nicely constructed quotes. Here’s the interesting thing: Malice has lived in Brooklyn most of his life and doesn’t know how to drive! Negotiating foot traffic has a far lower threshold for serious harm than driving. He reports that relocation to Austin, TX, is imminent, and with it, the purchase of a vehicle. My suspicion is that to stay out of harm’s way, Malice will learn quickly to obey tyrannical traffic laws, cooperate with other drivers, and perhaps even resent the growing number of dangerous assholes disrupting orderly flow like the rest of us — at least until he develops enough skill and confidence to become one of those assholes. The lesson not yet learned from Malice’s overactive theoretical perspective is that in a crowded, potentially dangerous world, others must be taken into account. Repetition of this kindergarten lesson throughout human activity may not be the most pleasant thing for bullies and assholes to accept, but refusing to do so makes one a sociopath.

For more than a decade, I’ve had in the back of my mind a blog post called “The Power of Naming” to remark that bestowing a name gives something power, substance, and in a sense, reality. That post never really came together, but its inverse did. Anyway, here’s a renewed attempt.

The period of language acquisition in early childhood is suffused with learning the names of things, most of which is passive. Names of animals (associated closely with sounds they make) are often a special focus using picture books. The kitty, doggie, and horsie eventually become the cat, dog, and horse. Similarly, the moo-cow and the tweety-bird shorten to cow and bird (though songbird may be an acceptable holdover). Words in the abstract are signifiers of the actual things, aided by the text symbols learned in literate cultures to reinforce mere categories instead of examples grounded in reality. Multiply the names of things several hundred thousand times into adulthood and indeed throughout life and one can develop a formidable vocabulary supporting expressive and nuanced thought and speech. Do you know the differences between acute, right, obtuse, straight, and reflex angles? Does it matter? Does your knowledge of barware inform when to use a flute, coupe, snifter, shot (or shooter or caballito), nosing glass (or Glencairn), tumbler, tankard, goblet, sling, and Stein? I’d say you’ve missed something by never having drunk dark beer (Ger.: Schwarzbier) from a frosted schooner. All these varieties developed for reasons that remain invisible to someone content to drink everything from the venerable red Solo cup. Funnily enough, the red Solo cup now comes in different versions, fooling precisely no one.

Returning to book blogging, Walter Ong (in Orality and Literacy) has curious comparisons between primarily oral cultures and literate cultures. For example:

Oral people commonly think of names (one kind of words) as conveying power over things. Explanations of Adam’s naming of the animals in Genesis 2:20 usually call condescending attention to this presumably quaint archaic belief. Such a belief is in fact far less quaint than it seems to unreflective chirographic and typographic folk. First of all, names do give humans beings power over what they name: without learning a vast store of names, one is simply powerless to understand, for example, chemistry and to practice chemical engineering. And so with all other intellectual knowledge. Secondly, chirographic and typographic folk tend to think of names as labels, written or printed tags imaginatively affixed to an object named. Oral folk have no sense of a name as a tag, for they have no idea of a name as something that can be seen. Written or printed representations of words can be labels; real, spoken words cannot be. [p. 33]

This gets at something that has been developing over the past few decades, namely, that as otherwise literate (or functionally literate) people gather more and more information through electronic media (screens that serve broadcast and cable TV, YouTube videos, prerecorded news for streaming, and podcasts, and most importantly, audiobooks — all of which speak content to listeners), the spoken word (re)gains primacy and the printed word fades into disuse. Electronic media may produce a hybrid of orality/literacy, but words are no longer silent, internal, and abstract. Indeed, words — all by themselves — are understood as being capable of violence. Gone are the days when “stick and stones ….” Now, fighting words incite and insults sting again.

Not so long ago, it was possible to provoke a duel with an insult or gesture, such as a glove across the face. Among some people, defense of honor never really disappeared (though dueling did). History has taken a strange turn, however. Proposed legislation to criminalize deadnaming (presumably to protect a small but growing number of transgender and nonbinary people who have redefined their gender identity and accordingly adopted different names) recognizes the violence of words but then tries to transmute the offense into an abstract criminal law. It’s deeply mixed up, and I don’t have the patience to sort it out.

More to say in later blog posts, but I’ll raise the Counter-Enlightenment once more to say that the nature of modern consciousness if shifting somewhat radically in response to stimuli and pressures that grew out of an information environment, roughly 70 years old now but transformed even more fundamentally in the last 25 years, that is substantially discontinuous from centuries-old traditions. Those traditions displaced even older traditions inherited from antiquity. Such is the way of the world, I suppose, and with the benefit of Walter Ong’s insights, my appreciation of the outlines is taking better shape.

I have observed various instances of magical thinking in mainstream culture, especially here, which I find problematical. Although it’s not my ambition to disabuse anyone of magical thinking, which extends far beyond, say, religious thought, I was somewhat taken aback at the suggestion found in the comic at this link (not embedded). For those not familiar with Questionable Content (one of two online comics I read regularly), the comic presents an extended cast of characters, mostly in their early 20s, living in a contemporary New England college town. Those characters are supplemented by a few older parents and lots of AIs (in robot bodies). The AIs are not particularly futuristic but are simply accepted as a normal (if curious) part of the world of the comic. Major story arcs involve characters and AIs (the AIs are characters, I suppose) in the process of discovering and establishing themselves as they (the humans, anyway) transition into early adulthood. There are no great political themes or intrusions into life in a college town. Rather, the comic is largely about acceptance of difference. Often, that means washing away meaningful difference in the name of banal tolerance. Real existential struggle is almost entirely absent.

In the linked comic, a new character comes along and offers advice to an established character struggling with sexual attractions and orientation. The dialogue includes this exchange:

Character A: If tarot or astrology or religion halps you make sense of the world and your place in it, then why not use them?
Character B: But they’re not real. [emphasis in original]
Character A: It doesn’t matter, if you use them constructively!

There it is in a nutshell: believe whatever you want if it, um, halps. I’ve always felt that being wrong (i.e., using unreal or make-believe things) was a sufficient injunction against anchoring oneself to notions widely known to be false. Besides, isn’t it often remarked that the biggest fool is one who fools himself? (Fiction as a combination of entertainment and building a worldview is quite normative, but it’s understood as fiction, or to a lesser degree, as life imitating art and its inverse. Exceptions abound, which are regarded as psychopathy.) The instruction in that dialogue (part object lesson, part lesson in cognition) is not that it’s OK to make mistakes but that knowingly believing something false has worthwhile advantages.

Surveying examples where promulgating false beliefs have constructive and destructive effects is too large a project. Well short of that, nasty categories include fraud, gaslighting, and propaganda, which are criminal in many cases and ought to be in most others (looking at you, MSM! — or not, since I neither trust nor watch). One familiar benevolent category is expressed in the phrase fake it til you make it, often recommended to overcome a lack of confidence. Of course, a swindle is also known as a confidence game (or by its diminutive, a con), so beware overconfidence when asked by another to pay for something (e.g., tarot or astrology readings), take risks, or accept an ideology without question.

As philosophy, willful adoption of falsity for its supposed benefits is half-baked. Though impossible to quantify, my suspicion is that instances of positive outcomes are overbalanced by negative ones. Maybe living in a constructed reality or self-reinforcing fantasy is what people want. The comic discussed is certainly in line with that approach. However, while we dither and delude ourselves with happy, aspirational stories based on silliness, the actual world around us, including all the human institutions that used to serve us but no longer do, falls to tatters. Is it better going through life and eventually to one’s grave refusing to see that reality? Should childlike wonder and innocence be retained in spite of what is easily observable just by poking one’s head up and dismissing comforting lies? Decide for yourself.

Once in a while, when discussing current events and their interpretations and implications, a regular interlocutor of mine will impeach me, saying “What do you know, really?” I’m always forced to reply that I know only what I’ve learned through various media sources, faulty though they may be, not through first-hand observation. (Reports of anything I have observed personally tend to differ considerably from my own experience once the news media completes its work.) How, then, can I know, to take a very contemporary instance this final week of July 2020, what’s going on in Portland from my home in Chicago other than what’s reported? Makes no sense to travel there (or much of anywhere) in the middle of a public health crisis just to see a different slice of protesting, lawbreaking, and peacekeeping [sic] activities with my own eyes. Extending the challenge to its logical extremity, everything I think I know collapses into solipsism. The endpoint of that trajectory is rather, well, pointless.

If you read my previous post, there is an argument that can’t be falsified any too handily that what we understand about ourselves and the world we inhabit is actually a constructed reality. To which I reply: is there any other kind? That construction achieves a fair lot of consensus about basics, more than one might even guess, but that still leaves quite a lot of space for idiosyncratic and/or personal interpretations that conflict wildly. In the absence of stabilizing authority and expertise, it has become impossible to tease a coherent story out of the many voices pressing on us with their interpretations of how we ought to think and feel. Twin conspiracies foisted on us by the Deep State and MSM known as RussiaGate and BountyGate attest to this. I’ll have more to say about inability to figure things out when I complete my post called Making Sense and Sensemaking.

In the meantime, the modern world has in effect constructed its own metaphorical Tower of Babel (borrowing from Jonathan Haidt — see below). It’s not different languages we speak so much (though it’s that, too) as the conflicting stories we tell. Democratization of media has given each us of — authorities, cranks, and everyone between — new platforms and vehicles for promulgating pet stories, interpretations, and conspiracies. Most of it is noise, and divining the worthwhile signal portion is a daunting task even for disciplined, earnest folks trying their best to penetrate the cacophony. No wonder so many simply turn away in disgust.

I admit (again) to being bugged by things found on YouTube — a miserable proxy for the marketplace of ideas — many of which are either dumb, wrongheaded, or poorly framed. It’s not my goal to correct every mistake, but sometimes, inane utterances of intellectuals and specialists I might otherwise admire just stick in my craw. It’s hubris on my part to insist on my understandings, considering my utter lack of standing as an acknowledged authority, but I’m not without my own multiple areas of expertise (I assert immodestly).

The initial purpose for this blog was to explore the nature of consciousness. I’ve gotten badly sidetracked writing about collapse, media theory, epistemology, narrative, and cinema, so let me circle back around. This is gonna be long.

German philosopher Oswald Spengler takes a crack at defining consciousness:

Human consciousness is identical with the opposition between the soul and the world. There are gradations in consciousness, varying from a dim perception, sometimes suffused by an inner light, to an extreme sharpness of pure reason that we find in the thought of Kant, for whom soul and world have become subject and object. This elementary structure of consciousness is not capable of further analysis; both factors are always present together and appear as a unity.

(more…)

In my preparations for a speech to be given in roughly two months, I stumbled across a prescient passage in an essay entitled “Jesuitism” from Latter-Day Pamphlets (1850) by Thomas Carlyle. Connect your own dots as this is offered without comment.

… this, then, is the horrible conclusion we have arrived at, in England as in all countries; and with less protest against it hitherto, and not with more, in England than in other countries? That the great body of orderly considerate men; men affecting the name of good and pious, and who, in fact, excluding certain silent exceptionary individuals one to the million, such as the Almighty Beneficence never quite withholds, are accounted our best men,–have unconsciously abnegated the sacred privilege and duty of acting or speaking the truth; and fancy that it is not truth that is to be acted, but that an amalgam of truth and falsity is the safe thing. In parliament and pulpit, in book and speech, in whatever spiritual thing men have to commune of, or to do together, this is the rule they have lapsed into, this is the pass they have arrived at. We have to report than Human Speech is not true! That it is false to a degree never witnessed in this world till lately. Such a subtle virus of falsity in the very essence of it, as far excels all open lying, or prior kinds of falsity; false with consciousness of being sincere! The heart of the world is corrupted to the core; a detestable devil’s-poison circulates in the life-blood of mankind; taints with abominable deadly malady all that mankind do. Such a curse never fell on men before.

For the falsity of speech rests on a far deeper falsity. False speech, as is inevitable when men long practise it, falsifies all things; the very thoughts, or fountains of speech and action become false. Ere long, by the appointed curse of Heaven, a man’s intellect ceases to be capable of distinguishing truth, when he permits himself to deal in speaking or acting what is false. Watch well the tongue, for out of it are the issues of life! O, the foul leprosy that heaps itself in monstrous accumulation over Human Life, and obliterates all the divine features of it into one hideous mountain of purulent disease, when Human Life parts company with truth; and fancies, taught by Ignatius or another, that lies will be the salvation of it! We of these late centuries have suffered as the sons of Adam never did before; hebetated, sunk under mountains of torpid leprosy; and studying to persuade ourselves that this is health.

And if we have awakened from the sleep of death into the Sorcerer’s Sabbath of Anarchy, is it not the chief of blessings that we are awake at all? Thanks to Transcendent Sansculottism and the long-memorable French Revolution, the one veritable and tremendous Gospel of these bad ages, divine Gospel such as we deserved, and merciful too, though preached in thunder and terror! Napoleon Campaignings, September Massacres, Reigns of Terror, Anacharsis Clootz and Pontiff Robespierre, and still more beggarly tragicalities that we have since seen, and are still to see: what frightful thing were not a little less frightful than the thing we had? Peremptory was our necessity of putting Jesuitism away, of awakening to the consciousness of Jesuitism. ‘Horrible,’ yes: how could it be other than horrible? Like the valley of Jehoshaphat, it lies round us, one nightmare wilderness, and wreck of dead-men’s bones, this false modern world; and no rapt Ezekiel in prophetic vision imaged to himself things sadder, more horrible and terrible, than the eyes of men, if they are awake, may now deliberately see. Many yet sleep; but the sleep of all, as we judge by their maundering and jargoning, their Gorham Controversies, street-barricadings, and uneasy tossings and somnambulisms, is not far from ending. Novalis says, ‘We are near awakening when we dream that we are dreaming.’ [italics in original]

Continuing (after some delay) from part 1, Pankaj Mishra concludes chapter 4 of The Age of Anger with an overview of Iranian governments that shifted from U.S./British client state (headed by the Shah of Iran, reigned 1941–1979) to its populist replacement (headed by Ayatollah Khomeini, ruled 1979–1989), both leaders having been authoritarians. During the period discussed, Iran underwent the same modernization and infiltration by liberal, Western values and economics, which produced a backlash familiar from Mishra’s descriptions of other nations and regions that had experienced the same severed roots of place since the onset of the Enlightenment. Vacillation among two or more styles of government might be understood as a thermostatic response: too hot/cold one direction leads to correction in another direction. It’s not a binary relationship, however, between monarchy and democracy (to use just one example). Nor are options between a security state headed by an installed military leader and a leader elected by popular vote. Rather, it’s a question of national identity being alternatively fractured and unified (though difficult to analyze and articulate) in the wake of multiple intellectual influences.

According to Lewis and Huntington, modernity has failed to take root in intransigently traditional and backward Muslim countries despite various attempts to impose it by secular leaders such as Turkey’s Atatürk, the Shah of Iran, Algeria’s Ben Bella, Egypt’s Nasser and Sadat, and Pakistan’s Ayub Khan.

Since 9/11 there have been many versions, crassly populist as well as solemnly intellectual, of the claims by Lewis and Huntington that the crisis in Muslim countries is purely self-induced, and [that] the West is resented for the magnitude of its extraordinary success as a beacon of freedom, and embodiment of the Enlightenment’s achievements … They have mutated into the apparently more sophisticated claim that the clash of civilizations occurs [primarily] within Islam, and that Western interventions are required on behalf of the ‘good Muslim’, who is rational, moderate and liberal. [p. 127]

This is history told by the putative winners. Mishra goes on:

Much of the postcolonial world … became a laboratory for Western-style social engineering, a fresh testing site for the Enlightenment ideas of secular progress. The philosophes had aimed at rationalization, or ‘uniformization’, of a range of institutions inherited from an intensely religious era. Likewise, postcolonial leaders planned to turn illiterate peasants into educated citizens, to industrialize the economy, move the rural population to cities, alchemize local communities into a singular national identity, replace the social hierarchies of the past with an egalitarian order, and promote the cults of science and technology among a pious and often superstitious population. [p. 133]

Readers may recognize this project and/or process by its more contemporary name: globalization. It’s not merely a war of competing ideas, however, because those ideas manifest in various styles of social and political organization. Moreover, the significance of migration from rural agrarian settings to primarily urban and suburban ones can scarcely be overstated. This transformation (referring to the U.S. in the course of the 20th century) is something James Howard Kunstler repeatedly characterizes rather emphatically as the greatest misallocation of resources in the history of the world. Mishra summarizes the effects of Westernization handily:

In every human case, identity turns out to be porous and inconsistent rather than fixed and discrete; and prone to get confused and lost in the play of mirrors. The cross-currents of ideas and inspirations — the Nazi reverence for Atatürk, a gay French philosopher’s denunciation of the modern West and sympathy for the Iranian Revolution, or the various ideological inspirations for Iran’s Islamic Revolution (Zionism, Existentialism, Bolshevism and revolutionary Shiism) — reveal that the picture of a planet defined by civilizations closed off from one another and defined by religion (or lack thereof) is a puerile cartoon. They break the simple axis — religious-secular, modern-medieval, spiritual-materialist — on which the contemporary world is still measured, revealing that its populations, however different their pasts, have been on converging and overlapping paths. [p. 158]

These descriptions and analyses put me in mind of a fascinating book I read some years ago and reviewed on Amazon (one of only a handful of Amazon reviews): John Reader’s Man on Earth (1988). Reader describes and indeed celebrates incredibly diverse ways of inhabiting the Earth specially adapted to the landscape and based on evolving local practices. Thus, the notion of “place” is paramount. Comparison occurs only by virtue of juxtaposition. Mishra does something quite different, drawing out the connective ideas that account for “converging and overlapping paths.” Perhaps inevitably, disturbances to collective and individual identities that flow from unique styles of social organization, especially those now operating at industrial scale (i.e., industrial civilization), appear to be picking up. For instance, in the U.S., even as mass shootings (a preferred form of attack but not the only one) appear to be on the rise at the same time that violent crime is at an all-time low, perpetrators of violence are not limited to a few lone wolves, as the common trope goes. According to journalist Matt Agorist,

mass shootings — in which murdering psychopaths go on rampages in public spaces — have claimed the lives of 339 people since 2015 [up to mid-July 2019]. While this number is certainly shocking and far too high, during this same time frame, police in America have claimed the lives of 4,355 citizens.

And according to this article in Vox, this crazy disproportion (police violence to mass shootings) is predominantly an American thing at least partly because of our high rate of fetishized civilian gun ownership. Thus, the self-described “land of the free, home of the brave” has transformed itself into a paranoid garrison state affecting civil authority even more egregiously than the disenfranchised (mostly young men). Something similar occurred during the Cold War, when leaders became hypervigilant for attacks and invasions that never came. Whether a few close calls during the height of the Cold War were the result of escalating paranoia, brinkmanship, or true, maniacal, existential threats from a mustache-twirling, hand-rolling despot hellbent on the destruction of the West is a good question, probably impossible to answer convincingly. However, the result today of this mindset couldn’t be more disastrous:

It is now clear that the post-9/11 policies of pre-emptive war, massive retaliation, regime change, nation-building and reforming Islam have failed — catastrophically failed — while the dirty war against the West’s own Enlightenment [the West secretly at war with itself] — inadvertently pursued through extrajudicial murder, torture, rendition, indefinite detention and massive surveillance — has been a wild success. The uncodified and unbridled violence of the ‘war on terror’ ushered in the present era of absolute enmity in which the adversaries, scornful of all compromise, seek to annihilate each other. Malignant zealots have emerged at the very heart of the democratic West after a decade of political and economic tumult; the simple explanatory paradigm set in stone soon after the attacks of 9/11 — Islam-inspired terrorism versus modernity — lies in ruins. [pp.124–125]