Posts Tagged ‘Book Blogging’

Returning to Pankaj Mishra’s The Age of Anger, chapter 2 (subtitled “Progress and its Contradictions”) profiles two writers of the 18th-century Enlightenment: François-Marie Arouet (1694–1778), better known by his nom de plume Voltaire, and Jean-Jacques Rousseau (1712–1778). Voltaire was a proponent and embodiment of Enlightenment values and ethics, whereas Rousseau was among the primary critics. Both were hugely influential, and the controversy inherent in their relative perspectives is unresolved even today. First come Rousseau’s criticisms (in Mishra’s prose):

… the new commercial society, which was acquiring its main features of class divisions, inequality and callous elites during the eighteenth century, made its members corrupt, hypocritical and cruel with its prescribed values of wealth, vanity and ostentation. Human beings were good by nature until they entered such a society, exposing themselves to ceaseless and psychologically debilitating transformation and bewildering complexity. Propelled into an endless process of change, and deprived of their peace and stability, human beings failed to be either privately happy or active citizens [p. 87]

This assessment could easily be mistaken for a description of the 1980s and 90s: ceaseless change and turmoil as new technological developments (e.g., the Internet) challenged everyone to reorient and reinvent themselves, often as a brand. Cultural transformation in the 18th century, however, was about more than just emerging economic reconfigurations. New, secular, free thought and rationalism openly challenged orthodoxies formerly imposed by religious and political institutions and demanded intellectual and entrepreneurial striving to participate meaningfully in charting new paths for progressive society purportedly no longer anchored statically in the past. Mishra goes on:

It isn’t just that the strong exploit the weak; the powerless themselves are prone to enviously imitate the powerful. But people who try to make more of themselves than others end up trying to dominate others, forcing them into positions of inferiority and deference. The lucky few on top remain insecure, exposed to the envy and malice of the also-rans. The latter use all means available to them to realize their unfulfilled cravings while making sure to veil them with a show of civility, even benevolence. [p. 89]

Sounds quite contemporary, no? Driving the point home:

What makes Rousseau, and his self-described ‘history of the human heart’, so astonishingly germane and eerily resonant is that, unlike his fellow eighteenth-century writers, he described the quintessential inner experience of modernity for most people: the uprooted outsider in the commercial metropolis, aspiring for a place in it, and struggling with complex feelings of envy, fascination, revulsion and rejection. [p. 90]

While most of the chapter describes Rousseau’s rejection and critique of 18th-century ethics, Mishra at one point depicts Rousseau arguing for instead of against something:

Rousseau’s ideal society was Sparta, small, harsh, self-sufficient, fiercely patriotic and defiantly un-cosmopolitan and uncommercial. In this society at least, the corrupting urge to promote oneself over others, and the deceiving of the poor by the rich, could be counterpoised by the surrender of individuality to public service, and the desire to seek pride for community and country. [p. 92]

Notably absent from Mishra’s profile is the meme mistakenly applied to Rousseau’s diverse criticism: the noble savage. Rousseau praises provincial men (patriarchal orientation acknowledged) largely unspoilt by the corrupting influence of commercial, cosmopolitan society devoted to individual self-interest and amour propre, and his ideal (above) is uncompromising. Although Rousseau had potential to insinuate himself successfully in fashionable salons and academic posts, his real affinity was with the weak and downtrodden — the peasant underclass — who were mostly passed over by rapidly modernizing society. Others managed to raise their station in life above the peasantry to join the bourgeoisie (disambiguation needed on that term). Mishra’s description (via Rousseau) of this middle and upper middle class group provided my first real understanding of popular disdain many report toward bourgeois values using the derisive term bourgie (clearer when spoken than when written).

Profile of Voltaire to follow in part 2.


Apologies for this overlong blog post. I know that this much text tries the patience of most readers and is well in excess of my customary 3–4 paragraphs.

Continuing my book blogging of Pankaj Mishra’s Age of Anger, Chapter Two (subtitled “History’s Winners and Their Illusions”) focuses on the thought revolution that followed from the Enlightenment in Western Europe and its imitation in non-Western cultures, especially as manifested in the century leading to the French Revolution. Although the American Revolution (more narrowly a tax revolt with insistence on self-rule) preceded the French Revolution by slightly more than a decade, it’s really the French, whose motto liberté, égalité, fraternité came to prominence and defined an influential set of European values, who effectively challenged enthusiastic modernizers around the globe to try to catch up with the ascendant West.

However, almost as soon as this project appeared, i.e., attempting to transform ancien régime monarchies in Northern Africa, the Middle East, and Russia into something pseudo-European, critics arose who denounced the abandonment of tradition and centuries-old national identities. Perhaps they can be understood as the first wave of modern conservatism. Here is Mishra’s characterization:

Modernization, mostly along capitalist lines, became the universalist creed that glorified the autonomous rights-bearing individual and hailed his rational choice-making capacity as freedom. Economic growth was posited as the end-all of political life and the chief marker of progress worldwide, not to mention the gateway to happiness. Communism was totalitarian. Ergo its ideological opponent, American liberalism, represented freedom, which in turn was best advanced by moneymaking. [p. 48]

Aside: The phrase “rights-bearing individual” has obvious echoes with today’s SJWs and their poorly conceived demand for egalitarianism not just before the law but in social and economic outcomes. Although economic justice (totally out of whack with today’s extreme income and wealth inequality) is a worthy goal that aligns with idealized but not real-world Enlightenment values, SJW activism reinforces retrograde divisions of people based on race, gender, sexual orientation, religion, disability, etc. Calls to level out all these questionable markers of identity have resulted in intellectual confusion and invalidation of large “privileged” and/or “unoppressed” groups such as white males of European descent in favor of oppressed minorities (and majorities, e.g., women) of all categories. Never mind that many of those same white males are often every bit as disenfranchised as others whose victimhood is paraded around as some sort virtue granting them authority and preferential treatment.

Modernization has not been evenly distributed around the globe, which accounts for countries even today being designated either First, Second, or Third World. An oft-used euphemism is “developing economy,” which translates to an invitation for wealthy First-World nations (or its corporations) to force their way in to exploit cheap labor and untapped natural resources. Indeed, as Mishra points out, the promise of joining First-World living standards (having diverged centuries ago) is markedly hollow:

… doubters of Western-style progress today include more than just marginal communities and some angry environmental activists. In 2014 The Economist said that, on the basis of IMF data, emerging economies — or, most of the human population — might have to wait for three centuries in order to catch up with the West. In this assessment, the last decade of high growth was an ‘aberration’ and ‘billions of people will be poorer for a lot longer than they might have expected just a few years ago’.

The implications are sobering: the non-West not only finds itself replicating the West’s trauma on an infinitely larger scale. While helping inflict the profoundest damage yet on the environment — manifest today in rising sea levels, erratic rainfall, drought, declining harvests, and devastating floods — the non-West also has no real prospect of catching up … [pp. 47-48]

That second paragraph is an unexpected acknowledgement that the earliest industrialized nations (France, the United Kingdom, and the U.S.) unwittingly put us on a path to self-annihilation only to be knowingly repeated and intensified by latecomers to industrialization. All those (cough) ecological disturbances are occurring right now, though the public has been lulled into complacency by temporary abundance, misinformation, under- and misreporting, and international political incompetence. Of course, ecological destruction is no longer merely the West’s trauma but a global catastrophe of the highest magnitude which is certainly in the process of catching up to us.

Late in Chapter Two, Mishra settles on the Crystal Palace exhibition space and utopian symbol, built in 1851 during the era of world’s fairs and mistaken enthusiasm regarding the myth of perpetual progress and perfectibility, as an irresistible embodiment of Western hubris to which some intellectual leaders responded with clear disdain. Although a marvelous technical feat of engineering prowess and demonstration of economic power (not unlike countries that host the Olympics — remember Beijing?), the Crystal Palace was also viewed as an expression of the sheer might of Western thought and its concomitant products. Mishra repeatedly quotes Dostoevsky, who visited the Crystal Palace in 1862 and described his visceral response to the place poignantly and powerfully:

You become aware of a colossal idea; you sense that here something has been achieved, that here there is victory and triumph. You even begin vaguely to fear something. However independent you may be, for some reason you become terrified. ‘For isn’t this the achievement of perfection?’ you think. ‘Isn’t this the ultimate?’ Could this in fact be the ‘one fold?’ Must you accept this as the final truth and forever hold your peace? It is all so solemn, triumphant, and proud that you gasp for breath. [p. 68]

And later, describing the “world-historical import” of the Crystal Palace:

Look at these hundreds of thousands, these millions of people humbly streaming here from all over the face of the earth. People come with a single thought, quietly, relentlessly, mutely thronging onto this colossal palace; and you feel that something final has taken place here, that something has come to an end. It is like a Biblical picture, something out of Babylon, a prophecy from the apocalypse coming to pass before your eyes. You sense that it would require great and everlasting spiritual denial and fortitude in order not to submit, not to capitulate before the impression, not to bow to what is, and not to deify Baal, that is not to accept the material world as your ideal. [pp. 69–70]

The prophetic finality of the Crystal Palace thus presaged twentieth-century achievements and ideas (the so-called American Century) that undoubtedly eclipsed the awesome majesty of the Crystal Palace, e.g., nuclear fission and liberal democracy’s purported victory over Soviet Communism (to name only two). Indeed, Mishra begins the chapter with a review of Americans declarations of the end of history, i.e., having reached final forms of political, social, and economic organization that are now the sole model for all nations to emulate. The whole point of the chapter is that such pronouncements are illusions with strong historical antecedents that might have cautioned us not to leap to unwarranted conclusions or to perpetuate a soul-destroying regime hellbent on extinguishing all alternatives. Of course, as Gore Vidal famously quipped, “Americans never learn; it’s part of our charm.”


I put aside Harari’s book from the previous blog post in favor of Pankaj Mishra’s Age of Anger: A History of the Present (2017). Mishra’s sharp cultural criticism is far more convincing than Harari’s Panglossian perspective. Perhaps some of that is due to an inescapable pessimism in my own character. Either way, I’ve found the first 35 pages dense with observations of interest to me as a blogger and armchair cultural critic. Some while back, I published a post attempting to delineate (not very well, probably) what’s missing in the modern world despite its obvious material abundance. Reinforcing my own contentions, Mishra’s thesis (as I understand it so far) is this: we today share with others post-Enlightenment an array of resentments and hatreds (Fr.: ressentiment) aimed incorrectly at scapegoats for political and social failure to deliver the promises of progressive modernity equitably. For instance, Mishra describes

… flamboyant secular radicals in the nineteenth and early twentieth centuries: the aesthetes who glorified war, misogyny and pyromania; the nationalists who accused Jews and liberals of rootless cosmopolitanism and celebrated irrational violence; and the nihilists, anarchists and terrorists who flourished in almost every continent against a background of cosy political-financial alliances, devastating economic crises and obscene inequalities. [pp. 10–11]

Contrast and/or compare his assessment of the recent past:

Beginning in the 1990s, a democratic revolution of aspiration … swept across the world, sparking longings for wealth, status and power, in addition to ordinary desires for stability and contentment, in the most unpromising circumstances. Egalitarian ambition broke free of old social hierarchies … The culture of [frantic] individualism went universal … The crises of recent years have uncovered an extensive failure to realize the ideals of endless economic expansion and private wealth creation. Most newly created ‘individuals’ toil within poorly imagined social and political communities and/or states with weakening sovereignty … individuals with very different pasts find themselves herded by capitalism and technology into a common present, where grossly unequal distributions of wealth and power have created humiliating new hierarchies. This proximity … is rendered more claustrophobic by digital communications … [S]hocks of modernity were once absorbed by inherited social structures of family and community, and the state’s welfare cushions [something mentioned here, too]. Today’s individuals are directly exposed to them in an age of accelerating competition on uneven playing fields, where it is easy to feel that there is no such thing as either society or state, and that there is only a war of all against all. [pp. 12–14]

These long quotes (the second one cut together from longer paragraphs) are here because Mishra is remarkably eloquent in his diagnosis of globalized culture. Although I’ve only read the prologue, I expect to find support for my long-held contention that disorienting disruptions of modernity (using Anthony Giddens’ sociological definition rather than the modish use of the term Postmodern to describe only the last few decades) create unique and formidable challenges to the formation of healthy self-image and personhood. Foremost among these challenges is an unexpectedly oppressive information environment: the world forced into full view and inciting comparison, jealousy, envy, and hatred stemming from routine and ubiquitous frustrations and humiliations as we each struggle in life getting our personal share of attention, renown, and reward.

Another reason Mishra provides for our collective anger is a deep human yearning not for anarchism or radical freedom but rather for belonging and absorption within a meaningful social context. This reminds me of Erich Fromm’s book Escape from Freedom (1941), which I read long ago but can’t remember so well anymore. I do remember quite vividly how counter-intuitive was the suggestion that absolute freedom is actually burdensome as distinguished from the usual programming we get about breaking free of all restraints. (Freedom! Liberty!) Indeed, Mishra provides a snapshot of multiple cultural and intellectual movements from the past two centuries where abandoning oneself to a cause, any cause, was preferable to the boredom and nothingness of everyday life absent purpose other than mere existence. The modern substitute for larger purpose — commodity culture — is a mere shadow of better ways of spending one’s life. Maybe commodity culture is better than sacrificing one’s life fighting wars (a common fate) or destroying others, but that’s a much longer, more difficult argument.

More to follow as my reading progresses.

I started reading Yuval Harari’s book Homo Deus: A Brief History of Tomorrow (2017). Had expected to read Sapiens (2014) first but its follow-up came into my possession instead. My familiarity with Harari’s theses and arguments stem from his gadfly presence on YouTube being interviewed or giving speeches promoting his books. He’s a compelling yet confounding thinker, and his distinctive voice in my mind’s ear lent to my reading the quality of an audiobook. I’ve only read the introductory chapter (“A New Human Agenda”) so far, the main argument being this:

We have managed to bring famine, plague and war under control thanks largely to our phenomenal economic growth, which provides us with abundant food, medicine, energy and raw materials. Yet this same growth destabilises the ecological equilibrium of the planet in myriad ways, which we have only begun to explore … Despite all the talk of pollution, global warming and climate change, most countries have yet to make any serious economic or political sacrifices to improve the situation … In the twenty-first century, we shall have to do better if we are to avoid catastrophe. [p. 20]

“Do better”? Harari’s bland understatement of the catastrophic implications of our historical moment is risible. Yet as a consequence of having (at least temporarily) brought three major historical pestilences (no direct mention of the fabled Four Horsemen of the Apocalypse) under administrative, managerial, and technical control (I leave that contention unchallenged), Harari states rather over-confidently — forcefully even — that humankind is now turning its attention and ambitions toward different problems, namely, mortality (the fourth of the Four Horsemen and one of the defining features of the human condition), misery, and divinity.

Harari provides statistical support for his thesis (mere measurement offered as indisputable evidence — shades of Steven Pinker in Enlightenment Now), none of which I’m in a position to refute. However, his contextualization, interpretation, and extrapolation of trends purportedly demonstrating how humans will further bend the arc of history strike me as absurd. Harari also misses the two true catalyzing factors underlying growth and trends that have caused history to go vertical: (1) a fossil-fuel energy binge of roughly two and one-half centuries that peaked more than a decade ago and (2) improved information and material flows and processing that enabled managerial and bureaucratic functions to transcend time and space or at least lessen their constraints on human activity dramatically. James Beniger addresses information flow and processing in his book The Control Revolution (1989). Many, many others have provided in-depth analyses of energy uses (or inputs) because, contrary to the familiar song lyric, it’s energy that makes the world go round. No one besides Harari (to my knowledge but I’m confident some lamebrained economist agrees with Harari) leaps to the unwarranted conclusion that economic growth is the principal forcing factor of the last 2–3 centuries.

I’ve taken issue with Harari before (here and here) and will not repeat those arguments. My impression of Homo Deus, now that I’ve got 70 pages under my belt, is that Harari wants to have it both ways: vaguely optimistic (even inspirational and/or aspirational) regarding future technological developments (after all, who doesn’t want the marvels and wonders we’ve been ceaselessly teased and promised?) yet precautionary because those very developments will produce disruptive and unforeseeable side effects (black swans) we can’t possibly yet imagine. To his credit, Harari’s caveats regarding unintended consequences are plain and direct. For instance, one of the main warnings is that the way we treat nonhuman species is the best model for how we humans will in turn be treated when superhumans or strong AI appear, which Harari believes is inevitable so long as we keep tinkering. Harari also indicates that he’s not advocating for any of these anticipated developments but is merely mapping them as likely outcomes of human restlessness and continued technological progress.

Harari’s disclaimers do not convince me; his writing is decidedly Transhumanist in character. In the limited portion I’ve read, Harari comes across far more like “golly, gee willikers” at human cleverness and potential than as someone seeking to slam on the brakes before we innovate ourselves out of relevance or existence. In fact, by focusing on mortality, misery, and divinity as future projects, Harari gets to indulge in making highly controversial (and fatuous) predictions regarding one set of transformations that can happen only if the far more dire and immediate threats of runaway global warming and nonlinear climate change don’t first lead to the collapse of industrial civilization and near-term extinction of humans alongside most other species. My expectation is that this second outcome is far more likely than anything contemplated by Harari in his book.

Update: Climate chaos has produced the wettest winter, spring, and summer on record, which shows no indication of abating. A significant percentage of croplands in flooded regions around the globe is unplanted, and those that are planted are stunted and imperiled. Harari’s confidence that we had that famine problem licked is being sorely tested.

An ongoing conflict in sociology and anthropology exists between those who believe that human nature is competitive and brutal to the bitter end versus those who believe human nature is more cooperative and sociable, sharing resources of all types to secure the greater good. This might be recognizable to some as the perennial friction between citizen and society (alternatively, individualism and collectivism). Convincing evidence from human prehistory is difficult to uncover. Accordingly, much of the argument for competition comes from evolutionary biology, where concepts such as genetic fitness and reproductive success (and by inference, reproductive failure) are believed to motivate and justify behavior across the board. As the typical argument goes, inferior genes and males in particular who lack sexual access or otherwise fail to secure mates don’t survive into the next generation. Attributes passed onto each subsequent generation thus favor fitter, Type A brutes who out-compete weaker (read: more cooperative) candidates in an endless self-reinforcing and narrowing cycle. The alternative offered by others points to a wider gene pool based on collaboration and sharing of resources (including mates) that enables populations to thrive together better than individuals who attempt to go it alone or dominate.

Not having undertaken a formal study of anthropology (or more broadly, primatology), I can’t say how well this issue is settled in the professional, academic literature. Online, I often see explanations that are really just-so stories based on logic. What that means is that an ideal or guiding principle is described, something that just “makes sense,” and supporting evidence is then assumed or projected. For instance, we now know many of the mechanisms that function at the cellular level with respect to reproduction and genetic evolution. Those mechanisms are typically spun up to the level of the organism through pure argumentation and presumed to manifest in individual behaviors. Any discontinuity between aggregate characteristics and particular instances is ignored. Questions are solved through ideation (i.e., thought experiments). However, series of if-then statements that seem plausible when confronted initially often turn out to be pure conjecture rather than evidence. That’s a just-so story.

One of the reasons we look into prehistory for evidence of our true nature (understood as biology, not sociology, handily sweeping aside the nature/nurture question) is that hunter-gatherers (HGs) lived at subsistence level for a far longer period of our evolutionary history than our comparatively brief time within the bounty of civilization. It’s only when surpluses and excesses provide something worth hoarding, monopolizing, and protecting that hierarchies arise and/or leveling mechanisms are relaxed. Leaving Babylon has a discussion of this here. Some few HG cultures survive into the 21st century, but for most of us, The Agricultural Revolution is the branching point when competition began to assert itself, displacing sharing and other egalitarian impulses. Accordingly, the dog-eat-dog competition and inequality characteristic of the modern world is regarded by many as an exaptation, not our underlying nature.


I’m currently reading Go Wild by John Ratey and Richard Manning. It has some rather astounding findings on offer. One I’ll draw out is that the human brain evolved not for thinking, as one might imagine, but for coordinating complex physiological movements:

… even the simplest of motions — a flick of a finger or a turn of the hand to pick up a pencil — is maddeningly complex and requires coordination and computational power beyond electronics abilities. For this you need a brain. One of our favorites quotes on this matter comes from the neuroscientists Rodolfo Llinás: “That which we call thinking is the evolutionary internationalization of movement.” [p. 100]

Almost all the computation is unconsciousness, or maybe preconscious, and it’s learned over a period of years in infancy and early childhood (for basic locomotion) and then supplemented throughout life (for skilled motions, e.g., writing cursive or typing). Moreover, those able to move with exceptional speed, endurance, power, accuracy, and/or grace are admired and sometimes rewarded in our culture. The obvious example is sports. Whether league sports with wildly overcompensated athletes, Olympic sports with undercompensated athletes, or combat sports with a mixture of both, thrill attaches to watching someone move effectively within the rule-bound context of the sport. Other examples include dancers, musicians, circus artists, and actors who specialize in physical comedy and action. Each develops specialized movements that are graceful and beautiful, which Ratey and Manning write may also account for nonsexual appreciation and fetishization of the human body, e.g., fashion models, glammed-up actors, and nude photography.

I’m being silly saying that jocks figgered it first, of course. A stronger case could probably be made for warriors in battle, such as a skilled swordsman. But it’s jocks who are frequently rewarded all out of proportion with others who specialize in movement. True, their genetics and training enable a relatively brief career (compared to, say, surgeons or pianists) before abilities ebb away and a younger athlete eclipses them. But a fundamental lack of equivalence with artisans and artists is clear, whose value lies less with their bodies than with outputs their movements produce.

Regarding computational burdens, consider the various mechanical arms built for grasping and moving objects, some of them quite large. Mechanisms (frame and hydraulics substituting for bone and muscle) themselves are quite complex, but they’re typically controlled by a human operator rather than automated. (Exceptions abound, but they’re highly specialized, such as circuit board manufacture or textile production.) More recently, robotics demonstrate considerable advancement in locomotion without a human operator, but they’re also narrowly focused in comparison with the flexibility of motion a human body readily possesses. Further, in the case of flying drones, robots operate in wide open space, or, in the case of those designed to move like dogs or insects, use 4+ legs for stability. The latter are typically built to withstand quite a lot of bumping and jostling. Upright bipedal motion is still quite clumsy in comparison with humans, excepting perhaps wheeled robots that obviously don’t move like humans do.

Curiously, the movie Pacific Rim (sequel just out) takes notice of the computational or cognitive difficulty of coordinated movement. To operate giant robots needed to fight Godzilla-like interdimensional monsters, two mind-linked humans control a battle robot. Maybe it’s a simple coincidence — a plot device to position humans in the middle of the action (and robot) rather than killing from a distance — such as via drone or clone — or maybe not. Hollywood screenwriters are quite clever at exploiting all sorts material without necessarily divulging the source of inspiration. It’s art imitating life, knowingly or not.

Be forewarned: this is long and self-indulgent. Kinda threw everything and the kitchen sink at it.

In the August 2017 issue of Harper’s Magazine, Walter Kirn’s “Easy Chair” column called “Apocalypse Always” revealed his brief, boyhood fascination with dystopian fiction. This genre has been around for a very long time, to which the Cassandra myth attests. Kirn’s column is more concerned with “high mid-twentieth-century dystopian fiction,” which in his view is now classic and canonical, an entire generation of Baby Boomers having been educated in such patterned thought. A new wave of dystopian fiction appeared in the 1990s and yet another more recently in the form of Young Adult novels (and films) that arguably serve better as triumphal coming-of-age stories albeit under dystopian circumstances. Kirn observes a perennial theme present in the genre: the twins disappearances of freedom and information:

In the classic dystopias, which concern themselves with the lack of freedom and not with surplus freedom run amok (the current and unforeseen predicament of many), society is superbly well organized, resembling a kind of hive or factory. People are sorted, classified, and ranked, their individuality suppressed through goon squads, potent narcotics, or breeding programs. Quite often, they wear uniforms, and express themselves, or fail to, in ritual utterance and gestures.

Whether Americans in 2018 resemble hollowed-out zombies suffering under either boot-heel or soft-serve oppression is a good question. Some would argue just that in homage to classic dystopias. Kirn suggests briefly that we might instead suffer from runaway anarchy, where too much freedom and licentiousness have led instead to a chaotic and disorganized society populated by citizens who can neither govern nor restrain themselves.

Disappearance of information might be understood in at least three familiar aspects of narrative framing: what happened to get us to this point (past as exposition, sometimes only hinted at), what the hell? is going on (present as conflict and action), and how is gets fixed (future as resolution and denouement). Strict control over information exercised by classic dystopian despots doesn’t track to conditions under which we now find ourselves, where more disorganized, fraudulent, and degraded information than ever is available alongside small caches of wisdom and understanding buried somewhere in the heap and discoverable only with the benefit of critical thinking flatly lost on at least a couple generations of miseducated graduates. However, a coherent narrative of who and what we are and what realistic prospects the future may hold has not emerged since the stifling version of the 1950s nuclear family and middle class consumer contentment. Kirn makes this comparison directly, where classic dystopian fiction

focus[es] on bureaucracy, coercion, propaganda, and depersonalization, overstates both the prowess of the hierarchs and the submissiveness of the masses, whom it still thinks of as the masses. It does not contemplate Trump-style charlatanism at the top, or a narcissistic populace that prizes attention over privacy. The threats to individualism are paramount; the scourge of surplus individualism, with everyone playing his own dunce king and slurping up resources until he bursts, goes unexplored.

Kirn’s further observations are worth a look. Go read for yourself.


Here’s the last interesting bit I am lifting from Anthony Gidden’s The Consequences of Modernity. Then I will be done with this particular book-blogging project. As part of Gidden’s discussion of the risk profile of modernity, he characterizes risk as either objective or perceived and further divides in into seven categories:

  1. globalization of risk (intensity)
  2. globalization of risk (frequency)
  3. environmental risk
  4. institutionalized risk
  5. knowledge gaps and uncertainty
  6. collective or shared risk
  7. limitations of expertise

Some overlap exists, and I will not distinguish them further. The first two are of primary significance today for obvious reasons. Although the specter of doomsday resulting from a nuclear exchange has been present since the 1950s, Giddens (writing in 1988) provides this snapshot of today’s issues:

The sheer number of serious risks in respect of socialised nature is quite daunting: radiation from major accidents at nuclear power-stations or from nuclear waste; chemical pollution of the seas sufficient to destroy the phytoplankton that renews much of the oxygen in the atmosphere; a “greenhouse effect” deriving from atmospheric pollutants which attack the ozone layer, melting part of the ice caps and flooding vast areas; the destruction of large areas of rain forest which are a basic source of renewable oxygen; and the exhaustion of millions of acres of topsoil as a result of widespread use of artificial fertilisers. [p. 127]

As I often point out, these dangers were known 30–40 years ago (in truth, much longer), but they have only worsened with time through political inaction and/or social inertia. After I began to investigate and better understand the issues roughly a decade ago, I came to the conclusion that the window of opportunity to address these risks and their delayed effects had already closed. In short, we’re doomed and living on borrowed time as the inevitable consequences of our actions slowly but steadily manifest in the world.

So here’s the really interesting part. The modern worldview bestows confidence borne out of expanding mastery of the built environment, where risk is managed and reduced through expert systems. Mechanical and engineering knowledge figure prominently and support a cause-and-effect mentality that has grown ubiquitous in the computing era, with its push-button inputs and outputs. However, the high modern outlook is marred by overconfidence in our competence to avoid disaster, often of our own making. Consider the abject failure of 20th-century institutions to handle geopolitical conflict without devolving into world war and multiple genocides. Or witness periodic crashes of financial markets, two major nuclear accidents, and numerous space shuttles and rockets destroyed. Though all entail risk, high-profile failures showcase our overconfidence. Right now, engineers (software and hardware) are confident they can deliver safe self-driving vehicles yet are blithely ignoring (says me, maybe not) major ethical dilemmas regarding liability and technological unemployment. Those are apparently problems for someone else to solve.

Since the start of the Industrial Revolution, we’ve barrelled headlong into one sort of risk after another, some recognized at the time, others only apparent after the fact. Nuclear weapons are the best example, but many others exist. The one I raise frequently is the live social experiment undertaken with each new communications technology (radio, cinema, telephone, television, computer, social networks) that upsets and destabilizes social dynamics. The current ruckus fomented by the radical left (especially in the academy but now infecting other environments) regarding silencing of free speech (thus, thought policing) is arguably one concomitant.

According to Giddens, the character of modern risk contrasts with that of the premodern. The scale of risk prior to the 17th century was contained and expectation of social continuity was strong. Risk was also transmuted through magical thinking (superstition, religion, ignorance, wishfulness) into providential fortuna or mere bad luck, which led to feelings of relative security rather than despair. Modern risk has now grown so widespread, consequential, and soul-destroying, situated at considerable remove leading to feelings of helplessness and hopelessness, that those not numbed by the litany of potential worries afflicting daily life (existential angst or ontological insecurity) often develop depression and other psychological compulsions and disturbances. Most of us, if aware of globalized risk, set it aside so that we can function and move forward in life. Giddens says that this conjures up anew a sense of fortuna, that our fate is no longer within our control. This

relieves the individual of the burden of engagement with an existential situation which might otherwise be chronically disturbing. Fate, a feeling that things will take their own course anyway, thus reappears at the core of a world which is supposedly taking rational control of its own affairs. Moreover, this surely exacts a price on the level of the unconscious, since it essentially presumes the repression of anxiety. The sense of dread which is the antithesis of basic trust is likely to infuse unconscious sentiments about the uncertainties faced by humanity as a whole. [p. 133]

In effect, the nature of risk has come full circle (completed a revolution, thus, revolutionized risk) from fate to confidence in expert control and back to fate. Of course, a flexibility of perspective is typical as situation demands — it’s not all or nothing — but the overarching character is clear. Giddens also provides this quote by Susan Sontag that captures what he calls the low-probability, high-consequence character of modern risk:

A permanent modern scenario: apocalypse looms — and it doesn’t occur. And still it looms … Apocalypse is now a long-running serial: not ‘Apocalypse Now,’ but ‘Apocalypse from now on.’ [p. 134]

Here’s another interesting tidbit from Anthony Giddens’ book The Consequences of Modernity, which is the subject of a series of book blogs I’ve been writing. In his discussion of disembedding mechanisms, he introduces the idea of civil inattention (from Goffman, actually). This has partly to do with presence or absence (including inattention) in both public and private settings where face-to-face contact used to be the only option but modern technologies have opened up the possibility of faceless interactions over distance, such as with the telegraph and telephone. More recently, the face has been reintroduced with videoconferencing, but nonverbal cues such as body language are largely missing; the fullness of communication remains attenuated. All manner of virtual or telepresence are in fact cheap facsimiles of true presence and the social cohesion and trust enabled by what Giddens calls facework commitments. Of course, we delude ourselves that interconnectivity mediated by electronics is a reasonable substitute for presence and attention, which fellow blogger The South Roane Agrarian bemoans with this post.

Giddens’ meaning is more specific than this, though. The inattention of which Giddens writes is not the casual distraction of others with which we all increasingly familiar. Rather, Giddens takes note social behaviors embedded in deep culture having to do with signalling trust.

Two people approach and pass one another on a city sidewalk. What could be more trivial and uninteresting? … Yet something is going on here which links apparently minor aspects of bodily management to some of the most pervasive features of modernity. The “inattention” displayed is not indifference. Rather it is a carefully monitored demonstration of what might be called polite estrangement. As the two people approach one another, each rapidly scans the face of the other, looking away as they pass … The glance accords recognition of the other as an agent and as a potential acquaintance. Holding the gaze of the other only briefly, then looking ahead as each passes the other couples such an attitude with an implicit reassurance of lack of hostile intent. [p. 81]

It’s a remarkably subtle interaction: making eye contact to confirm awareness of another but then averting one’s eyes to establish that copresence poses no particular threat in either direction. Staring too fixedly at another communicates something quite else, maybe fear or threat or disapprobation. By denying eye contact — by keeping one’s eyes buried in a handheld device, for instance — the opportunity to establish a modicum of trust between strangers is missed. Intent (or lack thereof) is a mystery. In practice, such modern-day inattention is mere distraction, not a sign of malevolence, but the ingrained social cue is obviated and otherwise banal happenstances become sources of irritation, discomfort, and/or unease, as with someone who doesn’t shake hands or perform others types of greeting properly.

I wrote before about my irritation with others face-planted in their phones. It is not a matter of outright offense but rather a quiet sense of affront at failure to adopt accepted social behaviors (as I once did). Giddens puts it this way:

Tact and rituals of politeness are mutual protective devices, which strangers or acquaintances knowingly use (mostly on the level of practical consciousness) as a kind of implicit social contact. Differential power, particularly where it is very marked, can breach or skew norms …. [pp. 82–83]

That those social behaviors have adapted to omnipresent mobile media, everyone pacified or hypnotized within their individual bubbles, is certainly not a salutary development. It is, however, a clear consequence of modernity.

Another modest surprise (to me at least) offered by Anthony Giddens (from The Consequences of Modernity) follows a discussion of reflexivity (what I call recursion when discussing consciousness), which is the dynamic of information and/or knowledge feeding back to influence later behavior and information/knowledge. His handy example is the populace knowing divorce rates, which has an obvious influence on those about to get married (but may decide to defer or abjure entirely). The surprise is this:

The discourse of sociology and the concepts, theories, and findings of the other social sciences continually “circulate in and out” of what it is that they are about. In so doing they reflexively restructure their subject matter, which itself has learned to think sociologically … Much that is problematic in the position of the professional sociologist, as the purveyor of expert knowledge about social life, derives from the fact that she or he is at most one step ahead of enlightened lay practitioners of the discipline. [p. 43]

I suppose “enlightened lay practitioners” are not the same as the general public, which I hold in rather low esteem as knowing (much less understanding) anything of value. Just consider electoral politics. Still, the idea that an expert in an academic field admits he is barely ahead of wannabes (like me) seems awfully damning. Whereas curious types will wade in just about anywhere, and in some cases, amateurs will indulge themselves enthusiastically in endeavors also practiced by experts (sports and music are the two principal examples that spring to mind), the distance (in both knowledge and skill) between experts and laypersons is typically quite far. I suspect those with high intellect and/or genetic gifts often bridge that gap, but then they join the ranks of the experts, so the exception leads nowhere.