Archive for the ‘Culture’ Category

The “American character,” if one can call it into being merely by virtue of naming it (the same rhetorical trick as solutionism), is diverse and ever-changing. Numerous characterizations have been offered throughout history, with Alexis de Tocqueville’s Democracy in America (1835 and 1840) being perhaps the one cited most frequently despite its outdatedness. Much in American character has changed since that time, and it’s highly questionable to think it was unified even then. However, as a means of understanding ourselves, it’s as good a place to start as any. A standard criticism of American character as seen from outside (i.e., when Americans travel abroad) is the so-called ugly American: loud, inconsiderate, boorish, and entitled. Not much to argue with there. A more contemporary assessment by Morris Berman, found throughout his “American trilogy,” is that we Americans are actually quite stupid, unaccountably proud of it, and constantly hustling (in the pejorative sense) in pursuit of material success. These descriptions don’t quite match up with familiar jingoism about how great America is (and of course, Americans), leading to non-Americans clamoring to emigrate here, or the self-worship we indulge in every national holiday celebrating political and military history (e.g., Independence Day, Veteran’s Day, Memorial Day).

I recently ran afoul of another ugly aspect of our national character: our tendency toward aggression and violence. In truth, this is hardly unique to Americans. Yet it came up glaringly in the context of a blog post at Pharyngula citing a Tweet comparing uneven application of law (and indignation among online chatterers?) when violence is committed by the political left vs. the political right. Degree of violence clearly matters, but obvious selection bias was deployed to present an egregiously lop-sided perspective. Radicals on both the left and right have shown little compunction about using violence to achieve their agendas. Never mind how poorly conceived those agendas may be. What really surprised me, however, was that my basic objection to violence in all forms across the spectrum was met with snark and ad hominem attack. When did reluctance to enact violence (including going to war) until extremity demands it become controversial?

My main point was that resorting to violence typically invalidates one’s objective. It’s a desperation move. Moreover, using force (e.g., intimidation, threats, physical violence — including throwing milkshakes) against ideological opponents is essentially policing others’ thoughts. But they’re fascists, right? Violence against them is justified because they don’t eschew violence. No, wrong. Mob justice and vigilantism obviate the rule of law and criminalize any perpetrator of violence. It’s also the application of faulty instrumental logic, ceding any principled claim to moral authority. But to commentators at the blog post linked above, I’m the problem because I’m not in support of fighting fascists with full force. Guess all those masked, caped crusaders don’t recognize that they’re contributing to lawlessness and mayhem. Now even centrists come in for attack for not be radical (or aggressive, or violent) enough. Oddly silent in the comments is the blog host, P.Z. Myers, who has himself communicated approval of milkshake patrols and Nazi punching, as though the presumptive targets (identified rather haphazardly and incorrectly in many instances) have no right to their own thoughts and ideas, vile though they may be, and that violence is the right way to “teach them a lesson.” No one learns the intended lesson when the victim of violence. Rather, if not simply cowed into submission (not the same as agreement), tensions tend to escalate into further and increasing violence. See also reaction formation.

Puzzling over this weird exchange with these, my fellow Americans (the ideologically possessed ones anyway), caused me to backtrack. For instance, the definition of fascism at dictionary.com is “a governmental system led by a dictator having complete power, forcibly suppressing opposition and criticism, regimenting all industry, commerce, etc., and emphasizing an aggressive nationalism and often racism.” That definition sounds more like totalitarianism or dictatorship and is backward looking, specifically to Italy’s Benito Mussolini in the period 1922 to 1943. However, like national characters, political moods and mechanisms change over time, and the recent fascist thrust in American politics isn’t limited to a single leader with dictatorial power. Accordingly, the definition above has never really satisfied me.

I’ve blogged repeatedly about incipient fascism in the U.S., the imperial presidency (usually associated with George W. Bush but also characteristic of Barack Obama), James Howard Kunstler’s prediction of a cornpone fascist coming to power (the way paved by populism), and Sheldon Wolin’s idea of inverted totalitarianism. What ties these together is how power is deployed and against what targets. More specifically, centralized power (or force) is directed against domestic populations to advance social and political objectives without broad public support for the sole benefit of holders of power. That’s a more satisfactory definition of fascism to me, certain far better that Peter Schiff’s ridiculous equation of fascism with socialism. Domination of others to achieve objectives describes the U.S. power structure (the military-industrial-corporate complex) to a tee. That doesn’t mean manufactured consent anymore; it means bringing the public into line, especially through propaganda campaigns, silencing of criticism, prosecuting whistle-blowers, and broad surveillance, all of which boil down to policing thought. The public has complied by embracing all manner of doctrine against enlightened self-interest, the very thing that was imagined to magically promote the general welfare and keep us from wrecking things or destroying ourselves unwittingly. Moreover, public support not really obtained through propaganda and domination, only the pretense of agreement found convincing by fools. Similarly, admiration, affection, and respect are not won with a fist. Material objectives (e.g., resource reallocation, to use a familiar euphemism) achieved through force are just common theft.

So what is Antifa doing? It’s forcibly silencing others. It’s doing the work of fascist government operatives by proxy. It’s fighting fascism by becoming fascist, not unlike the the Republican-led U.S. government in 2008 seeking bailouts for banks and large corporations, handily transforming our economy into a socialist experiment (e.g, crowd-funding casino capitalism through taxation). Becoming the enemy to fight the enemy is a nice trick of inversion, though many are so flummoxed by these contradictions they resort to Orwellian doublethink to reconcile the paradox. Under such conditions, there are no arguments that can convince. Battle lines are drawn, tribal affiliations are established, and the ideological war of brother against brother, American against American, intensifies until civility crumbles around us. Civil war and revolution haven’t occurred in the U.S. for 150 years, but they are popping up regularly around the globe, often at the instigation of the U.S. government (again, acting against the public interest). Is our turn coming because we Americans have been divided and conquered instead of recognizing the real source of threat?

Advertisements

Returning to Pankaj Mishra’s The Age of Anger, chapter 2 (subtitled “Progress and its Contradictions”) profiles two writers of the 18th-century Enlightenment: François-Marie Arouet (1694–1778), better known by his nom de plume Voltaire, and Jean-Jacques Rousseau (1712–1778). Voltaire was a proponent and embodiment of Enlightenment values and ethics, whereas Rousseau was among the primary critics. Both were hugely influential, and the controversy inherent in their relative perspectives is unresolved even today. First come Rousseau’s criticisms (in Mishra’s prose):

… the new commercial society, which was acquiring its main features of class divisions, inequality and callous elites during the eighteenth century, made its members corrupt, hypocritical and cruel with its prescribed values of wealth, vanity and ostentation. Human beings were good by nature until they entered such a society, exposing themselves to ceaseless and psychologically debilitating transformation and bewildering complexity. Propelled into an endless process of change, and deprived of their peace and stability, human beings failed to be either privately happy or active citizens [p. 87]

This assessment could easily be mistaken for a description of the 1980s and 90s: ceaseless change and turmoil as new technological developments (e.g., the Internet) challenged everyone to reorient and reinvent themselves, often as a brand. Cultural transformation in the 18th century, however, was about more than just emerging economic reconfigurations. New, secular, free thought and rationalism openly challenged orthodoxies formerly imposed by religious and political institutions and demanded intellectual and entrepreneurial striving to participate meaningfully in charting new paths for progressive society purportedly no longer anchored statically in the past. Mishra goes on:

It isn’t just that the strong exploit the weak; the powerless themselves are prone to enviously imitate the powerful. But people who try to make more of themselves than others end up trying to dominate others, forcing them into positions of inferiority and deference. The lucky few on top remain insecure, exposed to the envy and malice of the also-rans. The latter use all means available to them to realize their unfulfilled cravings while making sure to veil them with a show of civility, even benevolence. [p. 89]

Sounds quite contemporary, no? Driving the point home:

What makes Rousseau, and his self-described ‘history of the human heart’, so astonishingly germane and eerily resonant is that, unlike his fellow eighteenth-century writers, he described the quintessential inner experience of modernity for most people: the uprooted outsider in the commercial metropolis, aspiring for a place in it, and struggling with complex feelings of envy, fascination, revulsion and rejection. [p. 90]

While most of the chapter describes Rousseau’s rejection and critique of 18th-century ethics, Mishra at one point depicts Rousseau arguing for instead of against something:

Rousseau’s ideal society was Sparta, small, harsh, self-sufficient, fiercely patriotic and defiantly un-cosmopolitan and uncommercial. In this society at least, the corrupting urge to promote oneself over others, and the deceiving of the poor by the rich, could be counterpoised by the surrender of individuality to public service, and the desire to seek pride for community and country. [p. 92]

Notably absent from Mishra’s profile is the meme mistakenly applied to Rousseau’s diverse criticism: the noble savage. Rousseau praises provincial men (patriarchal orientation acknowledged) largely unspoilt by the corrupting influence of commercial, cosmopolitan society devoted to individual self-interest and amour propre, and his ideal (above) is uncompromising. Although Rousseau had potential to insinuate himself successfully in fashionable salons and academic posts, his real affinity was with the weak and downtrodden — the peasant underclass — who were mostly passed over by rapidly modernizing society. Others managed to raise their station in life above the peasantry to join the bourgeoisie (disambiguation needed on that term). Mishra’s description (via Rousseau) of this middle and upper middle class group provided my first real understanding of popular disdain many report toward bourgeois values using the derisive term bourgie (clearer when spoken than when written).

Profile of Voltaire to follow in part 2.

The Judaeo-Christian dictum “go forth, be fruitful, and multiply” (Genesis 1:28, translations vary) was taken to heart not only by Jews and Christians but by people everywhere resources allowed. Prior to the modern era, human population remained in check because, among other things, high rates of infant and child mortality, pandemics, and famine were commonplace. Now that modern medicine, hygiene, and health deliver far more children into adulthood (and thus their breeding years) and our fossil fuel energy binge allows us to overproduce and overreproduce, population has spiked. While some herald human flourishing (mere quantity, not quality) as an unmitigated good, our massive human population beggars the question: what to do with all the extra people? The American answer is already known: if they’re not productive citizens (read: labor for someone else’s profit), lock ’em up (ironically transforming them into profit centers using tax monies) or simply abandon them to live (and shit) on the streets of San Francisco or some other temperate, coastal city. If they’re foreigners competing for the same resources we (Americans) want for ourselves, well, just kill ’em (a different sort of disposal).

Those observations are really quite enough, ugly and obvious as they are. However, history isn’t yet done with us. Futurists warn that conditions will only worsen (well, duh!) as technological unemployment (robots and software soon to perform even more tasks that used to be handled by people paid money for their effort and expertise) causes more and more people to be tossed aside in venal pursuit of profit. Optimists and cheerleaders for the new technological utopia dystopia frequently offer as cold comfort that people with newfound time on their hands are free to become entrepreneurial or pursue creative endeavors. Never mind that basic needs (e.g., housing, food, clothing, and healthcare) must come first. The one thing that’s partially correct about the canard that everyone can magically transform themselves into small business owners or content creators is that we have become of nation of idlers fixated on entertainments of many varieties. That’s a real bottomless well. Some percentage (unknown by me) actually produces the content (TV shows, movies, music, books, blogs, journalism, YouTube channels, podcasts, social media feeds, video games, sports teams and competitions, etc.), all completing for attention, and those people are often rewarded handsomely if the medium produces giant subscription and revenues. Most of it is just digital exhaust. I also judge that most of us are merely part of the audience or have failed to go viral hit it big if indeed we have anything on offer in the public sphere. Of course, disposable time and income drives the whole entertainment complex. Doubtful folks living in burgeoning American tent cities contribute anything to that economic sector.

It’s sometimes said that a society can be measured by how it treats its weakest members. The European social contract (much derided in the U.S.) takes that notion seriously and supports the down-and-out. The American social contract typically blames those who are weak, often by no fault of their own (e.g., medical bankruptcy), and kicks them when they’re down. Consider just one common measure of a person: intelligence. Though there are many measures of intelligence, the standard is IQ, which is computational, linguistic, and abstract. It’s taboo to dwell too much on differences, especially when mapped onto race, gender, or nationality, so I won’t go there. However, the standard, conservative distribution places most people in the average between 90 and 110. A wider average between 81 (low average) and 119 (high average) captures even more people before a small percentage of outliers are found at the extremes. Of course, almost everyone thinks him- or herself squarely in the upper half. As one descends deeper into the lower half, it’s been found that IQ deficits mean such a person is unsuitable for most types of gainful employment and some are flatly unsuitable for any employment at all. What to do with those people? With U.S. population now just under 330 million, the lower half is roughly 165 million people! How many of those “useless eaters” are abandoned to their fates is hard to know, but it’s a far bigger number and problem than the ridiculous, unhelpful advice “learn to code” would suggest. The cruelty of the American social contract is plain to see.

/rant on

Yet another journalist has unburdened herself (unbidden story of personal discovery masquerading as news) of her addiction to digital media and her steps to free herself from the compulsion to be always logged onto the onslaught of useless information hurled at everyone nonstop. Other breaking news offered by our intrepid late-to-the-story reporter: water is wet, sunburn stings, and the Earth is dying (actually, we humans are actively killing it for profit). Freeing oneself from the screen is variously called digital detoxification (detox for short), digital minimalism, digital disengagement, digital decoupling, and digital decluttering (really ought to be called digital denunciation) and means limiting the duration of exposure to digital media and/or deleting one’s social media accounts entirely. Naturally, there are apps (counters, timers, locks) for that. Although the article offers advice for how to disentangle from screen addictions of the duh! variety (um, just hit the power switch), the hidden-in-plain-sight objective is really how to reengage after breaking one’s compulsions but this time asserting control over the infernal devices that have taken over life. It’s a love-hate style of technophilia and chock full of illusions embarrassing even to children. Because the article is nominally journalism, the author surveys books, articles, software, media platforms, refusniks, gurus, and opinions galore. So she’s partially informed but still hasn’t demonstrated a basic grasp of media theory, the attention economy, or surveillance capitalism, all of which relate directly. Perhaps she should bring those investigative journalism skills to bear on Jaron Lanier, one of the more trenchant critics of living online.

I rant because the embedded assumption is that anything, everything occurring online is what truly matters — even though online media didn’t yet exist as recently as thirty years ago — and that one must (must I say! c’mon, keep up!) always be paying attention to matter in turn or suffer from FOMO. Arguments in favor of needing to be online for information and news gathering are weak and ahistorical. No doubt the twisted and manipulated results of Google searches, sometimes contentious Wikipedia entries, and various dehumanizing, self-as-brand social media platforms are crutches we all now use — some waaaay, way more than others — but they’re nowhere close to the only or best way to absorb knowledge or stay in touch with family and friends. Career networking in the gig economy might require some basic level of connection but shouldn’t need to be the all-encompassing, soul-destroying work maintaining an active public persona has become.

Thus, everyone is chasing likes and follows and retweets and reblogs and other analytics as evidence of somehow being relevant on the sea of ephemera floating around us like so much disused, discarded plastic in those infamous garbage gyres. (I don’t bother to chase and wouldn’t know how to drive traffic anyway. Screw all those solicitations for search-engine optimization. Paying for clicks is for chumps, though lots apparently do it to lie enhance their analytics.) One’s online profile is accordingly a mirror of or even a substitute for the self — a facsimile self. Lost somewhere in my backblog (searched, couldn’t find it) is a post referencing several technophiles positively celebrating the bogus extension of the self accomplished by developing and burnishing an online profile. It’s the domain of celebrities, fame whores, narcissists, and sociopaths, not to mention a few criminals. Oh, and speaking of criminals, recent news is that OJ Simpson just opened a Twitter account to reform his disastrous public image? but is fundamentally outta touch with how deeply icky, distasteful, and disgusting it feels to others for him to be participating once again in the public sphere. Disgraced criminals celebrities negatively associated with the Me-Too Movement (is there really such a movement or was it merely a passing hashtag?) have mostly crawled under their respective multimillion-dollar rocks and not been heard from again. Those few who have tried to reemerge are typically met with revulsion and hostility (plus some inevitable star-fuckers with short memories). Hard to say when, if at all, forgiveness and rejoining society become appropriate.

/rant off

Color-Coded Holidays

Posted: June 16, 2019 in Culture

Considering the over-seriousness of most of my blog posts, here’s something lighter from the archives of my dead group blog.

Creative Destruction

Today’s holiday (Valentine’s Day) got me thinking about how the various major holidays scattered over the calendar are associated with specific colors and behaviors. The granddaddy of ’em all, Christmas, is the Red Holiday, which is associated with spending yourself into debt to get gifts for everyone and drinking rum-spiced eggnog. (The birth of Christ is an afterthought for most of us by now.) Valentine’s Day is the Pink Holiday and is for spending money on one’s sweetheart to demonstrate the level of one’s appreciation/sacrifice. Sorry, no drinking. Easter is the Yellow Holiday, probably pastel, and is for chasing colored eggs and purchasing baskets of goodies. Oh, and the risen saviour. Again, sorry, no drinking (unless you count sacramental wine.) The Green Holiday is St. Patrick’s Day, and is for drinking. Some wear some bit of green clothing, but it’s mostly about the drinking.

The Blue Holiday is…

View original post 182 more words

Apologies for this overlong blog post. I know that this much text tries the patience of most readers and is well in excess of my customary 3–4 paragraphs.

Continuing my book blogging of Pankaj Mishra’s Age of Anger, Chapter Two (subtitled “History’s Winners and Their Illusions”) focuses on the thought revolution that followed from the Enlightenment in Western Europe and its imitation in non-Western cultures, especially as manifested in the century leading to the French Revolution. Although the American Revolution (more narrowly a tax revolt with insistence on self-rule) preceded the French Revolution by slightly more than a decade, it’s really the French, whose motto liberté, égalité, fraternité came to prominence and defined an influential set of European values, who effectively challenged enthusiastic modernizers around the globe to try to catch up with the ascendant West.

However, almost as soon as this project appeared, i.e., attempting to transform ancien régime monarchies in Northern Africa, the Middle East, and Russia into something pseudo-European, critics arose who denounced the abandonment of tradition and centuries-old national identities. Perhaps they can be understood as the first wave of modern conservatism. Here is Mishra’s characterization:

Modernization, mostly along capitalist lines, became the universalist creed that glorified the autonomous rights-bearing individual and hailed his rational choice-making capacity as freedom. Economic growth was posited as the end-all of political life and the chief marker of progress worldwide, not to mention the gateway to happiness. Communism was totalitarian. Ergo its ideological opponent, American liberalism, represented freedom, which in turn was best advanced by moneymaking. [p. 48]

Aside: The phrase “rights-bearing individual” has obvious echoes with today’s SJWs and their poorly conceived demand for egalitarianism not just before the law but in social and economic outcomes. Although economic justice (totally out of whack with today’s extreme income and wealth inequality) is a worthy goal that aligns with idealized but not real-world Enlightenment values, SJW activism reinforces retrograde divisions of people based on race, gender, sexual orientation, religion, disability, etc. Calls to level out all these questionable markers of identity have resulted in intellectual confusion and invalidation of large “privileged” and/or “unoppressed” groups such as white males of European descent in favor of oppressed minorities (and majorities, e.g., women) of all categories. Never mind that many of those same white males are often every bit as disenfranchised as others whose victimhood is paraded around as some sort virtue granting them authority and preferential treatment.

Modernization has not been evenly distributed around the globe, which accounts for countries even today being designated either First, Second, or Third World. An oft-used euphemism is “developing economy,” which translates to an invitation for wealthy First-World nations (or its corporations) to force their way in to exploit cheap labor and untapped natural resources. Indeed, as Mishra points out, the promise of joining First-World living standards (having diverged centuries ago) is markedly hollow:

… doubters of Western-style progress today include more than just marginal communities and some angry environmental activists. In 2014 The Economist said that, on the basis of IMF data, emerging economies — or, most of the human population — might have to wait for three centuries in order to catch up with the West. In this assessment, the last decade of high growth was an ‘aberration’ and ‘billions of people will be poorer for a lot longer than they might have expected just a few years ago’.

The implications are sobering: the non-West not only finds itself replicating the West’s trauma on an infinitely larger scale. While helping inflict the profoundest damage yet on the environment — manifest today in rising sea levels, erratic rainfall, drought, declining harvests, and devastating floods — the non-West also has no real prospect of catching up … [pp. 47-48]

That second paragraph is an unexpected acknowledgement that the earliest industrialized nations (France, the United Kingdom, and the U.S.) unwittingly put us on a path to self-annihilation only to be knowingly repeated and intensified by latecomers to industrialization. All those (cough) ecological disturbances are occurring right now, though the public has been lulled into complacency by temporary abundance, misinformation, under- and misreporting, and international political incompetence. Of course, ecological destruction is no longer merely the West’s trauma but a global catastrophe of the highest magnitude which is certainly in the process of catching up to us.

Late in Chapter Two, Mishra settles on the Crystal Palace exhibition space and utopian symbol, built in 1851 during the era of world’s fairs and mistaken enthusiasm regarding the myth of perpetual progress and perfectibility, as an irresistible embodiment of Western hubris to which some intellectual leaders responded with clear disdain. Although a marvelous technical feat of engineering prowess and demonstration of economic power (not unlike countries that host the Olympics — remember Beijing?), the Crystal Palace was also viewed as an expression of the sheer might of Western thought and its concomitant products. Mishra repeatedly quotes Dostoevsky, who visited the Crystal Palace in 1862 and described his visceral response to the place poignantly and powerfully:

You become aware of a colossal idea; you sense that here something has been achieved, that here there is victory and triumph. You even begin vaguely to fear something. However independent you may be, for some reason you become terrified. ‘For isn’t this the achievement of perfection?’ you think. ‘Isn’t this the ultimate?’ Could this in fact be the ‘one fold?’ Must you accept this as the final truth and forever hold your peace? It is all so solemn, triumphant, and proud that you gasp for breath. [p. 68]

And later, describing the “world-historical import” of the Crystal Palace:

Look at these hundreds of thousands, these millions of people humbly streaming here from all over the face of the earth. People come with a single thought, quietly, relentlessly, mutely thronging onto this colossal palace; and you feel that something final has taken place here, that something has come to an end. It is like a Biblical picture, something out of Babylon, a prophecy from the apocalypse coming to pass before your eyes. You sense that it would require great and everlasting spiritual denial and fortitude in order not to submit, not to capitulate before the impression, not to bow to what is, and not to deify Baal, that is not to accept the material world as your ideal. [pp. 69–70]

The prophetic finality of the Crystal Palace thus presaged twentieth-century achievements and ideas (the so-called American Century) that undoubtedly eclipsed the awesome majesty of the Crystal Palace, e.g., nuclear fission and liberal democracy’s purported victory over Soviet Communism (to name only two). Indeed, Mishra begins the chapter with a review of Americans declarations of the end of history, i.e., having reached final forms of political, social, and economic organization that are now the sole model for all nations to emulate. The whole point of the chapter is that such pronouncements are illusions with strong historical antecedents that might have cautioned us not to leap to unwarranted conclusions or to perpetuate a soul-destroying regime hellbent on extinguishing all alternatives. Of course, as Gore Vidal famously quipped, “Americans never learn; it’s part of our charm.”

 

Third version of this topic. Whereas the previous two were about competing contemporary North American ways of knowing, this one is broader in both time and space.

The May 2019 issue of Harper’s Magazine has a fascinating review of Christina Thompson’s book Sea People: The Puzzle of Polynesia (2019). Beyond the puzzle itself — how did Polynesian people migrate to, settle, and populate the far-flung islands of the Central and South Pacific? — the review hits upon one of my recurring themes on this blog, namely, that human cognition is plastic enough to permit highly divergent ways of knowing.

The review (and book?) is laden with Eurocentric detail about the “discovery” of closely related Polynesian cultures dispersed more widely (geographically) than any other culture prior to the era of mass migration. Indeed, the reviewer chides the author at one point for transforming Polynesia from a subject in its own right into an exotic object of (Western) fascination. This distorted perspective is commonplace and follows from the earlier “discovery” and colonization of North America as though it were not already populated. Cartographers even today are guilty of this Eurocentrism, relegating “empty” expanses of the Pacific Ocean to irrelevance in maps when in fact the Pacific is “the dominant feature of the planet” and contains roughly twenty-five thousand islands (at current sea level? — noting that sea level was substantially lower during the last ice age some 13,000 years but due to rise substantially by the end of this century and beyond, engulfing many of the islands now lying dangerously close to sea level). Similar distortions are needed to squash the spherical (3D) surface of the globe onto planar (2D) maps (e.g., the Mercator projection, which largely ignores the Pacific Ocean in favor of continents; other projections shown here) more easily conceptualized (for Westerners) in terms of coordinate geometry using latitude and longitude (i.e., the Cartesian plane).

The review mentions the familiar dichotomy of grouping a hammer, saw, hatchet, and log in terms of abstract categories (Western thought) vs. utility or practicality (non-Western). Exploration of how different ways of knowing manifest is, according to the review, among the more intellectually exciting parts of the book. That’s the part I’m latching onto. For instance, the review offers this:

Near the middle of Sea People, Thompson explores the ramification of Polynesia as, until contact, an oral culture with “an oral way of seeing.” While writing enables abstraction, distancing, and what we generally call objectivity, the truth of oral cultures is thoroughly subjective. Islands aren’t dots on a map seen from the sky but destinations one travels to in the water.

This is the crux of the puzzle of Polynesians fanning out across the Pacific approximately one thousand years ago. They had developed means of wayfinding in canoes and outriggers without instruments or maps roughly 500 years prior to Europeans crossing the oceans in sailing ships. Perhaps I’m reading too much into the evidence, but abstraction and objectivity as a particular way of knowing, bequeathed to Western Europe via the Enlightenment and development of the scientific method, stunted or delayed exploration of the globe precisely because explorers began with a god’s eye view of the Earth from above rather than from the surface (object vs. subject). In contrast, quoting here from the book rather than the review, Polynesians used

a system known as etak, in which they visualize a “reference island,” — which is usually a real island but may also be imaginary — off to one side of the path they are following, about midway between their starting point and their destination. As the journey progresses, this island “moves” under each of the stars in the star path [situated near the horizon rather than overhead], while the canoe in which the voyagers are traveling stays still. Of course, the navigators know that it is the canoe and not the islands that are moving, but this is the way they conceptualize the voyage.

Placing oneself at the center of the world or universe — at least for the purpose of navigation — is a conceptual pose Westerners discarded when heliocentrism gradually replaced geocentrism. (Traveling using GPS devices ironically places the traveler back at the center of the map with terrain shifting around the vehicle, but it’s a poor example of wayfinding precisely because the traveler fobs the real work onto the device and likely possesses no real understanding or skill traversing the terrain besides following mechanical instructions.) While we Westerners might congratulate ourselves for a more accurate, objective orientation to the stars, its unwitting limitations are worth noting. Recent discoveries regarding human prehistory, especially megalithic stone construction accomplished with techniques still unknown and flatly impossible with modern technology, point to the existence of other ways of knowing lost to contemporary human cultures steadily triangulating on and conforming to Western thought (through the process of globalization). Loss of diversity of ways of knowing creates yet another sort of impoverishment that can only be barely glimpsed since most of us are squarely inside the bubble. Accordingly, it’s not for nothing that some unusually sensitive critics of modernity suggest we’re entering a new Dark Age.

 

I put aside Harari’s book from the previous blog post in favor of Pankaj Mishra’s Age of Anger: A History of the Present (2017). Mishra’s sharp cultural criticism is far more convincing than Harari’s Panglossian perspective. Perhaps some of that is due to an inescapable pessimism in my own character. Either way, I’ve found the first 35 pages dense with observations of interest to me as a blogger and armchair cultural critic. Some while back, I published a post attempting to delineate (not very well, probably) what’s missing in the modern world despite its obvious material abundance. Reinforcing my own contentions, Mishra’s thesis (as I understand it so far) is this: we today share with others post-Enlightenment an array of resentments and hatreds (Fr.: ressentiment) aimed incorrectly at scapegoats for political and social failure to deliver the promises of progressive modernity equitably. For instance, Mishra describes

… flamboyant secular radicals in the nineteenth and early twentieth centuries: the aesthetes who glorified war, misogyny and pyromania; the nationalists who accused Jews and liberals of rootless cosmopolitanism and celebrated irrational violence; and the nihilists, anarchists and terrorists who flourished in almost every continent against a background of cosy political-financial alliances, devastating economic crises and obscene inequalities. [pp. 10–11]

Contrast and/or compare his assessment of the recent past:

Beginning in the 1990s, a democratic revolution of aspiration … swept across the world, sparking longings for wealth, status and power, in addition to ordinary desires for stability and contentment, in the most unpromising circumstances. Egalitarian ambition broke free of old social hierarchies … The culture of [frantic] individualism went universal … The crises of recent years have uncovered an extensive failure to realize the ideals of endless economic expansion and private wealth creation. Most newly created ‘individuals’ toil within poorly imagined social and political communities and/or states with weakening sovereignty … individuals with very different pasts find themselves herded by capitalism and technology into a common present, where grossly unequal distributions of wealth and power have created humiliating new hierarchies. This proximity … is rendered more claustrophobic by digital communications … [S]hocks of modernity were once absorbed by inherited social structures of family and community, and the state’s welfare cushions [something mentioned here, too]. Today’s individuals are directly exposed to them in an age of accelerating competition on uneven playing fields, where it is easy to feel that there is no such thing as either society or state, and that there is only a war of all against all. [pp. 12–14]

These long quotes (the second one cut together from longer paragraphs) are here because Mishra is remarkably eloquent in his diagnosis of globalized culture. Although I’ve only read the prologue, I expect to find support for my long-held contention that disorienting disruptions of modernity (using Anthony Giddens’ sociological definition rather than the modish use of the term Postmodern to describe only the last few decades) create unique and formidable challenges to the formation of healthy self-image and personhood. Foremost among these challenges is an unexpectedly oppressive information environment: the world forced into full view and inciting comparison, jealousy, envy, and hatred stemming from routine and ubiquitous frustrations and humiliations as we each struggle in life getting our personal share of attention, renown, and reward.

Another reason Mishra provides for our collective anger is a deep human yearning not for anarchism or radical freedom but rather for belonging and absorption within a meaningful social context. This reminds me of Erich Fromm’s book Escape from Freedom (1941), which I read long ago but can’t remember so well anymore. I do remember quite vividly how counter-intuitive was the suggestion that absolute freedom is actually burdensome as distinguished from the usual programming we get about breaking free of all restraints. (Freedom! Liberty!) Indeed, Mishra provides a snapshot of multiple cultural and intellectual movements from the past two centuries where abandoning oneself to a cause, any cause, was preferable to the boredom and nothingness of everyday life absent purpose other than mere existence. The modern substitute for larger purpose — commodity culture — is a mere shadow of better ways of spending one’s life. Maybe commodity culture is better than sacrificing one’s life fighting wars (a common fate) or destroying others, but that’s a much longer, more difficult argument.

More to follow as my reading progresses.

“Come with me if you want to live.” That’s among the quotable lines from the latest movie in the Terminator franchise, though it’s not nearly so succinct or iconic as “I’ll be back” from the first Terminator. Whereas the latter has the quality (in hindsight) of slow, implacable inevitability (considering the Terminator is literally a death-bringer), the former occurs within the context of a character having only just traveled back in time, not yet adequately reoriented, and forced to make a snap decision under duress. “I’ll be back” might be easy to brush off as harmless (temporary denial) since the threat recedes — except that it doesn’t, it’s merely delayed. “Come with me …” demands a leap of faith (or trust) because the danger is very real at that instant.

Which quote, I must ask, better characterizes the threat of climate change? My answer: both, but at different times. Three to four decades ago, it was the “I’ll be back” type: building slowly but inevitable given the underlying structure of industrial civilization. That structure was known even then by a narrow circle of experts (e.g., engineers for Big Oil and at the Dept. of Energy) to be a heat engine, meaning that we would ultimately cook our own goose by warming the planet, altering the climatic steady state under which our most recent civilization has flourished and producing a steady loss of biodiversity and biomass until our own human habitat (the entirety of the planet by now) becomes a hostile environment unable (unwilling if one anthropomorphizes Mother Nature) to support our swollen population. All that was if we stayed on course and took no corrective action. Despite foreknowledge and ample warning, that’s precisely what occurred (and continues today).

With the Intergovernmental Panel on Climate Change (IPCC) in particular, the threat has for roughly a decade shifted over to “Come with me ….” It’s no longer possible to put things off, yet we continue to dither well beyond the tipping point where/when we can still save ourselves from self-annihilation. Although scientists have been gathering data and evidence, forming an overwhelming consensus, and sounding the alarm, scientific illiteracy, realpolitik, journalistic malpractice, and corporate greed have all conspired to grant the illusion of time to react we simply don’t have anymore (and truth be told, probably didn’t as of the early 1980s).

I’m aware of at least three journalists (relying on the far more authoritative work of scientific consensus) who have embraced the message: Dahr Jamail, Thom Hartmann, and David Wallace-Wells. None to my knowledge has been able to bring himself to admit that humanity is now a collection of dead men walking. They can’t muster the courage to give up hope (or to report truthfully), clinging to the possibility we may still have a fleeting chance to avert disaster. I heard Ralph Nader on his webcast say something to the same effect, namely, what good is it to rob others of hope? My personal values adhere to unstinting truth rather than illusion or self-deception, so I subscribe to Guy McPherson‘s assessment that we face near-term human extinction (precise date unknown but soon if, for example, this the year we get a blue ocean event). Simply put, McPherson is professor emeritus of natural resources and ecology and evolutionary biology at the University of Arizona [note my emphasis]. I trust his scholarship (summarizing the work of other scientists and drawing necessary though unpalatable conclusions) more than I trust journalistic shaping of the story for public consumption.

The obvious metaphor for what we face is a terminal medical diagnosis, or if one has hope, perhaps a death sentence about to be carried out but with the possibility of a last-minute stay of execution via phone call from the governor. Opinions vary whether one should hope/resist up to the final moment or make peace with one’s fate. By not telling the truth, I daresay the MSM has not given the public the second option by using the “I’ll be back” characterization when it’s really “Come with me ….” Various authors on the Web offer a better approximation of the truth (such as it can be known) and form a loose doomer network (a/k/a collapsniks). This blog is (an admittedly tiny) part of that doomersphere, which gives me no pleasure.

There is something ironic and vaguely tragic about how various Internet platforms — mostly search engines and social media networks — have unwittingly been thrust into roles their creators never envisioned for themselves. Unless I’m mistaken, they launched under the same business model as broadcast media: create content, or better yet, crowd-source content, to draw in viewers and subscribers whose attention is then delivered to advertisers. Revenue is derived from advertisers while the basic services — i.e., search, job networking, encyclopedias and dictionaries, or social connection — are given away gratis. The modest inconveniences and irritations of having the screen littered and interrupted with ads is a trade-off most end users are happy to accept for free content.

Along the way, some platform operators discovered that user data itself could be both aggregated and individualized and subsequently monetized. This second step unwittingly created so-called surveillance capitalism that Shoshana Zuboff writes about in her recently published book (previously blogged about it here). Essentially, an Orwellian Big Brother (several of them, in fact) tracks one’s activity through smart phone apps and Web browsers, including GPS data revealing movement through real space, not just virtual spaces. This is also the domain of the national security state from local law enforcement to the various security branches of the Federal government: dragnet surveillance where everyone is watched continuously. Again, end users shrug off surveillance as either no big deal or too late to resist.

The most recent step is that, like the Internet itself, various platforms have been functioning for some time already as public utilities and accordingly fallen under demand for regulation with regard to authenticity, truth, and community standards of allowable speech. Thus, private corporations have been thrust unexpectedly into the role of regulating content. Problem is, unlike broadcast networks that create their own content and can easily enforce restrictive standards, crowd-sourced platforms enable the general population to upload its own content, often mere commentary in text form but increasingly as video content, without any editorial review. These platforms have parried by deploying and/or modifying their preexisting surveillance algorithms in search of objectionable content normally protected as free speech and taken steps to remove content, demonetize channels, and ban offending users indefinitely, typically without warning and without appeal.

If Internet entrepreneurs initially got into the biz to make a few (or a lot of) quick billions, which some few of them have, they have by virtue of the global reach of their platforms been transformed into censors. It’s also curious that by enabling end uses to publish to their platforms, they’ve given voice to the masses in all their unwashed glory. Now, everyone’s crazy, radicalized uncle (or sibling or parent or BFF) formerly banished to obscurity railing against one thing or another at the local tavern, where he was tolerated as harmless so long as he kept his bar tab current, is proud to fly his freak flag anywhere and everywhere. Further, the anonymous coward who might issue death or bomb threats to denounce others has been given means to distribute hate across platforms and into the public sphere, where it gets picked up and maybe censored. Worst of all, the folks who monitor and decide what is allowed, functioning as modern-day thought police, are private citizens and corporations with no oversight or legal basis to act except for the fact that everything occurs on their respective platforms. This is a new aspect to the corporatocracy but not one anyone planned.