Archive for June, 2019

Much ado over nothing was made this past week regarding a technical glitch (or control room error) during the first of two televised Democratic presidential debates where one pair of moderators’ mics was accidentally left on and extraneous, unintended speech leaked into the broadcast. It distracted the other pair of moderators enough to cause a modest procedural disruption. Big deal. This was not the modal case of a hot mic where someone, e.g., a politician, swears (a big no-no despite the shock value being almost completely erased in today’s media landscape) or accidentally reveals callous attitudes (or worse) thinking that no one important was listening or recording. Hot mics in the past have led to public outrage and criminal investigations. One recent example that still sticks in everyone’s craw was a novice political candidate who revealed he could use his fame and impudent nerve to “grab ’em by the pussy.” Turned out not to be the career killer everyone thought it would be.

The latest minor furor over a hot mic got me thinking, however, about inadvertent revelation of matters of genuine public interest. Three genres spring to mind: documentary films, whistle-blowing, and investigative journalism, that last including category outliers such as Wikileaks. Whereas a gaffe on a hot mic usually means the leaker/speaker exposes him- or herself and thus has no one else to blame, disclosures occurring in the other three categories are often against the will of those exposed. It’s obviously in the public interest to know about corruption, misbehavior, and malfeasance in corporate and political life, but the manner in which such information is made public is controversial. Those who expose others suffer harassment and persecution. Documentarians probably fare the best with respect to being left alone following release of information. Michael Moore, for all his absurd though entertaining theatrics, is free (so far as I know) to go about his business and do as he pleases. However, gestures to protect whistle-blowers are just that: gestures. Those who have leaked classified government information in particular, because they gained access to such information through security clearances and signed nondisclosure agreements (before knowing what secrets they were obliged to keep, which is frankly the way such obligations work), are especially prone to reprisal and prosecution. Such information is literally not theirs to disclose, but when keeping others’ secrets is heinous enough, some people feel their conscience and moral duty are superior to job security and other risks involved. Opinions vary, sometimes passionately. And now even journalists who uncover or merely come into possession of evidence of wrongdoing and later publish it — again, decidedly in the public interest — are subject to (malicious?) prosecution. Julian Assange is the current test case.

The free speech aspect of revealing someone else’s amoral and criminal acts is a fraught argument. However, it’s clear that as soon as damaging information comes to light, focus shifts away from the acts and their perpetrators to those who publish the information. Shifting the focus is a miserable yet well-established precedent by now, the result being that most folks who might consider coming forward to speak up now keep things to themselves rather than suffer entirely foreseeable consequences. In that light, when someone comes forward anyway, knowing that they will be hounded, vilified, arrested, and worse, he or she deserved more respect for courage and self-sacrifice than generally occurs in the aftermath of disclosure. The flip side — condemnation, prosecution, and death threats — are already abundant in the public sphere.

Some time after reports of torture at Guantánamo, Abu Ghraib, and Bagram went public, a handful of low-level servicemen (“bad apples” used to deflect attention down the command hierarchy) were prosecuted, but high-level officials (e.g., former U.S. presidents Bush and Obama, anyone in their respective administrations, and commanding officers on site) were essentially immunized from prosecution. That example is not quite the same as going after truth-tellers, but it’s a rather egregious instance of bad actors going unprosecuted. I’m still incensed by it. And that’s why I’m blogging about the hot mic. Lots of awful things go on behind the scenes without public knowledge or sanction. Those who commit high crimes (including war crimes) clearly know what they’re doing is wrong. Claims of national security are often invoked and gag laws are legislated into existence on behalf of private industry. When leaks do inevitably occur, those accused immediately attack the accuser, often with the aid of others in the media. Denials may also be issued (sometimes not — why bother?), but most bad actors hide successfully behind the deflecting shift of focus. When will those acting in the shadows against the public interest and in defiance of domestic and international law ever be brought to justice? I daresay the soul of the nation is at stake, and as long as officialdom escapes all but temporary public relations problems to be spun, the pride everyone wants to take as Americans eludes us. In the meantime, there’s a lot to answer for, and it keeps piling up.

/rant on

Yet another journalist has unburdened herself (unbidden story of personal discovery masquerading as news) of her addiction to digital media and her steps to free herself from the compulsion to be always logged onto the onslaught of useless information hurled at everyone nonstop. Other breaking news offered by our intrepid late-to-the-story reporter: water is wet, sunburn stings, and the Earth is dying (actually, we humans are actively killing it for profit). Freeing oneself from the screen is variously called digital detoxification (detox for short), digital minimalism, digital disengagement, digital decoupling, and digital decluttering (really ought to be called digital denunciation) and means limiting the duration of exposure to digital media and/or deleting one’s social media accounts entirely. Naturally, there are apps (counters, timers, locks) for that. Although the article offers advice for how to disentangle from screen addictions of the duh! variety (um, just hit the power switch), the hidden-in-plain-sight objective is really how to reengage after breaking one’s compulsions but this time asserting control over the infernal devices that have taken over life. It’s a love-hate style of technophilia and chock full of illusions embarrassing even to children. Because the article is nominally journalism, the author surveys books, articles, software, media platforms, refusniks, gurus, and opinions galore. So she’s partially informed but still hasn’t demonstrated a basic grasp of media theory, the attention economy, or surveillance capitalism, all of which relate directly. Perhaps she should bring those investigative journalism skills to bear on Jaron Lanier, one of the more trenchant critics of living online.

I rant because the embedded assumption is that anything, everything occurring online is what truly matters — even though online media didn’t yet exist as recently as thirty years ago — and that one must (must I say! c’mon, keep up!) always be paying attention to matter in turn or suffer from FOMO. Arguments in favor of needing to be online for information and news gathering are weak and ahistorical. No doubt the twisted and manipulated results of Google searches, sometimes contentious Wikipedia entries, and various dehumanizing, self-as-brand social media platforms are crutches we all now use — some waaaay, way more than others — but they’re nowhere close to the only or best way to absorb knowledge or stay in touch with family and friends. Career networking in the gig economy might require some basic level of connection but shouldn’t need to be the all-encompassing, soul-destroying work maintaining an active public persona has become.

Thus, everyone is chasing likes and follows and retweets and reblogs and other analytics as evidence of somehow being relevant on the sea of ephemera floating around us like so much disused, discarded plastic in those infamous garbage gyres. (I don’t bother to chase and wouldn’t know how to drive traffic anyway. Screw all those solicitations for search-engine optimization. Paying for clicks is for chumps, though lots apparently do it to lie enhance their analytics.) One’s online profile is accordingly a mirror of or even a substitute for the self — a facsimile self. Lost somewhere in my backblog (searched, couldn’t find it) is a post referencing several technophiles positively celebrating the bogus extension of the self accomplished by developing and burnishing an online profile. It’s the domain of celebrities, fame whores, narcissists, and sociopaths, not to mention a few criminals. Oh, and speaking of criminals, recent news is that OJ Simpson just opened a Twitter account to reform his disastrous public image? but is fundamentally outta touch with how deeply icky, distasteful, and disgusting it feels to others for him to be participating once again in the public sphere. Disgraced criminals celebrities negatively associated with the Me-Too Movement (is there really such a movement or was it merely a passing hashtag?) have mostly crawled under their respective multimillion-dollar rocks and not been heard from again. Those few who have tried to reemerge are typically met with revulsion and hostility (plus some inevitable star-fuckers with short memories). Hard to say when, if at all, forgiveness and rejoining society become appropriate.

/rant off

Color-Coded Holidays

Posted: June 16, 2019 in Culture
Tags:

Considering the over-seriousness of most of my blog posts, here’s something lighter from the archives of my dead group blog.

Creative Destruction

Today’s holiday (Valentine’s Day) got me thinking about how the various major holidays scattered over the calendar are associated with specific colors and behaviors. The granddaddy of ’em all, Christmas, is the Red Holiday, which is associated with spending yourself into debt to get gifts for everyone and drinking rum-spiced eggnog. (The birth of Christ is an afterthought for most of us by now.) Valentine’s Day is the Pink Holiday and is for spending money on one’s sweetheart to demonstrate the level of one’s appreciation/sacrifice. Sorry, no drinking. Easter is the Yellow Holiday, probably pastel, and is for chasing colored eggs and purchasing baskets of goodies. Oh, and the risen saviour. Again, sorry, no drinking (unless you count sacramental wine.) The Green Holiday is St. Patrick’s Day, and is for drinking. Some wear some bit of green clothing, but it’s mostly about the drinking.

The Blue Holiday is…

View original post 182 more words

Apologies for this overlong blog post. I know that this much text tries the patience of most readers and is well in excess of my customary 3–4 paragraphs.

Continuing my book blogging of Pankaj Mishra’s Age of Anger, Chapter Two (subtitled “History’s Winners and Their Illusions”) focuses on the thought revolution that followed from the Enlightenment in Western Europe and its imitation in non-Western cultures, especially as manifested in the century leading to the French Revolution. Although the American Revolution (more narrowly a tax revolt with insistence on self-rule) preceded the French Revolution by slightly more than a decade, it’s really the French, whose motto liberté, égalité, fraternité came to prominence and defined an influential set of European values, who effectively challenged enthusiastic modernizers around the globe to try to catch up with the ascendant West.

However, almost as soon as this project appeared, i.e., attempting to transform ancien régime monarchies in Northern Africa, the Middle East, and Russia into something pseudo-European, critics arose who denounced the abandonment of tradition and centuries-old national identities. Perhaps they can be understood as the first wave of modern conservatism. Here is Mishra’s characterization:

Modernization, mostly along capitalist lines, became the universalist creed that glorified the autonomous rights-bearing individual and hailed his rational choice-making capacity as freedom. Economic growth was posited as the end-all of political life and the chief marker of progress worldwide, not to mention the gateway to happiness. Communism was totalitarian. Ergo its ideological opponent, American liberalism, represented freedom, which in turn was best advanced by moneymaking. [p. 48]

Aside: The phrase “rights-bearing individual” has obvious echoes with today’s SJWs and their poorly conceived demand for egalitarianism not just before the law but in social and economic outcomes. Although economic justice (totally out of whack with today’s extreme income and wealth inequality) is a worthy goal that aligns with idealized but not real-world Enlightenment values, SJW activism reinforces retrograde divisions of people based on race, gender, sexual orientation, religion, disability, etc. Calls to level out all these questionable markers of identity have resulted in intellectual confusion and invalidation of large “privileged” and/or “unoppressed” groups such as white males of European descent in favor of oppressed minorities (and majorities, e.g., women) of all categories. Never mind that many of those same white males are often every bit as disenfranchised as others whose victimhood is paraded around as some sort virtue granting them authority and preferential treatment.

Modernization has not been evenly distributed around the globe, which accounts for countries even today being designated either First, Second, or Third World. An oft-used euphemism is “developing economy,” which translates to an invitation for wealthy First-World nations (or its corporations) to force their way in to exploit cheap labor and untapped natural resources. Indeed, as Mishra points out, the promise of joining First-World living standards (having diverged centuries ago) is markedly hollow:

… doubters of Western-style progress today include more than just marginal communities and some angry environmental activists. In 2014 The Economist said that, on the basis of IMF data, emerging economies — or, most of the human population — might have to wait for three centuries in order to catch up with the West. In this assessment, the last decade of high growth was an ‘aberration’ and ‘billions of people will be poorer for a lot longer than they might have expected just a few years ago’.

The implications are sobering: the non-West not only finds itself replicating the West’s trauma on an infinitely larger scale. While helping inflict the profoundest damage yet on the environment — manifest today in rising sea levels, erratic rainfall, drought, declining harvests, and devastating floods — the non-West also has no real prospect of catching up … [pp. 47-48]

That second paragraph is an unexpected acknowledgement that the earliest industrialized nations (France, the United Kingdom, and the U.S.) unwittingly put us on a path to self-annihilation only to be knowingly repeated and intensified by latecomers to industrialization. All those (cough) ecological disturbances are occurring right now, though the public has been lulled into complacency by temporary abundance, misinformation, under- and misreporting, and international political incompetence. Of course, ecological destruction is no longer merely the West’s trauma but a global catastrophe of the highest magnitude which is certainly in the process of catching up to us.

Late in Chapter Two, Mishra settles on the Crystal Palace exhibition space and utopian symbol, built in 1851 during the era of world’s fairs and mistaken enthusiasm regarding the myth of perpetual progress and perfectibility, as an irresistible embodiment of Western hubris to which some intellectual leaders responded with clear disdain. Although a marvelous technical feat of engineering prowess and demonstration of economic power (not unlike countries that host the Olympics — remember Beijing?), the Crystal Palace was also viewed as an expression of the sheer might of Western thought and its concomitant products. Mishra repeatedly quotes Dostoevsky, who visited the Crystal Palace in 1862 and described his visceral response to the place poignantly and powerfully:

You become aware of a colossal idea; you sense that here something has been achieved, that here there is victory and triumph. You even begin vaguely to fear something. However independent you may be, for some reason you become terrified. ‘For isn’t this the achievement of perfection?’ you think. ‘Isn’t this the ultimate?’ Could this in fact be the ‘one fold?’ Must you accept this as the final truth and forever hold your peace? It is all so solemn, triumphant, and proud that you gasp for breath. [p. 68]

And later, describing the “world-historical import” of the Crystal Palace:

Look at these hundreds of thousands, these millions of people humbly streaming here from all over the face of the earth. People come with a single thought, quietly, relentlessly, mutely thronging onto this colossal palace; and you feel that something final has taken place here, that something has come to an end. It is like a Biblical picture, something out of Babylon, a prophecy from the apocalypse coming to pass before your eyes. You sense that it would require great and everlasting spiritual denial and fortitude in order not to submit, not to capitulate before the impression, not to bow to what is, and not to deify Baal, that is not to accept the material world as your ideal. [pp. 69–70]

The prophetic finality of the Crystal Palace thus presaged twentieth-century achievements and ideas (the so-called American Century) that undoubtedly eclipsed the awesome majesty of the Crystal Palace, e.g., nuclear fission and liberal democracy’s purported victory over Soviet Communism (to name only two). Indeed, Mishra begins the chapter with a review of Americans declarations of the end of history, i.e., having reached final forms of political, social, and economic organization that are now the sole model for all nations to emulate. The whole point of the chapter is that such pronouncements are illusions with strong historical antecedents that might have cautioned us not to leap to unwarranted conclusions or to perpetuate a soul-destroying regime hellbent on extinguishing all alternatives. Of course, as Gore Vidal famously quipped, “Americans never learn; it’s part of our charm.”

 

Third version of this topic. Whereas the previous two were about competing contemporary North American ways of knowing, this one is broader in both time and space.

The May 2019 issue of Harper’s Magazine has a fascinating review of Christina Thompson’s book Sea People: The Puzzle of Polynesia (2019). Beyond the puzzle itself — how did Polynesian people migrate to, settle, and populate the far-flung islands of the Central and South Pacific? — the review hits upon one of my recurring themes on this blog, namely, that human cognition is plastic enough to permit highly divergent ways of knowing.

The review (and book?) is laden with Eurocentric detail about the “discovery” of closely related Polynesian cultures dispersed more widely (geographically) than any other culture prior to the era of mass migration. Indeed, the reviewer chides the author at one point for transforming Polynesia from a subject in its own right into an exotic object of (Western) fascination. This distorted perspective is commonplace and follows from the earlier “discovery” and colonization of North America as though it were not already populated. Cartographers even today are guilty of this Eurocentrism, relegating “empty” expanses of the Pacific Ocean to irrelevance in maps when in fact the Pacific is “the dominant feature of the planet” and contains roughly twenty-five thousand islands (at current sea level? — noting that sea level was substantially lower during the last ice age some 13,000 years but due to rise substantially by the end of this century and beyond, engulfing many of the islands now lying dangerously close to sea level). Similar distortions are needed to squash the spherical (3D) surface of the globe onto planar (2D) maps (e.g., the Mercator projection, which largely ignores the Pacific Ocean in favor of continents; other projections shown here) more easily conceptualized (for Westerners) in terms of coordinate geometry using latitude and longitude (i.e., the Cartesian plane).

The review mentions the familiar dichotomy of grouping a hammer, saw, hatchet, and log in terms of abstract categories (Western thought) vs. utility or practicality (non-Western). Exploration of how different ways of knowing manifest is, according to the review, among the more intellectually exciting parts of the book. That’s the part I’m latching onto. For instance, the review offers this:

Near the middle of Sea People, Thompson explores the ramification of Polynesia as, until contact, an oral culture with “an oral way of seeing.” While writing enables abstraction, distancing, and what we generally call objectivity, the truth of oral cultures is thoroughly subjective. Islands aren’t dots on a map seen from the sky but destinations one travels to in the water.

This is the crux of the puzzle of Polynesians fanning out across the Pacific approximately one thousand years ago. They had developed means of wayfinding in canoes and outriggers without instruments or maps roughly 500 years prior to Europeans crossing the oceans in sailing ships. Perhaps I’m reading too much into the evidence, but abstraction and objectivity as a particular way of knowing, bequeathed to Western Europe via the Enlightenment and development of the scientific method, stunted or delayed exploration of the globe precisely because explorers began with a god’s eye view of the Earth from above rather than from the surface (object vs. subject). In contrast, quoting here from the book rather than the review, Polynesians used

a system known as etak, in which they visualize a “reference island,” — which is usually a real island but may also be imaginary — off to one side of the path they are following, about midway between their starting point and their destination. As the journey progresses, this island “moves” under each of the stars in the star path [situated near the horizon rather than overhead], while the canoe in which the voyagers are traveling stays still. Of course, the navigators know that it is the canoe and not the islands that are moving, but this is the way they conceptualize the voyage.

Placing oneself at the center of the world or universe — at least for the purpose of navigation — is a conceptual pose Westerners discarded when heliocentrism gradually replaced geocentrism. (Traveling using GPS devices ironically places the traveler back at the center of the map with terrain shifting around the vehicle, but it’s a poor example of wayfinding precisely because the traveler fobs the real work onto the device and likely possesses no real understanding or skill traversing the terrain besides following mechanical instructions.) While we Westerners might congratulate ourselves for a more accurate, objective orientation to the stars, its unwitting limitations are worth noting. Recent discoveries regarding human prehistory, especially megalithic stone construction accomplished with techniques still unknown and flatly impossible with modern technology, point to the existence of other ways of knowing lost to contemporary human cultures steadily triangulating on and conforming to Western thought (through the process of globalization). Loss of diversity of ways of knowing creates yet another sort of impoverishment that can only be barely glimpsed since most of us are squarely inside the bubble. Accordingly, it’s not for nothing that some unusually sensitive critics of modernity suggest we’re entering a new Dark Age.

 

Richard Wolff gave a fascinating talk at Google offices in New York City, which is embedded below:

This talk was published nearly two years ago, demonstrating that we refuse to learn or make adjustments we need to order society better (and to avoid disaster and catastrophe). No surprise there. (Also shows how long it takes me to get to things.) Critics of capitalism and the democracy we pretend to have in the U.S. are many. Wolff criticizes effectively from a Marxist perspective (Karl Marx being among the foremost of those critics). For those who don’t have the patience to sit through Wolff’s 1.5-hour presentation, let me draw out a few details mixed with my own commentary (impossible to separate, sorry; sorry, too, for the profusion of links no one follows).

The most astounding thing to me is that Wolff admitted he made it through higher education to complete a Ph.D. in economics without a single professor assigning Marx to read or study. Quite the set of blinders his teachers wore. Happily, Wolff eventually educated himself on Marx. Multiple economic forms have each had their day: sharing, barter, feudalism, mercantilism, capitalism (including subcategories anarcho-capitalism and laissez-faire economics), Keynesian regulation, socialism (and its subcategory communism), etc. Except for the first, prevalent among indigent societies living close to subsistence, all involve hierarchy and coercion. Some regard those dynamics as just, others as unjust. It’s worth noting, too, that no system is pure. For instance, the U.S. has a blend of market capitalism and socialism. Philanthropy also figures in somehow. However, as social supports in the U.S. continue to be withdrawn and the masses are left to fend for themselves, what socialism existed as a hidden-in-plain-sight part of our system is being scaled down, privatized, foisted on charitable organizations, and/or driven out of existence.

The usual labor arrangement nearly all of us know — working for someone else for a wage/salary — is defined in Marxism as exploitation (not the lay understanding of the term) for one simple reason: all economic advantage from excess productivity of labor accrues to the business owner(s) (often a corporation). That’s the whole point of capitalism: to exploit (with some acknowledged risk) the differential between the costs of labor and materials (and increasingly, information) vs. the revenue they produce in order to prosper and grow. To some, exploitation is a dirty word, but understood from an analytical point of view, it’s the bedrock of all capitalist labor relationships. Wolff also points out that real wages in the U.S. (adjusted for inflation) have been flat for more than 40 years while productivity has climbed steadily. The differential profit (rather immense over time) has been pocketed handily by owners (billionaire having long-since replaced millionaire as an aspiration) while the average citizen/consumer has kept pace with the rising standard of living by adding women to the workforce (two or more earners per family instead of one), racking up debt, and deferring retirement.

Wolff’s antidote or cure to the dynamic of late-stage capitalism (nearly all the money being controlled by very few) is to remake corporate ownership, where a board of directors without obligation to workers makes all the important decisions and takes all the profit, into worker-owned businesses that practice direct democracy and distribute profits more equitably. How closely this resembles a coop (read: cooperative), commune, or kibbutz I cannot assess. Worker-owned businesses, no longer corporations, also differ significantly from how “socializing a business” is generally understood, i.e., a business or sector being taken over and run by the government. The U.S. Postal Service is one example. (Curiously, that last link has a .com suffix instead of .gov.) Public K–12 education operated by the states is another. As I understand it, this difference (who owns and runs an enterprise) is what lies behind democratic socialism being promoted in the progressive wing of the Democratic Party. Bernie Sanders is aligning his socialist politics with worker ownership of the means of production. Wolff also promotes this approach through his book and nonprofit organization Democracy at Work. How different these projects may be lies beyond my cursory analysis.

Another alternative to capitalist hegemony is a resource-based economy, which I admit I don’t really understand. Its rank utopianism is difficult to overlook, since it doesn’t fit at all with human history, where we muddle through without much of a plan or design except perhaps for those few who discover and devise ways to game systems for self-aggrandizement and personal benefit while leaving everyone else in the lurch. Peter Joseph, founder of The Zeitgeist Movement, is among the promoters of a resource-based economy. One of its chief attributes is the disuse of money. Considering central banks (the Federal Reserve System in the U.S.) that issue fiat currency worth increasingly little are being challenged rather effectively by cryptocurrencies based on nothing beyond social consensus, it’s interesting to contemplate an alternative to astronomical levels of wealth (and its inverse: debt) that come as a result of being trapped within the fiat monetary system that benefits so very few people.

Since this is a doom blog (not much of an admission, since it’s been obvious for years now), I can’t finish up without observing that none of these economic systems appears to take into account that we’re on a countdown to self-annihilation as we draw down the irreplaceable energy resources that make the whole shebang go. It’s possible the contemplated resource-based economy does so, but I rather doubt it. A decade or more ago, much of the discussion was about peak oil, which shortly thereafter gave way to peak everything. Shortages of materials such as helium, sand, and rare earths don’t figure strongly in public sentiment so long as party balloons, construction materials, and cell phones continue to be widely available. However, ongoing destruction of the biosphere through the primary activities of industrial civilization (e.g., mining, chemical-based agriculture, and steady expansion of human habitation into formerly wild nature) and the secondary effects of anthropogenic climate change (still hotly contested but more and more obvious with each passing season) and loss of biodiversity and biomass is catching up to us. In economics, this destruction is an externality conveniently ignored or waved away while profits can be made. The fullness of time will provide proof that we’ve enjoyed an extraordinary moment in history where we figured out how to exploit a specific sort of abundance (fossil fuels) with the ironic twist that that very exploitation leads to the collapse of the civilization it spawned and supported. No one planned it this way, really, and once the endgame came into view, nothing much could be done to forestall it. So we continue apace with self-destruction while celebrating its glamor and excess as innovation and progress. If only Wolff would incorporate that perspective, too.