Posts Tagged ‘Politics’

The “American character,” if one can call it into being merely by virtue of naming it (the same rhetorical trick as solutionism), is diverse and ever-changing. Numerous characterizations have been offered throughout history, with Alexis de Tocqueville’s Democracy in America (1835 and 1840) being perhaps the one cited most frequently despite its outdatedness. Much in American character has changed since that time, and it’s highly questionable to think it was unified even then. However, as a means of understanding ourselves, it’s as good a place to start as any. A standard criticism of American character as seen from outside (i.e., when Americans travel abroad) is the so-called ugly American: loud, inconsiderate, boorish, and entitled. Not much to argue with there. A more contemporary assessment by Morris Berman, found throughout his “American trilogy,” is that we Americans are actually quite stupid, unaccountably proud of it, and constantly hustling (in the pejorative sense) in pursuit of material success. These descriptions don’t quite match up with familiar jingoism about how great America is (and of course, Americans), leading to non-Americans clamoring to emigrate here, or the self-worship we indulge in every national holiday celebrating political and military history (e.g., Independence Day, Veteran’s Day, Memorial Day).

I recently ran afoul of another ugly aspect of our national character: our tendency toward aggression and violence. In truth, this is hardly unique to Americans. Yet it came up glaringly in the context of a blog post at Pharyngula citing a Tweet comparing uneven application of law (and indignation among online chatterers?) when violence is committed by the political left vs. the political right. Degree of violence clearly matters, but obvious selection bias was deployed to present an egregiously lop-sided perspective. Radicals on both the left and right have shown little compunction about using violence to achieve their agendas. Never mind how poorly conceived those agendas may be. What really surprised me, however, was that my basic objection to violence in all forms across the spectrum was met with snark and ad hominem attack. When did reluctance to enact violence (including going to war) until extremity demands it become controversial?

My main point was that resorting to violence typically invalidates one’s objective. It’s a desperation move. Moreover, using force (e.g., intimidation, threats, physical violence — including throwing milkshakes) against ideological opponents is essentially policing others’ thoughts. But they’re fascists, right? Violence against them is justified because they don’t eschew violence. No, wrong. Mob justice and vigilantism obviate the rule of law and criminalize any perpetrator of violence. It’s also the application of faulty instrumental logic, ceding any principled claim to moral authority. But to commentators at the blog post linked above, I’m the problem because I’m not in support of fighting fascists with full force. Guess all those masked, caped crusaders don’t recognize that they’re contributing to lawlessness and mayhem. Now even centrists come in for attack for not be radical (or aggressive, or violent) enough. Oddly silent in the comments is the blog host, P.Z. Myers, who has himself communicated approval of milkshake patrols and Nazi punching, as though the presumptive targets (identified rather haphazardly and incorrectly in many instances) have no right to their own thoughts and ideas, vile though they may be, and that violence is the right way to “teach them a lesson.” No one learns the intended lesson when the victim of violence. Rather, if not simply cowed into submission (not the same as agreement), tensions tend to escalate into further and increasing violence. See also reaction formation.

Puzzling over this weird exchange with these, my fellow Americans (the ideologically possessed ones anyway), caused me to backtrack. For instance, the definition of fascism at dictionary.com is “a governmental system led by a dictator having complete power, forcibly suppressing opposition and criticism, regimenting all industry, commerce, etc., and emphasizing an aggressive nationalism and often racism.” That definition sounds more like totalitarianism or dictatorship and is backward looking, specifically to Italy’s Benito Mussolini in the period 1922 to 1943. However, like national characters, political moods and mechanisms change over time, and the recent fascist thrust in American politics isn’t limited to a single leader with dictatorial power. Accordingly, the definition above has never really satisfied me.

I’ve blogged repeatedly about incipient fascism in the U.S., the imperial presidency (usually associated with George W. Bush but also characteristic of Barack Obama), James Howard Kunstler’s prediction of a cornpone fascist coming to power (the way paved by populism), and Sheldon Wolin’s idea of inverted totalitarianism. What ties these together is how power is deployed and against what targets. More specifically, centralized power (or force) is directed against domestic populations to advance social and political objectives without broad public support for the sole benefit of holders of power. That’s a more satisfactory definition of fascism to me, certain far better that Peter Schiff’s ridiculous equation of fascism with socialism. Domination of others to achieve objectives describes the U.S. power structure (the military-industrial-corporate complex) to a tee. That doesn’t mean manufactured consent anymore; it means bringing the public into line, especially through propaganda campaigns, silencing of criticism, prosecuting whistle-blowers, and broad surveillance, all of which boil down to policing thought. The public has complied by embracing all manner of doctrine against enlightened self-interest, the very thing that was imagined to magically promote the general welfare and keep us from wrecking things or destroying ourselves unwittingly. Moreover, public support not really obtained through propaganda and domination, only the pretense of agreement found convincing by fools. Similarly, admiration, affection, and respect are not won with a fist. Material objectives (e.g., resource reallocation, to use a familiar euphemism) achieved through force are just common theft.

So what is Antifa doing? It’s forcibly silencing others. It’s doing the work of fascist government operatives by proxy. It’s fighting fascism by becoming fascist, not unlike the the Republican-led U.S. government in 2008 seeking bailouts for banks and large corporations, handily transforming our economy into a socialist experiment (e.g, crowd-funding casino capitalism through taxation). Becoming the enemy to fight the enemy is a nice trick of inversion, though many are so flummoxed by these contradictions they resort to Orwellian doublethink to reconcile the paradox. Under such conditions, there are no arguments that can convince. Battle lines are drawn, tribal affiliations are established, and the ideological war of brother against brother, American against American, intensifies until civility crumbles around us. Civil war and revolution haven’t occurred in the U.S. for 150 years, but they are popping up regularly around the globe, often at the instigation of the U.S. government (again, acting against the public interest). Is our turn coming because we Americans have been divided and conquered instead of recognizing the real source of threat?

Advertisements

Much ado over nothing was made this past week regarding a technical glitch (or control room error) during the first of two televised Democratic presidential debates where one pair of moderators’ mics was accidentally left on and extraneous, unintended speech leaked into the broadcast. It distracted the other pair of moderators enough to cause a modest procedural disruption. Big deal. This was not the modal case of a hot mic where someone, e.g., a politician, swears (a big no-no despite the shock value being almost completely erased in today’s media landscape) or accidentally reveals callous attitudes (or worse) thinking that no one important was listening or recording. Hot mics in the past have led to public outrage and criminal investigations. One recent example that still sticks in everyone’s craw was a novice political candidate who revealed he could use his fame and impudent nerve to “grab ’em by the pussy.” Turned out not to be the career killer everyone thought it would be.

The latest minor furor over a hot mic got me thinking, however, about inadvertent revelation of matters of genuine public interest. Three genres spring to mind: documentary films, whistle-blowing, and investigative journalism, that last including category outliers such as Wikileaks. Whereas a gaffe on a hot mic usually means the leaker/speaker exposes him- or herself and thus has no one else to blame, disclosures occurring in the other three categories are often against the will of those exposed. It’s obviously in the public interest to know about corruption, misbehavior, and malfeasance in corporate and political life, but the manner in which such information is made public is controversial. Those who expose others suffer harassment and persecution. Documentarians probably fare the best with respect to being left alone following release of information. Michael Moore, for all his absurd though entertaining theatrics, is free (so far as I know) to go about his business and do as he pleases. However, gestures to protect whistle-blowers are just that: gestures. Those who have leaked classified government information in particular, because they gained access to such information through security clearances and signed nondisclosure agreements (before knowing what secrets they were obliged to keep, which is frankly the way such obligations work), are especially prone to reprisal and prosecution. Such information is literally not theirs to disclose, but when keeping others’ secrets is heinous enough, some people feel their conscience and more duty is superior to job security and other risks involved. Opinions vary, sometimes passionately. And now even journalists who uncover or merely come into possession of evidence of wrongdoing and later publish it — again, decidedly in the public interest — are subject to (malicious?) prosecution. Julian Assange is the current test case.

The free speech aspect of revealing someone else’s amoral and criminal acts is a fraught argument. However, it’s clear that as soon as damaging information comes to light, focus shifts away from the acts and their perpetrators to those who publish the information. Shifting the focus is a miserable yet well-established precedent by now, the result being that most folks who might consider coming forward to speak up now keep things to themselves rather than suffer entirely foreseeable consequences. In that light, when someone comes forward anyway, knowing that they will be hounded, vilified, arrested, and worse, he or she deserved more respect for courage and self-sacrifice than generally occurs in the aftermath of disclosure. The flip side — condemnation, prosecution, and death threats — are already abundant in the public sphere.

Some time after reports of torture at Guantánamo, Abu Ghraib, and Bagram went public, a handful of low-level servicemen (“bad apples” used to deflect attention down the command hierarchy) were prosecuted, but high-level officials (e.g., former U.S. presidents Bush and Obama, anyone in their respective administrations, and commanding officers on site) were essentially immunized from prosecution. That example is not quite the same as going after truth-tellers, but it’s a rather egregious instance of bad actors going unprosecuted. I’m still incensed by it. And that’s why I’m blogging about the hot mic. Lots of awful things go on behind the scenes without public knowledge or sanction. Those who commit high crimes (including war crimes) clearly know what they’re doing is wrong. Claims of national security are often invoked and gag laws are legislated into existence on behalf of private industry. When leaks do inevitably occur, those accused immediately attack the accuser, often with the aid of others in the media. Denials may also be issued (sometimes not — why bother?), but most bad actors hide successfully behind the deflecting shift of focus. When will those acting in the shadows against the public interest and in defiance of domestic and international law ever be brought to justice? I daresay the soul of the nation is at stake, and as long as officialdom escapes all but temporary public relations problems to be spun, the pride everyone wants to take as Americans eludes us. In the meantime, there’s a lot to answer for, and it keeps piling up.

Apologies for this overlong blog post. I know that this much text tries the patience of most readers and is well in excess of my customary 3–4 paragraphs.

Continuing my book blogging of Pankaj Mishra’s Age of Anger, Chapter Two (subtitled “History’s Winners and Their Illusions”) focuses on the thought revolution that followed from the Enlightenment in Western Europe and its imitation in non-Western cultures, especially as manifested in the century leading to the French Revolution. Although the American Revolution (more narrowly a tax revolt with insistence on self-rule) preceded the French Revolution by slightly more than a decade, it’s really the French, whose motto liberté, égalité, fraternité came to prominence and defined an influential set of European values, who effectively challenged enthusiastic modernizers around the globe to try to catch up with the ascendant West.

However, almost as soon as this project appeared, i.e., attempting to transform ancien régime monarchies in Northern Africa, the Middle East, and Russia into something pseudo-European, critics arose who denounced the abandonment of tradition and centuries-old national identities. Perhaps they can be understood as the first wave of modern conservatism. Here is Mishra’s characterization:

Modernization, mostly along capitalist lines, became the universalist creed that glorified the autonomous rights-bearing individual and hailed his rational choice-making capacity as freedom. Economic growth was posited as the end-all of political life and the chief marker of progress worldwide, not to mention the gateway to happiness. Communism was totalitarian. Ergo its ideological opponent, American liberalism, represented freedom, which in turn was best advanced by moneymaking. [p. 48]

Aside: The phrase “rights-bearing individual” has obvious echoes with today’s SJWs and their poorly conceived demand for egalitarianism not just before the law but in social and economic outcomes. Although economic justice (totally out of whack with today’s extreme income and wealth inequality) is a worthy goal that aligns with idealized but not real-world Enlightenment values, SJW activism reinforces retrograde divisions of people based on race, gender, sexual orientation, religion, disability, etc. Calls to level out all these questionable markers of identity have resulted in intellectual confusion and invalidation of large “privileged” and/or “unoppressed” groups such as white males of European descent in favor of oppressed minorities (and majorities, e.g., women) of all categories. Never mind that many of those same white males are often every bit as disenfranchised as others whose victimhood is paraded around as some sort virtue granting them authority and preferential treatment.

Modernization has not been evenly distributed around the globe, which accounts for countries even today being designated either First, Second, or Third World. An oft-used euphemism is “developing economy,” which translates to an invitation for wealthy First-World nations (or its corporations) to force their way in to exploit cheap labor and untapped natural resources. Indeed, as Mishra points out, the promise of joining First-World living standards (having diverged centuries ago) is markedly hollow:

… doubters of Western-style progress today include more than just marginal communities and some angry environmental activists. In 2014 The Economist said that, on the basis of IMF data, emerging economies — or, most of the human population — might have to wait for three centuries in order to catch up with the West. In this assessment, the last decade of high growth was an ‘aberration’ and ‘billions of people will be poorer for a lot longer than they might have expected just a few years ago’.

The implications are sobering: the non-West not only finds itself replicating the West’s trauma on an infinitely larger scale. While helping inflict the profoundest damage yet on the environment — manifest today in rising sea levels, erratic rainfall, drought, declining harvests, and devastating floods — the non-West also has no real prospect of catching up … [pp. 47-48]

That second paragraph is an unexpected acknowledgement that the earliest industrialized nations (France, the United Kingdom, and the U.S.) unwittingly put us on a path to self-annihilation only to be knowingly repeated and intensified by latecomers to industrialization. All those (cough) ecological disturbances are occurring right now, though the public has been lulled into complacency by temporary abundance, misinformation, under- and misreporting, and international political incompetence. Of course, ecological destruction is no longer merely the West’s trauma but a global catastrophe of the highest magnitude which is certainly in the process of catching up to us.

Late in Chapter Two, Mishra settles on the Crystal Palace exhibition space and utopian symbol, built in 1851 during the era of world’s fairs and mistaken enthusiasm regarding the myth of perpetual progress and perfectibility, as an irresistible embodiment of Western hubris to which some intellectual leaders responded with clear disdain. Although a marvelous technical feat of engineering prowess and demonstration of economic power (not unlike countries that host the Olympics — remember Beijing?), the Crystal Palace was also viewed as an expression of the sheer might of Western thought and its concomitant products. Mishra repeatedly quotes Dostoevsky, who visited the Crystal Palace in 1862 and described his visceral response to the place poignantly and powerfully:

You become aware of a colossal idea; you sense that here something has been achieved, that here there is victory and triumph. You even begin vaguely to fear something. However independent you may be, for some reason you become terrified. ‘For isn’t this the achievement of perfection?’ you think. ‘Isn’t this the ultimate?’ Could this in fact be the ‘one fold?’ Must you accept this as the final truth and forever hold your peace? It is all so solemn, triumphant, and proud that you gasp for breath. [p. 68]

And later, describing the “world-historical import” of the Crystal Palace:

Look at these hundreds of thousands, these millions of people humbly streaming here from all over the face of the earth. People come with a single thought, quietly, relentlessly, mutely thronging onto this colossal palace; and you feel that something final has taken place here, that something has come to an end. It is like a Biblical picture, something out of Babylon, a prophecy from the apocalypse coming to pass before your eyes. You sense that it would require great and everlasting spiritual denial and fortitude in order not to submit, not to capitulate before the impression, not to bow to what is, and not to deify Baal, that is not to accept the material world as your ideal. [pp. 69–70]

The prophetic finality of the Crystal Palace thus presaged twentieth-century achievements and ideas (the so-called American Century) that undoubtedly eclipsed the awesome majesty of the Crystal Palace, e.g., nuclear fission and liberal democracy’s purported victory over Soviet Communism (to name only two). Indeed, Mishra begins the chapter with a review of Americans declarations of the end of history, i.e., having reached final forms of political, social, and economic organization that are now the sole model for all nations to emulate. The whole point of the chapter is that such pronouncements are illusions with strong historical antecedents that might have cautioned us not to leap to unwarranted conclusions or to perpetuate a soul-destroying regime hellbent on extinguishing all alternatives. Of course, as Gore Vidal famously quipped, “Americans never learn; it’s part of our charm.”

 

Richard Wolff gave a fascinating talk at Google offices in New York City, which is embedded below:

This talk was published nearly two years ago, demonstrating that we refuse to learn or make adjustments we need to order society better (and to avoid disaster and catastrophe). No surprise there. (Also shows how long it takes me to get to things.) Critics of capitalism and the democracy we pretend to have in the U.S. are many. Wolff criticizes effectively from a Marxist perspective (Karl Marx being among the foremost of those critics). For those who don’t have the patience to sit through Wolff’s 1.5-hour presentation, let me draw out a few details mixed with my own commentary (impossible to separate, sorry; sorry, too, for the profusion of links no one follows).

The most astounding thing to me is that Wolff admitted he made it through higher education to complete a Ph.D. in economics without a single professor assigning Marx to read or study. Quite the set of blinders his teachers wore. Happily, Wolff eventually educated himself on Marx. Multiple economic forms have each had their day: sharing, barter, feudalism, mercantilism, capitalism (including subcategories anarcho-capitalism and laissez-faire economics), Keynesian regulation, socialism (and its subcategory communism), etc. Except for the first, prevalent among indigent societies living close to subsistence, all involve hierarchy and coercion. Some regard those dynamics as just, others as unjust. It’s worth noting, too, that no system is pure. For instance, the U.S. has a blend of market capitalism and socialism. Philanthropy also figures in somehow. However, as social supports in the U.S. continue to be withdrawn and the masses are left to fend for themselves, what socialism existed as a hidden-in-plain-sight part of our system is being scaled down, privatized, foisted on charitable organizations, and/or driven out of existence.

The usual labor arrangement nearly all of us know — working for someone else for a wage/salary — is defined in Marxism as exploitation (not the lay understanding of the term) for one simple reason: all economic advantage from excess productivity of labor accrues to the business owner(s) (often a corporation). That’s the whole point of capitalism: to exploit (with some acknowledged risk) the differential between the costs of labor and materials (and increasingly, information) vs. the revenue they produce in order to prosper and grow. To some, exploitation is a dirty word, but understood from an analytical point of view, it’s the bedrock of all capitalist labor relationships. Wolff also points out that real wages in the U.S. (adjusted for inflation) have been flat for more than 40 years while productivity has climbed steadily. The differential profit (rather immense over time) has been pocketed handily by owners (billionaire having long-since replaced millionaire as an aspiration) while the average citizen/consumer has kept pace with the rising standard of living by adding women to the workforce (two or more earners per family instead of one), racking up debt, and deferring retirement.

Wolff’s antidote or cure to the dynamic of late-stage capitalism (nearly all the money being controlled by very few) is to remake corporate ownership, where a board of directors without obligation to workers makes all the important decisions and takes all the profit, into worker-owned businesses that practice direct democracy and distribute profits more equitably. How closely this resembles a coop (read: cooperative), commune, or kibbutz I cannot assess. Worker-owned businesses, no longer corporations, also differ significantly from how “socializing a business” is generally understood, i.e., a business or sector being taken over and run by the government. The U.S. Postal Service is one example. (Curiously, that last link has a .com suffix instead of .gov.) Public K–12 education operated by the states is another. As I understand it, this difference (who owns and runs an enterprise) is what lies behind democratic socialism being promoted in the progress wing of the Democratic Party. Bernie Sanders is aligning his socialist politics with worker ownership of the means of production. Wolff also promotes this approach through his book and nonprofit organization Democracy at Work. How different these projects may be lies beyond my cursory analysis.

Another alternative to capitalist hegemony is a resource-based economy, which I admit I don’t really understand. Its rank utopianism is difficult to overlook, since it doesn’t fit at all with human history, where we muddle through without much of a plan or design except perhaps for those few who discover and devise ways to game systems for self-aggrandizement and personal benefit while leaving everyone else in the lurch. Peter Joseph, founder of The Zeitgeist Movement, is among the promoters of a resource-based economy. One of its chief attributes is the disuse of money. Considering that central banks (the Federal Reserve System in the U.S.) issue fiat currency worth increasingly little are being challenged rather effectively by cryptocurrencies based on nothing beyond social consensus, it’s interesting to contemplate an alternative to astronomical levels of wealth (and its inverse: debt) that come as a result of being trapped within the fiat monetary system that benefits so very few people.

Since this is a doom blog (not much of an admission, since it’s been obvious for years now), I can’t finish up without observing that none of these economic systems appears to take into account that we’re on a countdown to self-annihilation as we draw down the irreplaceable energy resources that make the whole shebang go. It’s possible the contemplated resource-based economy does so, but I rather doubt it. A decade or more ago, much of the discussion was about peak oil, which shortly thereafter gave way to peak everything. Shortages of materials such as helium, sand, and rare earths don’t figure strongly in public sentiment so long as party balloons, construction materials, and cell phones continue to be widely available. However, ongoing destruction of the biosphere through the primary activities of industrial civilization (e.g., mining, chemical-based agriculture, and steady expansion of human habitation into formerly wild nature) and the secondary effects of anthropogenic climate change (still hotly contested but more and more obvious with each passing season) and loss of biodiversity and biomass is catching up to us. In economics, this destruction is an externality conveniently ignored or waved away while profits can be made. The fullness of time will provide proof that we’ve enjoyed an extraordinary moment in history where we figured out how to exploit a specific sort of abundance (fossil fuels) with the ironic twist that that very exploitation led to the collapse of the civilization it spawned and supported. No one planned it this way, really, and once the endgame came into view, nothing much could be done to forestall it. So we continue apace with self-destruction while celebrating its glamor and excess as innovation and progress. If only Wolff would incorporate that perspective, too.

Renewed twin memes Universal Basic Income (UBI) and Debt Jubilees (DJ) have been in the news recently. I write renewed because the two ideas are quite literally ancient, unlearnt lessons that are enjoying revitalized interest in the 21st century. Both are capable of sophisticated support from historical and contemporary study, which I admit I haven’t undertaken. However, others have done the work and make their recommendations with considerable authority. For instance, Andrew Yang, interviewed repeatedly as a 2020 U.S. presidential candidate, has made UBI the centerpiece of his policy proposals, whereas Michael Hudson has a new book out called … and forgive them their debts: Lending, Foreclosure and Redemption — From Bronze Age Finance to the Jubilee Year that offers a forgotten history of DJ.

Whenever UBI or DJ comes up in conversation, the most obvious, predicable response I hear (containing a kernel of truth) is that either proposal would reward the losers in today’s capitalist regime: those who earn too little or those who carry too much debt (often a combination of both). Never mind that quality education and economic opportunities have been steadily withdrawn over the past half century. UBI and DJ would thus be giveaways, and I daresay nothing offends a sense of fairness more than others getting something for nothing. Typical resentment goes, “I worked hard, played by the rules, and met my responsibilities; why should others who slacked, failed, or cheated get the benefit of my hard work?” It’s a commonplace “othering” response, failing to recognize that as societies we are completely interconnected and interdependent. Granting the winners in the capitalist contest a pass on fair play is also a major assumption. The most iconic supreme winners are all characterized by shark-like business practices: taking advantage of tax loopholes, devouring everything, and shrewdly understanding their predatory behavior not in terms of producing value but rather as gobbling or destroying competition to gain market share. More than a few companies these days are content to operate for years on venture capital, reporting one quarterly loss after another until rivals are vanquished. Amazon.com is the test case, though how many times its success can be repeated is unknown.

With my relative lack of economic study and sophistication, I take my lessons instead from the children’s game Monopoly. As an oversimplification of the dynamics of capital formation and ownership, Monopoly even for children reaches its logical conclusion well before its actual end, where one person “wins” everything. The balancing point when the game is no longer worth playing is debatable, but some have found through experience the answer is “before it starts.” It’s just no fun destroying bankrupting other players utterly through rent seeking. The no-longer-fun point is analogous to late-stage capitalism, where the conclusion has not yet been fully reached but is nonetheless clear. The endgame is, in a word, monopoly — the significant element being “mono,” as in there can be only one winner. (Be careful what you wish for: it’s lonely and resentful at the top.) Others take a different, aspirational lesson from Monopoly, which is to figure out game dynamics, or game the game, so that the world can be taken by force. One’s growing stranglehold on others disallows fair negotiation and cooperation (social rather than capitalist values) precisely because one party holds all the advantages, leading to exploitation of the many for the benefit of a few (or one).

Another unlearnt ancient lesson is that nothing corrupts so easily or so much as success, power, fame, wealth. Many accept that corruption willingly; few take the lesson to heart. (Disclosure: I’ve sometimes embarked on the easy path to wealth by buying lottery tickets. Haven’t won, so I’m not corruptible yet corrupted. Another case of something for nearly nothing, or for those gambling away their rent and grocery money, nothing for something.) Considering that money makes the world go around, especially in the modern age, the dynamics of capitalism are inescapable and the internal contradictions of capitalism are well acknowledged. The ancient idea of DJ is essentially a reset button depressed before the endgame leads to rebellion and destruction of the ownership class. Franklin D. Roosevelt is credited in some accounts of history as having saved capitalism from that near endgame by transferring wealth back to the people through the New Deal and the war economy. Thus, progressives are calling for a Green New Deal, though it’s not clear they are aware that propping up capitalism only delays its eventual collapse through another couple cycles (reversals) of capital flow. Availability of cheap, plentiful energy that allowed economies (and populations) to balloon over the past two and one-half centuries cannot continue for much longer, so even if we get UBI or DJ, the endgame remains unchanged.

There is something ironic and vaguely tragic about how various Internet platforms — mostly search engines and social media networks — have unwittingly been thrust into roles their creators never envisioned for themselves. Unless I’m mistaken, they launched under the same business model as broadcast media: create content, or better yet, crowd-source content, to draw in viewers and subscribers whose attention is then delivered to advertisers. Revenue is derived from advertisers while the basic services — i.e., search, job networking, encyclopedias and dictionaries, or social connection — are given away gratis. The modest inconveniences and irritations of having the screen littered and interrupted with ads is a trade-off most end users are happy to accept for free content.

Along the way, some platform operators discovered that user data itself could be both aggregated and individualized and subsequently monetized. This second step unwittingly created so-called surveillance capitalism that Shoshana Zuboff writes about in her recently published book (previously blogged about it here). Essentially, an Orwellian Big Brother (several of them, in fact) tracks one’s activity through smart phone apps and Web browsers, including GPS data revealing movement through real space, not just virtual spaces. This is also the domain of the national security state from local law enforcement to the various security branches of the Federal government: dragnet surveillance where everyone is watched continuously. Again, end users shrug off surveillance as either no big deal or too late to resist.

The most recent step is that, like the Internet itself, various platforms have been functioning for some time already as public utilities and accordingly fallen under demand for regulation with regard to authenticity, truth, and community standards of allowable speech. Thus, private corporations have been thrust unexpectedly into the role of regulating content. Problem is, unlike broadcast networks that create their own content and can easily enforce restrictive standards, crowd-sourced platforms enable the general population to upload its own content, often mere commentary in text form but increasingly as video content, without any editorial review. These platforms have parried by deploying and/or modifying their preexisting surveillance algorithms in search of objectionable content normally protected as free speech and taken steps to remove content, demonetize channels, and ban offending users indefinitely, typically without warning and without appeal.

If Internet entrepreneurs initially got into the biz to make a few (or a lot of) quick billions, which some few of them have, they have by virtue of the global reach of their platforms been transformed into censors. It’s also curious that by enabling end uses to publish to their platforms, they’ve given voice to the masses in all their unwashed glory. Now, everyone’s crazy, radicalized uncle (or sibling or parent or BFF) formerly banished to obscurity railing against one thing or another at the local tavern, where he was tolerated as harmless so long as he kept his bar tab current, is proud to fly his freak flag anywhere and everywhere. Further, the anonymous coward who might issue death or bomb threats to denounce others has been given means to distribute hate across platforms and into the public sphere, where it gets picked up and maybe censored. Worst of all, the folks who monitor and decide what is allowed, functioning as modern-day thought police, are private citizens and corporations with no oversight or legal basis to act except for the fact that everything occurs on their respective platforms. This is a new aspect to the corporatocracy but not one anyone planned.

Throughout human history, the question “who should rule?” has been answered myriad ways. The most enduring answer is simple: he who can muster and deploy the most force of arms and then maintain control over those forces. Genghis Khan is probably the most outrageously successful example and is regarded by the West as a barbarian. Only slightly removed from barbarians is the so-called Big Man, who perhaps adds a layer of diplomacy by running a protection racket while selectively providing and distributing spoils. As societies move further away from subsistence and immediacy, various absolute rulers are established, often through hereditary title. Call them Caesar, chief, dear leader, emir, emperor (or empress), kaiser, king (or queen), pharaoh, premier, el presidente, sultan, suzerain, or tsar, they typically acquire power through the accident of birth and are dynastic. Some are female but most are male, and they typically extract tribute and sometimes demand loyalty oaths.

Post-Enlightenment, rulers are frequently democratically elected administrators (e.g., legislators, technocrats, autocrats, plutocrats, kleptocrats, and former military) ideally meant to be representative of common folks. In the U.S., members of Congress (and of course the President) are almost wholly drawn from the ranks of the wealthy (insufficient wealth being a de facto bar to office) and are accordingly estranged from American life the many different ways most of us experience it. Below the top level of visible, elected leaders is a large, hidden apparatus of high-level bureaucratic functionaries (often appointees), the so-called Deep State, that is relatively stable and made up primarily of well-educated, white-collar careerists whose ambitions for themselves and the country are often at odds with the citizenry.

I began to think about this in response to a rather irrational reply to an observation I made here. Actually, it wasn’t even originally my observation but that of Thomas Frank, namely, that the Deep State is largely made up of the liberal professional class. The reply reinforced the notion who better to rule than the “pros”? History makes the alternatives unthinkable. Thus, the Deep State’s response to the veritable one-man barbarian invasion of the Oval Office has been to seek removal of the interloper by hook or by crook. (High office in this case was won unexpectedly and with unnamed precedent by rhetorical force — base populism — rather than by military coup, making the current occupant a quasi-cult leader; similarly, extracted tribute is merely gawking attention rather than riches.)

History also reveals that all forms of political organization suffer endemic incompetence and corruption, lending truth to Winston Churchill’s witticism “Democracy is the worst form of government, except for all the others.” Indeed, recent rule by technocrats has been supremely awful, leading to periodic market crashes, extreme wealth inequality, social stigmatization, and forever wars. Life under such rule is arguably better than under various other political styles; after all, we gots our vaunted freedoms and enviable material comforts. But the exercise of those freedoms does not reliably deliver either ontological security or psychological certainty we humans crave. In truth, our current form of self-governance has let nothing get in the way of plundering the planet for short-term profit. That ongoing priority is making Earth uninhabitable not just for other species but for humans, too. In light of this fact, liberal technocratic democracy could be a far worse failure than most: it will have killed billions (an inevitability now under delayed effect).

Two new grassroots movements (to my knowledge) have appeared that openly question who should rule: the Sunrise Movement (SM) and the Extinction Rebellion (ER). SM is a youth political movement in the U.S. that acknowledges climate change and supports the Green New Deal as a way of prioritizing the desperate existential threat modern politics and society have become. For now at least, SM appears to be content with working within the system, replacing incumbents with candidates it supports. More intensely, ER is a global movement centered in the U.K. that also acknowledges that familiar modern forms of social and political organization (there are several) no longer function but in fact threaten all of us with, well, extinction. One of its unique demands is that legislatures be drawn via sortition from the general population to be more representative of the people. Further, sortition avoids the established pattern of those elected to lead representational governments from being corrupted by the very process of seeking and attaining office.

I surmise attrition and/or replacement (the SM path) are too slow and leave candidates vulnerable to corruption. In addition, since no one relinquishes power willingly, current leaders will have to be forced out via open rebellion (the ER path). I’m willing to entertain either path but must sadly conclude that both are too little, too late to address climate change and near-term extinction effectively. Though difficult to establish convincingly, I suspect the time to act was in the 1970s (or even before) when the Ecology Movement arose in recognition that we cannot continue to despoil our own habitat without consequence. That message (social, political, economic, and scientific all at once) was as inert then as it is now. However, fatalism acknowledged, some other path forward is better than our current systems of rule.

I observed way back here that it was no longer a thing to have a black man portray the U.S. president in film. Such casting might draw a modest bit of attention, but it no longer raises a particularly arched eyebrow. These depictions in cinema were only slightly ahead of the actuality of the first black president. Moreover, we’ve gotten used to female heads of state elsewhere, and we now have in the U.S. a burgeoning field of presidential wannabes from all sorts of diverse backgrounds. Near as I can tell, no one really cares anymore that a candidate is a women, or black, or a black women. (Could be that I’m isolated and/or misreading the issue.)

In Chicago where I live, the recent mayoral election offered a choice among some ten candidates to succeed the current mayor who is not seeking reelection. None of them got the required majority of votes, so a runoff between the top two candidates is about to occur. Both also happen to be black women. Although my exposure to the mainstream media and all the talking heads offering analysis is limited, I’ve yet to hear anyone remark disparagingly that Chicago will soon have its first black female mayor. This is as it should be: the field is open to all comers and no one can (or should) claim advantage or disadvantage based on identitarian politics.

Admittedly, extremists on both ends of the bogus left/right political spectrum still pay quite a lot of attention to identifiers. Academia in particular is currently destroying itself with bizarre claims and demands for equity — a nebulous doctrine that divides rather than unites people. Further, some conservatives can’t yet countenance a black, female, gay, atheist, or <insert other> politician, especially in the big chair. I’m nonetheless pleased to see that irrelevant markers matter less and less to many voters. Perhaps it’s a transition by sheer attrition and will take more time, but the current Zeitgeist outside of academia bodes well.

Everyone is familiar with the convention in entertainment media where characters speak without the use of recognizable language. (Not related really to the convention of talking animals.) The first instance I can recall (someone correct me if earlier examples are to be found) is the happy-go-lucky bird Woodstock from the old Peanuts cartoons (do kids still recognize that cast of characters?), whose dialog was shown graphically as a series of vertical lines:

When the cartoon made its way onto TV for holiday specials, its creator Charles Schultz used the same convention to depict adults, never shown onscreen but with dialogue voiced by a Harmon-muted trombone. Roughly a decade later, two characters from the Star Wars franchise “spoke” in languages only other Star Wars characters could understand, namely, Chebacca (Chewie) and R2D2. More recently, the character Groot from Guardians of the Galaxy (known to me only through the Marvel movie franchise, not through comic books) speaks only one line of dialogue, “I am Groot,” which is understood as full speech by others Guardians characters. When behemoths larger than a school bus (King Kong, Godzilla, Jurassic dinosaurs, Cloverfield, Kaiju, etc.) appear, the characters are typically denied the power of speech beyond the equivalent of a lion’s roar. (True villains talk little or not at all as they go about their machinations — no monologuing! unless it’s a James Bond film. An exception notable for its failure to charm audiences is Ultron, who wouldn’t STFU. You can decide for yourself which is the worse kind of villainy.)

This convention works well enough for storytelling and has the advantage of allowing the reader/viewer to project onto otherwise blank speech. However, when imported into the real world, especially in politics, the convention founders. There is no Babelfish universal translator inserted in the ear to transform nonsense into coherence. The obvious example of babblespeech is 45, whose speech when off the teleprompter is a series of rambling non sequiturs, free associations, slogans, and sales pitches. Transcripts of anyone’s extemporaneous speech reveal lots of restarts and blind alleys; we all interrupt ourselves to redirect. However, word salad that substitutes for meaningful content in 45’s case is tragicomic: alternately entirely frustrating or comically entertaining depending on one’s objective. Satirical news shows fall into the second category.

45 is certainly not the first. Sarah Palin in her time as a media darling (driver of ratings and butt of jokes — sound familiar?) had a knack for crazy speech combinations that were utter horseshit yet oddly effective for some credulous voters. She was even a hero to some (nearly a heartbeat away from being the very first PILF). We’ve also now been treated to a series of public interrogations where a candidate for a cabinet post or an accused criminal offers testimony before a congressional panel. Secretary of Education Betsy DeVos famously evaded simple yes/no questions during her confirmation hearing, and Supreme Court Justice Brett Kavanaugh similarly refused to provide direct answers to direct questions. Unexpectedly, sacrificial lamb Michael Cohen does give direct answers to many questions, but his interlocutors then don’t quite know how to respond considering their experience and expectation that no one answers appropriately.

What all this demonstrates is that there is often a wide gulf between what is said and what is heard. In the absence of what might be understood as effective communication (honest, truthful, and forthright), audiences and voters fill in the blanks. Ironically, we also can’t handle hear too much truth when confronted by its awfulness. None of this is a problem in storytelling, but when found in politic narratives, it’s emblematic of how dysfunctional our communications have become, and with them, the clear thought and principled activity of governance.

First, a bit of history. The U.S. Constitution was ratified in 1788 and superseded the Articles of Confederation. The first ten Amendments, ratified in 1791 (rather quickly after the initial drafting and adoption of the main document — oops, forgot these obvious assumptions), are known as the Bill of Rights. The final amendment to date, the 27th Amendment, though proposed in 1789 along with others, was not ratified until 1992. A half dozen additional amendments approved by Congress have not yet been ratified, and a large number of other unapproved amendments have been proposed.

The received wisdom is that, by virtue of its lengthy service as the supreme law of the land, the U.S. Constitution has become sacrosanct and invulnerable to significant criticism and further amendment. That wisdom has begun to be questioned actively as a result of (at least) two factors: (1) recognition that the Federal government serves the common good and citizenry rather poorly, having become corrupt and dysfunctional, and (2) the Electoral College, an anachronism from the Revolutionary Era that skews voting power away from cities, handed two recent presidential elections to candidates who failed to win the popular vote yet won in the Electoral College. For a numerical analysis of how electoral politics is gamed to subvert public opinion, resulting in more government seats held by Republicans than voting (expressing the will of the people) would indicate, see this article by the Brookings Institute.

These are issues of political philosophy and ongoing public debate, spurred by dissatisfaction over periodic Federal shutdowns, power struggles between the executive and legislative branches that are tantamount to holding each other hostage, and income inequality that pools wealth and power in the hands of ever fewer people. The judicial branch (especially the U.S. Supreme Court) is also a significant point of contention; its newly appointed members are increasingly right wing but have not (yet) taken openly activist roles (e.g., reversing Roe v. Wade). As philosophy, questioning the wisdom of the U.S. Constitution requires considerable knowledge of history and comparative government to undertake with equanimity (as opposed to emotionalism). I don’t possess such expert knowledge but will observe that the U.S. is an outlier among nations in relying on a centuries-old constitution, which may not have been the expectation or intent of the drafters.

It might be too strong to suggest just yet that the public feels betrayed by its institutions. Better to say that, for instance, the U.S. Constitution is now regarded as a flawed document — not for its day (with limited Federal powers) but for the needs of today (where the Federal apparatus, including the giant military, has grown into a leviathan). This would explain renewed interest in direct democracy (as opposed to representative government), flirtations with socialism (expanded over the blended system we already have), and open calls for revolution to remove a de facto corporatocracy. Whether the U.S. Constitution can or should survive these challenges is the question.