Archive for the ‘Corporatism’ Category

From They Rule: The 1% vs. Democracy (2014) by Paul Street, which I’m just starting to read:

The contemporary United States, I find in this volume, is neither a dictatorship nor a democracy. It is something in between or perhaps different altogether: a corporate-managed state-capitalist pseudo-democracy that sells the narrow interests of the wealthy business and financial elite as the public interest, closes off critical and independent thought, and subjects culture, politics, policy, institutions, the environment, daily life, and individual minds to the often hidden and unseen authoritarian dictates of money and profit. It is a corporate and financial plutocracy whose managers generally prefer to rule through outwardly democratic and noncoercive means since leading American corporations and their servants have worked effectively at draining and disabling democracy’s radical and progressive potential by propagandizing, dulling, pacifying, deadening, overextending, overstressing, atomizing, and demobilizing the citizenry. At the same time, American state and capitalist elites remain ready, willing, and able to maintain their power with the help from ever more sinister and sophisticated methods and tools of repression brutality, and coercive control.

Continuing (after some delay) from part 1, Pankaj Mishra concludes chapter 4 of The Age of Anger with an overview of Iranian governments that shifted from U.S./British client state (headed by the Shah of Iran, reigned 1941–1979) to its populist replacement (headed by Ayatollah Khomeini, ruled 1979–1989), both leaders having been authoritarians. During the period discussed, Iran underwent the same modernization and infiltration by liberal, Western values and economics, which produced a backlash familiar from Mishra’s descriptions of other nations and regions that had experienced the same severed roots of place since the onset of the Enlightenment. Vacillation among two or more styles of government might be understood as a thermostatic response: too hot/cold one direction leads to correction in another direction. It’s not a binary relationship, however, between monarchy and democracy (to use just one example). Nor are options between a security state headed by an installed military leader and a leader elected by popular vote. Rather, it’s a question of national identity being alternatively fractured and unified (though difficult to analyze and articulate) in the wake of multiple intellectual influences.

According to Lewis and Huntington, modernity has failed to take root in intransigently traditional and backward Muslim countries despite various attempts to impose it by secular leaders such as Turkey’s Atatürk, the Shah of Iran, Algeria’s Ben Bella, Egypt’s Nasser and Sadat, and Pakistan’s Ayub Khan.

Since 9/11 there have been many versions, crassly populist as well as solemnly intellectual, of the claims by Lewis and Huntington that the crisis in Muslim countries is purely self-induced, and [that] the West is resented for the magnitude of its extraordinary success as a beacon of freedom, and embodiment of the Enlightenment’s achievements … They have mutated into the apparently more sophisticated claim that the clash of civilizations occurs [primarily] within Islam, and that Western interventions are required on behalf of the ‘good Muslim’, who is rational, moderate and liberal. [p. 127]

This is history told by the putative winners. Mishra goes on:

Much of the postcolonial world … became a laboratory for Western-style social engineering, a fresh testing site for the Enlightenment ideas of secular progress. The philosophes had aimed at rationalization, or ‘uniformization’, of a range of institutions inherited from an intensely religious era. Likewise, postcolonial leaders planned to turn illiterate peasants into educated citizens, to industrialize the economy, move the rural population to cities, alchemize local communities into a singular national identity, replace the social hierarchies of the past with an egalitarian order, and promote the cults of science and technology among a pious and often superstitious population. [p. 133]

Readers may recognize this project and/or process by its more contemporary name: globalization. It’s not merely a war of competing ideas, however, because those ideas manifest in various styles of social and political organization. Moreover, the significance of migration from rural agrarian settings to primarily urban and suburban ones can scarcely be overstated. This transformation (referring to the U.S. in the course of the 20th century) is something James Howard Kunstler repeatedly characterizes rather emphatically as the greatest misallocation of resources in the history of the world. Mishra summarizes the effects of Westernization handily:

In every human case, identity turns out to be porous and inconsistent rather than fixed and discrete; and prone to get confused and lost in the play of mirrors. The cross-currents of ideas and inspirations — the Nazi reverence for Atatürk, a gay French philosopher’s denunciation of the modern West and sympathy for the Iranian Revolution, or the various ideological inspirations for Iran’s Islamic Revolution (Zionism, Existentialism, Bolshevism and revolutionary Shiism) — reveal that the picture of a planet defined by civilizations closed off from one another and defined by religion (or lack thereof) is a puerile cartoon. They break the simple axis — religious-secular, modern-medieval, spiritual-materialist — on which the contemporary world is still measured, revealing that its populations, however different their pasts, have been on converging and overlapping paths. [p. 158]

These descriptions and analyses put me in mind of a fascinating book I read some years ago and reviewed on Amazon (one of only a handful of Amazon reviews): John Reader’s Man on Earth (1988). Reader describes and indeed celebrates incredibly diverse ways of inhabiting the Earth specially adapted to the landscape and based on evolving local practices. Thus, the notion of “place” is paramount. Comparison occurs only by virtue of juxtaposition. Mishra does something quite different, drawing out the connective ideas that account for “converging and overlapping paths.” Perhaps inevitably, disturbances to collective and individual identities that flow from unique styles of social organization, especially those now operating at industrial scale (i.e., industrial civilization), appear to be picking up. For instance, in the U.S., even as mass shootings (a preferred form of attack but not the only one) appear to be on the rise at the same time that violent crime is at an all-time low, perpetrators of violence are not limited to a few lone wolves, as the common trope goes. According to journalist Matt Agorist,

mass shootings — in which murdering psychopaths go on rampages in public spaces — have claimed the lives of 339 people since 2015 [up to mid-July 2019]. While this number is certainly shocking and far too high, during this same time frame, police in America have claimed the lives of 4,355 citizens.

And according to this article in Vox, this crazy disproportion (police violence to mass shootings) is predominantly an American thing at least partly because of our high rate of fetishized civilian gun ownership. Thus, the self-described “land of the free, home of the brave” has transformed itself into a paranoid garrison state affecting civil authority even more egregiously than the disenfranchised (mostly young men). Something similar occurred during the Cold War, when leaders became hypervigilant for attacks and invasions that never came. Whether a few close calls during the height of the Cold War were the result of escalating paranoia, brinkmanship, or true, maniacal, existential threats from a mustache-twirling, hand-rolling despot hellbent on the destruction of the West is a good question, probably impossible to answer convincingly. However, the result today of this mindset couldn’t be more disastrous:

It is now clear that the post-9/11 policies of pre-emptive war, massive retaliation, regime change, nation-building and reforming Islam have failed — catastrophically failed — while the dirty war against the West’s own Enlightenment [the West secretly at war with itself] — inadvertently pursued through extrajudicial murder, torture, rendition, indefinite detention and massive surveillance — has been a wild success. The uncodified and unbridled violence of the ‘war on terror’ ushered in the present era of absolute enmity in which the adversaries, scornful of all compromise, seek to annihilate each other. Malignant zealots have emerged at the very heart of the democratic West after a decade of political and economic tumult; the simple explanatory paradigm set in stone soon after the attacks of 9/11 — Islam-inspired terrorism versus modernity — lies in ruins. [pp.124–125]

Richard Wolff gave a fascinating talk at Google offices in New York City, which is embedded below:

This talk was published nearly two years ago, demonstrating that we refuse to learn or make adjustments we need to order society better (and to avoid disaster and catastrophe). No surprise there. (Also shows how long it takes me to get to things.) Critics of capitalism and the democracy we pretend to have in the U.S. are many. Wolff criticizes effectively from a Marxist perspective (Karl Marx being among the foremost of those critics). For those who don’t have the patience to sit through Wolff’s 1.5-hour presentation, let me draw out a few details mixed with my own commentary (impossible to separate, sorry; sorry, too, for the profusion of links no one follows).

The most astounding thing to me is that Wolff admitted he made it through higher education to complete a Ph.D. in economics without a single professor assigning Marx to read or study. Quite the set of blinders his teachers wore. Happily, Wolff eventually educated himself on Marx. Multiple economic forms have each had their day: sharing, barter, feudalism, mercantilism, capitalism (including subcategories anarcho-capitalism and laissez-faire economics), Keynesian regulation, socialism (and its subcategory communism), etc. Except for the first, prevalent among indigent societies living close to subsistence, all involve hierarchy and coercion. Some regard those dynamics as just, others as unjust. It’s worth noting, too, that no system is pure. For instance, the U.S. has a blend of market capitalism and socialism. Philanthropy also figures in somehow. However, as social supports in the U.S. continue to be withdrawn and the masses are left to fend for themselves, what socialism existed as a hidden-in-plain-sight part of our system is being scaled down, privatized, foisted on charitable organizations, and/or driven out of existence.

The usual labor arrangement nearly all of us know — working for someone else for a wage/salary — is defined in Marxism as exploitation (not the lay understanding of the term) for one simple reason: all economic advantage from excess productivity of labor accrues to the business owner(s) (often a corporation). That’s the whole point of capitalism: to exploit (with some acknowledged risk) the differential between the costs of labor and materials (and increasingly, information) vs. the revenue they produce in order to prosper and grow. To some, exploitation is a dirty word, but understood from an analytical point of view, it’s the bedrock of all capitalist labor relationships. Wolff also points out that real wages in the U.S. (adjusted for inflation) have been flat for more than 40 years while productivity has climbed steadily. The differential profit (rather immense over time) has been pocketed handily by owners (billionaire having long-since replaced millionaire as an aspiration) while the average citizen/consumer has kept pace with the rising standard of living by adding women to the workforce (two or more earners per family instead of one), racking up debt, and deferring retirement.

Wolff’s antidote or cure to the dynamic of late-stage capitalism (nearly all the money being controlled by very few) is to remake corporate ownership, where a board of directors without obligation to workers makes all the important decisions and takes all the profit, into worker-owned businesses that practice direct democracy and distribute profits more equitably. How closely this resembles a coop (read: cooperative), commune, or kibbutz I cannot assess. Worker-owned businesses, no longer corporations, also differ significantly from how “socializing a business” is generally understood, i.e., a business or sector being taken over and run by the government. The U.S. Postal Service is one example. (Curiously, that last link has a .com suffix instead of .gov.) Public K–12 education operated by the states is another. As I understand it, this difference (who owns and runs an enterprise) is what lies behind democratic socialism being promoted in the progress wing of the Democratic Party. Bernie Sanders is aligning his socialist politics with worker ownership of the means of production. Wolff also promotes this approach through his book and nonprofit organization Democracy at Work. How different these projects may be lies beyond my cursory analysis.

Another alternative to capitalist hegemony is a resource-based economy, which I admit I don’t really understand. Its rank utopianism is difficult to overlook, since it doesn’t fit at all with human history, where we muddle through without much of a plan or design except perhaps for those few who discover and devise ways to game systems for self-aggrandizement and personal benefit while leaving everyone else in the lurch. Peter Joseph, founder of The Zeitgeist Movement, is among the promoters of a resource-based economy. One of its chief attributes is the disuse of money. Considering that central banks (the Federal Reserve System in the U.S.) issue fiat currency worth increasingly little are being challenged rather effectively by cryptocurrencies based on nothing beyond social consensus, it’s interesting to contemplate an alternative to astronomical levels of wealth (and its inverse: debt) that come as a result of being trapped within the fiat monetary system that benefits so very few people.

Since this is a doom blog (not much of an admission, since it’s been obvious for years now), I can’t finish up without observing that none of these economic systems appears to take into account that we’re on a countdown to self-annihilation as we draw down the irreplaceable energy resources that make the whole shebang go. It’s possible the contemplated resource-based economy does so, but I rather doubt it. A decade or more ago, much of the discussion was about peak oil, which shortly thereafter gave way to peak everything. Shortages of materials such as helium, sand, and rare earths don’t figure strongly in public sentiment so long as party balloons, construction materials, and cell phones continue to be widely available. However, ongoing destruction of the biosphere through the primary activities of industrial civilization (e.g., mining, chemical-based agriculture, and steady expansion of human habitation into formerly wild nature) and the secondary effects of anthropogenic climate change (still hotly contested but more and more obvious with each passing season) and loss of biodiversity and biomass is catching up to us. In economics, this destruction is an externality conveniently ignored or waved away while profits can be made. The fullness of time will provide proof that we’ve enjoyed an extraordinary moment in history where we figured out how to exploit a specific sort of abundance (fossil fuels) with the ironic twist that that very exploitation led to the collapse of the civilization it spawned and supported. No one planned it this way, really, and once the endgame came into view, nothing much could be done to forestall it. So we continue apace with self-destruction while celebrating its glamor and excess as innovation and progress. If only Wolff would incorporate that perspective, too.

“Come with me if you want to live.” That’s among the quotable lines from the latest movie in the Terminator franchise, though it’s not nearly so succinct or iconic as “I’ll be back” from the first Terminator. Whereas the latter has the quality (in hindsight) of slow, implacable inevitability (considering the Terminator is literally a death-bringer), the former occurs within the context of a character having only just traveled back in time, not yet adequately reoriented, and forced to make a snap decision under duress. “I’ll be back” might be easy to brush off as harmless (temporary denial) since the threat recedes — except that it doesn’t, it’s merely delayed. “Come with me …” demands a leap of faith (or trust) because the danger is very real at that instant.

Which quote, I must ask, better characterizes the threat of climate change? My answer: both, but at different times. Three to four decades ago, it was the “I’ll be back” type: building slowly but inevitable given the underlying structure of industrial civilization. That structure was known even then by a narrow circle of experts (e.g., engineers for Big Oil and at the Dept. of Energy) to be a heat engine, meaning that we would ultimately cook our own goose by warming the planet, altering the climatic steady state under which our most recent civilization has flourished and producing a steady loss of biodiversity and biomass until our own human habitat (the entirety of the planet by now) becomes a hostile environment unable (unwilling if one anthropomorphizes Mother Nature) to support our swollen population. All that was if we stayed on course and took no corrective action. Despite foreknowledge and ample warning, that’s precisely what occurred (and continues today).

With the Intergovernmental Panel on Climate Change (IPCC) in particular, the threat has for roughly a decade shifted over to “Come with me ….” It’s no longer possible to put things off, yet we continue to dither well beyond the tipping point where/when we can still save ourselves from self-annihilation. Although scientists have been gathering data and evidence, forming an overwhelming consensus, and sounding the alarm, scientific illiteracy, realpolitik, journalistic malpractice, and corporate greed have all conspired to grant the illusion of time to react we simply don’t have anymore (and truth be told, probably didn’t as of the early 1980s).

I’m aware of at least three journalists (relying on the far more authoritative work of scientific consensus) who have embraced the message: Dahr Jamail, Thom Hartmann, and David Wallace-Wells. None to my knowledge has been able to bring himself to admit that humanity is now a collection of dead men walking. They can’t muster the courage to give up hope (or to report truthfully), clinging to the possibility we may still have a fleeting chance to avert disaster. I heard Ralph Nader on his webcast say something to the same effect, namely, what good is it to rob others of hope? My personal values adhere to unstinting truth rather than illusion or self-deception, so I subscribe to Guy McPherson‘s assessment that we face near-term human extinction (precise date unknown but soon if, for example, this the year we get a blue ocean event). Simply put, McPherson is professor emeritus of natural resources and ecology and evolutionary biology at the University of Arizona [note my emphasis]. I trust his scholarship (summarizing the work of other scientists and drawing necessary though unpalatable conclusions) more than I trust journalistic shaping of the story for public consumption.

The obvious metaphor for what we face is a terminal medical diagnosis, or if one has hope, perhaps a death sentence about to be carried out but with the possibility of a last-minute stay of execution via phone call from the governor. Opinions vary whether one should hope/resist up to the final moment or make peace with one’s fate. By not telling the truth, I daresay the MSM has not given the public the second option by using the “I’ll be back” characterization when it’s really “Come with me ….” Various authors on the Web offer a better approximation of the truth (such as it can be known) and form a loose doomer network (a/k/a collapsniks). This blog is (an admittedly tiny) part of that doomersphere, which gives me no pleasure.

Renewed twin memes Universal Basic Income (UBI) and Debt Jubilees (DJ) have been in the news recently. I write renewed because the two ideas are quite literally ancient, unlearnt lessons that are enjoying revitalized interest in the 21st century. Both are capable of sophisticated support from historical and contemporary study, which I admit I haven’t undertaken. However, others have done the work and make their recommendations with considerable authority. For instance, Andrew Yang, interviewed repeatedly as a 2020 U.S. presidential candidate, has made UBI the centerpiece of his policy proposals, whereas Michael Hudson has a new book out called … and forgive them their debts: Lending, Foreclosure and Redemption — From Bronze Age Finance to the Jubilee Year that offers a forgotten history of DJ.

Whenever UBI or DJ comes up in conversation, the most obvious, predicable response I hear (containing a kernel of truth) is that either proposal would reward the losers in today’s capitalist regime: those who earn too little or those who carry too much debt (often a combination of both). Never mind that quality education and economic opportunities have been steadily withdrawn over the past half century. UBI and DJ would thus be giveaways, and I daresay nothing offends a sense of fairness more than others getting something for nothing. Typical resentment goes, “I worked hard, played by the rules, and met my responsibilities; why should others who slacked, failed, or cheated get the benefit of my hard work?” It’s a commonplace “othering” response, failing to recognize that as societies we are completely interconnected and interdependent. Granting the winners in the capitalist contest a pass on fair play is also a major assumption. The most iconic supreme winners are all characterized by shark-like business practices: taking advantage of tax loopholes, devouring everything, and shrewdly understanding their predatory behavior not in terms of producing value but rather as gobbling or destroying competition to gain market share. More than a few companies these days are content to operate for years on venture capital, reporting one quarterly loss after another until rivals are vanquished. Amazon.com is the test case, though how many times its success can be repeated is unknown.

With my relative lack of economic study and sophistication, I take my lessons instead from the children’s game Monopoly. As an oversimplification of the dynamics of capital formation and ownership, Monopoly even for children reaches its logical conclusion well before its actual end, where one person “wins” everything. The balancing point when the game is no longer worth playing is debatable, but some have found through experience the answer is “before it starts.” It’s just no fun destroying bankrupting other players utterly through rent seeking. The no-longer-fun point is analogous to late-stage capitalism, where the conclusion has not yet been fully reached but is nonetheless clear. The endgame is, in a word, monopoly — the significant element being “mono,” as in there can be only one winner. (Be careful what you wish for: it’s lonely and resentful at the top.) Others take a different, aspirational lesson from Monopoly, which is to figure out game dynamics, or game the game, so that the world can be taken by force. One’s growing stranglehold on others disallows fair negotiation and cooperation (social rather than capitalist values) precisely because one party holds all the advantages, leading to exploitation of the many for the benefit of a few (or one).

Another unlearnt ancient lesson is that nothing corrupts so easily or so much as success, power, fame, wealth. Many accept that corruption willingly; few take the lesson to heart. (Disclosure: I’ve sometimes embarked on the easy path to wealth by buying lottery tickets. Haven’t won, so I’m not corruptible yet corrupted. Another case of something for nearly nothing, or for those gambling away their rent and grocery money, nothing for something.) Considering that money makes the world go around, especially in the modern age, the dynamics of capitalism are inescapable and the internal contradictions of capitalism are well acknowledged. The ancient idea of DJ is essentially a reset button depressed before the endgame leads to rebellion and destruction of the ownership class. Franklin D. Roosevelt is credited in some accounts of history as having saved capitalism from that near endgame by transferring wealth back to the people through the New Deal and the war economy. Thus, progressives are calling for a Green New Deal, though it’s not clear they are aware that propping up capitalism only delays its eventual collapse through another couple cycles (reversals) of capital flow. Availability of cheap, plentiful energy that allowed economies (and populations) to balloon over the past two and one-half centuries cannot continue for much longer, so even if we get UBI or DJ, the endgame remains unchanged.

There is something ironic and vaguely tragic about how various Internet platforms — mostly search engines and social media networks — have unwittingly been thrust into roles their creators never envisioned for themselves. Unless I’m mistaken, they launched under the same business model as broadcast media: create content, or better yet, crowd-source content, to draw in viewers and subscribers whose attention is then delivered to advertisers. Revenue is derived from advertisers while the basic services — i.e., search, job networking, encyclopedias and dictionaries, or social connection — are given away gratis. The modest inconveniences and irritations of having the screen littered and interrupted with ads is a trade-off most end users are happy to accept for free content.

Along the way, some platform operators discovered that user data itself could be both aggregated and individualized and subsequently monetized. This second step unwittingly created so-called surveillance capitalism that Shoshana Zuboff writes about in her recently published book (previously blogged about it here). Essentially, an Orwellian Big Brother (several of them, in fact) tracks one’s activity through smart phone apps and Web browsers, including GPS data revealing movement through real space, not just virtual spaces. This is also the domain of the national security state from local law enforcement to the various security branches of the Federal government: dragnet surveillance where everyone is watched continuously. Again, end users shrug off surveillance as either no big deal or too late to resist.

The most recent step is that, like the Internet itself, various platforms have been functioning for some time already as public utilities and accordingly fallen under demand for regulation with regard to authenticity, truth, and community standards of allowable speech. Thus, private corporations have been thrust unexpectedly into the role of regulating content. Problem is, unlike broadcast networks that create their own content and can easily enforce restrictive standards, crowd-sourced platforms enable the general population to upload its own content, often mere commentary in text form but increasingly as video content, without any editorial review. These platforms have parried by deploying and/or modifying their preexisting surveillance algorithms in search of objectionable content normally protected as free speech and taken steps to remove content, demonetize channels, and ban offending users indefinitely, typically without warning and without appeal.

If Internet entrepreneurs initially got into the biz to make a few (or a lot of) quick billions, which some few of them have, they have by virtue of the global reach of their platforms been transformed into censors. It’s also curious that by enabling end uses to publish to their platforms, they’ve given voice to the masses in all their unwashed glory. Now, everyone’s crazy, radicalized uncle (or sibling or parent or BFF) formerly banished to obscurity railing against one thing or another at the local tavern, where he was tolerated as harmless so long as he kept his bar tab current, is proud to fly his freak flag anywhere and everywhere. Further, the anonymous coward who might issue death or bomb threats to denounce others has been given means to distribute hate across platforms and into the public sphere, where it gets picked up and maybe censored. Worst of all, the folks who monitor and decide what is allowed, functioning as modern-day thought police, are private citizens and corporations with no oversight or legal basis to act except for the fact that everything occurs on their respective platforms. This is a new aspect to the corporatocracy but not one anyone planned.

I’ve been on the sidelines of the Chicago Symphony Orchestra (CSO) musicians’ union labor action — a strike now extending into its second month with no apparent resolution in sight — and reticent to take a strong position. This might be surprising considering that I’m a natural ally of the musicians in at least two respects: (1) my support for the labor movement in general, and (2) my sustained interest in classical music as both a listener and practitioner. On balance, I have two objections that hold me back: (1) difficulty empathizing with anyone already well compensated for his or her work (CSO base salary is more than $160K per year; many make considerably more), and (2) the argument that as a premier arts institution, the organization should take no heed of economic effects being felt universally and visited on many who actually suffer deprivations beyond lost prestige.

To buttress their position, the Musicians of the CSO (why do the musicians operate a website distinct from the organization as a whole?) issued a press release in late March 2019 (PDF link). I’ve no desire to analyze it paragraph-by-paragraph, but I want to bring a few bits forward:

For more than 50 years, the Chicago Symphony Orchestra has been touted as the nation’s finest – able to draw talent from across the globe. [emphasis added]

Music is not a championship endeavor despite the plethora of televised lip-syncing singing contests. No one orchestra can lay reasonable claim to being the best. Smacks of hubris. Simply change that to “as among the nation’s finest” and I’m OK with it.

In the last seven years the Orchestra’s salary has not kept up with inflation. Further, the Orchestra’s benefit package has fallen behind that of Los Angeles and San Francisco. Now, the Association is attempting to change a fundamental tenet of the security of the Orchestra – and American life – our pension plan.

Well boo hoo for you. Many of the fundamental tenets of American life have been steadily stripped away from the population over the past 40 years or so. The very existence of a pension plan is exceptional for many in the labor force, not to mention the handsome salary and other benefits, including a 20-hour workweek, that CSO musicians enjoy. (Admittedly, a lot of outside preparation is necessary to participate effectively.) I understand that comparison with sister institutions in LA, SF, and NYC provide context, but cost of living differences at the coasts ought to be part of that context, too. Keeping up with the Joneses in this instance is a fool’s errand. And besides, those three cities suffer considerably with homeless and destitute populations that line the sidewalks and alleys. Chicago has somehow managed to displace most of its homeless population (mostly through harassment, not humanitarian aid), though one cannot avoid a phalanx of panhandlers outside Chicago Symphony Center on concert nights. Still, it’s nothing compared to conditions in downtown SF, which have gotten so bad with people living, peeing, and shitting in the street that an infamous poop map is available to help pedestrians avoid the worst of it. (I’ve no idea what the sidewalk outside Davies Symphony Hall in SF is like, but the location appears to be in the area of greatest poop concentration.) LA’s skid row is another district straight out of hell.

With many of the musicians already vested, our concern is truly about the future of the Orchestra – its ability to retain and attract great talent – a concern shared by Maestro Muti, Daniel Barenboim, and many of the world’s other finest orchestras and leaders.

This is not a concern of mine in the slightest. Sure, musicians play musical chairs, swapping around from orchestra to orchestra as opportunities arise, just like other workers traipse from job to job throughout their working lives. So what? A performing position with the CSO has long been a terminal position from which many players retire after more than 50 years of service (if they’re so fortunate to be hired by the orchestra in their 20s). I cannot estimate how many top-tier musicians forego auditions for the CSO due to perceived inadequacies with compensation or working conditions. Maybe that explains the years-long inability to hire and/or retain personnel for certain principal chairs. Still, I’m not convinced at all by “we’re the best yet we can’t compete without excessive compensation” (or shouldn’t have to). Similar arguments for ridiculously inflated CEO pay to attract qualified individuals fall on deaf ears.

An overview of the musicians’ strike was published by Lawrence A. Johnson at Chicago Classical Review, which provides details regarding the musicians’ demands. According to Johnson, the public’s initial support of the strike has turned sour. Comments I’ve been reading and my own reaction have followed exactly this trajectory. Lawrence also uses the term tone deaf to describe the musicians, though he’s diplomatic enough to avoid saying it himself, noting that the charge comes from commentators. I won’t be nearly so diplomatic. Musicians, stop this nonsense now! Demands far in excess of need, far in excess of typical workers’ compensation, and far in excess of your bargaining position do you no credit. In addition, although season ticket holders may express dismay at lost opportunities to hear certain concerts, soloists, and repertoire due to the work stoppage, the CSO is not a public utility that must keep working to maintain public wellbeing. Alternatives in greater Chicagoland can easily take up your slack for those in need of a classical music fix. Indeed, I haven’t been to a CSO concert in years because they’ve become anodyne. My CSO love affair is with the recorded legacy of the 1970s and 80s.

By striking, you’re creating a public relations nightmare that will drive people away, just as the baseball strike and take-a-knee controversy in football (and elsewhere) sent sports fans scrambling for the exits. You’re tone deaf regarding actual workplace and contract insufficiency many others confront regularly, as well as the economic realities of Chicago, Illinois, the U.S. and indeed the globe. Get over yourselves.

Throughout human history, the question “who should rule?” has been answered myriad ways. The most enduring answer is simple: he who can muster and deploy the most force of arms and then maintain control over those forces. Genghis Khan is probably the most outrageously successful example and is regarded by the West as a barbarian. Only slightly removed from barbarians is the so-called Big Man, who perhaps adds a layer of diplomacy by running a protection racket while selectively providing and distributing spoils. As societies move further away from subsistence and immediacy, various absolute rulers are established, often through hereditary title. Call them Caesar, chief, dear leader, emir, emperor (or empress), kaiser, king (or queen), pharaoh, premier, el presidente, sultan, suzerain, or tsar, they typically acquire power through the accident of birth and are dynastic. Some are female but most are male, and they typically extract tribute and sometimes demand loyalty oaths.

Post-Enlightenment, rulers are frequently democratically elected administrators (e.g., legislators, technocrats, autocrats, plutocrats, kleptocrats, and former military) ideally meant to be representative of common folks. In the U.S., members of Congress (and of course the President) are almost wholly drawn from the ranks of the wealthy (insufficient wealth being a de facto bar to office) and are accordingly estranged from American life the many different ways most of us experience it. Below the top level of visible, elected leaders is a large, hidden apparatus of high-level bureaucratic functionaries (often appointees), the so-called Deep State, that is relatively stable and made up primarily of well-educated, white-collar careerists whose ambitions for themselves and the country are often at odds with the citizenry.

I began to think about this in response to a rather irrational reply to an observation I made here. Actually, it wasn’t even originally my observation but that of Thomas Frank, namely, that the Deep State is largely made up of the liberal professional class. The reply reinforced the notion who better to rule than the “pros”? History makes the alternatives unthinkable. Thus, the Deep State’s response to the veritable one-man barbarian invasion of the Oval Office has been to seek removal of the interloper by hook or by crook. (High office in this case was won unexpectedly and with unnamed precedent by rhetorical force — base populism — rather than by military coup, making the current occupant a quasi-cult leader; similarly, extracted tribute is merely gawking attention rather than riches.)

History also reveals that all forms of political organization suffer endemic incompetence and corruption, lending truth to Winston Churchill’s witticism “Democracy is the worst form of government, except for all the others.” Indeed, recent rule by technocrats has been supremely awful, leading to periodic market crashes, extreme wealth inequality, social stigmatization, and forever wars. Life under such rule is arguably better than under various other political styles; after all, we gots our vaunted freedoms and enviable material comforts. But the exercise of those freedoms does not reliably deliver either ontological security or psychological certainty we humans crave. In truth, our current form of self-governance has let nothing get in the way of plundering the planet for short-term profit. That ongoing priority is making Earth uninhabitable not just for other species but for humans, too. In light of this fact, liberal technocratic democracy could be a far worse failure than most: it will have killed billions (an inevitability now under delayed effect).

Two new grassroots movements (to my knowledge) have appeared that openly question who should rule: the Sunrise Movement (SM) and the Extinction Rebellion (ER). SM is a youth political movement in the U.S. that acknowledges climate change and supports the Green New Deal as a way of prioritizing the desperate existential threat modern politics and society have become. For now at least, SM appears to be content with working within the system, replacing incumbents with candidates it supports. More intensely, ER is a global movement centered in the U.K. that also acknowledges that familiar modern forms of social and political organization (there are several) no longer function but in fact threaten all of us with, well, extinction. One of its unique demands is that legislatures be drawn via sortition from the general population to be more representative of the people. Further, sortition avoids the established pattern of those elected to lead representational governments from being corrupted by the very process of seeking and attaining office.

I surmise attrition and/or replacement (the SM path) are too slow and leave candidates vulnerable to corruption. In addition, since no one relinquishes power willingly, current leaders will have to be forced out via open rebellion (the ER path). I’m willing to entertain either path but must sadly conclude that both are too little, too late to address climate change and near-term extinction effectively. Though difficult to establish convincingly, I suspect the time to act was in the 1970s (or even before) when the Ecology Movement arose in recognition that we cannot continue to despoil our own habitat without consequence. That message (social, political, economic, and scientific all at once) was as inert then as it is now. However, fatalism acknowledged, some other path forward is better than our current systems of rule.

First, a bit of history. The U.S. Constitution was ratified in 1788 and superseded the Articles of Confederation. The first ten Amendments, ratified in 1791 (rather quickly after the initial drafting and adoption of the main document — oops, forgot these obvious assumptions), are known as the Bill of Rights. The final amendment to date, the 27th Amendment, though proposed in 1789 along with others, was not ratified until 1992. A half dozen additional amendments approved by Congress have not yet been ratified, and a large number of other unapproved amendments have been proposed.

The received wisdom is that, by virtue of its lengthy service as the supreme law of the land, the U.S. Constitution has become sacrosanct and invulnerable to significant criticism and further amendment. That wisdom has begun to be questioned actively as a result of (at least) two factors: (1) recognition that the Federal government serves the common good and citizenry rather poorly, having become corrupt and dysfunctional, and (2) the Electoral College, an anachronism from the Revolutionary Era that skews voting power away from cities, handed two recent presidential elections to candidates who failed to win the popular vote yet won in the Electoral College. For a numerical analysis of how electoral politics is gamed to subvert public opinion, resulting in more government seats held by Republicans than voting (expressing the will of the people) would indicate, see this article by the Brookings Institute.

These are issues of political philosophy and ongoing public debate, spurred by dissatisfaction over periodic Federal shutdowns, power struggles between the executive and legislative branches that are tantamount to holding each other hostage, and income inequality that pools wealth and power in the hands of ever fewer people. The judicial branch (especially the U.S. Supreme Court) is also a significant point of contention; its newly appointed members are increasingly right wing but have not (yet) taken openly activist roles (e.g., reversing Roe v. Wade). As philosophy, questioning the wisdom of the U.S. Constitution requires considerable knowledge of history and comparative government to undertake with equanimity (as opposed to emotionalism). I don’t possess such expert knowledge but will observe that the U.S. is an outlier among nations in relying on a centuries-old constitution, which may not have been the expectation or intent of the drafters.

It might be too strong to suggest just yet that the public feels betrayed by its institutions. Better to say that, for instance, the U.S. Constitution is now regarded as a flawed document — not for its day (with limited Federal powers) but for the needs of today (where the Federal apparatus, including the giant military, has grown into a leviathan). This would explain renewed interest in direct democracy (as opposed to representative government), flirtations with socialism (expanded over the blended system we already have), and open calls for revolution to remove a de facto corporatocracy. Whether the U.S. Constitution can or should survive these challenges is the question.

Update

Seems I was roughly half a year early. Harper’s Magazine has as its feature article for the October 2019 issue a serendipitous article: “Constitution in Crisis” (not behind a paywall, I believe). The cover of the issue, however, poses a more provocative question: “Do We Need the Constitution?” Decide for yourself, I suppose, if you’re aligned with the revolutionary spirit.

For a time after the 2008 financial collapse, skyscraper projects in Chicago came to a dead halt, mostly due to dried-up financing. My guess (since I don’t know with any reliability) is that much the same obtained worldwide. However, the game appears to be back on, especially in New York City, one of few cities around the globe where so-called “real money” tends to pool and collect. Visual Capitalist has an interesting infographic depicting changes to the NYC skyline every 20 years. The number of supertalls topping 1,000 feet expected by 2020 is quite striking.

Courtesy of Visual Capitalist

The accompanying text admits that NYC is left in the dust by China, specifically, the Pearl River Delta Megacity, which includes Hong Kong, Shenzhen, Macau, and others. As I’ve written before, the mad rush to build (earning ridiculous, absurd, imaginary prestige points awarded by and to exactly no one) takes no apparent notice of a slo-mo crack-up in the way modern societies organize and fund themselves. The new bear market might give one … um, pause.

Also left in the dust is Chicago, home of the original skyscraper. Since the 2008 collapse, Chicago’s most ambitious project, the ill-fated Chicago Spire (a/k/a the Fordham Spire) was abandoned despite a big hole dug in the ground and some foundation work completed. An absence of completed prestige projects since 2008 means Chicago has been lapped several times over by NYC, not that anyone is counting. The proposed site of the Chicago Spire is too enticing, however — just inside Lake Shore Drive at the mouth of the Chicago River — for it to be dormant for long. Indeed, a press release last year (escaped my attention at the time) announced redevelopment of the site, and a slick website is operating for now (linked in the past to similar sites that went abandoned along with their subject projects). Also reported late last year, Chicago appears to have rejoined the game in earnest, with multiple projects already under construction and others in the planning/approval phases.

So if hiatus was called the last time we crashed financially (a regular occurrence, I note), it seems we’ve called hiatus on the hiatus and are back in a mad, futile race to remake modernity into gleaming vertical cities dotting the globe. Such hubris and exuberance might be intoxicating to technophiles, but I’m reminded of a observation (can’t locate a quote, sorry) to the effect that civilizations’ most extravagant projects are undertaken just before their collapses. Our global civilization is no different.