Archive for the ‘Corporatism’ Category

I pull in my share of information about current events and geopolitics despite a practiced inattention to mainstream media and its noisome nonsense. (See here for another who turned off the MSM.) I read or heard somewhere (can’t remember where) that most news outlets and indeed most other media, to drive traffic, now function as outrage engines, generating no small amount of righteousness, indignation, anger, and frustration at all the things so egregiously wrong in our neighborhoods, communities, regions, and across the world. These are all negative emotions, though legitimate responses to various scourges plaguing us currently, many of which are self-inflicted. It’s enough aggregate awfulness to draw people into the street again in principled protest, dissent, and resistance; it’s not yet enough to effect change. Alan Jacobs comments about outrage engines, noting that sharing via retweets is not the same as caring. In the Age of Irony, a decontextualized “yo, check this out!” is nearly as likely to be interpreted as support rather than condemnation (or mere gawking for entertainment value). Moreover, pointing, linking, and retweeting are each costless versions of virtue signaling. True virtue makes no object of publicity.

So where do I get my outrage quotient satisfied? Here is a modest linkfest, in no particular order, of sites not already on my blogroll. I don’t habituate these sites daily, but I drop in, often skimming, enough to keep abreast of themes and events of importance. (more…)

This past Thursday was an occasion of protest for many immigrant laborers who did not show up to work. Presumably, this action was in response to recent executive attacks on immigrants and hoped to demonstrate how businesses would suffer without immigrant labor doing jobs Americans frequently do not want. Tensions between the ownership and laboring classes have a long, tawdry history I cannot begin to summarize. As with other contextual failures, I daresay the general public believes incorrectly that such conflicts date from the 19th century when formal sociopolitical theories like Marxism were published, which intersect heavily with labor economics. An only slightly better understanding is that the labor movement commenced in the United Kingdom some fifty years after the Industrial Revolution began, such as with the Luddites. I pause to remind that the most basic, enduring, and abhorrent labor relationship, extending back millennia, is slavery, which ended in the U.S. only 152 years ago but continues even today in slightly revised forms around the globe.

Thursday’s work stoppage was a faint echo of general strikes and unionism from the middle of the 20th century. Gains in wages and benefits, working conditions, and negotiating position transferred some power from owners to laborers during that period, but today, laborers must sense they are back on their heels, defending conditions fought for by their grandparents but ultimately losing considerable ground. Of course, I’m sympathetic to labor, considering I’m not in the ownership class. (It’s all about perspective.) I must also admit, however, to once quitting a job after only one day that was simply too, well, laborious. I had that option at the time, though it ultimately led nearly to bankruptcy for me — a life lesson that continues to inform my attitudes. As I survey the scene today, however, I suspect many laborers — immigrants and native-born Americans alike — have the unenviable choice of accepting difficult, strenuous labor for low pay or being unemployed. Gradual reduction of demand for labor has two main causes: globalization and automation.


I pause periodically to contemplate deep time, ancient history, and other subjects that lie beyond most human conceptual abilities. Sure, we sorta get the idea of a very long ago past out there in the recesses or on the margins, just like we get the idea of U.S. sovereign debt now approaching $20 trillion. Problem is, numbers lose coherence when they mount up too high. Scales differ widely with respect to time and currency. Thus, we can still think reasonably about human history back to roughly 6,000 years ago, but 20,000 years ago or more draws a blank. We can also think about how $1 million might have utility, but $1 billion and $1 trillion are phantoms that appear only on ledgers and contracts and in the news (typically mergers and acquisitions). If deep time or deep debt feel like they don’t exist except as conceptual categories, try wrapping your head around the deep state , which in the U.S. is understood to be a surprisingly large rogue’s gallery of plutocrats, kleptocrats, and oligarchs drawn from the military-industrial-corporate complex, the intelligence community, and Wall Street. It exists but does so far enough outside the frame of reference most of us share that it effectively functions in the shadow of daylight where it can’t be seen for all the glare. Players are plain enough to the eye as they board their private jets to attend annual meetings of the World Economic Forum in Davos-Klosters, Switzerland, or two years ago the Jackson Hole [Economic] Summit in Jackson Hole, WY, in connection with the American Principles Project, whatever that is. They also enjoy plausible deniability precisely because most of us don’t really believe self-appointed masters of the universe can or should exist.

Another example of a really bad trip down the rabbit hole, what I might call deep cynicism (and a place I rarely allow myself to go), appeared earlier this month at Gin and Tacos (on my blogroll):

The way they [conservatives] see it, half the kids coming out of public schools today are basically illiterate. To them, this is fine. We have enough competition for the kinds of jobs a college degree is supposed to qualify one for as it is. Our options are to pump a ton of money into public schools and maybe see some incremental improvement in outcomes, or we can just create a system that selects out the half-decent students for a real education and future and then warehouse the rest until they’re no longer minors and they’re ready for the prison-poverty-violence cycle [add military] to Hoover them up. Vouchers and Charter Schools are not, to the conservative mind, a better way to educate kids well. They are a cheaper way to educate them poorly. What matters is that it costs less to people like six-figure income earners and home owners. Those people can afford to send their kids to a decent school anyway. Public education, to their way of thinking, used to be about educating people just enough that they could provide blue collar or service industry labor. Now that we have too much of that, a public high school is just a waiting room for prison. So why throw money into it? They don’t think education “works” anyway; people are born Good or Bad, Talented or Useless. So it only makes sense to find the cheapest possible way to process the students who were written off before they reached middle school. If charter schools manage to save 1% of them, great. If not, well, then they’re no worse than public schools. And they’re cheaper! Did I mention that they’re cheaper?

There’s more. I provided only the main paragraph. I wish I could reveal that the author is being arch or ironic, but there is no evidence of that. I also wish I could refute him, but there is similarly no useful evidence for that. Rather, the explanation he provides is a reality check that fits the experience of wide swaths of the American public, namely, that “public high school is just a waiting room for prison” (soon and again, debtor’s prison) and that it’s designed to be just that because it’s cheaper than actually educating people. Those truly interesting in being educated will take care of it themselves. Plus, there’s additional money to be made operating prisons.

Deep cynicism is a sort of radical awareness that stares balefully at the truth and refuses to blink or pretend. A psychologist might call it the reality principle; a scientist might aver that it relies unflinchingly on objective evidence; a philosopher might call it strict epistemology. To get through life, however, most of us deny abundant evidence presented to us daily in favor of dreams and fantasies that assemble into the dominant paradigm. That paradigm includes the notions that evil doesn’t really exist, that we’re basically good people who care about each other, and that our opportunities and fates are not, on the whole, established long before we begin the journey.

I attended a fundraiser a short while back. It’s familiar territory for me, filled with gifts culled from local businesses and corporations to be resold at auction, portable kitchens and bars to feed and libate guests to break down their inhibitions to giving, and lots of high heels and party dresses (with ample cleavage). Men rarely strut and parade the way the women do; tuxedos are the rare except. Secondary and tertiary activities are typical, often a DJ or live band that plays so loudly sensible people would flee the room rather than slowly go deaf. But monstrous volume in the era of amplified everything has dulled that reflex to nothingness. Or people are by now already deaf from habitual exposure to arena-rock volume filtered down to small venues. Folks simply, stupidly tough it out, ending the night with their ears ringing and their voices hoarse from screaming over the noise just to be heard.

Beneficiaries of fundraisers usually fall into two categories that are poorly served by American institutions: those seeking quality educations (including public schools that ought to be better funded through taxation) and folks suffering from catastrophic illness or disease that is ideally meant to be covered by health insurance but in practice is not. Auctioneers do pretty well enticing people to part with their money. It’s a true skill. But then, who goes to a fundraiser determined to hold tightly to their hard-earned cash? (Don’t answer that question.) Silent auctions may accompany the live auction, but the group vibe definitely contributes to some competition to outbid the next person (a wallet- or dick-measuring exercise?). Auction items are mostly luxury items, things normal Americans wouldn’t consider buying except when associated with charitable giving. Getting something for one’s charity (bought under or over its presumed market value) also shifts some portion of the philanthropic burden to those entities donating gifts.

All this is preliminary the most appallingly tone-deaf item offered for auction: a 4-person safari to a game preserve in South Africa to hunt and kill a wildebeest. When the auctioneer described the item, everyone in my vicinity looked at each other as if to say “what the fuck?” Certainly, humans have a long history of hunting game purely for sport (which is to say, not for food), and from the perspective of a South African safari outfitter, wild animals are a natural resource to be exploited the same way that, for instance, mining and agriculture is conducted throughout the world, but the last few years have seen a notable change of heart about killing animals, especially so-called romance animals (mostly large mammals, including whales, less so large fish), without need or admirable purpose. The outcry over an American dentist killing Cecil the Lion was an expression of that sentiment. So, too, was the killing of a gorilla at the Cincinnati Zoo after a child fell into the enclosure. (Personally, considering how few of them exist, I would privilege the life of the gorilla over the child, but that’s a mine field.) Pictures of Donald Trump’s sons standing over their trophy kills have also elicited significant disapproval. We are now acutely aware that wild animals are not an inexhaustible resource (and never were — consider the passenger pigeon).

I judged that bidding on the safari was no more or less robust than other auction items, but I mentioned aloud that if I were to bid on it, I would probably go on the safari but would also insist on merely paintballing the poor wildebeest, a relatively harmless proxy for killing it needlessly. Admittedly, the wildebeest would experience the same existential terror as if it were being hunted to death, but at least it would live. Or it would live until the next safari came round. Hunting and killing a wildebeest or other large game has never been on my bucket list, and its appearance at auction would not suddenly inspire me to add it to the list. That is the province of of a class of fools rich and insulated enough to still regard the world as their playground, with no thought of responsibility, stewardship, or consequences.

Continuing from my previous post, Brian Phillips has an article, writing for MTV News, entitled “Shirtless Trump Saves Drowning Kitten: Facebook’s fake-news problem and the rise of the postmodern right.” (Funny title, that.) I navigated to the article via Alan Jacob’s post at Text Patterns (on my blogroll). Let me consider each in turn.

After chuckling that Phillips is directing his analysis to the wrong audience, an admittedly elitist response on my part, I must further admit that the article is awfully well-written and nails the blithe attitude accompanying epistemological destruction carried out, perhaps unwittingly but too well-established now to ignore, by developers of social media as distinguished from traditional news media. Which would be considered more mainstream today is up for debate. Maybe Phillips has the right audience after all. He certainly gets the importance of controlling the narrative:

Confusion is an authoritarian tool; life under a strongman means not simply being lied to but being beset by contradiction and uncertainty until the line between truth and falsehood blurs and a kind of exhaustion settles over questions of fact. Politically speaking, precision is freedom. It’s telling, in that regard, that Trump supporters, the voters most furiously suspicious of journalism, also proved to be the most receptive audience for fictions that looked journalism-like. Authoritarianism doesn’t really want to convince its supporters that their fantasies are true, because truth claims are subject to verification, and thus to the possible discrediting of authority. Authoritarianism wants to convince its supporters that nothing is true, that the whole machinery of truth is an intolerable imposition on their psyches, and thus that they might as well give free rein to their fantasies.

But Phillips is too clever by half, burying the issue in scholarly style that speaks successfully only to a narrow class of academics and intellectuals, much like the language and memes employed by the alt-right are said to be dog whistles perceptible only to rabid, mouth-breathing bigots. Both charges are probably unfair reductions, though with kernels of truth. Here’s some of Phillips overripe language:

Often the battleground for this idea [virtue and respect] was the integrity of language itself. The conservative idea, at that time [20 years ago], was that liberalism had gone insane for political correctness and continental theory, and that the way to resist the encroachment of Derrida was through fortifying summaries of Emerson … What had really happened was that the left had become sensitized to the ways in which conventional moral language tended to shore up existing privilege and power, and had embarked on a critique of this tendency that the right interpreted, with some justification, as an attack on the very concept of meaning.

More plainly, Phillips’ suggestion is that the radical right learned the lessons of Postmodernism (PoMo) even better than did the avant-garde left, the latter having outwitted themselves by giving the right subtle tools used later to outmaneuver everyone. Like other mildly irritating analyses I have read, it’s a statement of inversion: an idea bringing into existence its antithesis that unironically proves and undermines the original, though with a dose of Schadenfreude. This was (partially) the subject of a 4-part blog I wrote called “Dissolving Reality” back in Aug. and Sept. 2015. (Maybe half a dozen read the series; almost no one commented.)

So what does Alan Jacobs add to the discussion? He exhibits his own scholarly flourishes. Indeed, I admire the writing but find myself distracted by the writerly nature, which ejects readers from the flow of ideas to contemplate the writing itself. For instance, this:

It turns out that the children of the ruling classes learned their lessons well, so when they inherited positions in their fathers’ law firms they had some extra, and very useful, weapons in their rhetorical armory.

In precisely the same way, when, somewhat later, academic leftists preached that race and gender were the determinative categories of social analysis, members of the future alt-right were slouching in the back rows of their classrooms, baseball caps pulled down over their eyes, making no external motions but in their dark little hearts twitching with fervent agreement.

Terrific capture of the classroom culture in which teachers are steeped. Drawing identity politics more manifestly into the mix is a fairly obvious extrapolation over Phillips and may reflect the results of the presidential election, where pundits, wheeling around to reinterpret results that should not have so surprised them, now suggest Republican victories are a repudiation of leftist moral instruction. The depth of Phillips’ and Jacobs’ remarks is not so typical of most pundits, however, and their follow-up analysis at some point becomes just more PoMo flagellation. Here, Jacobs is even more clearly having some fun:

No longer did we have to fear being brought before the bar of Rational Evidence, that hanging judge of the Enlightenment who had sent so many believers to the gallows! You have your constructs and we have our constructs, and who’s to say which are better, right? O brave new world that hath such a sociology of knowledge in it!

This goes back to the heart of the issue, our epistemological crisis, but I dispute that race and gender are the determinative categories of social analysis, no matter how fashionable they may be in the academy. A simpler and more obvious big picture controls: it’s about life and death. My previous post was about geopolitics, where death is rained down upon foreign peoples and justifying rhetoric is spread domestically. Motivations may be complex and varied, but the destruction of people and truth affects everyone, albeit unevenly, without regard to race, gender, religion, nationality, etc. All are caught in the dragnet.

Moreover, with the advent of Western civilization, intellectuals have always been sensitive to the sociology of knowledge. It’s a foundation of philosophy. That it’s grown sclerotic long precedes PoMo theory. In fact, gradual breaking apart and dismantling of meaning is visible across all expressive genres, not just literature. In painting, it was Impressionism, Cubism, Dada and Surrealism, and Abstract Expressionism. In architecture, it was Art Deco, the International Style, Modernism, Brutalism, and Deconstructivism. In music, it was the Post-Romantic, the Second Viennese School, Modernism, Serialism, and Minimalism. In scientific paradigms, it was electromagnetism, relativity, quantum mechanics, the Nuclear Era, and semiconductors. The most essential characteristics in each case are increasingly dogmatic abstraction and drilling down to minutia that betray meaningful essences. Factoring in economic and political perversions, we arrive at our current epistemological phase where truth and consequences matter little (though death and destruction still do) so long as deceits, projections, and distractions hold minds in thrall. In effect, gravity is turned off and historical narratives levitate until reality finally, inevitably comes crashing down in a monstrous Jenga pile, as it does periodically.

In the meantime, I suppose Phillips and Jacobs can issue more gaseous noise into the fog bank the information environment has become. They can’t get much traction (nor can I) considering how most of the affluent West thinks at the level of a TV sitcom. In addition, steps being considered to rein in the worst excesses of fake news would have corporations and traditional news media appointed as watchers and censors. Beyond any free speech objections, which are significant, expecting culprits to police themselves only awards them greater power to dominate, much like bailouts rewarded the banks. More fog, more lies, more levitation.

Stray links build up over time without my being able to handle them adequately, so I have for some time wanted a way of purging them. I am aware of other bloggers who curate and aggregate links with short commentaries quite well, but I have difficulty making my remarks pithy and punchy. That said, here are a few that I’m ready to purge in this first attempt to dispose of a few links from by backlog.

Skyfarm Fantasies

Futurists have offered myriad visions of technologies that have no hope of being implemented, from flying cars to 5-hour workweeks to space elevators. The newest pipe dream is the Urban Skyfarm, a roughly 30-story tree-like structure with 24 acres of space using solar panels and hydroponics to grow food close to the point of consumption. Utopian engineering such as this crops up frequently (pun intended) and may be fun to contemplate, but in the U.S. at least, we can’t even build high-speed rail, and that technology is already well established elsewhere. I suppose that’s why cities such as Seoul and Singapore, straining to make everything vertical for lack of horizontal space, are the logical test sites.

Leaving Nashville

The City of Nashville is using public funds to buy homeless people bus tickets to leave town and go be poor somewhere else. Media spin is that the city is “helping people in need,” but it’s obviously a NIMBY response to a social problem city officials and residents (not everyone, but enough) would rather not have to address more humanely. How long before cities begin completing with each other in numbers of people they can ship off to other cities? Call it the circle of life when the homeless start gaming the programs, revisiting multiple cities in an endless circuit.


Over at Rough Type, Nick Carr points to an article in The Nation entitled “Instagram and the Fantasy of of Mastery,” which argues that a variety of technologies now give “artists” the illusion of skill, merit, and vision by enabling work to be easily executed using prefab templates and stylistic filters. For instance, in pop music, the industry standard is to auto-tune everyone’s singing to hide imperfections. Carr’s summary probably is better than the article itself and shows us the logical endpoint of production art in various media undertaken without the difficult work necessary to develop true mastery.

Too Poor to Shop

The NY Post reported over the summer that many Americans are too poor to shop except for necessities. Here are the first two paragraphs:

Retailers have blamed the weather, slow job growth and millennials for their poor results this past year, but a new study claims that more than 20 percent of Americans are simply too poor to shop.

These 26 million Americans are juggling two to three jobs, earning just around $27,000 a year and supporting two to four children — and exist largely under the radar, according to America’s Research Group, which has been tracking consumer shopping trends since 1979.

Current population in the U.S. is around 325 million. Twenty percent of that number is 65 million; twenty-six million is 8 percent. Pretty basic math, but I guess NY Post is not to be trusted to report even simple things accurately. Maybe it’s 20% of U.S. households. I dunno and can’t be bothered to check. Either way, that’s a pretty damning statistic considering the U.S. stock market continues to set new all-time highs — an economic recovery not shared with average Americans. Indeed, here are a few additional newsbits and links stolen ruthlessly from

  • The number of Americans that are living in concentrated areas of high poverty has doubled since the year 2000.
  • In 2007, about one out of every eight children in America was on food stamps. Today, that number is one out of every five.
  • 46 million Americans use food banks each year, and lines start forming at some U.S. food banks as early as 6:30 in the morning because people want to get something before the food supplies run out.
  • The number of homeless children in the U.S. has increased by 60 percent over the past six years.
  • According to Poverty USA, 1.6 million American children slept in a homeless shelter or some other form of emergency housing last year.

For further context, theeconomiccollapseblog also points to “The Secret Shame of Middle Class Americans” in The Atlantic, which reports, among other things, that fully 47% of Americans would struggle to scrape together a mere $400 in an emergency.

How do such folks respond to the national shopping frenzy kicking off in a few days with Black Friday, Small Business Saturday, Charitable Sunday, and Cyber Monday? I suggest everyone stay home.

rant on/

Monastic pursuit of a singular objective, away from the maddening and distracting rush of modern life, is a character attribute that receives more than its rightful share of attention. In its salutary forms, monastic pursuit is understood as admirable, visionary, iconic (or iconoclastic), and heroic. In creative endeavors, seclusion and disengagement from feedback are preconditions for finding one’s true voice and achieving one’s vision. In sports, the image of the athlete devoted to training for the big event — race, match, tournament — to the exclusion of all else is by now a tired trope. Indeed, in this Olympics season, athlete profiles — puff pieces of extraordinary predictability — typically depict competitors in isolation, absolutely no one else at the gym, in the pool, on the track, etc., as though everyone goes it alone without the support or presence of coaches or teammates. Over-specialization and -achievement are such that spectators are conditioned to expect successful individuals, champions, to bleed (quite literally) as a mark of devotion to their respective fields.

At some point, however, monastic pursuit morphs into something more recognizably maniacal. The author retreating to his cabin in the woods to write the great American novel becomes the revolutionary hermit composing his political manifesto. Healthy competition among rivals turns into decidedly unsportsmanlike conduct. (Lance Armstrong is the poster boy not just for doping but also for the sociopathy he displayed mistreating teammates and perpetuating the lie as vehemently and as long as he did. Further examples compound quickly in sports). Business leaders, discontented with (sometime obscene) profitability, target others in their market sector with the intent of driving them out of business and establishing monopolies. (This contrasts markedly with the ideology of self-correcting markets many CEOs falsely espouse.) In politics, high-minded campaigns and elected politicians formed around sound policy and good governance lose out to such dirty tricks as character assassination, rigged and stolen elections, partisanship, and reflexive obstructionism of projects that enjoy popular support. In journalism, fair and balanced reporting inverts to constant harping on preferred talking points to control narratives through sheer force of repetition. You get the idea.

It’s difficult to say from where this intemperate impulse arises, but we’re undoubtedly in a phase of history where nearly every field of endeavor manifests its own version of the arms race. Some might argue that in a cost-benefit analysis, we’re all better off because we enjoy fruits not obtainable without (some folks at least) taking a scorched-earth approach, raising the bar, and driving everyone to greater heights. The willingness of some to distort and disgrace themselves hideously may be a high price to pay, especially when it’s for simple entertainment, but so long as we aren’t paying the price personally, we’re willing spectators to whatever glory and train wrecks occur. I would argue that, ultimately, we’re all paying the price. Routine competition and conflict resolution have grown so unhinged that, just to be in the game, competitors must be prepared to go all in (poker lingo) at even modest provocation. As a result, for just one example, the spirit of America’s erstwhile pastime (baseball) has been so corrupted that balanced players and fans (!) stay away and are replaced by goons. A true level playing field probably never existed. Now, however, whoever can muster the most force (financial, rhetorical, criminal) wins the trophy, and we’re each in turn encouraged to risk all in our own monastic pursuit.

rant off/

An enduring trope of science fiction is naming of newly imagined gadgets and technologies (often called technobabble with a mixture of humor and derision), as well as naming segments of human and alien societies. In practice, that means renaming already familiar things to add a quasi-futuristic gleam, and it’s a challenge faced by every story that adopts an alternative or futuristic setting: describing the operating rules of the fictional world but with reference to recognizable human characteristics and institutions. A variety of recent Young Adult (YA) fiction has indulged in this naming and renaming, some of which have been made into movies, mostly dystopic in tone, e.g., the Hunger Games tetralogy, the Twilight saga, the Harry Potter series, the Maze Runner, and the Divergent trilogy. (I cite these because, as multipart series, they are stronger cultural touchstones, e.g., Star Wars, than similar once-and-done adult cinematic dystopias, e.g., Interstellar and ElyseumStar Trek is a separate case, considering how it has devolved after being rebooted from its utopian though militaristic origins into a pointless series of action thrillers set in space.) Some exposition rises to the level of lore but is mostly mere scene-setting removed slightly from our own reality. Similar naming schemes are used in cinematic universes borne out of comic books, especially character names, powers, and origins. Because comic book source material is extensive, almost all of it becomes lore, which is enjoyed by longtime children initiates into the alternate universes created by the writers and illustrators but mildly irritating to adult moviegoers like me.

History also has names for eras and events sufficiently far back in time for hindsight to provide a clear vantage point. In the U.S., we had the Colonial Era, the Revolutionary Period, The Frontier Era and Wild West, the Industrial/Mechanical Age, Modernism, and Postmodernism, to name a few but by no means all. Postmodernism is already roughly 40 years old, yet we have not yet named the era in which we now live. Indeed, because we’re the proverbial fish inside the fishbowl, unable to recognize the water in which we swim, the contemporary moment may have no need of naming, now or at any given time. That task awaits those who follow. We have, however, given names to the succession of generations following the Baby Boom. How well their signature characteristics fit their members is the subject of considerable debate.

As regular readers of this blog already know, I sense that we’re on the cusp of something quite remarkable, most likely a hard, discontinuous break from our recent past. Being one of the fish in the bowl, I probably possess no better understanding of our current phase of history than the next. Still, if had to choose one word to describe the moment, it would be dissolution. My 4-part blog post about dissolving reality is one attempt to provide an outline. A much older post called aged institutions considers the time-limited effectiveness of products of human social organization. The grand question of our time might be whether we are on the verge of breaking apart or simply transitioning into something new — will it be catastrophe or progress?

News this past week of Britain’s exit from the European Union may be only one example of break-up vs. unity, but the drive toward secession and separatism (tribal and ideological, typically based on bogus and xenophobic identity groups constantly thrown in our faces) has been gaining momentum even in the face of economic globalization (collectivism). Scotland very nearly seceded from the United Kingdom last year; Quebec has had multiple referenda about seceding from Canada, none yet successful; and Vermont, Texas, and California have all flirted with secession from the United States. No doubt some would argue that such examples of dissolution, actual or prospective, are actually transitional, meaning progressive. And perhaps they do in fact fulfill the need for smaller, finer, localized levels of social organization that many have argued are precisely what an era of anticipated resource scarcity demands. Whether what actually manifests will be catastrophe (as I expect it will) is, of course, what history and future historians will eventually name.

The first Gilded Age in the U.S. and the Roaring Twenties were periods that produced an overabundance of wealth for a handful of people. Some of them became synonymous with the term robber baron precisely for their ability to extract and accumulate wealth, often using tactics that to say the least lacked scruples when not downright criminal. The names include Rockefeller, Carnegie, Astor, Mellon, Stanford, Vanderbilt, Duke, Morgan, and Schwab. All have their names associated in posterity with famous institutions. Some are colleges and universities, others are banking and investment behemoths, yet others are place names and commercial establishments. Perhaps the philanthropy they practiced was not entirely generous, as captains of industry (then and today) seem to enjoy burnishing their legacies with a high level of name permanence. Still, one can observe that most of the institutions bearing their names are infrastructure useful to the general public, making them public goods. This may be partly because the early 20th century was still a time of nation building, whereas today is arguably a time of decaying empire.

The second Gilded Age in the U.S. commenced in the 1980s and is still going strong as measured by wealth inequality. However, the fortunes of today’s tycoons appear to be directed less toward public enrichment than toward self-aggrandizement. The very nature of philanthropy has shifted. Two modern philanthropists appear to be transitional: Bill Gates and Ted Turner. The Gates Foundation has a range of missions, including healthcare, education, and access to information technology. Ted Turner’s well-publicized $1 billion gift to the United Nations Foundation in 1997 was an open dare to encourage similar philanthropy among the ultrarich. The Turner Foundation website’s byline is “protecting & restoring the natural world.” Not to be ungrateful or uncharitable, but both figureheads are renowned for highhandedness in the fashion in which they gathered up their considerable fortunes and are redirecting some portion of their wealth toward pet projects that can be interpreted as a little self-serving. Philanthropic efforts by Warren Buffet appear to be less about giving away his own fortune to charities or establishing institutions bearing his name as they are about using his notoriety to raise charitable funds from others sources and thus stimulating charitable giving. The old saying applies especially to Buffet: “no one gets rich by giving it away.” More galling, perhaps, is another group of philanthropists, who seem to be more interested in building shrines to themselves. Two entries stand out: The Lucas Museum (currently seeking a controversial site in Chicago) and The Walmart Museum. Neither resembles a public good, though their press packages may try to convince otherwise.

Charity has also shifted toward celebrity giving, with this website providing philanthropic news and profiles of celebrities complete with their causes and beneficiaries. With such a wide range of people under consideration, it’s impossible to make any sweeping statements about the use or misuse of celebrity, the way entertainers are overcompensated for their talents, or even how individuals such as Richard Branson and Elon Musk have been elevated to celebrity status primarily for being rich. (They undoubtedly have other legitimate claims to fame, but they’re overshadowed in a culture that celebrates wealth before any other attribute.) And then there are the wealthy contributors to political campaigns, such as the Koch brothers, George Soros, and Sheldon Adelson, just to name a few. It’s fair to say that every contributor wants some bang for their buck, but I daresay that political contributors (not strictly charity givers) expect a higher quotient of influence, or in terms more consistent with their thinking, a greater return on investment.

None of this takes into account the charitable work and political contributions stemming from corporations and unions, or indeed the umbrella corporations that exist solely to raise funds from the general public, taking a sizeable share in administrative fees before passing some portion onto the eventual beneficiary. Topical charities and scams also spring up in response to whatever is the latest natural disaster or atrocity. What’s the average citizen to do when the pittance they can donate pales in comparison to that offered by the 1% (which would be over 3 million people in the U.S. alone)? Or indeed how does one guard against being manipulated by scammers (including the burgeoning number of street panhandlers) and political candidates into throwing money at fundamentally insoluble problems? Are monetary gifts really the best way of demonstrating charity toward the needy? Answers to these questions are not forthcoming.

Update: Closure has been achieved on the Lucas Museum coming to Chicago. After 2 years of litigation blocking any building on his proposed site on the lakefront, George Lucas has decided to seek a site in California instead. Both sides had to put their idiotic PR spin on the result, but most people I know are relieved not to have George Lucas making inroads into Chicago architecture. Now if only we could turn back time and stop Donald Trump.

While I’m revisiting old posts, the term digital exhaust came up again in a very interesting article by Shoshana Zuboff called “The Secrets of Surveillance Capitalism,” published in Frankfurter Allgemeine in March 2016. I no longer remember how I came upon it, but its publication in a obscure (to Americans) German newspaper (German and English versions available) was easily located online with a simple title search. The article has certainly not gone viral the way social media trends work, but someone obviously picked it up, promoted it, and raised it to the level of awareness of lots of folks, including me.

My earlier remarks about digital exhaust were that the sheer volume of information produced and exchanged across digital media quickly becomes unmanageable, with the result that much of it becomes pointless ephemera — the useless part of the signal-to-noise ratio. Further, I warned that by turning our attention to digital sources of information of dubious value, quality, and authority, we face an epistemological shift that could take considerable hindsight to describe accurately. That was 2007. It may not yet be long enough to fully understand effect(s) too well, or to crystallize the moment (to reuse my pet phrase yet again), but the picture is already clarifying somewhat terrifyingly.