Archive for the ‘Ethics’ Category

“Language is dynamic” is a phrase invoked in praise or derision of shifts in usage. Corollaries include “the only constant is change” and “time’s arrow points in only one direction” — both signalling that stasis is an invalid and ultimately futile conservative value. The flip side might well be the myth of progress, understood in reference not to technological advancement but human nature’s failure to rise above its base (animal) origins. This failure is especially grotesque considering that humans currently albeit temporarily live in an age of material abundance that would provide amply for everyone if that largesse were justly and equitably produced and distributed. However, resources (including labor) are being systematically exploited, diverted, and hoarded by a small, unethical elite (what some call “alpha chimps”) who often use state power to subjugate vulnerable populations to funnel further tribute to the already obscenely wealthy top of the socioeconomic hierarchy. But that’s a different diatribe.

Although I’m sensitive the dynamism of language — especially terms for broad ideas in need of short, snappy neologisms — I’m resistant to adopting most new coin. For instance, multiple colors of pill (red, blue, white, and black to my knowledge) refer to certain narrative complexes that people, in effect, swallow. Similarly, the “blue church” is used to refer to legacy media struggling desperately (and failing) to retain its last shreds of legitimacy and authority. (Dignity is long gone.) Does language really need these terms or are hipsters just being clever? That question probably lacks a definitive answer.

My real interest with this blog post, however, is how the modern digital mediascape has given rise to a curious phenomenon associated with cancel culture: deletion of tweets and social media posts to scrub one’s past of impropriety as though the tweet or post never happened. (I’ve never deleted a post nor have any plans to.) Silicon Valley hegemons can’t resist getting their piece of the action, too, by applying deeply flawed algorithms to everyone’s content to demonetize, restrict, and/or remove (i.e., censor) offensive opinion that runs counter to (shifting) consensus narratives decided upon in their sole discretion as water carriers for officialdom. Algorithmic dragnets are not effective kludges precisely because thoughts are not synonymous with their online expression; one merely points to the other. Used to be said that the Internet is forever, so wait before posting or tweeting a reasonable duration so that irresponsible behavior (opinion and trolling, mostly) can be tempered. Who knows who possesses technical expertise and access to tweet and video archives other than, say, the Wayback Machine? When a public figure says or does something dumb, a search-and-destroy mission is often launched to resurrect offending and damning past utterance. Of course, scrub-a-dub erasure or deletion is merely another attempt to manage narrative and isn’t a plea for forgiveness, which doesn’t exist in the public sphere anyway except for rehabilitated monsters such as past U.S. presidents a/k/a war criminals. And the Internet isn’t in fact forever; ask an archivist.

Shifting language, shifting records, shifting sentiment, shifting intellectual history are all aspects of culture that develop naturally and inevitably over time. We no longer believe, for instance, in the four elements or geocentrism (a/k/a the Aristotelian-Ptolemaic system; never mind the intransigent Flat Earthers who need not be silenced). Darker aspects of these shifts, however, include the remarkable Orwellian insight that “Who controls the past controls the future: who controls the present controls the past” from the 1949 novel Nineteen Eighty-Four. Here’s the passage for context:

Who controls the past, controls the future: who controls the present, controls the past … The mutability of the past is the central tenet of Ingsoc. Past events, it is argued, have no objective existence, but survive only in written records and in human memories. The past is whatever the records and the memories agree upon. And since the Party is in full control of all records, and in equally full control of the minds of its members, it follows that the past is whatever the Party chooses to make it.

In 2021, the awful lesson is taken to heart by multiple parties (not the Party in the novel but wannabes) who have latched maniacally onto Orwellian mechanisms of thought control specifically through the manipulation of records, history, and language. But as mentioned above, policing mere expression is not the same as policing thought itself, at least among those who retain critical thinking skills and independence of mind. I abstain judgment how effective attempted brainwashing is with the masses but will at least mention that Yeonmi Park, who escaped from North Korea in 2007 before settling in the U.S. in 2014, describes the chilling totalitarian thought control exercised by the North Korean government — the stuff of nightmare dystopianism. The template is by now well established and despots everywhere are only too happy to implement it repeatedly, following an evil trajectory that should be resisted at every turn while still possible.

Among those building fame and influence via podcasting on YouTube is Michael Malice. Malice is a journalist (for what organization?) and the author of several books, so he has better preparation and content than many who (like me) offer only loose opinion. He latest book (no link) is The Anarchist Handbook (2021), which appears to be a collection of essays (written by others, curated by Malice) arguing in theoretical support of anarchism (not to be confused with chaos). I say theoretical because, as a hypersocial species of animal, humans never live in significant numbers without forming tribes and societies for the mutual benefit of their members. Malice has been making the rounds discussing his book and is undoubtedly an interesting fellow with well rehearsed arguments. Relatedly, he argues in favor of objectivism, the philosophy of Ayn Rand that has been roundly criticized and dismissed yet continues to be attractive especially to purportedly self-made men and women (especially duped celebrities) of significant wealth and achievement.

Thus far in life, I’ve disdained reading Rand or getting too well acquainted with arguments in favor of anarchism and/or objectivism. As an armchair culture critic, my bias skews toward understanding how things work (i.e., Nassim Taleb’s power laws) in actuality rather than in some crackpot theory. As I understand it, the basic argument put forward to support any variety of radical individualism is that everyone working in his or her own rational self-interest, unencumbered by the mores and restrictions of polite society, leads to the greatest (potential?) happiness and prosperity. Self-interest is not equivalent to selfishness, but even if it were, the theorized result would still be better than any alternative. A similar argument is made with respect to economics, known as the invisible hand. In both, hidden forces (often digital or natural algorithms), left alone to perform their work, enhance conditions over time. Natural selection is one such hidden force now better understood as a component of evolutionary theory. (The term theory when used in connection with evolution is an anachronism and misnomer, as the former theory has been scientifically substantiated as a power law.) One could argue as well that human society is a self-organizing entity (disastrously so upon even casual inspection) and that, because of the structure of modernity, we are all situated within a thoroughly social context. (Guy McPherson used to quip that we’re all born into captivity precisely because there is no escape.) Accordingly, the notion that one can or should go it alone is a delusion because it’s flatly impossible to escape the social surround, even in aboriginal cultures, unless one is totally isolated from other humans in what little remains of the wilderness. Of course, those few hardy individuals who retreat into isolation typically bring with them the skills, training, tools, and artifacts of society. A better example might be feral children, lost in the wilderness at an early age and deprived of human society but often taken in by a nonhuman animal (and thus socialized differently).

My preferred metaphor when someone insists on total freedom and isolation away from the maddening crowd is traffic — usually automobile traffic but foot traffic as well. Both are examples of aggregate flow arising out of individual activity, like drops of rain forming into streams, rivers, and floods. When stuck in automobile congestion or jostling for position in foot traffic, it’s worthwhile to remember that you are the traffic, a useful example of synecdoche. Those who buck the flow, cut the line, or drive along the shoulder — often just to be stuck again a little farther ahead — are essentially practicing anarchists or me-firsters, whom the rest of us simply regard as assholes. Cultures differ with respect to the orderliness of queuing, but even in those places where flow is irregular and unpredictable, a high level of coordination (lost on many American drivers who can’t figger a roundabout a/k/a traffic circle) is nonetheless evident.

As I understand it, Malice equates cooperation with tyranny because people defer to competence, which leads to hierarchy, which results in power differentials, which transforms into tyranny (petty or profound). (Sorry, can’t locate the precise formulation.) Obvious benefits (e.g., self-preservation) arising out of mutual coordination (aggregation) such as in traffic flows are obfuscated by theory distilled into nicely constructed quotes. Here’s the interesting thing: Malice has lived in Brooklyn most of his life and doesn’t know how to drive! Negotiating foot traffic has a far lower threshold for serious harm that driving. He reports that relocation to Austin, TX, is imminent, and with it, the purchase of a vehicle. My suspicion is that to stay out of harm’s way, Malice will learn quickly to obey tyrannical traffic laws, cooperate with other drivers, and perhaps even resent the growing number of dangerous assholes disrupting orderly flow like the rest of us — at least until he develops enough skill and confidence to become one of those assholes. The lesson not yet learned from Malice’s overactive theoretical perspective is that in a crowded, potentially dangerous world, others must be taken into account. Repetition of this kindergarten lesson throughout human activity may not be the most pleasant thing for bullies and assholes to accept, but refusing to do so makes one a sociopath.

In my neighborhood of Chicago, it’s commonplace to see vehicles driving on the road with a giant Puerto Rican flag flying from a pole wedged in each of the rear windows. Often, one of the two flags’ traditional colors (red, white, and blue) is changed to black and white — a symbol of resistance. Puerto Rican politics is a complicated nest of issues I don’t know enough about to say more. However, the zeal of my neighbors is notable. Indeed, as I visited a local farmer’s market last weekend, I couldn’t help but notice quite a welcome diversity on display and folks entirely untroubled by the presence of others who didn’t look just like them (tattoos, unnatural hair colors, long beards and shaved heads, nonstandard attire and accoutrements, etc.). I’m actually pleased to see a level of comfort and freedom to present oneself is such manner as one wishes, and not just because of the buzz phrase “diversity and inclusion.” So go ahead: fly your freak flag high! (This same value applies to viewpoint diversity.)

In contrast, when I venture to some far-flung suburb for sundry activities now that lockdowns and restrictions have been lifted, I encounter mostly white, middle-aged, middle-class suburbanites who admittedly look just like me. It’s unclear that folks in those locales are xenophobic in any way, having withdrawn from city life in all its messiness for a cozy, upscale, crime-free subdivision indistinguishable from the next one over. Maybe that’s an artifact of mid-20th-century white flight, where uniformity of presentation and opinion is the norm. Still, it feels a little weird. (Since the 1980s, some rather well-put-together people have returned to the city center, but that usually requires a king-sized income to purchase a luxury condo in some 50-plus-storey tower. After last summer’s BLM riots, that influx turned again to outflux.) One might guess that, as a visible minority within city confines, I would be more comfortable among my own cohort elsewhere, but that’s not the case. I rather like rubbing elbows with others of diverse backgrounds and plurality of perspectives.

I’ve also grown especially weary of critical race theory being shoved in my face at every turn, as though race is (or should be) the primary lens through which all human relations must be filtered. Such slavish categorization, dropping everyone giant, ill-fitted voting blocs, is the hallmark of ideologues unable to break out of the pseudo-intellectual silos they created for themselves and seek to impose on others. Yet I haven’t joined the growing backlash and instead feel increasingly ill at ease in social situations that appear (on the surface at least) to be too white bread. Shows, perhaps, how notions of race that were irrelevant for most of my life have now crept in and invaded my conscience. Rather than solving or resolving longstanding issues, relentless focus on race instead spreads resentment and discomfort. The melting pot isn’t boiling, but summer is not yet over.

Happy to report that humans have finally outgrown their adolescent fixation, obsession, and infatuation surrounding technology and gadgetry, especially those that blow up things (and people), part of a maladaptive desire to watch the world burn (like a disturbed 14-year-old playing with fire to test the boundaries of control while hoping for the boundary to be breached). We are now in the process of correcting priorities and fixing the damage done. We’re also free from the psychological prison in which we trapped ourselves through status seeking and insistence on rigid ego consciousness by recognizing instead that, as artifacts of a hypersocial species, human cognition is fundamentally distributed among us as each of us is for all intents and purposes a storyteller retelling, reinforcing, and embellishing stories told elsewhere — even though it’s not quite accurate to call it mass mind or collective consciousness — and that indeed all men are brothers (an admitted anachronism, since that phrase encompasses women/sisters, too). More broadly, humans also now understand that we are only one species among many (a relative late-comer in evolutionary time, as it happens) that coexist in a dynamic balance with each other and with the larger entity some call Mother Earth or Gaia. Accordingly, we have determined that our relationship can no longer be that of abuser (us) and abused (everything not us) if the dynamism built into that system is not to take us out (read: trigger human extinction, like most species suffered throughout evolutionary time). If these pronouncements sound too rosy, well, get a clue, fool!

Let me draw your attention to the long YouTube video embedded below. These folks have gotten the clues, though my commentary follows anyway, because SWOTI.

After processing all the hand-waving and calls to immediate action (with inevitable nods to fundraising), I was struck by two things in particular. First, XR’s co-founder Roger Hallan gets pretty much everything right despite an off-putting combination of alarm, desperation, exasperation, and blame. He argues that to achieve the global awakening needed to alter humanity’s course toward (self-)extinction, we actually need charismatic speakers and heightened emotionalism. Scientific dispassion and neutered measured political discourse (such as the Intergovernmental Panel on Climate Change (IPCC) or as Al Gore attempted for decades before going Hollywood already fifteen years ago now) have simply failed to accomplish anything. (On inspection, what history has actually delivered is not characterized by the lofty rhetoric of statesmen and boosters of Enlightenment philosophy but rather resembles a sociologist’s nightmare of dysfunctional social organization, where anything that could possible go wrong pretty much has.) That abysmal failure is dawning on people under the age of 30 or so quite strongly, whose futures have been not so much imperiled as actively robbed. (HOW DARE YOU!? You slimy, venal, incompetent cretins above the age of 30 or so!) So it’s not for nothing that Roger Hallan insists that the XR movement ought to be powered and led by young people, with old people stepping aside, relinquishing positions of power and influence they’ve already squandered.

Second, Chris Hedges, easily the most erudite and prepared speaker/contributor, describes his first-hand experience reporting on rebellion in Europe leading to (1) the collapse of governments and (2) disintegration of societies. He seems to believe that the first is worthwhile, necessary, and/or inevitable even though the immediate result is the second. Civil wars, purges, and genocides are not uncommon throughout history in the often extended periods preceding and following social collapse. The rapidity of governmental collapse once the spark of citizen rebellion becomes inflamed is, in his experience, evidence that driving irresponsible leaders from power is still possible. Hedges’ catchphrase is “I fight fascists because they’re fascists,” which as an act of conscience allows him to sleep at night. A corollary is that fighting may not necessarily be effective, at least on the short term, or be undertaken without significant sacrifice but needs to be done anyway to imbue life with purpose and meaning, as opposed to anomie. Although Hedges may entertain the possibility that social disintegration and collapse will be far, far more serious and widespread once the armed-to-the-teeth American empire cracks up fully (already under way to many observers) than with the Balkan countries, conscientious resistance and rebellion is still recommended.

Much as my attitudes are aligned with XR, Hallan, and Hedges, I’m less well convinced that we should all go down swinging. That industrial civilization is going down and all of us with it no matter what we do is to me an inescapable conclusion. I’ve blogged about this quite a bit. Does ethical behavior demand fighting to the bitter end? Or can we fiddle while Rome burns, so to speak? There’s a lot of middle ground between those extremes, including nihilistic mischief (euphemism alert) and a bottomless well of anticipated suffering to alleviate somehow. More than altering the inevitable, I’m more inclined to focus on forestalling eleventh-hour evil and finding some grace in how we ultimately, collectively meet species death.

On the heels of a series of snowstorms, ice storms, and deep freezes (mid-Feb. 2021) that have inundated North America and knocked out power to millions of households and businesses, I couldn’t help but to notice inane remarks and single-pane comics to the effect “wish we had some global warming now!” Definitely, things are looking distinctly apocalyptic as folks struggle with deprivation, hardship, and existential threats. However, the common mistake here is to substitute one thing for another, failing to distinguish weather from climate.

National attention is focused on Texas, expected to be declared a disaster zone by Pres. Biden once he visits (a flyover, one suspects) to survey and assess the damage. It’s impossible to say that current events are without precedent. Texas has been in the cross-hairs for decades, suffering repeated droughts, floods, fires, and hurricanes that used to be prefixed by 50-year or 100-year. One or another is now occurring practically every year, which is exactly what climate chaos delivers. And in case the deep freeze and busted water pipes all over Texas appear to have been unpredictable, this very thing happened in Arizona in 2011. Might have been a shot across the bow for Texas to learn from and prepare, but its self-reliant, gun-totin’, freedom-lovin’ (fuck, yeah!), secessionist character is instead demonstrated by having its own electrical grid covering most of the state, separated from other North American power grids, ostensibly to skirt federal regulations. Whether that makes Texas’ grid more or less vulnerable to catastrophic failure is an open question, but events of the past week tested it sorely. It failed badly. People literally froze to death as a result. Some reports indicate Texas was mere moments away from an even greater failure that would have meant months to rebuild and reestablish electrical service. A substantial diaspora would have ensued, essentially meaning more climate refugees.

So where’s the evil in this? Well, let me tell you. Knowledge that we humans are on track to extirpate ourselves via ongoing industrial activity has been reported and ignored for generations. Guy McPherson’s essay “Extinction Foretold, Extinction Ignored” has this to say at the outset:

The warnings I will mention in this short essay were hardly the first ones about climate catastrophe likely to result from burning fossil fuels. A little time with your favorite online search engine will take you to George Perkins Marsh sounding the alarm in 1847, Svente Arrhenius’s relevant journal article in 1896, Richard Nixon’s knowledge in 1969, and young versions of Al Gore, Carl Sagan, and James Hansen testifying before the United States Congress in the 1980s. There is more, of course, all ignored for a few dollars in a few pockets. [links in original]

My personal acquaintance with this large body of knowledge began accumulating in 2007 or so. Others with decision-making capacity have known for much, much longer. Yet short-term motivations shoved aside responsible planning and preparation that is precisely the warrant of governments at all levels, especially, say, the U.S. Department of Energy. Sure, climate change is reported as controversy, or worse, as conspiracy, but in my experience, only a few individuals are willing to speak the obvious truth. They are often branded kooks. Institutions dither, distract, and even issue gag orders to, oh, I dunno, prop up real estate values in south Florida soon to be underwater. I’ve suggested repeatedly that U.S. leaders and institutions should be acting to manage contraction and alleviate suffering best as possible, knowing that civilization will fail anyway. To pretend otherwise and guarantee — no — drive us toward worst-case scenarios is just plain evil. Of course, the megalomania of a few tech billionaires who mistakenly believe they can engineer around society’s biggest problems is just as bad.

Writ small (there’s a phrase no one uses denoting narrowing scope), meaning at a scale less than anthropogenic climate change (a/k/a unwitting geoengineering), American society has struggled to prioritize guns vs. butter for over a century. The profiteering military-industrial complex has clearly won that debate, leaving infrastructure projects, such as bridge and road systems and public utilities, woefully underfunded and extremely vulnerable to market forces. Refusal to recognize public health as a right or public good demanding a national health system (like other developed countries have) qualifies as well. As inflated Pentagon budgets reveal, the U.S. never lacks money to oppress, fight, and kill those outside the U.S. Inside the U.S., however, cities and states fall into ruin, and American society is allowed to slowly unwind for lack of support. Should we withdraw militarily from the world stage and focus on domestic needs, such as homelessness and joblessness? Undoubtedly. Would that leave us open to attack or invasion (other than the demographic invasion of immigrants seeking refuge in the U.S.)? Highly doubtful. Other countries have their own domestic issues to manage and would probably appreciate a cessation of interference and intervention from the U.S. One might accuse me of substituting one thing for another, as I accused others at top, but the guns-vs.-butter debate is well established. Should be obvious that it’s preferable to prioritize caring for our own society rather than devoting so much of our limited time and resources to destroying others.

Evil exists in the world. History and current events both bear this out amply. Pseudo-philosophers might argue that, like emotions and other immaterial sensations, good and evil are merely reified concepts, meaning they are human constructs with no palpable external reality. Go tell that to victims of evildoers. Human suffering can’t be anonymized, rationalized, or philosophized away quite so handily.

It was sort of refreshing, back in the day, when Google’s motto and/or corporate code of conduct was simple: “Don’t Be Evil.” It acknowledged the potential for being or becoming evil (like any of the Bigs: Big Tobacco, Big Soda, Big Oil, Big Pharma, Big Media, Big Agriculture, etc.) and presumably aspired to resist obvious temptations. That was then (from 2000 to 2018), this is now (2021 until death take us — soon enough, I fear). But like all entities possessed of absurd levels of wealth and power, Google (now reorganized as a subsidiary of Alphabet, but who actually refers to it that way?) and its Silicon Valley brethren have succumbed to temptation and become straight-up evil.

One might charitably assess this development as something unbidden, unanticipated, and unexpected, but that’s no excuse, really. I certainly don’t envy celebrity executives experiencing difficulty resulting from having created unmanageable behemoths loosed on both public and polity unable to recognize beastly fangs until already clamped on their necks. As often occurs, dystopian extrapolations are explored in fiction, sometimes satirically. The dénouement of the HBO show Silicon Valley depicts tech mogul wannabes succeeding in creating an AI (or merely a sophisticated algorithm? doesn’t matter …) that would in time become far too powerful in blind execution of its inner imperative. In the show, characters recognize what they had done and kill their own project rather than allow it to destroy the world. In reality, multiple developers of computer tech platforms (and their embedded dynamic, including the wildly unhelpful albeit accurate term algorithm) lacked the foresight to anticipate awful downstream effects of their brainchildren. Yet now that those effects are manifesting recognizably, these corporations continue to operate and wreak havoc.

Silicon Valley shows a extended software development period of bungling ineptitude punctuated by brilliant though momentary breakthroughs. Characters are smart, flawed people laughably unable to get out of the way of their own success. The pièce de résistance was yoking one so-called “learning machine” to another and initiating what would become a runaway doomsday process (either like ecological collapse, building slowly the making the biosphere uninhabitable all at once, or like the gray goo problem, progressively “processing” biomass at the molecular level until all that remains is lifeless goo). It was a final act of bumbling that demanded the characters’ principled, ethical response before the window of opportunity closed. Real Silicon Valley tech platforms are in the (ongoing) process of rending the social fabric, which is no laughing matter. The issue du jour surrounds free speech and its inverse censorship. More broadly, real Silicon Valley succeeded in gaming human psychology for profit in at least two aspects (could be more as yet unrecognized): (1) mining behavioral data as an exploitable resource, and (2) delivering inexhaustible streams of extremely divisive content (not its own) to drive persistent engagement with its platforms. Yoked together, they operate to drive society mad, and yet, mounting evidence of this development has not produced even an inkling that maybe the damned doomsday devices ought to be shut off. As with the environment, we operate with freedom enough to destroy ourselves. Instead, politicians issue stunningly ineffectual calls for regulation or break-up of monopolies. In the meantime, ever more absurd wealth and power are concentrated in the hands of a few executives who have clearly punted and decided “let’s be evil.” No restraints on their behavioral experimentation across whole societies exist.

Much more to say on this topic in additional parts to come.

Already widely reported but only just having come to my awareness is an initiative by Rolling Stone to establish a Culture Council: “an Invitation-Only Community of Influencers, Innovatives, and Creatives.” The flattering terms tastemakers and thought leaders are also used. One must presume that submissions will be promotional and propaganda pieces masquerading as news articles. Selling advertising disguised as news is an old practice, but the ad usually has the notation “advertisement” somewhere on the page. Who knows whether submissions will be subject to editorial review?

To be considered for membership, candidates must sit in a senior-level position at a company generating at least $500K in annual revenue or have obtained at least $1M in total institutional funding.

Rolling Stone‘s website doesn’t say it anywhere I can locate, but third-party reports indicate that members pay either a $1,500 annual fee and $500 submission fee (one-time? repeat?) or a flat $2,000 submission fee. Not certain which. Just to be abundantly clear, fees would be paid by the submitter to the magazine, reversing how published content is normally acquired (i.e., by paying staff writers and free lancers). I’d say this move by Rolling Stone is unprecedented, but of course, it’s not. However, it is a more brazen pay-to-play scheme than most and may be a harbinger of even worse developments to come.

Without describing fully how creative content (arts and news) was supported in the past, I will at least observe that prior to the rise of full-time creative professions in the 18th and 19th centuries (those able to scratch out earn a living on commissions and royalties), creative work was either a labor of love/dedication, typically remunerated very poorly if at all, or was undertaken through the patronage of wealthy European monarchs, aristocrats, and religious institutions (at least in the developing West). Unless I’m mistaken, self-sustaining news organizations and magazines came later. More recent developments include video news release and crowd sourcing, the latter of which sometimes accomplished under the pretense of running contests. The creative commons is how many now operative (including me — I’ve refused to monetize my blog), which is exploited ruthlessly by HuffPost (a infotainment source I ignore entirely), which (correct me if wrong) doesn’t pay for content but offers exposure as an inducement to journalists trying to develop a byline and/or audience. Podcasts, YouTube channels, and news sites also offer a variety of subscription, membership, and voluntary patronage (tipping) schemes to pay the bills (or hit it big if an outlier). Thus, business models have changed considerably over time and are in the midst of another major transformation, especially for news-gathering organizations and the music recording industry in marked retreat from their former positions.

Rolling Stone had always been a niche publication specializing in content that falls outside my usual scope of interest. I read Matt Taibbi’s reporting that appeared in Rolling Stone, but the magazine’s imprint (read: reputation) was not the draw. Now that the Rolling Stone is openly soliciting content through paid membership in the Culture Council, well, the magazine sinks past irrelevance to active avoidance.

It’s always been difficult to separate advertising and propaganda from reliable news, and some don’t find it important to keep these categories discrete, but this new initiative is begging to be gamed by motivated PR hacks and self-promoters with sufficient cash to burn. It’s essentially Rolling Stone whoring itself out. Perhaps more worrying is that others will inevitably follow Rolling Stone‘s example and sell their journalistic integrity with similar programs, effectively putting the final nails in their own coffins (via brand self-destruction). The models in this respect are cozy, incestuous relationships between PACs, lobbying groups, think tanks, and political campaigns. One might assume that legacy publications such as Rolling Stone would have the good sense to retain as much of their valuable brand identity as possible, but the relentless force of corporate/capitalist dynamics are corrupting even the incorruptible.

Something in an online discussion brought me back to my days as a Boy Scout. (No, not that, with your nasty, nasty assumptions.) It was one of the first merit badges I earned: Citizenship in the Community (link to PDF). I can’t remember any of the content anymore (haven’t yet consulted the PDF), and indeed, looking back with the advantage of several decades of hindsight, I have a hard time imagining any of the (morality? ethics?) lessons learned back then having had much durable impact despite remembering an emerging confidence and awareness (a commonplace delusion of youth) of my position within the community. Still, I appreciate having had many Boy Scout character-building experiences, which led to simple and enduring understandings of ideals such as honor, duty, preparedness, service, forbearance, shouldering hardships, and perhaps most of all, accepting responsibility for others, particularly those younger and weaker. (I’m not claiming to be any sort of paragon of virtue. Cynicism and misanthropy may have wrecked that aspiration.) I never served in the military, but I surmise others learn similar lessons slightly later in life when more readily absorbed and not so easily forgotten. In the past decade plus, some may seek these lessons through participation in endurance sports or martial arts (if not distorted by bad instruction like in Cobra Kai), though the focus outward (i.e., toward community and mutual reliance) may not be as strong.

The subject came up in a discussion of participants in small-scale democracy, something I’ve always known is messy, unrewarding, thankless, and sometimes costly yet still necessary to be a good citizen contributing to one’s community. Many adults get their first taste of local democratic groups (read: self-governing) through parent groups like the Parent-Teacher Association (PTA). Or maybe it’s a performing arts organization, home owner’s association, church council, social work hotline, self-help group, or cooperative. Doesn’t matter which. (Political activism and organizing might be something quite different. Hard to say.) Groups run on the good will and dedication of volunteered time and skills for the benefit of members of the community. As with any population, there are always free riders: those who contribute nothing but enjoy and/or extract benefits. In fact, if everyone were integrally involved, organizational complexity would become unmanageable. If activities of such groups seem like a piece of cake or vaguely utopian, just join one and see how different character types behave. Lotta dead wood in such organization. Moreover, power mongers and self-aggrandizers often take over small-scale democracies and run them like private fiefdoms. Or difficult policy and finance discussions divide otherwise like-minded groups into antagonists. As I said, it’s a decidedly messy undertaking.

Members of the community outside of the executive group (typically a board of directors) also have legitimate interests. Maybe community members attend meetings to keep informed or weigh in online with unconstructive complaints and criticisms (or even mockery and trolling) but then refuse to contribute anything worthwhile. Indeed, boards often have difficulty recruiting new officers or participants because no one wants to take on responsibility and face potential criticism directed at them. I’ve also seen boards settle into the same few folks year after year whose opinions and leadership grow stale and calcifies.

Writ large, leadership skills learned through citizenship in the community rise to the equivalents of Boy Scout merit badges Citizenship in the Nation and Citizenship in the World (no links but searchable). Skills deployed at those strata would arguably require even greater wherewithal and wisdom, with stakes potentially being much higher. Regrettably, having just passed through an election cycle and change of leadership in the U.S., my dour assessment is that leadership has failed miserably at multiple issues. The two most significant involve how we fail to organize society for the benefit of all, namely, economic equality and resource sustainability. Once market forces came to bear on social organization and corporate entities grew too large to be rooted in community service anymore, greed and corruption destroyed high-minded ideals. More self-aggrandizers and careerists than ever (no names, fill in the blanks, they’re all famous — or infamous) rose to the tops of organizations and administrations, especially politics, news media, and the punditry. Their logical antidotes are routinely and ruthlessly disenfranchised and/or ignored. The lasting results are financial inequality run amok and unsustainable resource addictions (energy mostly) that are toxifying the environment and reducing the landscape to ruin and inhabitability. (Perpetual war is a third institutional failure that could be halted almost immediately if moral clarity were somehow to appear.) It’s all out there, plain to see, yet continues to mount because of execrable leadership. Some argue it’s really a problem with human nature, a kind of original stain on our souls that can never be erased and so should be forgiven or at least understood (and rationalized away) within a large context. I’m not yet ready to excuse national and world leaders. Their culpability is criminal.

The end of every U.S. presidential administration is preceded by a spate of pardons and commutations — the equivalents of a get-out-of-jail-free card offered routinely to conspirators collaborators with the outgoing executive and general-purpose crony capitalists. This practice, along with diplomatic immunity and supranational elevation of people (and corporations-as-people) beyond the reach of prosecution, is a deplorable workaround obviating the rule of law. Whose brilliant idea it was to offer special indulgence to miscreants is unknown to me, but it’s pretty clear that, with the right connections and/or with enough wealth, you can essentially be as bad as you wanna be with little fear of real consequence (a/k/a too big to fail a/k/a too big to jail). Similarly, politicians, whose very job it is to manage the affairs of society, are free to be incompetent and destructive in their brazen disregard for needs of the citizenry. Only modest effort (typically a lot of jawing directed to the wrong things) is necessary to enjoy the advantages of incumbency.

In this moment of year-end summaries, I could choose from among an array of insane, destructive, counter-productive, and ultimately self-defeating nominees (behaviors exhibited by elite powers that be) as the very worst, the baddest of the bad. For me, in the largest sense, that would be the abject failure of the rule of law (read: restraints), which has (so far) seen only a handful of high-office criminals prosecuted successfully (special investigations leading nowhere and failed impeachments don’t count) for their misdeeds and malfeasance. I prefer to be more specific. Given my indignation over the use of torture, that would seem an obvious choice. However, those news stories have been shoved to the back burner, including the ongoing torture of Julian Assange for essentially revealing truths cynics like me already suspected and now know to be accurate, where they general little heat. Instead, I choose war as the very worst, an example of the U.S. (via its leadership) being as bad as it can possibly be. The recent election cycle offered a few candidates who bucked the consensus that U.S. involvement in every unnecessary, undeclared war since WWII is justified. They were effectively shut out by the military-industrial complex. And as the incoming executive tweeted on November 24, 2020, America’s back, baby! Ready to do our worst again (read: some more, since we [the U.S. military] never stopped [making war]). A sizeable portion of the American public is aligned with this approach, too.

So rule of law has failed and we [Americans] are infested with crime and incompetence at the highest levels. Requirements, rights, and protections found in the U.S. Constitution are handily ignored. That means every administration since Truman has been full of war criminals, because torture and elective war are crimes. The insult to my sensibilities is far worse than the unaffordability of war, the failure to win or end conflicts, or the lack of righteousness in our supposed cause. It’s that we [America, as viewed from outside] are belligerent, bellicose aggressors. We [Americans] are predators. And we [Americans, but really all humans] are stuck in an adolescent concept of conduct in the world shared with animals that must kill just to eat. We [humans] make no humanitarian progress at all. But the increasing scale of our [human] destructiveness is progress if drones, robots, and other DARPA-developed weaponry impress.


/rant on

Remember all those folks in the weeks and days preceding election day on November 4, 2020, who were buying guns, ammo, and other provisions in preparation for civil breakdown? (No one known personally, of course, and gawd no not actually any of us, either; just them other others who don’t read blogs or anything else.) Well, maybe they were correct adopting the precautionary principal (notably absent from a host of other perils besetting us). But as of this writing, nothing remotely resembling widespread disruption — feared by some, hotly anticipated by others — has developed. But wait! There’s still time. Considering Americans were set up by both political parties to distrust the outcome of the presidential race no matter which candidate claimed to have prevailed, we now face weeks or months of legal challenges and impatient formation of agitators (again, both sides) demanding their candidate be declared the winner (now, dammit!) by the courts instead of either official ballot-counters or the liberal-biased MSM. To say our institutions have failed us, and further, that political operatives all the way up to the sitting president have been openly fomenting violence in the streets, is a statement of the obvious.

Among my concerns more pressing than who gets to sit in the big chair, however, is the whipsawing stock market. Although no longer an accurate proxy of overall economic health or asset valuation, the stock market’s thoroughly irrational daily reaction to every rumor of, say, a vaccine for the raging coronavirus, or resumption of full economic activity and profitability despite widespread joblessness, renewed lockdowns, and a massive wave of homelessness in the offing due to bankruptcies, evictions, and foreclosures, none of this bodes well for the short-term future and maintenance of, oh, I dunno, supply lines to grocery stores. Indeed, I suspect we are rapidly approaching our very own Minsky Moment, which Wikipedia describes as “a sudden, major collapse of asset values which marks the end of the growth phase of a cycle in credit markets or business activity” [underlying links omitted]. This is another prospective event (overdue, actually) for which the set-up has been long prepared. Conspiratorial types call it “the great reset” — something quite different from a debt jubilee.

For lazy thinkers, rhyming comparisons with the past frequently resort to calling someone a Nazi (or the new Hitler) or reminding everyone of U.S. chattel slavery. At the risk of being accused of similar stupidity, I suggest that we’re not on the eve of a 1929-style market crash and ensuing second great depression (though those could well happen, too, bread lines having already formed in 2020) but are instead poised at the precipice of hyperinflation and intense humiliation akin to the Weimar Republic in 1933 or so. American humiliation will result from recognition that the U.S. is now a failed state and doesn’t even pretend anymore to look after its citizens or the commonweal. Look no further than the two preposterous presidential candidates, neither of whom made any campaign promises to improve the lives of average Americans. Rather, the state has been captured by kleptocrats. Accordingly, no more American exceptionalism and no more lying to ourselves how we’re the model for the rest of the world to admire and emulate.

Like Germany in the 1930s, the U.S. has also suffered military defeats and stagnation (perhaps by design) and currently demonstrates a marked inability to manage itself economically, politically, or culturally. Indeed, the American people may well be ungovernable at this point, nourished on a thin gruel of rugged individualism that forestalls our coming together to address adversity effectively. The possibility of another faux-populist savior arising out of necessity only to lead us over the edge (see the Great Man Theory of history) seems eerily likely, though the specific form that descent into madness would take is unclear. Recent history already indicates a deeply divided American citizenry having lost its collective mind but not yet having gone fully apeshit, flinging feces and destroying what remains of economically ravaged communities for the sheer sport of it. (I’ve never understood vandalism.) That’s what everyone was preparing for with emergency guns, ammo, and provisions. How narrowly we escaped catastrophe (or merely delayed it) should be clear in the fullness of time.

/rant off