Archive for the ‘Corporatism’ Category

Continuing from part 1.

So here’s the dilemma: knowing a little bit about media theory and how the medium shapes the message, I’m spectacularly unconvinced that the cheerleaders are correct and that an entirely new mediascape (a word I thought maybe I had just made up, but alas, no) promises offers to correct the flaws of the older, inherited mediascape. It’s clearly not journalists leading the charge. Rather, comedians, gadflies, and a few academics (behaving as public intellectuals) command disproportionate attention among the digital chattering classes as regular folks seek entertainment and stimulation superior to the modal TikTok video. No doubt a significant number of news junkies still dote on their favorite journalists, but almost no journalist has escaped self-imposed limitations of the chosen media to offer serious reporting. Rather, they offer “commentary” and half-assed observations on human nature (much like like comedians who believe themselves especially insightful — armchair social critics like me probably fit that bill, too). If the sheer count of aggregate followers and subscribers across social media platforms is any indication (it isn’t …), athletes, musicians (mostly teenyboppers and former pop tarts, as I call them), and the irritatingly ubiquitous Kardashian/Jenner clan are the most influential, especially among Millennials and Gen Z, whose tastes skew toward the frivolous. Good luck getting insightful analysis out of those folks. Maybe in time they’ll mature into thoughtful, engaged citizens. After all, Kim Kardashian apparently completed a law degree (but has yet to pass the bar). Don’t quite know what to think of her three failed marriages (so far). Actually, I try not to.

I’ve heard arguments that the public is voting with its attention and financial support for new media and increasingly disregarding the so-called prestige media (no such thing anymore, though legacy media is still acceptable). That may well be, but it seems vaguely ungrateful for established journalists and comedians, having enjoyed the opportunity to apprentice under seasoned professionals, to take acquired skills to emerging platforms. Good information gathering and shaping — even for jokes — doesn’t happen in a vacuum, and responsible journalism in particular can’t simply be repackaging information gathered by others (i.e., Reuters, the Associated Press, and Al Jezeera) with the aforementioned “commentary.” A frequent reason cited for jumping ship is the desire to escape editorial control and institutional attempts to distort the news itself according to some corporate agenda or ideology. Just maybe new platforms have made that possible in a serious way. However, the related desire to take a larger portion of the financial reward for one’s own work (typically as celebrities seeking to extend their 15 minutes of fame — ugh) is a surefire way to introduce subtle, new biases and distortions. The plethora of metrics available online, for instance, allows content creators to see what “hits” or goes viral, inviting service to public interest that is decidedly less than wholesome (like so much rubbernecking).

It’s also curious that, despite all the talk about engaging with one’s audience, new media is mired in broadcast mode, meaning that most content is presented to be read or heard or viewed with minimal or no audience participation. It’s all telling, and because comments sections quickly run off the rails, successful media personalities ignore them wholesale. One weird feature some have adopted during livestreams is to display viewer donations accompanied by brief comments and questions, the donation being a means of separating and promoting one’s question to the top of an otherwise undifferentiated heap. To my knowledge, none has yet tried the established talk radio gambit of taking live telephone calls, giving the public a chance to make a few (unpurchased) remarks before the host resumes control. Though I’ve never been invited (an invitation is required) and would likely decline to participate, the Clubhouse smartphone app appears to offer regular folks a venue to discuss and debate topics of the day. However, reports on the platform dynamics suggest that the number of eager participants quickly rises to an impossible number for realistic group discussion (the classroom, or better yet, graduate seminar establishes better limitations). A workable moderation mechanism has yet to emerge. Instead, participants must “raise their hand” to be called upon to speak (i.e., be unmuted) and can be kicked out of the “room” arbitrarily if the moderator(s) so decide. This is decidedly not how conversation flows face-to-face.

What strikes me is that while different broadcast modes target and/or capture different demographics, they all still package organize content around the same principle: purporting to have obtained information and expertise to be shared with or taught to audiences. Whether subject matter is news, science, psychology, comedy, politics, etc., they have something ostensibly worth telling you (and me), hopefully while enhancing fame, fortune, and influence. So it frankly doesn’t matter that much whether the package is a 3-minute news segment, a brief celebrity interview on a late night talk show, an article published in print or online, a blog post, a YouTube video of varying duration, a private subscription to a Discord Server, a Subreddit, or an Instagram or Twitter feed; they are all lures for one’s attention. Long-form conversations hosted by Jordan Peterson, Joe Rogan, and Lex Fridman break out of self-imposed time limitations of the typical news segment and flow more naturally, but they also meander and get seriously overlong for anyone but long-haul truckers. (How many times have I tuned out partway into Paul VanderKlay’s podcast commentary or given up on on Matt Taibbi’s SubStack (tl;dr)? Yeah, lost count.) Yet these folks enthusiastically embrace the shifting mediascape. The digital communications era is already mature enough that several generations of platforms have come and gone as well-developed media are eventually coopted or turned commercial and innovators drive out weaker competitors. Remember MySpace, Google Plus, or American Online? The list of defunct social media is actually quite long. Because public attention is a perpetually moving target, I’m confident that those now enjoying their moment in the sun will face new challenges until it all eventually goes away amidst societal collapse. What then?

Coming back to this topic after some time (pt. 1 here). My intention was to expand upon demands for compliance, and unsurprisingly, relevant tidbits continuously pop up in the news. The dystopia American society is building for itself doesn’t disappoint — not that anyone is hoping for such a development (one would guess). It’s merely that certain influential elements of society reliably move toward consolidation of power and credulous citizens predictably forfeit their freedom and autonomy with little or no hesitation. The two main examples to discuss are Black Lives Matter (BLM) and the response to to the global pandemic, which have occurred simultaneously but are not particularly related.

The BLM movement began in summer 2013 but boiled over in summer 2020 on the heels of the George Floyd killing, with protests spilling over into straightforward looting, mayhem, and lawlessness. That fit of high emotional pique found many protester accosting random strangers in public and demanding a raised fist in support of the movement, which was always ideologically disorganized but became irrational and power-hungry as Wokedom discovered its ability to submit others to its will. In response, many businesses erected what I’ve heard called don’t-hurt-me walls in apparent support of BLM and celebration of black culture so that windows would not be smashed and stores ransacked. Roving protests in numerous cities demanded shows of support, though with what exactly was never clear, from anyone encountered. Ultimately, protests morphed into a sort of protection racket, and agitators learned to enjoy making others acquiesce to arbitrary demands. Many schools and corporations now conduct mandatory training to, among other things, identify unconscious bias, which has the distinct aroma of original sin that can never be assuaged or forgiven. It’s entirely understandable that many individuals, under considerable pressure to conform as moral panic seized the country, play along to keep the peace or keep their jobs. Backlash is building, of course.

The much larger example affecting everyone, nationwide and globally, is the response to the pandemic. Although quarantines have been used in the past to limit regional outbreaks of infectious disease, the global lockdown of business and travel was something entirely new. Despite of lack of evidence of efficacy, the precautionary principle prevailed and nearly everyone was forced into home sequestration and later, after an embarrassingly stupid scandal (in the U.S.), made to don masks when venturing out in public. As waves of viral infection and death rolled across the globe, political leaders learned to enjoy making citizens acquiesce to capricious and often contradictory demands. Like BLM, a loose consensus emerged about the “correct” way to handle the needs of the moment, but the science and demographics of the virus produced widely variant interpretations of such correctness. A truly coordinated national response in the U.S. never coalesced, and hindsight has judged the whole morass a fundamentally botched job of maintaining public health in most countries.

But political leaders weren’t done demanding compliance. Any entirely novel vaccine protocol was rushed into production after emergency use authorization was obtained and indemnification (against what?) was granted to the pharma companies that developed competing vaccines. Whether this historical moment will turn out to be something akin to the thalidomide scandal remains to be seen, but at the very least, the citizenry is being driven heavily toward participation in a global medical experiment. Some states even offer million-dollar lotteries to incentivize individuals to comply and take the jab. Open discussion of risks associated with the new vaccines has been largely off limits, and a two-tier society is already emerging: the vaccinated and the unclean (which is ironic, since many of the unclean have never been sick).

Worse yet (and like the don’t-hurt-me walls), many organizations are adopting as-yet-unproven protocols and requiring vaccination for participants in their activities (e.g., schools, sports, concerts) or simply to keep one’s job. The mask mandate was a tolerable discomfort (though not without many principled refusals), but forcing others to be crash test dummies experimental test subjects is well beyond the pale. Considering how the narrative continues to evolve and transform, thoughtful individuals trying to evaluate competing truth claims for themselves are unable to get clear, authoritative answers. Indeed, it’s hard to imagine a situation where authorities in politics, medicine, science, and journalism could worked so assiduously to undermine their own credibility. Predictably, heads (or boards of directors) of many organizations are learning to enjoy the newly discovered power to transform their organizations into petty fiefdoms and demand compliance from individuals — usually under the claim of public safety (“for the children” being unavailable this time). Considering how little efficacy has yet been truly demonstrated with any of the various regimes erected to contain or stall the pandemic, the notion that precautions undertaken have been worth giving injudicious authority to people up and down various power hierarchies to compel individuals remains just that: a notion.

Tyrants and bullies never seem to tire of watching others do the submission dance. In the next round, be ready to hop on one leg and/or bark like a dog when someone flexes on you. Land of the free and home of the brave no longer.

Addendum

The CDC just announced an emergency meeting to be held (virtually) June 18 to investigate reports (800+ via the Vaccination Adverse Effect Reporting System (VAERS), which almost no one had heard of only a month ago) of heart inflammation in adolescents following vaccination against the covid virus. Significant underreporting is anticipated following the circular logic that since authorities declared the vaccines safe prematurely (without standard scientific evidence to support such a statement), the effects cannot be due to the vaccine. What will be the effect of over 140 million people having been assured that vaccination is entirely safe, taken the jab, and then discovered “wait! maybe not so much ….” Will the complete erosion of trust in what we’re instructed told by officialdom and its mouthpieces in journalism spark widespread, organized, grassroots defiance once the bedrock truth is laid bare? Should it?

On the heels of a series of snowstorms, ice storms, and deep freezes (mid-Feb. 2021) that have inundated North America and knocked out power to millions of households and businesses, I couldn’t help but to notice inane remarks and single-pane comics to the effect “wish we had some global warming now!” Definitely, things are looking distinctly apocalyptic as folks struggle with deprivation, hardship, and existential threats. However, the common mistake here is to substitute one thing for another, failing to distinguish weather from climate.

National attention is focused on Texas, expected to be declared a disaster zone by Pres. Biden once he visits (a flyover, one suspects) to survey and assess the damage. It’s impossible to say that current events are without precedent. Texas has been in the cross-hairs for decades, suffering repeated droughts, floods, fires, and hurricanes that used to be prefixed by 50-year or 100-year. One or another is now occurring practically every year, which is exactly what climate chaos delivers. And in case the deep freeze and busted water pipes all over Texas appear to have been unpredictable, this very thing happened in Arizona in 2011. Might have been a shot across the bow for Texas to learn from and prepare, but its self-reliant, gun-totin’, freedom-lovin’ (fuck, yeah!), secessionist character is instead demonstrated by having its own electrical grid covering most of the state, separated from other North American power grids, ostensibly to skirt federal regulations. Whether that makes Texas’ grid more or less vulnerable to catastrophic failure is an open question, but events of the past week tested it sorely. It failed badly. People literally froze to death as a result. Some reports indicate Texas was mere moments away from an even greater failure that would have meant months to rebuild and reestablish electrical service. A substantial diaspora would have ensued, essentially meaning more climate refugees.

So where’s the evil in this? Well, let me tell you. Knowledge that we humans are on track to extirpate ourselves via ongoing industrial activity has been reported and ignored for generations. Guy McPherson’s essay “Extinction Foretold, Extinction Ignored” has this to say at the outset:

The warnings I will mention in this short essay were hardly the first ones about climate catastrophe likely to result from burning fossil fuels. A little time with your favorite online search engine will take you to George Perkins Marsh sounding the alarm in 1847, Svente Arrhenius’s relevant journal article in 1896, Richard Nixon’s knowledge in 1969, and young versions of Al Gore, Carl Sagan, and James Hansen testifying before the United States Congress in the 1980s. There is more, of course, all ignored for a few dollars in a few pockets. [links in original]

My personal acquaintance with this large body of knowledge began accumulating in 2007 or so. Others with decision-making capacity have known for much, much longer. Yet short-term motivations shoved aside responsible planning and preparation that is precisely the warrant of governments at all levels, especially, say, the U.S. Department of Energy. Sure, climate change is reported as controversy, or worse, as conspiracy, but in my experience, only a few individuals are willing to speak the obvious truth. They are often branded kooks. Institutions dither, distract, and even issue gag orders to, oh, I dunno, prop up real estate values in south Florida soon to be underwater. I’ve suggested repeatedly that U.S. leaders and institutions should be acting to manage contraction and alleviate suffering best as possible, knowing that civilization will fail anyway. To pretend otherwise and guarantee — no — drive us toward worst-case scenarios is just plain evil. Of course, the megalomania of a few tech billionaires who mistakenly believe they can engineer around society’s biggest problems is just as bad.

Writ small (there’s a phrase no one uses denoting narrowing scope), meaning at a scale less than anthropogenic climate change (a/k/a unwitting geoengineering), American society has struggled to prioritize guns vs. butter for over a century. The profiteering military-industrial complex has clearly won that debate, leaving infrastructure projects, such as bridge and road systems and public utilities, woefully underfunded and extremely vulnerable to market forces. Refusal to recognize public health as a right or public good demanding a national health system (like other developed countries have) qualifies as well. As inflated Pentagon budgets reveal, the U.S. never lacks money to oppress, fight, and kill those outside the U.S. Inside the U.S., however, cities and states fall into ruin, and American society is allowed to slowly unwind for lack of support. Should we withdraw militarily from the world stage and focus on domestic needs, such as homelessness and joblessness? Undoubtedly. Would that leave us open to attack or invasion (other than the demographic invasion of immigrants seeking refuge in the U.S.)? Highly doubtful. Other countries have their own domestic issues to manage and would probably appreciate a cessation of interference and intervention from the U.S. One might accuse me of substituting one thing for another, as I accused others at top, but the guns-vs.-butter debate is well established. Should be obvious that it’s preferable to prioritize caring for our own society rather than devoting so much of our limited time and resources to destroying others.

So far, this multipart blog post has trafficked in principles and generalities. Let me try now to be more specific, starting with an excerpt from Barry Lynn’s article in Harper’s Magazine titled “The Big Tech Extortion Racket” (Sept. 2020):

… around the middle of the nineteenth century, Americans began to develop technologies that could not be broken into component pieces. This was especially true of the railroad and the telegraph … Such corporations [railroad and telegraph companies] posed one overarching challenge: they charged some people more than others to get to market. They exploited their control over an essential service in order to extort money, and sometimes political favors … Americans found the answer to this problem in common law. For centuries, the owners of ferries, stagecoaches, and inns had been required to serve all customers for the same price and in the order in which they arrived. In the late nineteenth century, versions of such “common carrier” rules were applied to the new middleman corporations.

Today we rightly celebrate the Sherman Antitrust Act of 1890, which gave Americans the power to break apart private corporations. But in many respects, the Interstate Commerce Act of 1887 was the more important document. This act was based on the understanding that monopoly networks like the railroad and the telegraph could be used to influence the actions of people who depend on them, and hence their power must be carefully restricted …

For a century and a half, Americans used common carrier policies to ensure the rule of law in activities that depended on privately held monopolies … regulations freed Americans to take full advantage of every important network technology introduced during these years, including telephones, water and electrical services, energy pipelines, and even large, logistics-powered retailers. Citizens did not have to worry that the men who controlled the technologies involved would exploit their middleman position to steal other people’s business or disrupt balances of power.

I appreciate that Barry Lynn brings up the Interstate Commerce Act. If this legal doctrine appeared in the net neutrality debate a few years ago, it must have escaped my notice. While Internet Service Providers (ISPs) enable network access and connectivity, those utilities have not yet exhibited let’s-be-evil characteristics. Similarly, phone companies (including cell phones) and public libraries may well be eavesdropping and/or monitoring activities of the citizenry, but the real action lies elsewhere, namely, on social media networks and with online retailers. Evil is arguably concentrated in the FANG (or FAANG) corporations but has now grown to be ubiquitous in all social networks (e.g., Twitter) operating as common carriers (Zoom? Slack?) and across academe, nearly all of which have succumbed to moral panic. They are interpreting correctly, sad to observe, demands to censor and sanitize others’ no-longer-free speech appearing on their networks or within their realms. How much deeper it goes toward shaping politics and social engineering is quasi-conspiratorial and impossible for me to assess.

Much as I would prefer to believe that individuals possess the good sense to shift their activities away from social networks or turn their attention from discomfiting information sources, that does not appear to be the case. Demands for trigger warnings and safe spaces commonplace a few years ago on college campuses have instead morphed into censorious removal, deplatforming, and cancellation from the entire public sphere. Those are wrong responses in free societies, but modern institutions and technologies have gotten out of hand and outstripped the limits of normal human cognition. In short, we’re a society gone mad. So rather than accept responsibility to sort out information overflow oneself, many are demanding that others do it for them, and evil private corporations are complying (after a fashion). Moreover, calls for creation of an Orwellian Ministry of Truth, rebranded as a Truth Commission and Reality Czar, could hardly be any more chillingly and fascistically bizarre. People really need someone to brainwash decide for them what is real? Has anyone at the New York Times actually read Orwell’s dystopian novel 1984 and taken to heart its lessons?

Considering the acceleration of practically everything in the late-modern world (postmodern refers to something quite different), which makes planning one’s higher education somewhat fraught if the subject matter studied is rendered flatly out-of-date or moribund by the time of either graduation or entry into the workforce, I’ve heard it recommended that expertise in any particular subject area may be less important than developing expertise in at least one subject that takes a systems approach. That system might be language and communications, mathematics (or any other hard science), history, economics and finance, business administration, computer coding, law and governance, etc. So long as a rigorous understanding of procedures and rules is developed, a structuralist mindset can be repeated and transferred into other subject areas. Be careful, however, not to conflate this approach with a liberal arts education, which is sometimes described as learning how to learn and is widely applicable across disciplines. The liberal arts have fallen distinctly out of favor in the highly technological and technocratic world, which cares little for human values resistant to quantification. Problem is, Western societies in particular are based on liberal democratic institutions now straining due to their sclerotic old age. And because a liberal arts education is scarcely undertaken anymore, civics and citizenship are no longer taught. Even the study of English has now been corrupted (postmodern does apply here) to the point that the basic liberal arts skill of critical thinking is being lost through attrition. Nowhere is that more abundantly clear than in bristling debate over free speech and censorship.

Aside. Although society tinkers and refines itself (sometimes declines) over time, a great body of cultural inheritance informs how things are done properly within an ideology or system. When tinkering and refinement become outright intransigence and defiance of an established order, it’s commonplace to hear the objection “but that’s not how _______ works.” For instance, debate over climate science or the utility of vaccines often has one party proclaiming “trust [or believe] the science.” However, that’s not how science works (i.e., through unquestioning trust or belief). The scientific method properly understood includes verification, falsification, and revision when results and assertions fail to establish reasonable certainty (not the same as consensus). Similarly, critical thinking includes a robust falsification check before “facts” can be accepted at face value. So-called “critical studies” (a/k/a grievance studies), like religious faith, typically positions bald assertions beyond the reach of falsification. Well, sorry, that’s not how critical thinking works.

Being older and educated before critical studies were fully legitimized (or gave rise to things as risible as feminist glaciology), my understand has always been that free speech and other rights are absolutes that cannot be sliced and diced into bits. That way lies casuistry, where law founders frequently. Thus, if one wishes, say, to trample or burn the U.S. flag in protest, no law can be passed or constitutional amendment enacted to carve out an exception disallowed that instance of dissenting free speech. A lesser example is kneeling silently rather than participating in singing the national anthem before a sporting event. Though offensive to certain individual’s sensibilities, silencing speech is far worse according to liberal democratic values. Whatever our ideological or political difference are, we cannot work them out when one party has the power to place topics out or bounds or remove others from discussion entirely. The point at which spirited debate crosses over into inciting violence or fomenting insurrection is a large gray area, which is the subject of the second impeachment of 45. Civil law covers such contingencies, so abridging free speech, deplatforming, and adopting the formulation “language is violence” are highly improper responses under the liberal form of government codified in the U.S. Constitution, which includes the Bill of Rights originally omitted from the U.S. Constitution but quickly added to articulate the rights fully.

Liberal democratic ideology arose in mercantile, substantially agrarian Western societies before scientific, industrial, and capitalist revolutions built a full head of steam, so to speak. Considering just how much America has developed since the Colonial Period, it’s no surprise society has outgrown its own founding documents. More pointedly, the intellectual commons was a much smaller environment, often restricted to a soapbox in the town square and the availability of book, periodicals,and broadsides. Today, the public square has moved online to a bewildering array of social media platforms that enables publication of one’s ideas well beyond the sound of one’s voice over a crowd or the bottleneck of a publisher’s printing press. It’s an entirely new development, and civil law has not kept pace. Whether Internet communications are regulated like the airwaves or nationalized like the U.S. military, it’s clear that the Wild West uber-democratic approach (where anyone can basically say anything) has failed. Demands for regulation (restrictions on free speech) are being taken seriously and acted upon by the private corporations that run social media platforms. During this interim phase, it’s easy for me, as a subscriber to liberal democratic values, to insist reflexively on free speech absolutism. The apparent mood of the public lies elsewhere.

Evil exists in the world. History and current events both bear this out amply. Pseudo-philosophers might argue that, like emotions and other immaterial sensations, good and evil are merely reified concepts, meaning they are human constructs with no palpable external reality. Go tell that to victims of evildoers. Human suffering can’t be anonymized, rationalized, or philosophized away quite so handily.

It was sort of refreshing, back in the day, when Google’s motto and/or corporate code of conduct was simple: “Don’t Be Evil.” It acknowledged the potential for being or becoming evil (like any of the Bigs: Big Tobacco, Big Soda, Big Oil, Big Pharma, Big Media, Big Agriculture, etc.) and presumably aspired to resist obvious temptations. That was then (from 2000 to 2018), this is now (2021 until death take us — soon enough, I fear). But like all entities possessed of absurd levels of wealth and power, Google (now reorganized as a subsidiary of Alphabet, but who actually refers to it that way?) and its Silicon Valley brethren have succumbed to temptation and become straight-up evil.

One might charitably assess this development as something unbidden, unanticipated, and unexpected, but that’s no excuse, really. I certainly don’t envy celebrity executives experiencing difficulty resulting from having created unmanageable behemoths loosed on both public and polity unable to recognize beastly fangs until already clamped on their necks. As often occurs, dystopian extrapolations are explored in fiction, sometimes satirically. The dénouement of the HBO show Silicon Valley depicts tech mogul wannabes succeeding in creating an AI (or merely a sophisticated algorithm? doesn’t matter …) that would in time become far too powerful in blind execution of its inner imperative. In the show, characters recognize what they had done and kill their own project rather than allow it to destroy the world. In reality, multiple developers of computer tech platforms (and their embedded dynamic, including the wildly unhelpful albeit accurate term algorithm) lacked the foresight to anticipate awful downstream effects of their brainchildren. Yet now that those effects are manifesting recognizably, these corporations continue to operate and wreak havoc.

Silicon Valley shows a extended software development period of bungling ineptitude punctuated by brilliant though momentary breakthroughs. Characters are smart, flawed people laughably unable to get out of the way of their own success. The pièce de résistance was yoking one so-called “learning machine” to another and initiating what would become a runaway doomsday process (either like ecological collapse, building slowly the making the biosphere uninhabitable all at once, or like the gray goo problem, progressively “processing” biomass at the molecular level until all that remains is lifeless goo). It was a final act of bumbling that demanded the characters’ principled, ethical response before the window of opportunity closed. Real Silicon Valley tech platforms are in the (ongoing) process of rending the social fabric, which is no laughing matter. The issue du jour surrounds free speech and its inverse censorship. More broadly, real Silicon Valley succeeded in gaming human psychology for profit in at least two aspects (could be more as yet unrecognized): (1) mining behavioral data as an exploitable resource, and (2) delivering inexhaustible streams of extremely divisive content (not its own) to drive persistent engagement with its platforms. Yoked together, they operate to drive society mad, and yet, mounting evidence of this development has not produced even an inkling that maybe the damned doomsday devices ought to be shut off. As with the environment, we operate with freedom enough to destroy ourselves. Instead, politicians issue stunningly ineffectual calls for regulation or break-up of monopolies. In the meantime, ever more absurd wealth and power are concentrated in the hands of a few executives who have clearly punted and decided “let’s be evil.” No restraints on their behavioral experimentation across whole societies exist.

Much more to say on this topic in additional parts to come.

Already widely reported but only just having come to my awareness is an initiative by Rolling Stone to establish a Culture Council: “an Invitation-Only Community of Influencers, Innovatives, and Creatives.” The flattering terms tastemakers and thought leaders are also used. One must presume that submissions will be promotional and propaganda pieces masquerading as news articles. Selling advertising disguised as news is an old practice, but the ad usually has the notation “advertisement” somewhere on the page. Who knows whether submissions will be subject to editorial review?

To be considered for membership, candidates must sit in a senior-level position at a company generating at least $500K in annual revenue or have obtained at least $1M in total institutional funding.

Rolling Stone‘s website doesn’t say it anywhere I can locate, but third-party reports indicate that members pay either a $1,500 annual fee and $500 submission fee (one-time? repeat?) or a flat $2,000 submission fee. Not certain which. Just to be abundantly clear, fees would be paid by the submitter to the magazine, reversing how published content is normally acquired (i.e., by paying staff writers and free lancers). I’d say this move by Rolling Stone is unprecedented, but of course, it’s not. However, it is a more brazen pay-to-play scheme than most and may be a harbinger of even worse developments to come.

Without describing fully how creative content (arts and news) was supported in the past, I will at least observe that prior to the rise of full-time creative professions in the 18th and 19th centuries (those able to scratch out earn a living on commissions and royalties), creative work was either a labor of love/dedication, typically remunerated very poorly if at all, or was undertaken through the patronage of wealthy European monarchs, aristocrats, and religious institutions (at least in the developing West). Unless I’m mistaken, self-sustaining news organizations and magazines came later. More recent developments include video news release and crowd sourcing, the latter of which sometimes accomplished under the pretense of running contests. The creative commons is how many now operative (including me — I’ve refused to monetize my blog), which is exploited ruthlessly by HuffPost (a infotainment source I ignore entirely), which (correct me if wrong) doesn’t pay for content but offers exposure as an inducement to journalists trying to develop a byline and/or audience. Podcasts, YouTube channels, and news sites also offer a variety of subscription, membership, and voluntary patronage (tipping) schemes to pay the bills (or hit it big if an outlier). Thus, business models have changed considerably over time and are in the midst of another major transformation, especially for news-gathering organizations and the music recording industry in marked retreat from their former positions.

Rolling Stone had always been a niche publication specializing in content that falls outside my usual scope of interest. I read Matt Taibbi’s reporting that appeared in Rolling Stone, but the magazine’s imprint (read: reputation) was not the draw. Now that the Rolling Stone is openly soliciting content through paid membership in the Culture Council, well, the magazine sinks past irrelevance to active avoidance.

It’s always been difficult to separate advertising and propaganda from reliable news, and some don’t find it important to keep these categories discrete, but this new initiative is begging to be gamed by motivated PR hacks and self-promoters with sufficient cash to burn. It’s essentially Rolling Stone whoring itself out. Perhaps more worrying is that others will inevitably follow Rolling Stone‘s example and sell their journalistic integrity with similar programs, effectively putting the final nails in their own coffins (via brand self-destruction). The models in this respect are cozy, incestuous relationships between PACs, lobbying groups, think tanks, and political campaigns. One might assume that legacy publications such as Rolling Stone would have the good sense to retain as much of their valuable brand identity as possible, but the relentless force of corporate/capitalist dynamics are corrupting even the incorruptible.

Something in an online discussion brought me back to my days as a Boy Scout. (No, not that, with your nasty, nasty assumptions.) It was one of the first merit badges I earned: Citizenship in the Community (link to PDF). I can’t remember any of the content anymore (haven’t yet consulted the PDF), and indeed, looking back with the advantage of several decades of hindsight, I have a hard time imagining any of the (morality? ethics?) lessons learned back then having had much durable impact despite remembering an emerging confidence and awareness (a commonplace delusion of youth) of my position within the community. Still, I appreciate having had many Boy Scout character-building experiences, which led to simple and enduring understandings of ideals such as honor, duty, preparedness, service, forbearance, shouldering hardships, and perhaps most of all, accepting responsibility for others, particularly those younger and weaker. (I’m not claiming to be any sort of paragon of virtue. Cynicism and misanthropy may have wrecked that aspiration.) I never served in the military, but I surmise others learn similar lessons slightly later in life when more readily absorbed and not so easily forgotten. In the past decade plus, some may seek these lessons through participation in endurance sports or martial arts (if not distorted by bad instruction like in Cobra Kai), though the focus outward (i.e., toward community and mutual reliance) may not be as strong.

The subject came up in a discussion of participants in small-scale democracy, something I’ve always known is messy, unrewarding, thankless, and sometimes costly yet still necessary to be a good citizen contributing to one’s community. Many adults get their first taste of local democratic groups (read: self-governing) through parent groups like the Parent-Teacher Association (PTA). Or maybe it’s a performing arts organization, home owner’s association, church council, social work hotline, self-help group, or cooperative. Doesn’t matter which. (Political activism and organizing might be something quite different. Hard to say.) Groups run on the good will and dedication of volunteered time and skills for the benefit of members of the community. As with any population, there are always free riders: those who contribute nothing but enjoy and/or extract benefits. In fact, if everyone were integrally involved, organizational complexity would become unmanageable. If activities of such groups seem like a piece of cake or vaguely utopian, just join one and see how different character types behave. Lotta dead wood in such organization. Moreover, power mongers and self-aggrandizers often take over small-scale democracies and run them like private fiefdoms. Or difficult policy and finance discussions divide otherwise like-minded groups into antagonists. As I said, it’s a decidedly messy undertaking.

Members of the community outside of the executive group (typically a board of directors) also have legitimate interests. Maybe community members attend meetings to keep informed or weigh in online with unconstructive complaints and criticisms (or even mockery and trolling) but then refuse to contribute anything worthwhile. Indeed, boards often have difficulty recruiting new officers or participants because no one wants to take on responsibility and face potential criticism directed at them. I’ve also seen boards settle into the same few folks year after year whose opinions and leadership grow stale and calcifies.

Writ large, leadership skills learned through citizenship in the community rise to the equivalents of Boy Scout merit badges Citizenship in the Nation and Citizenship in the World (no links but searchable). Skills deployed at those strata would arguably require even greater wherewithal and wisdom, with stakes potentially being much higher. Regrettably, having just passed through an election cycle and change of leadership in the U.S., my dour assessment is that leadership has failed miserably at multiple issues. The two most significant involve how we fail to organize society for the benefit of all, namely, economic equality and resource sustainability. Once market forces came to bear on social organization and corporate entities grew too large to be rooted in community service anymore, greed and corruption destroyed high-minded ideals. More self-aggrandizers and careerists than ever (no names, fill in the blanks, they’re all famous — or infamous) rose to the tops of organizations and administrations, especially politics, news media, and the punditry. Their logical antidotes are routinely and ruthlessly disenfranchised and/or ignored. The lasting results are financial inequality run amok and unsustainable resource addictions (energy mostly) that are toxifying the environment and reducing the landscape to ruin and inhabitability. (Perpetual war is a third institutional failure that could be halted almost immediately if moral clarity were somehow to appear.) It’s all out there, plain to see, yet continues to mount because of execrable leadership. Some argue it’s really a problem with human nature, a kind of original stain on our souls that can never be erased and so should be forgiven or at least understood (and rationalized away) within a large context. I’m not yet ready to excuse national and world leaders. Their culpability is criminal.

I simply can’t keep up with all the reading, viewing, and listening in my queue. Waking hours are too few, and concentration dissipates long before sleep overtakes. Accordingly, it’s much easier to settle into couch-potato mode and watch some mindless drivel, such as the Netflix hit Bridgerton binged in two sittings. (Unlike cinema critics, I’m not bothered especially by continuity errors, plot holes, clunky dialogue, weak character motivation, gaps of logic, or glossy decadence of the fictional worlds. I am bothered by the Kafka trap sprung on anyone who notices casting decisions that defy time and place — an ill-advised but now commonplace historical revisionism like editing Mark Twain.) As a result, blog posts are less frequent than they might perhaps be as I pronounce upon American (or more broadly, Western) culture, trying vainly to absorb it as a continuously moving target. Calls to mind the phrase Après moi, le déluge, except that there is no need to wait. A deluge of entertainment, news, analysis, punditry, and trolling has buried everyone already. So rather than the more careful consideration I prefer to post, here are some hot takes.

The Irregular Aphorist. Caitlin Johnstone offers many trenchant observations in the form of aphorisms (some of which I’ve quoted before), all gathered under the subtitle Notes From The Edge Of The Narrative Matrix. The modifier irregular only means that aphorisms are a regular but not constant feature. Her site doesn’t have a tag to that effect but probably ought to. Here’s one in particular that caught my attention:

Everything our species has tried has led us to a dying world and a society that is stark raving mad, so nobody is in any position to tell you that you are wrong.

Twin truths here are (1) the dying world and (2) societal madness, both of which I’ve been describing for some time. Glad when others recognize them, too.

Piling on. Though few still are willing to admit it, nonpharmaceutical interventions (NPIs, e.g., distancing, masks, and lockdowns) to stall or reduce the spread of the virus failed to achieve their objectives according to this study. Instead, NPIs piled on suffering no one could forestall. I read somewhere (no link) that the world is approaching half of total, cumulative deaths/infections predicted had nothing been done to impede the pandemic running its course. Adding in deaths of despair (numbers not entirely up to date), we’re using the wrong tools to fight the wrong battle. Of course, interventions opened up giant opportunities for power grabs and vulture capitalism, so the cynic in me shrugs and wonders half aloud “what did you expect, really?”

Growth of the Managerial Bureaucracy. A blog called Easily Distracted by Timothy Burke (never on my blogroll) publishes only a few times per year, but his analysis is terrific — at least when it doesn’t wind up being overlong and inconclusive. Since a student debt jubilee is back in the news (plenty of arguments pro and con), unintended consequences are anticipated in this quote:

When you set out to create elaborate tiers that segregate the deserving poor from the comfortable middle-class and the truly wealthy, you create a system that requires a massive bureaucracy to administer and a process that forces people into petitionary humiliation in order to verify their eligibility. You create byzantine cutoff points that become business opportunities for predatory rentiers.

Something similar may well be occurring with stimulus checks being issued pro rata (has anyone actually gotten one?), but at least we’re spared any petitionary humiliations. We get whatever the algorithms (byzantine cutoff points) dictate. How those funds will be gamed and attached is not yet clear. Stay alert.

No Defense of Free Speech. Alan Jacobs often recommends deleting, unsubscribing, and/or ignoring social media accounts (after his own long love-hate relationship with them) considering how they have become wholly toxic to a balanced psyche as well as principal enablers of surveillance capitalism and narrative control. However, in an article about the manorial elite, he’s completely lost the plot that absolutism is required in defense of free speech. It’s not sufficient to be blasé or even relieved when 45 is kicked off Twitter permanently or when multiple parties conspire to kill Parler. Establishing your own turf beyond the reach of Silicon Valley censors is a nice idea but frankly impractical. Isn’t that what whoever ran Parler (or posted there) must have thought? And besides, fencing off the digital commons these very entities created has catapulted them into the unenviable position of undemocratic, unelected wielders of monopolistic power and co-conspirators to boot. That’s what needs to be curtailed, not free speech.

The Taxonomic Apocalypse. Although drawn from fiction and thus largely hypothetical, a new book (coming late 2021) by Adam Roberts called It’s the End of the World: But What Are We Really Afraid Of? surveys doomsday stories and categorizes different versions of how it all ends. Alan Jacobs (yeah, him again — must have an advance copy of the manuscript) recommends it as “a delightful and provocative little book” but fails to grok two things: (1) these stories are rehearsals-cum-preparations for the real thing, and (2) the real thing really is bearing down on us implacably and so is no longer a mere hypothetical to contemplate and categorize for shits and grins. Despite acceptance of the eventualities that await all of us, reading Roberts’ taxonomy is not something I would expect to find delightful. Skip.

Narrative Collapse. Ran Prier (no link) sometimes makes statements revealing an unexpected god’s-eye view:

[45] is a mean rich kid who figured out that if he does a good Archie Bunker impression, every lost soul with an authoritarian father will think he’s the messiah. We’re lucky that he cares only about himself, instead of having some crazy utopian agenda. But the power, and the agency, is with the disaffected citizens of a declining empire, tasting barbarism.

This is all about people wanting to be part of a group that’s part of a story. Lately, some of the big group-stories have been dying: sky father religion, American supremacy, the conquest of nature, the virtue of wealth-seeking. In their place, young and clumsy group-stories struggle and rise.

Collapse of certain fundamental stories that animate our thinking is at the core of The Spiral Staircase (see About Brutus at top), though it’s often couched in terms of consciousness in transition. Getting through the transition (only temporarily, see previous item in list) probably means completion of the Counter-Enlightenment historical arc, which necessarily includes further descent into barbarism.

Hail Mary for Individualism. I always take special notice when someone cites Allan Bloom. Alan Jacobs (um, yeah, he’s prolific and I’m using his ideas again — sue me) cites Bloom to argue that individualism or the sovereign self, a product of the Enlightenment, is already dead. No doubt, the thought-world described so ably by Bloom no longer exists, but individualism has not yet died out by attrition or been fully dissolved in nonduality. Many of us born before the advent of the Internet retain selfhood and authenticity not yet coopted by or incorporated into mass mind. Moreover, ongoing struggles over identity (e.g., gender, sexual orientation, and race that are often used improperly to define the self) result from an inchoate sense that individualism is eroding precipitously, not that it’s already passé. Defiant attempts to (re)establish an authentic self (contravening all logic and becoming critical theory of one sort or another) in the face of this loss may well be a last-ditch effort to save the self, but it’s failing.

Unlike turtles, humans do not have protective shells into which we can withdraw when danger presents. Nor can we lift off, fly away, and elude danger the way birds do. These days, we’re sorely beset by an invisible pandemic spread by exposure to carriers (read: other people) and so asked or forced to submit to being locked down and socially distanced. Thus, we are withdrawn into the protective shell of the home in cycles of varying intensity and obeisance to maintain health and safety. Yet life goes on, and with it, numerous physical requirements (ignoring psychological needs) that can’t be met virtually demand we venture out into the public sphere to gather resources, risking exposure to the scourge. Accordingly, the conduct of business has adapted to enable folks to remain in the protective shells of their vehicles, taking delivery through the car window and rarely if ever entering a brick-and-mortar establishment except in defiance or at the option of acceptable risk. In effect, we’re being driven into our cars ever more, and the vehicle is readily understood as a proxy for its inhabitant(s). Take note of pictures of people in bread lines during the Great Depression having been replaced by pictures of cars lined up for miles during the pandemic to get packaged meals from charitable organizations.

Reflecting on this aspect of modern life, I realized that it’s not exactly novel. The widespread adoption of the individual vehicle in the 1940s and 50s, as distinguished from mass transit, and the construction of the interstate highway system promised (and delivered) flexibility and freedom of tremendous appeal. While the shift into cars (along with air travel) doomed now moribund passenger rail (except intracity in the few American cities with effective rail systems), it enabled the buildout of suburbs and exurbs now recognized as urban sprawl. And like all those packages now clogging delivery systems as we shift even more heavily during the holiday season to online shopping, a loss of efficiency was inevitable. All those individual cars and boxes create congestion that cry out for solutions.

Among the solutions (really a nonsolution) were the first drive-through banks of the 1970s. Is doing one’s banking without leaving the vehicle’s protective shell really an efficiency? Or is it merely an early acknowledgement and enabling of antisocial individualism? Pneumatic tubes that permitted drive-through banking did not speed up transactions appreciably, but the novel mechanism undoubtedly reinforced the psychological attachment Americans felt with their cars. That growing attachment was already apparent in the 1950s, with two bits of Americana from that decade still resonating: the drive-in theater and the drive-in restaurant. The drive-in theater was a low-fidelity efficiency and alternative to the grand movie houses built in the 1920s and 30s seating a few thousand people in one cavernous space. (A different sort of efficiency enabling choice later transformed most cinema establishments into multiplexes able to show 8–10 titles instead of one, handily diminishing audiences of thousands to hundreds or even tens and robbing the group experience of much of its inherent power. Now that premium streaming content is delivered to screens at home and we are disallowed assembly into large audiences, we have instead become something far more inert — viewers — with fully anticipatable degradation of the entertainment experience notwithstanding the handsome technologies found within the comforts of the home.) I’ve heard that drive-ins are experiencing a renaissance of sorts in 2020, with Walmart parking lots converted into showplaces, at least temporarily, to resemble (poorly) group experience and social cohesion connection. The drive-in restaurant of the 1950s, with their iconic carhops (sometimes on roller skates), is a further example of enabling car culture to proliferate. Never mind that eating in the car is actually kinda sad and maybe a little disgusting as odors and refuse collect in that confined space. One might suspect that drive-ins were directed toward teenyboppers and cruisers of the 1950s exploring newfound freedom, mobility, and the illusion of privacy in their cars, parked in neat rows at drive-ins (and Lookout Points for smooch sessions) all across the country. However, my childhood memory was that it was also a family affair.

Inevitably, fast food restaurants followed the banks in the 1970s and quickly established drive-through lanes, reinforcing the degradation of the food experience into mere feeding (often on one’s lonesome) rather than dining in community. Curiously, the pandemic has made every restaurant still operating, even the upscale ones, a drive-through and forced those with and without dedicated drive-through lanes to bring back the anachronistic carhop to serve the congestion. A trip to a local burger joint in Chicago last week revealed 40+ cars in queue and a dozen or so carhops on the exterior directing traffic and making deliveries through the car window (briefly penetrating the protective shell) so that no one would have to enter the building and expose oneself to virus carriers. I’ve yet to see a 2020 carhop wearing roller skates (now roller blades) or a poodle skirt.

Such arrangements are probably effective at minimizing pandemic risk and have become one of several new normals (discussion of political dysfunction deferred). Who can say how long they will persist? Still, it’s strange to observe the psychology of our response, even if only superficially and preliminarily. Car culture has been a curious phenomenon since at least the middle of the 20th century. New dynamics reinforcing our commitment to cars are surprising, perhaps, but a little unsurprising, too, considering how we made ourselves so dependent on them as the foundation of personal transportation infrastructure. As a doomer, I had rather expected that Peak Oil occurring around 2006 or so would spell the gradual (or sudden) end of happy motoring as prices at the pump, refusal to elevate standard fuel efficiency above 50 mph, and climbing average cost of new vehicles placed individual options beyond the reach of average folks. However, I’ve been genuinely surprised by fuel costs sinking to new lows (below the cost of production, even bizarrely inverting to the point that producers paid buyers to take inventory) and continued attempts to engineer (only partially) around the limitations of Peak Oil, if not indeed Peak Energy. I continue to believe these are mirages, like the record-setting bull market of 2020 occurring in the midst of simultaneous economic, social, and health crises.