Posts Tagged ‘Communications’

This unwritten blog post has been sitting in my drafts folder since October 2019. The genesis, the kernel, is that beyond the ongoing collapse of the ecosystem, the natural world that provides all the resources upon which we humans and other organisms rely for life and survival, all other concerns are secondary. Now 5–6 months later, we’re faced with a short- to mid-term crisis that has transfixed and paralyzed us, riveting all attention on immediate pressures, not least of which is ample supplies of paper with which to wipe our asses. Every day brings news of worsening conditions: rising numbers of infection; growing incidence of death; sequestering and quarantining of entire cities, states, and countries; business shutdowns; financial catastrophe; and the awful foreknowledge that we have a long way to go before we emerge (if ever) back into daylight and normalcy. The Age of Abundance (shared unequally) may be gone forever.

Are we mobilizing fully enough to stop or at least ameliorate the pandemic? Are our democratically elected leaders [sic] up to the task of marshaling us through the (arguably) worst global crisis in living memory? Are regular folks rising to the occasion, shouldering loss and being decent toward one another in the face of extraordinary difficulties? So far, my assessment would indicate that the answers are no, no, and somewhat. (OK, some municipal and state leaders have responded late but admirably; I’m really thinking of the early executive response that wasn’t). But let me remind: as serious as the immediate health crisis may be, the even larger civilizational collapse underway (alongside the current extinction process) has not yet been addressed. Sure, lots of ink and pixels have been devoted to studies, reports, books, articles, speeches, and blog posts about collapse, but we have blithely and intransigently continued to inhabit the planet as though strong continuity of our living arrangements will persist until — oh, I dunno — the end of the century or so. Long enough away that very few of us now alive (besides Greta Thunberg) care enough what happens then to forestall much of anything. Certainly not any of the real decision-makers. Collapse remains hypothetical, contingent, theoretical, postulated, and suppositional until … well … it isn’t anymore.

While we occupy ourselves indoors at a social distance for some weeks or months to avoid exposure to the scourge, I’d like to believe that we have the intelligence to recognize that, even in the face of a small (by percentage) reduction of global human population, all other concerns are still secondary to dealing with the prospect (or certainty, depending on one’s perspective) of collapse. However, we’re not disciplined or wizened enough to adopt that view. Moreover, it’s unclear what can or should be done, muddying the issue sufficiently to further delay action being taken. Fellow blogger The Compulsive Explainer summarizes handily:

We have been in an emergency mode for some time, and are now just recognizing it. This time it is a virus that is killing us, but we have been dying for a long time, from many causes. So many causes, they cannot be enumerated individually.

So for the near term, life goes on; for the farther term, maybe not.

In educational philosophy, learning is often categorized in three domains: the cognitive, the affective, and the psychomotor (called Bloom’s Taxonomy). Although formal education admittedly concentrates primarily on the cognitive domain, a well-rounded person gives attention to all three. The psychomotor domain typically relates to tool use and manipulation, but if one considers the body itself a tool, then athletics and physical workouts are part of a balanced approach. The affective domain is addressed through a variety of mechanisms, not least of which is narrative, much of it entirely fictional. We learn how to process emotions through vicarious experience as a safe way to prepare for the real thing. Indeed, dream life is described as the unconscious mind’s mechanism for consolidating memory and experience as well as rehearsing prospective events (strategizing) in advance. Nightmares are, in effect, worst-case scenarios dreamt up for the purpose of avoiding the real thing (e.g., falling from a great height or venturing too far into the dark — a proxy for the unknown). Intellectual workouts address the cognitive domain. While some are happy to remain unbalanced, focusing on strengths found exclusively in a single domain (gym rats, eggheads, actors) and thus remaining physically, emotionally, or intellectually stunted or immature, most understand that workouts in all domains are worth seeking out as elements of healthy development.

One form of intellectual workout is debate, now offered by various media and educational institutions. Debate is quite old but has been embraced with renewed gusto in a quest to develop content (using new media) capable of drawing in viewers, which mixes educational objectives with commercial interests. The time-honored political debate used to be good for determining where to cast one’s vote but has become nearly useless in the last few decades as neither the sponsoring organizations, the moderators, nor the candidates seem to understand anymore how to run a debate or behave properly. Instead, candidates use the opportunity to attack each other, ignore questions and glaring issues at hand, and generally refuse to offer meaningful responses to the needs of voters. Indeed, this last was among the principal innovations of Bill Clinton: roll out some appealing bit of vacuous rhetoric yet offer little to no guidance what policies will actually be pursued once in office. Two presidential administrations later, Barack Obama did much the same, which I consider a most egregious betrayal or bait-and-switch. Opinions vary.

In a recent Munk Debate, the proposition under consideration was whether humankind’s best days lie ahead or behind. Optimists won the debate by a narrow margin (determined by audience vote); however, debate on the issue is not binding truth, nor does debate really resolve the question satisfactorily. The humor and personalities of the debaters probably had more influence than their arguments. Admitting that I possess biases, I found myself inclined favorably toward the most entertaining character, though what I find entertaining is itself further bias not shared especially with many others. In addition, I suspect the audience did not include many working class folks or others who see their prospects for better lives diminishing rapidly, which skews the resulting vote. The age-old parental desire to leave one’s children a better future than their own is imperiled according to this poll (polls may vary considerably — do your own search). How one understands “better off” is highly variable, but the usual way that’s understood is in terms of material wellbeing.

Folks on my radar (names withheld) range widely in their enthusiasm or disdain for debate. The poles appears to be default refusal to accept invitations to debate (often couched as open challenges to professed opinions) as a complete waste of time to earnest desire to participate in, host, and/or moderate debates as a means of informing the public by providing the benefit of expert argumentation. As an intellectual workout, I appreciate the opportunity to hear debates (at least when I’m not exasperated by a speaker’s lack of discipline or end-around arguments), but readers can guess from the title of this post that I expect nothing to be resolved by debate. Were I ever to be offered an opportunity to participate, I can well imagine accepting the invitation and having some fun flexing my intellectual muscles, but I would enter into the event with utterly no expectation of being able to convince anyone of anything. Minds are already too well made up on most issues. If I were offered a spot on some bogus news-and-opinion show to be a talking head, shot from the shoulders up and forced to shout and interrupt to get a brief comment or soundbite in edgewise, that I would decline handily as a total waste of time.

Didn’t expect to come back to this one so soon, but an alternative meaning behind my title just appeared. Whereas the first post was about cancel culture, this redux is about finding people willing and able to act as mouthpieces for whatever narrative the powers that be wish to foist on the public, as in “Where do they dig up these characters people?”

Wide-ranging opinion is not difficult to obtain in large populations, so although plenty of folks are willing to be paid handsomely to mouth whatever words are provided to them (e.g., public relations hacks, social media managers, promoters, spokespersons, actors, and straight-up shills in advertisements of all sorts), a better approach is simply to find people who honestly believe the chosen narrative so that they can do others’ bidding guilelessly, which is to say, without any need of selling their souls. This idea first came to my attention in an interview (can’t remember the source) given by Noam Chomsky where is chided the interviewer, who had protested that no one was telling him what to say, by observing that if he didn’t already share the desired opinion, he wouldn’t have the job. The interviewer was hired and retained precisely because he was already onboard. Those who depart from the prescribed organizational perspective are simply not hired, or if their opinions evolve away from the party line, they are fired. No need to name names, but many have discovered that journalistic objectivity (or at least a pose of objectivity) and independent thought are not high values in the modern media landscape.

Here’s a good example: 19-year-old climate change denier/skeptic Naomi Seibt is being billed as the anti-Greta Thunberg. No doubt Seibt believes the opinions she will be presenting at the Heartland Institute later this week. All the more authenticity if she does. But it’s a little suspicious, brazen and clumsy even, that another European teenage girl is being raised up to dispel Time Magazine‘s 2019 Person of the Year, Greta Thunberg. Maybe it’s even true, as conspiracists suggest, that Thunberg herself is being used to drive someone else’s agenda. The MSM is certainly using her to drive ratings. These questions are all ways to distract from the main point, which is that we’re driving ourselves to extinction (alongside most of the rest of the living world) by virtue of the way we inhabit the planet and consume its finite resources.

Here’s a second example: a “debate” on the subject of socialism between economists Paul Krugman and Richard Wolff on PBS‘s show Democracy Now!

 

Let me disclose my biases up front. I’ve never liked economists as analysts of culture, sociology, or electoral politics. Krugman in particular has always read like more of an apologist for economic policies that support the dysfunctional status quo, so I pay him little attention. On the other hand, Wolff has engaged his public as a respectable teacher/explainer of the renewed socialist movement of which he is a part, and I give him my attention regularly. In truth, neither of these fellow needed to be “dug up” from obscurity. Both are heavily covered in the media, and they did a good job not attacking each other while making their cases in the debate.

The weird thing was how Krugman is so clearly triggered by the word socialism, even though he acknowledges that the U.S. has many robust examples of socialism already. He was clearly the one designated to object to socialism as an ideology and describes socialism as an electoral kiss of death. Maybe he has too many childhood memories of ducking, covering, and cowering during those Atomic Era air raid drills and so socialism and communism were imprinted on him as evils never to be entertained. At least three generations after him lack those memories, however, and are not traumatized by the prospect of socialism. In fact, that’s what the Democratic primaries are demonstrating: no fear but rather enthusiastic support for the avowed Democratic Socialist on the ballots. Who are the fearful ones? Capitalists. They would be wise to learn sooner than later that the public, as Wolff says plainly, is ready for change. Change is coming for them.

One of the victims of cancel culture, coming to my attention only days ago, is Kate Smith (1907–1986), a singer of American popular song. Though Smith had a singing career spanning five decades, she is best remembered for her version(s) of Irving Berlin’s God Bless America, which justifiably became a bit of Americana. The decades of Smith’s peak activity were the 1930s and 40s.

/rant on

I dunno what goes through people’s heads, performing purity rituals or character excavation on folks long dead. The controversy stems from Smith having a couple other songs in her discography: That’s Why Darkies Were Born (1931) and Pickaninny Heaven from the movie Hello, Everybody! (1933). Hate to break it anyone still living under a rock, but these dates are not far removed from minstrelsy, blackface, and The Birth of a Nation (1915) — a time when typical Americans referred to blacks with a variety of terms we now consider slurs. Such references were still used during the American civil rights movement (1960s) and are in use among some virulent white supremacists even today. I don’t know the full context of Kate Smith having sung those songs, but I suspect I don’t need to. In that era, popular entertainment had few of the sensibilities regarding race we now have (culture may have moved on, but it’s hard to say with a straight face it’s evolved or progressed humanely), and uttering commonly used terms back then was not automatic evidence of any sort of snarling racism.

I remember having heard my grandparents, nearly exact contemporaries of Kate Smith, referring to blacks (the term I grew up with, still acceptable I think) with other terms we no longer consider acceptable. It shocked me, but to them, that’s simply what blacks were called (the term(s) they grew up with). Absolutely nothing in my grandparents’ character or behavior indicated a nasty, racist intent. I suspect the same was true of Kate Smith in the 1930s.

Back when I was a librarian, I also saw plenty of sheet music published before 1920 or so with the term darkie (or darkey) in the title. See for example this. The Library of Congress still uses the subject headings “negro spirituals” (is there another kind?) and “negro songs” to refer to various subgenres of American folk song that includes slave songs, work songs, spirituals, minstrel music, protest songs, etc. Maybe we should cancel the Library of Congress. Some published music titles from back then even call them coon songs. That last one is totally unacceptable today, but it’s frankly part of our history, and like changing character names in Mark Twain’s Huckleberry Finn, sanitizing the past does not make it go away or any less discomfiting. But if you wanna bury your head in the sand, go ahead, ostrich.

Also, if some person or entity ever does some questionably racist, sexist, or malign thing (even something short of abominable) situated contextually in the past, does that mean he, she, or it must be cancelled irrevocably? If that be the case, then I guess we gotta cancel composer Richard Wagner, one of the most notorious anti-Semites of the 19th century. Also, stop watching Pixar, Marvel, and Star Wars films (among others), because remember that time when Walt Disney Studios (now Walt Disney Company) made a racist musical film, Song of the South (1946)? Disney’s tainted legacy (extending well beyond that one movie) is at least as awful as, say, Kevin Spacey, and we’re certainly not about to rehabilitate him.

/rant off

Nicholas Carr has a pair of thoughtful new posts at his blog Rough Type (see blogroll) under the tag “infinite media.” The second of the two is about context collapse, restoration, and content collapse. I won’t review that particular post; I’m merely pointing to it for you to read. Carr is a journalist and media theorist whose work is especially interesting to me as a partial antidote to what I’ve been calling our epistemological crisis. In short, he offers primers on how to think about stuff, that stuff being the primary medium through which most people now gather information: via screens.

Relatedly, the other media theorist to whom I pay attention is Alan Jacobs, who has a recent book (which I read but didn’t review or blog about) called more simply How to Think. It’s about recognizing and avoiding cognitive biases on the way to more disciplined, clear thinking. I mention these two fellows together because I’ve been reading their blogs and books for over a decade now and have been curious to observe how their public interactions have changed over time. They have each embraced and abandoned various new media (particularly social media) and adopted more stringent media ecology. Carr posts ocassionally now and has closed comments at his blog (a shame, since his commentariat was valuable, quite unlike the troll mob at most sites). Jacobs is even more aggressive, starting and abandoning one blog after another (was active at multiple URLs, one formerly on my blogroll) and deleting his Twitter account entirely. Whatever goings-on occur at Facebook I can’t say; I never go there. These aren’t criticisms. We all evolve our associations and activities. But these two are unusual, perhaps, in that they evaluate and recommend with varying vehemence how to interact with electronic media tools.

The wide-open Web available to Americans (but restricted in some countries) used to be valorized as a wholly democratic, organic, grass-roots, decentralized force for good where information yearned to breathe free. Though pioneered by academic institutions, it wasn’t long before the porn industry became the first to monetize it effectively (cuz duh! that’s there the money was — at least initially) and then the whole thing was eventually overwhelmed by others with unique agendas and mechanisms, including commerce, surveillance, and propaganda. The surfeit of information demanded curation, and social media with algorithmic feeds became the default for folks either too lazy or just untrained (or uninterested) in how to think for themselves. Along the way, since a surprisingly large portion of human activity diverted to online media, that activity turned into a resource mined, harvested, and in turn monetized, much like the voting public has become a resource tracked, polled, channeled, activated, disenfranchized, corrupted, and analyzed to death.

An earlier media theorist I read with enthusiasm, Neil Postman, recommended that curricula include the study of semantics as applied to media. (Use of a word like semantics sends nonacademics running for the hills, but the recommendation is basically about thinking critically, even skeptically, regarding information, its sources, and its means of distribution.) The rise of handheld omnimedia postdates Postman, so I can only surmise that the bewildering array of information we confront absorb every day, which I liken to drinking from a fire hose, only compounds Postman’s concern that students are severely overmatched by media (especially advertising) intent on colonizing and controlling their minds. Thus, today’s information environment is a far cry from the stately slowness of earlier eras when teaching and learning (to say nothing of entertainment) were conducted primarily through reading, lecture, and discussion.

A comment came in on this blog chiding me for still blogging after 14 years. I admit hardly anyone reads anymore; they watch (or listen, as with audio-only podcasts). Preferred forms of media consumption have moved on from printed text, something USA Today recognized decades ago when it designed its print publication and sidewalk distribution boxes to look more like TVs. Nonetheless, the modest reproach reminded me of a cry in the wilderness by Timothy Burke: why he still blogs, though quite infrequently. (There’s a brokeback can’t-quit-you joke in there somewhere I’ll leave unformulated.) So this blog may indeed be past its proper expiration date, yet it remains for me one of the best means for organizing how I think about stuff. Without it, I’m afraid thoughts would be rattling loose inside my head, disorganized, only to be displaced by the next slurp from the fire hose.

Cenk Uygur is running for U.S. Congress in California. Good for him … I guess. Racehorse politics don’t actually interest me, at least as a topic for a blog post, but his decision to enter the electoral fray poses some curious issues. What follows is some context and unsolicited advice, the latter exceptional for me since I’m not a political advocate and don’t reside in Cenk’s district (or even state).

Unlike many who heap unwarranted praise of our interrelated systems of government and economics, or who subscribe to some version of Churchill’s quip that democracy is the worst form of government yet preferred over all the others, I regard representative democracy and capitalism both as dumpster fires in the process of burning out. Good ideas while they lasted, perhaps, but they consumed nearly all their available fuel and are now sputtering, leaving behind useless ash and detritus. As a journalist and political junkie commentator, Cenk Uygur may be sensing his Hindenburg moment has arrived to jump onto the sinking RMS Titanic (mixing metaphors of doomed ships), meaning that a serendipitous right-time-right-place opportunity presented itself. Omigawd, the humanity! Others who had their unique Hindenburg moments and made good include Rudy Giuliani in the immediate aftermath of 9/11 (only to spiral down ignominiously) and Alexandria Ocasio-Cortez (AOC, elected to the U.S. Congress in 2018). Dunno about Cenk Uygur. His campaign website linked above rather conspicuously omits his surname (couldn’t find it anywhere). Maybe like AOC and Pete Buttigieg, it’s just too challenging for folks. Curious choice.

I have mostly disregarded Cenk Uygur and The Young Turks (TYT) for some time now. They are part of the new media formed and distributed (primarily?) on the Web, though I’m doubtful they (and others) have yet established a useful revival to supplant traditional broadcast journalism (TV and radio) that have become sclerotic. How their business models (and the inevitable distortions those models introduce) differ is unclear. The main reason I ignore him/them is that TYT adopted a breezy, chatty, unscripted style that is less about reporting than interpreting mostly political news on the fly. They are essentially programming their viewers/subscribers with progressive talking points and orthodoxy, a form of narrowcasting. Onscreen “reporters” have come and gone, but none are as boorish as Cenk Uygur, who labors under the impression that he can outwit others with logic traps but really comes across as incoherent, unfocused, and ideological. TYT has also aired their dirty laundry in the form of beefs with former “correspondents.” None of this serves my political and/or intellectual interests.

The tone of TYT puzzles me, too, considering the utter seriousness of political dysfunction. Commentators appear to enjoy being in front of the camera for verbal jousting matches with each other and guests or simply to riff on the news. Another journalist clearly in love with being on-camera is Rachel Maddow, who has been pilloried for promulgating the Russiagate story relentlessly. Maybe anchors who relish (a little too much) being in the public eye is a collateral effect of news bureaus having been folded into the entertainment divisions of media conglomerates and being forced told to put forward a smiling face no matter what horrors are reported. If I want to see politics served up as jokes, I watch Jimmy Dore (who provides an alarming level of insight). If I want to watch people having entertaining fun, I watch movies or stream TV. I do not watch ideological news shows or political debates (if I watch at all) to be entertained but rather to be informed. While TYT commentators endeavor to be scrupulously factually correct in their opinions, they offer too little signal alongside the noise.

So here are a few recommendations for Cenk’s campaign, worth a couple cents at most:

  • Recognize that politic decisions voters now face are no longer merely left/right, progressive/conservative, who-gets-to-hold-office binaries. Rather, it’s now whether we should track further down the path of authoritarian rule (e.g., a fascist national security state) masking itself as populism (but instead serving the plutocracy) under any political banner or instead serve the interests of the American people (best as able) as empire and industrial civilization sputter out.
  • Recognize that logic and reason are poor substitutes for good character and clarity of vision when the public (i.e., the great unwashed masses) responds more readily to jingoism, emotionalism, and empty rhetoric.
  • Zingers, gotchas, and takedowns are gladiatorial exploits that require more than mere accuracy to hit their marks and inflict damage. Take care not to indulge without considerable preparation and nuance. Some are obviously better at this than others.
  • When answering questions and/or giving interviews, do not mistake the exchange as a speech opportunity and dominate from one side (boorishness). Riffing, having fun, and sucking all the air out of the room are the attributes of TYT but wear thin in campaigning. Listening is just as important, maybe more.
  • Align your tone with the gravity of other’s suffering rather than your enjoyment of the applause and limelight. Your personal circumstances are not the proper locus of emotion.
  • Politics is deeply intertwined with wealth, power, and corruption and accordingly creates distortion fields that threaten to undo even the purest of hearts when compromise and/or betrayal are offered as lures. It’s an inevitability, a basic feature rather than a bug. Know that it’s coming. No one is incorruptible.

Admittedly, I’m not a campaign strategist and have no access to polling data. Accordingly, this post will likely be neither read nor its recommendations heeded; I’m not a political playah. Think of this as the undesired Christmas gift so valueless it can’t even be returned for store credit.

Much ado over nothing was made this past week regarding a technical glitch (or control room error) during the first of two televised Democratic presidential debates where one pair of moderators’ mics was accidentally left on and extraneous, unintended speech leaked into the broadcast. It distracted the other pair of moderators enough to cause a modest procedural disruption. Big deal. This was not the modal case of a hot mic where someone, e.g., a politician, swears (a big no-no despite the shock value being almost completely erased in today’s media landscape) or accidentally reveals callous attitudes (or worse) thinking that no one important was listening or recording. Hot mics in the past have led to public outrage and criminal investigations. One recent example that still sticks in everyone’s craw was a novice political candidate who revealed he could use his fame and impudent nerve to “grab ’em by the pussy.” Turned out not to be the career killer everyone thought it would be.

The latest minor furor over a hot mic got me thinking, however, about inadvertent revelation of matters of genuine public interest. Three genres spring to mind: documentary films, whistle-blowing, and investigative journalism, that last including category outliers such as Wikileaks. Whereas a gaffe on a hot mic usually means the leaker/speaker exposes him- or herself and thus has no one else to blame, disclosures occurring in the other three categories are often against the will of those exposed. It’s obviously in the public interest to know about corruption, misbehavior, and malfeasance in corporate and political life, but the manner in which such information is made public is controversial. Those who expose others suffer harassment and persecution. Documentarians probably fare the best with respect to being left alone following release of information. Michael Moore, for all his absurd though entertaining theatrics, is free (so far as I know) to go about his business and do as he pleases. However, gestures to protect whistle-blowers are just that: gestures. Those who have leaked classified government information in particular, because they gained access to such information through security clearances and signed nondisclosure agreements (before knowing what secrets they were obliged to keep, which is frankly the way such obligations work), are especially prone to reprisal and prosecution. Such information is literally not theirs to disclose, but when keeping others’ secrets is heinous enough, some people feel their conscience and more duty is superior to job security and other risks involved. Opinions vary, sometimes passionately. And now even journalists who uncover or merely come into possession of evidence of wrongdoing and later publish it — again, decidedly in the public interest — are subject to (malicious?) prosecution. Julian Assange is the current test case.

The free speech aspect of revealing someone else’s amoral and criminal acts is a fraught argument. However, it’s clear that as soon as damaging information comes to light, focus shifts away from the acts and their perpetrators to those who publish the information. Shifting the focus is a miserable yet well-established precedent by now, the result being that most folks who might consider coming forward to speak up now keep things to themselves rather than suffer entirely foreseeable consequences. In that light, when someone comes forward anyway, knowing that they will be hounded, vilified, arrested, and worse, he or she deserved more respect for courage and self-sacrifice than generally occurs in the aftermath of disclosure. The flip side — condemnation, prosecution, and death threats — are already abundant in the public sphere.

Some time after reports of torture at Guantánamo, Abu Ghraib, and Bagram went public, a handful of low-level servicemen (“bad apples” used to deflect attention down the command hierarchy) were prosecuted, but high-level officials (e.g., former U.S. presidents Bush and Obama, anyone in their respective administrations, and commanding officers on site) were essentially immunized from prosecution. That example is not quite the same as going after truth-tellers, but it’s a rather egregious instance of bad actors going unprosecuted. I’m still incensed by it. And that’s why I’m blogging about the hot mic. Lots of awful things go on behind the scenes without public knowledge or sanction. Those who commit high crimes (including war crimes) clearly know what they’re doing is wrong. Claims of national security are often invoked and gag laws are legislated into existence on behalf of private industry. When leaks do inevitably occur, those accused immediately attack the accuser, often with the aid of others in the media. Denials may also be issued (sometimes not — why bother?), but most bad actors hide successfully behind the deflecting shift of focus. When will those acting in the shadows against the public interest and in defiance of domestic and international law ever be brought to justice? I daresay the soul of the nation is at stake, and as long as officialdom escapes all but temporary public relations problems to be spun, the pride everyone wants to take as Americans eludes us. In the meantime, there’s a lot to answer for, and it keeps piling up.

/rant on

Yet another journalist has unburdened herself (unbidden story of personal discovery masquerading as news) of her addiction to digital media and her steps to free herself from the compulsion to be always logged onto the onslaught of useless information hurled at everyone nonstop. Other breaking news offered by our intrepid late-to-the-story reporter: water is wet, sunburn stings, and the Earth is dying (actually, we humans are actively killing it for profit). Freeing oneself from the screen is variously called digital detoxification (detox for short), digital minimalism, digital disengagement, digital decoupling, and digital decluttering (really ought to be called digital denunciation) and means limiting the duration of exposure to digital media and/or deleting one’s social media accounts entirely. Naturally, there are apps (counters, timers, locks) for that. Although the article offers advice for how to disentangle from screen addictions of the duh! variety (um, just hit the power switch), the hidden-in-plain-sight objective is really how to reengage after breaking one’s compulsions but this time asserting control over the infernal devices that have taken over life. It’s a love-hate style of technophilia and chock full of illusions embarrassing even to children. Because the article is nominally journalism, the author surveys books, articles, software, media platforms, refusniks, gurus, and opinions galore. So she’s partially informed but still hasn’t demonstrated a basic grasp of media theory, the attention economy, or surveillance capitalism, all of which relate directly. Perhaps she should bring those investigative journalism skills to bear on Jaron Lanier, one of the more trenchant critics of living online.

I rant because the embedded assumption is that anything, everything occurring online is what truly matters — even though online media didn’t yet exist as recently as thirty years ago — and that one must (must I say! c’mon, keep up!) always be paying attention to matter in turn or suffer from FOMO. Arguments in favor of needing to be online for information and news gathering are weak and ahistorical. No doubt the twisted and manipulated results of Google searches, sometimes contentious Wikipedia entries, and various dehumanizing, self-as-brand social media platforms are crutches we all now use — some waaaay, way more than others — but they’re nowhere close to the only or best way to absorb knowledge or stay in touch with family and friends. Career networking in the gig economy might require some basic level of connection but shouldn’t need to be the all-encompassing, soul-destroying work maintaining an active public persona has become.

Thus, everyone is chasing likes and follows and retweets and reblogs and other analytics as evidence of somehow being relevant on the sea of ephemera floating around us like so much disused, discarded plastic in those infamous garbage gyres. (I don’t bother to chase and wouldn’t know how to drive traffic anyway. Screw all those solicitations for search-engine optimization. Paying for clicks is for chumps, though lots apparently do it to lie enhance their analytics.) One’s online profile is accordingly a mirror of or even a substitute for the self — a facsimile self. Lost somewhere in my backblog (searched, couldn’t find it) is a post referencing several technophiles positively celebrating the bogus extension of the self accomplished by developing and burnishing an online profile. It’s the domain of celebrities, fame whores, narcissists, and sociopaths, not to mention a few criminals. Oh, and speaking of criminals, recent news is that OJ Simpson just opened a Twitter account to reform his disastrous public image? but is fundamentally outta touch with how deeply icky, distasteful, and disgusting it feels to others for him to be participating once again in the public sphere. Disgraced criminals celebrities negatively associated with the Me-Too Movement (is there really such a movement or was it merely a passing hashtag?) have mostly crawled under their respective multimillion-dollar rocks and not been heard from again. Those few who have tried to reemerge are typically met with revulsion and hostility (plus some inevitable star-fuckers with short memories). Hard to say when, if at all, forgiveness and rejoining society become appropriate.

/rant off

“Come with me if you want to live.” That’s among the quotable lines from the latest movie in the Terminator franchise, though it’s not nearly so succinct or iconic as “I’ll be back” from the first Terminator. Whereas the latter has the quality (in hindsight) of slow, implacable inevitability (considering the Terminator is literally a death-bringer), the former occurs within the context of a character having only just traveled back in time, not yet adequately reoriented, and forced to make a snap decision under duress. “I’ll be back” might be easy to brush off as harmless (temporary denial) since the threat recedes — except that it doesn’t, it’s merely delayed. “Come with me …” demands a leap of faith (or trust) because the danger is very real at that instant.

Which quote, I must ask, better characterizes the threat of climate change? My answer: both, but at different times. Three to four decades ago, it was the “I’ll be back” type: building slowly but inevitable given the underlying structure of industrial civilization. That structure was known even then by a narrow circle of experts (e.g., engineers for Big Oil and at the Dept. of Energy) to be a heat engine, meaning that we would ultimately cook our own goose by warming the planet, altering the climatic steady state under which our most recent civilization has flourished and producing a steady loss of biodiversity and biomass until our own human habitat (the entirety of the planet by now) becomes a hostile environment unable (unwilling if one anthropomorphizes Mother Nature) to support our swollen population. All that was if we stayed on course and took no corrective action. Despite foreknowledge and ample warning, that’s precisely what occurred (and continues today).

With the Intergovernmental Panel on Climate Change (IPCC) in particular, the threat has for roughly a decade shifted over to “Come with me ….” It’s no longer possible to put things off, yet we continue to dither well beyond the tipping point where/when we can still save ourselves from self-annihilation. Although scientists have been gathering data and evidence, forming an overwhelming consensus, and sounding the alarm, scientific illiteracy, realpolitik, journalistic malpractice, and corporate greed have all conspired to grant the illusion of time to react we simply don’t have anymore (and truth be told, probably didn’t as of the early 1980s).

I’m aware of at least three journalists (relying on the far more authoritative work of scientific consensus) who have embraced the message: Dahr Jamail, Thom Hartmann, and David Wallace-Wells. None to my knowledge has been able to bring himself to admit that humanity is now a collection of dead men walking. They can’t muster the courage to give up hope (or to report truthfully), clinging to the possibility we may still have a fleeting chance to avert disaster. I heard Ralph Nader on his webcast say something to the same effect, namely, what good is it to rob others of hope? My personal values adhere to unstinting truth rather than illusion or self-deception, so I subscribe to Guy McPherson‘s assessment that we face near-term human extinction (precise date unknown but soon if, for example, this the year we get a blue ocean event). Simply put, McPherson is professor emeritus of natural resources and ecology and evolutionary biology at the University of Arizona [note my emphasis]. I trust his scholarship (summarizing the work of other scientists and drawing necessary though unpalatable conclusions) more than I trust journalistic shaping of the story for public consumption.

The obvious metaphor for what we face is a terminal medical diagnosis, or if one has hope, perhaps a death sentence about to be carried out but with the possibility of a last-minute stay of execution via phone call from the governor. Opinions vary whether one should hope/resist up to the final moment or make peace with one’s fate. By not telling the truth, I daresay the MSM has not given the public the second option by using the “I’ll be back” characterization when it’s really “Come with me ….” Various authors on the Web offer a better approximation of the truth (such as it can be known) and form a loose doomer network (a/k/a collapsniks). This blog is (an admittedly tiny) part of that doomersphere, which gives me no pleasure.

There is something ironic and vaguely tragic about how various Internet platforms — mostly search engines and social media networks — have unwittingly been thrust into roles their creators never envisioned for themselves. Unless I’m mistaken, they launched under the same business model as broadcast media: create content, or better yet, crowd-source content, to draw in viewers and subscribers whose attention is then delivered to advertisers. Revenue is derived from advertisers while the basic services — i.e., search, job networking, encyclopedias and dictionaries, or social connection — are given away gratis. The modest inconveniences and irritations of having the screen littered and interrupted with ads is a trade-off most end users are happy to accept for free content.

Along the way, some platform operators discovered that user data itself could be both aggregated and individualized and subsequently monetized. This second step unwittingly created so-called surveillance capitalism that Shoshana Zuboff writes about in her recently published book (previously blogged about it here). Essentially, an Orwellian Big Brother (several of them, in fact) tracks one’s activity through smart phone apps and Web browsers, including GPS data revealing movement through real space, not just virtual spaces. This is also the domain of the national security state from local law enforcement to the various security branches of the Federal government: dragnet surveillance where everyone is watched continuously. Again, end users shrug off surveillance as either no big deal or too late to resist.

The most recent step is that, like the Internet itself, various platforms have been functioning for some time already as public utilities and accordingly fallen under demand for regulation with regard to authenticity, truth, and community standards of allowable speech. Thus, private corporations have been thrust unexpectedly into the role of regulating content. Problem is, unlike broadcast networks that create their own content and can easily enforce restrictive standards, crowd-sourced platforms enable the general population to upload its own content, often mere commentary in text form but increasingly as video content, without any editorial review. These platforms have parried by deploying and/or modifying their preexisting surveillance algorithms in search of objectionable content normally protected as free speech and taken steps to remove content, demonetize channels, and ban offending users indefinitely, typically without warning and without appeal.

If Internet entrepreneurs initially got into the biz to make a few (or a lot of) quick billions, which some few of them have, they have by virtue of the global reach of their platforms been transformed into censors. It’s also curious that by enabling end uses to publish to their platforms, they’ve given voice to the masses in all their unwashed glory. Now, everyone’s crazy, radicalized uncle (or sibling or parent or BFF) formerly banished to obscurity railing against one thing or another at the local tavern, where he was tolerated as harmless so long as he kept his bar tab current, is proud to fly his freak flag anywhere and everywhere. Further, the anonymous coward who might issue death or bomb threats to denounce others has been given means to distribute hate across platforms and into the public sphere, where it gets picked up and maybe censored. Worst of all, the folks who monitor and decide what is allowed, functioning as modern-day thought police, are private citizens and corporations with no oversight or legal basis to act except for the fact that everything occurs on their respective platforms. This is a new aspect to the corporatocracy but not one anyone planned.