Archive for the ‘Blogosphere’ Category

I simply can’t keep up with all the reading, viewing, and listening in my queue. Waking hours are too few, and concentration dissipates long before sleep overtakes. Accordingly, it’s much easier to settle into couch-potato mode and watch some mindless drivel, such as the Netflix hit Bridgerton binged in two sittings. (Unlike cinema critics, I’m not bothered especially by continuity errors, plot holes, clunky dialogue, weak character motivation, gaps of logic, or glossy decadence of the fictional worlds. I am bothered by the Kafka trap sprung on anyone who notices casting decisions that defy time and place — an ill-advised but now commonplace historical revisionism like editing Mark Twain.) As a result, blog posts are less frequent than they might perhaps be as I pronounce upon American (or more broadly, Western) culture, trying vainly to absorb it as a continuously moving target. Calls to mind the phrase Après moi, le déluge, except that there is no need to wait. A deluge of entertainment, news, analysis, punditry, and trolling has buried everyone already. So rather than the more careful consideration I prefer to post, here are some hot takes.

The Irregular Aphorist. Caitlin Johnstone offers many trenchant observations in the form of aphorisms (some of which I’ve quoted before), all gathered under the subtitle Notes From The Edge Of The Narrative Matrix. The modifier irregular only means that aphorisms are a regular but not constant feature. Her site doesn’t have a tag to that effect but probably ought to. Here’s one in particular that caught my attention:

Everything our species has tried has led us to a dying world and a society that is stark raving mad, so nobody is in any position to tell you that you are wrong.

Twin truths here are (1) the dying world and (2) societal madness, both of which I’ve been describing for some time. Glad when others recognize them, too.

Piling on. Though few still are willing to admit it, nonpharmaceutical interventions (NPIs, e.g., distancing, masks, and lockdowns) to stall or reduce the spread of the virus failed to achieve their objectives according to this study. Instead, NPIs piled on suffering no one could forestall. I read somewhere (no link) that the world is approaching half of total, cumulative deaths/infections predicted had nothing been done to impede the pandemic running its course. Adding in deaths of despair (numbers not entirely up to date), we’re using the wrong tools to fight the wrong battle. Of course, interventions opened up giant opportunities for power grabs and vulture capitalism, so the cynic in me shrugs and wonders half aloud “what did you expect, really?”

Growth of the Managerial Bureaucracy. A blog called Easily Distracted by Timothy Burke (never on my blogroll) publishes only a few times per year, but his analysis is terrific — at least when it doesn’t wind up being overlong and inconclusive. Since a student debt jubilee is back in the news (plenty of arguments pro and con), unintended consequences are anticipated in this quote:

When you set out to create elaborate tiers that segregate the deserving poor from the comfortable middle-class and the truly wealthy, you create a system that requires a massive bureaucracy to administer and a process that forces people into petitionary humiliation in order to verify their eligibility. You create byzantine cutoff points that become business opportunities for predatory rentiers.

Something similar may well be occurring with stimulus checks being issued pro rata (has anyone actually gotten one?), but at least we’re spared any petitionary humiliations. We get whatever the algorithms (byzantine cutoff points) dictate. How those funds will be gamed and attached is not yet clear. Stay alert.

No Defense of Free Speech. Alan Jacobs often recommends deleting, unsubscribing, and/or ignoring social media accounts (after his own long love-hate relationship with them) considering how they have become wholly toxic to a balanced psyche as well as principal enablers of surveillance capitalism and narrative control. However, in an article about the manorial elite, he’s completely lost the plot that absolutism is required in defense of free speech. It’s not sufficient to be blasé or even relieved when 45 is kicked off Twitter permanently or when multiple parties conspire to kill Parler. Establishing your own turf beyond the reach of Silicon Valley censors is a nice idea but frankly impractical. Isn’t that what whoever ran Parler (or posted there) must have thought? And besides, fencing off the digital commons these very entities created has catapulted them into the unenviable position of undemocratic, unelected wielders of monopolistic power and co-conspirators to boot. That’s what needs to be curtailed, not free speech.

The Taxonomic Apocalypse. Although drawn from fiction and thus largely hypothetical, a new book (coming late 2021) by Adam Roberts called It’s the End of the World: But What Are We Really Afraid Of? surveys doomsday stories and categorizes different versions of how it all ends. Alan Jacobs (yeah, him again — must have an advance copy of the manuscript) recommends it as “a delightful and provocative little book” but fails to grok two things: (1) these stories are rehearsals-cum-preparations for the real thing, and (2) the real thing really is bearing down on us implacably and so is no longer a mere hypothetical to contemplate and categorize for shits and grins. Despite acceptance of the eventualities that await all of us, reading Roberts’ taxonomy is not something I would expect to find delightful. Skip.

Narrative Collapse. Ran Prier (no link) sometimes makes statements revealing an unexpected god’s-eye view:

[45] is a mean rich kid who figured out that if he does a good Archie Bunker impression, every lost soul with an authoritarian father will think he’s the messiah. We’re lucky that he cares only about himself, instead of having some crazy utopian agenda. But the power, and the agency, is with the disaffected citizens of a declining empire, tasting barbarism.

This is all about people wanting to be part of a group that’s part of a story. Lately, some of the big group-stories have been dying: sky father religion, American supremacy, the conquest of nature, the virtue of wealth-seeking. In their place, young and clumsy group-stories struggle and rise.

Collapse of certain fundamental stories that animate our thinking is at the core of The Spiral Staircase (see About Brutus at top), though it’s often couched in terms of consciousness in transition. Getting through the transition (only temporarily, see previous item in list) probably means completion of the Counter-Enlightenment historical arc, which necessarily includes further descent into barbarism.

Hail Mary for Individualism. I always take special notice when someone cites Allan Bloom. Alan Jacobs (um, yeah, he’s prolific and I’m using his ideas again — sue me) cites Bloom to argue that individualism or the sovereign self, a product of the Enlightenment, is already dead. No doubt, the thought-world described so ably by Bloom no longer exists, but individualism has not yet died out by attrition or been fully dissolved in nonduality. Many of us born before the advent of the Internet retain selfhood and authenticity not yet coopted by or incorporated into mass mind. Moreover, ongoing struggles over identity (e.g., gender, sexual orientation, and race that are often used improperly to define the self) result from an inchoate sense that individualism is eroding precipitously, not that it’s already passé. Defiant attempts to (re)establish an authentic self (contravening all logic and becoming critical theory of one sort or another) in the face of this loss may well be a last-ditch effort to save the self, but it’s failing.

I’ve reached another crossroads. Chalk it up to pandemic exhaustion at being mostly cooped up for the better part of a year. Of course, this state is on top of other sources of exhaustion (politics, doom, the news grind cycle) that drained my enthusiasm for things I used to do before meaningful (to me) endeavors were all cancelled and everyone was forced to search for meaning staring at surfaces (e.g., the walls, pages, and screens — especially screens for most Americans, I daresay). So as the year and decade draw to a close, I anticipate a spate of lists and summaries as we move into 2021 with the hope it won’t be worse than 2020 — a faint hope, I might add, since nothing has been resolved except perhaps (!) which listless septuagenarian gets to sit in the Oval Office. The jury is still out whether vaccines will have the intended effect.

Aside: The calendar is not a timer or odometer. So although we change the calendar to 2021, the new year is the first year of the new decade (third decade of the 21st century, obviously). We struggled with this issue at the end of the previous century/millennium when 2000 became 2001, not more popularly when 1999 became 2000. This discrepancy is because calendars begin counting each month, year, etc. with 1, not 0. So the first ten counting numbers are 1–10, not 0–9, and all decades run from xx01 to xx10. However, timers and odometers begin counting at 0 and show elapsed intervals, so the first ten minutes or miles run from the start (at 0) to the end of 9, at which point the odometer in particular rolls to 10 and begins a new sequence. I realize I’m being a pointy-headed snoot about this, but it’s a relatively easy concept to understand. Innumeracy evident among the public is a microcosm for all the other easy concepts so badly misunderstood.

I’ve admitted to feelings of exhaustion and defeat numerous times, and indeed, hope eludes me whilst a narrow group of things still produce enjoyment. But my blogroll is no longer one of those things. I recently wrote the following to an acquaintance of mine:

Now that collapse narratives have matured, over two decades old for some (about 14 years for me), I notice that existential threats are still too remote and contingent for most to do more than signal some vague level of awareness and/or concern before returning to normal life. A few who sank into it deeply recognized that nothing positive comes out of it and have retreated from public life, or at least ongoing tracking and reporting. Several of the sites I used to frequent for news and perspective have dried up, and my finding is that adding more awfulness to the pile doesn’t enhance my understandings anymore, so I’ve also largely stopped gathering information. I still cite collapse frequently at my doom blog, but I have other things to write about.

I’m one of those who sank into the collapse narrative rather deeply and blogged about it consistently. By now, the sole available positive outcome has manifested: the recognition (and with it, resignation) that nothing will or can be done to avert disaster. So I’m dumping the doom and inactive blogs from my blogroll. I’ll continue to blog about and bear witness to the gathering storm: the cascade failure of industrial civilization. It’s proven to be a more protracted process than expected (at least by me), but no promises that it will stall until the end of the century at the conclusion of 2100 for sea level to rise and flora and fauna to expire. Human habitat will continue to diminish decade by decade, and at some point, so will human population — already shown to be rather precariously perched on an illusory safety and security we take as business as usual. I’ll keep a couple of the respectable truth-telling blogs just to have something to which to link. I have no links to add at this point.

Caveat: this post is uncharacteristically long and perhaps a bit disjointed. Or perhaps an emerging blogging style is being forged. Be forewarned.

Sam Harris has been the subject of or mentioned in numerous previous blog posts. His podcast Making Sense (formerly, Waking Up), partially behind a paywall but generously offered for free (no questions asked) to those claiming financial hardship, used to be among those I would tune in regularly. Like the Joe Rogan Experience (soon moving to Spotify — does that mean its disappearance from YouTube?), the diversity of guests and reliable intellectual stimulation have been attractive. Calling his podcast Making Sense aligns with my earnest concern over actually making sense of things as the world spins out of control and our epistemological crisis deepens. Yet Harris has been a controversial figure since coming to prominence as a militant atheist. I really want to like what Harris offers, but regrettably, he has lost (most of) my attention. Others reaching the same conclusion have written or vlogged their reasons, e.g., “Why I’m no longer a fan of ….” Do a search.

Having already ranted over specific issues Harris has raised, let me instead register three general complaints. First, once a subject is open for discussion, it’s flogged to death, often without reaching any sort of conclusion, or frankly, helping to make sense. For instance, Harris’ solo discussion (no link) regarding facets of the killing of George Floyd in May 2020, which event sparked still unabated civil unrest, did more to confuse than clarify. It was as though Harris were trying the court case by himself, without a judge, jury, or opposing counsel. My second complaint is that Harris’ verbosity, while impressive in many respects, leads to interviews marred by long-winded, one-sided speeches where the thread is hopelessly lost, blocking an interlocutor from tracking and responding effectively. Whether Harris intends to bury others under an avalanche of argument or does so uncontrollably doesn’t matter. It’s still a Gish gallop. Third is his over-emphasis on hypotheticals and thought experiments. Extrapolation is a useful but limited rhetorical technique, as is distillation. However, treating prospective events as certainties is tantamount to building arguments on poor foundations, namely, abstractions. Much as I admire Harris’ ambition to carve out a space within the public sphere to get paid for thinking and discussing topics of significant political and philosophical currency, he frustrates me enough that I rarely tune in anymore.

(more…)

Most of us are familiar with a grandpa, uncle, or father who eventually turns into a cranky old man during late middle age or in his dotage. (Why is it a mostly male phenomenon?) In the last three decades, Clint Eastwood typecast himself as a cranky old man, building on lone-wolf characters (mostly cops, criminals, and cowboys) established earlier in his career. In real life, these guys spout talking points absorbed from mainstream media and narrative managers, or if they are truly lazy and/or can’t articulate anything coherently on their own, merely forward agitprop via e-mail like chain mail of yore. They also demonstrate remarkably forgivable racism, sexism, and bigotry, such as Eastwood’s rather enjoyable and ultimately redeemed character in the film Gran Torino. If interaction with such a fellow is limited to Thanksgiving gatherings once per year, crankiness can be tolerated fairly easily. If interactions are ongoing, then a typical reaction is simply to delete e-mail messages unread, or in the case of unavoidable face-to-face interaction, to chalk it up: Well, that’s just Grandpa Joe or Uncle Bill or Dad. Let him rant; he’s basically harmless now that he’s so old he creaks.

Except that not all of them are so harmless. Only a handful of the so-called Greatest Generation (I tire of the term but it’s solidly established) remain in positions of influence. However, lots of Boomers still wield considerable power despite their advancing age, looming retirement (and death), and basic out-of-touchness with a culture that has left them behind. Nor are their rants and bluster necessarily wrong. See, for instance, this rant by Tom Engelhardt, which begins with these two paragraphs:

Let me rant for a moment. I don’t do it often, maybe ever. I’m not Donald Trump. Though I’m only two years older than him, I don’t even know how to tweet and that tells you everything you really need to know about Tom Engelhardt in a world clearly passing me by. Still, after years in which America’s streets were essentially empty, they’ve suddenly filled, day after day, with youthful protesters, bringing back a version of a moment I remember from my youth and that’s a hopeful (if also, given Covid-19, a scary) thing, even if I’m an old man in isolation in this never-ending pandemic moment of ours.

In such isolation, no wonder I have the urge to rant. Our present American world, after all, was both deeply unimaginable — before 2016, no one could have conjured up President Donald Trump as anything but a joke — and yet in some sense, all too imaginable …

If my own father (who doesn’t read this blog) could articulate ideas as well as Engelhardt, maybe I would stop deleting unread the idiocy he forwards via e-mail. Admittedly, I could well be following in my father’s footsteps, as the tag rants on this blog indicates, but at least I write my own screed. I’m far less accomplished at it than, say, Engelhardt, Andy Rooney (in his day), Ralph Nader, or Dave Barry, but then, I’m only a curmudgeon-in-training, not having fully aged (or elevated?) yet to cranky old manhood.

As the fall presidential election draws near (assuming that it goes forward), the choice in the limited U.S. two-party system is between one of two cranky old men, neither of which is remotely capable of guiding the country through this rough patch at the doomer-anticipated end of human history. Oh, and BTW, echoing Engelhardt’s remark above, 45 has been a joke all of my life — a dark parody of success — and remains so despite occupying the Oval Office. Their primary opponent up to only a couple months ago was Bernie Sanders, himself a cranky old man but far more endearing at it. This is what passes for the best leadership on offer?

Many Americans are ready to move on to someone younger and more vibrant, able to articulate a vision for something, well, different from the past. Let’s skip right on past candidates (names withheld) who parrot the same worn-out ideas as our fathers and grandfathers. Indeed, a meme emerged recently to the effect that the Greatest Generation saved us from various early 20th-century scourges (e.g., Nazis and Reds) only for the Boomers to proceed in their turn to mess up the planet so badly nothing will survive new scourges already appearing. It may not be fair to hang such labels uniformly around the necks of either generation (or subsequent ones); each possesses unique characteristics and opportunities (some achieved, others squandered) borne out of their particular moment in history. But this much is clear: whatever happens with the election and whichever generational cohort assumes power, the future is gonna be remarkably different.

This is an infrequent feature of this blog: additions to and deletions from my blogroll. Other bloggers attract my attention for various reasons, mostly the quality of writing and ideas (interrelated), but over time, some start to repel me. This update has several in both categories.

At Wit’s End, Three-Pound Brain, and Bracing Views were are all added some while back. The first two have new posts very infrequently, but the quality is very high (IMO). The last is far more active and solicits commentary openly. Subject matter at these blogs varies widely, and only the third could be accused of being an outrage engine. It’s a worthwhile read nonetheless if political dysfunction doesn’t ignite in you a firestorm of rage and indignation.

Dropping Creative Destruction, Gin & Tacos and Pharyngula. The first has been dead for a long time; nothing there to see anymore besides the backblog. I thought it might eventually revive, but alas, no. Updates to the second have dropped significantly as authorial attention shifted to podcasting. The commentariat there was especially worthwhile, but with so few new posts, the disappearance of whimsical history lessons, and irritating focus on racehorse politics, the blog has lost my recommendation. The third used to be a fun read, especially for being well argued. The tone shifted at some point toward smug, woke felation service of an in-group, by definition excluding everyone else. Like another unmentioned blog dropped from my blogroll some years ago, the author behaves like an omniscient bully: being absolutely correct about everything all the time. The lack of humility or tolerance for ambiguity — or even the very human admission once in a while “I dunno …” — is exhausting.

Final admission: traffic to and from this blog is chronically low, so no blogger cares about being added or removed from my blogroll. No illusions about that on my part. However, respectable curation is a value worth periodic updates.

Nicholas Carr has a pair of thoughtful new posts at his blog Rough Type (see blogroll) under the tag “infinite media.” The second of the two is about context collapse, restoration, and content collapse. I won’t review that particular post; I’m merely pointing to it for you to read. Carr is a journalist and media theorist whose work is especially interesting to me as a partial antidote to what I’ve been calling our epistemological crisis. In short, he offers primers on how to think about stuff, that stuff being the primary medium through which most people now gather information: via screens.

Relatedly, the other media theorist to whom I pay attention is Alan Jacobs, who has a recent book (which I read but didn’t review or blog about) called more simply How to Think. It’s about recognizing and avoiding cognitive biases on the way to more disciplined, clear thinking. I mention these two fellows together because I’ve been reading their blogs and books for over a decade now and have been curious to observe how their public interactions have changed over time. They have each embraced and abandoned various new media (particularly social media) and adopted more stringent media ecology. Carr posts ocassionally now and has closed comments at his blog (a shame, since his commentariat was valuable, quite unlike the troll mob at most sites). Jacobs is even more aggressive, starting and abandoning one blog after another (was active at multiple URLs, one formerly on my blogroll) and deleting his Twitter account entirely. Whatever goings-on occur at Facebook I can’t say; I never go there. These aren’t criticisms. We all evolve our associations and activities. But these two are unusual, perhaps, in that they evaluate and recommend with varying vehemence how to interact with electronic media tools.

The wide-open Web available to Americans (but restricted in some countries) used to be valorized as a wholly democratic, organic, grass-roots, decentralized force for good where information yearned to breathe free. Though pioneered by academic institutions, it wasn’t long before the porn industry became the first to monetize it effectively (cuz duh! that’s there the money was — at least initially) and then the whole thing was eventually overwhelmed by others with unique agendas and mechanisms, including commerce, surveillance, and propaganda. The surfeit of information demanded curation, and social media with algorithmic feeds became the default for folks either too lazy or just untrained (or uninterested) in how to think for themselves. Along the way, since a surprisingly large portion of human activity diverted to online media, that activity turned into a resource mined, harvested, and in turn monetized, much like the voting public has become a resource tracked, polled, channeled, activated, disenfranchized, corrupted, and analyzed to death.

An earlier media theorist I read with enthusiasm, Neil Postman, recommended that curricula include the study of semantics as applied to media. (Use of a word like semantics sends nonacademics running for the hills, but the recommendation is basically about thinking critically, even skeptically, regarding information, its sources, and its means of distribution.) The rise of handheld omnimedia postdates Postman, so I can only surmise that the bewildering array of information we confront absorb every day, which I liken to drinking from a fire hose, only compounds Postman’s concern that students are severely overmatched by media (especially advertising) intent on colonizing and controlling their minds. Thus, today’s information environment is a far cry from the stately slowness of earlier eras when teaching and learning (to say nothing of entertainment) were conducted primarily through reading, lecture, and discussion.

A comment came in on this blog chiding me for still blogging after 14 years. I admit hardly anyone reads anymore; they watch (or listen, as with audio-only podcasts). Preferred forms of media consumption have moved on from printed text, something USA Today recognized decades ago when it designed its print publication and sidewalk distribution boxes to look more like TVs. Nonetheless, the modest reproach reminded me of a cry in the wilderness by Timothy Burke: why he still blogs, though quite infrequently. (There’s a brokeback can’t-quit-you joke in there somewhere I’ll leave unformulated.) So this blog may indeed be past its proper expiration date, yet it remains for me one of the best means for organizing how I think about stuff. Without it, I’m afraid thoughts would be rattling loose inside my head, disorganized, only to be displaced by the next slurp from the fire hose.

The “American character,” if one can call it into being merely by virtue of naming it (the same rhetorical trick as solutionism), is diverse and ever-changing. Numerous characterizations have been offered throughout history, with Alexis de Tocqueville’s Democracy in America (1835 and 1840) being perhaps the one cited most frequently despite its outdatedness. Much in American character has changed since that time, and it’s highly questionable to think it was unified even then. However, as a means of understanding ourselves, it’s as good a place to start as any. A standard criticism of American character as seen from outside (i.e., when Americans travel abroad) is the so-called ugly American: loud, inconsiderate, boorish, and entitled. Not much to argue with there. A more contemporary assessment by Morris Berman, found throughout his “American trilogy,” is that we Americans are actually quite stupid, unaccountably proud of it, and constantly hustling (in the pejorative sense) in pursuit of material success. These descriptions don’t quite match up with familiar jingoism about how great America is (and of course, Americans), leading to non-Americans clamoring to emigrate here, or the self-worship we indulge in every national holiday celebrating political and military history (e.g., Independence Day, Veteran’s Day, Memorial Day).

I recently ran afoul of another ugly aspect of our national character: our tendency toward aggression and violence. In truth, this is hardly unique to Americans. Yet it came up glaringly in the context of a blog post at Pharyngula citing a Tweet comparing uneven application of law (and indignation among online chatterers?) when violence is committed by the political left vs. the political right. Degree of violence clearly matters, but obvious selection bias was deployed to present an egregiously lop-sided perspective. Radicals on both the left and right have shown little compunction about using violence to achieve their agendas. Never mind how poorly conceived those agendas may be. What really surprised me, however, was that my basic objection to violence in all forms across the spectrum was met with snark and ad hominem attack. When did reluctance to enact violence (including going to war) until extremity demands it become controversial?

My main point was that resorting to violence typically invalidates one’s objective. It’s a desperation move. Moreover, using force (e.g., intimidation, threats, physical violence — including throwing milkshakes) against ideological opponents is essentially policing others’ thoughts. But they’re fascists, right? Violence against them is justified because they don’t eschew violence. No, wrong. Mob justice and vigilantism obviate the rule of law and criminalize any perpetrator of violence. It’s also the application of faulty instrumental logic, ceding any principled claim to moral authority. But to commentators at the blog post linked above, I’m the problem because I’m not in support of fighting fascists with full force. Guess all those masked, caped crusaders don’t recognize that they’re contributing to lawlessness and mayhem. Now even centrists come in for attack for not be radical (or aggressive, or violent) enough. Oddly silent in the comments is the blog host, P.Z. Myers, who has himself communicated approval of milkshake patrols and Nazi punching, as though the presumptive targets (identified rather haphazardly and incorrectly in many instances) have no right to their own thoughts and ideas, vile though they may be, and that violence is the right way to “teach them a lesson.” No one learns the intended lesson when the victim of violence. Rather, if not simply cowed into submission (not the same as agreement), tensions tend to escalate into further and increasing violence. See also reaction formation.

Puzzling over this weird exchange with these, my fellow Americans (the ideologically possessed ones anyway), caused me to backtrack. For instance, the definition of fascism at dictionary.com is “a governmental system led by a dictator having complete power, forcibly suppressing opposition and criticism, regimenting all industry, commerce, etc., and emphasizing an aggressive nationalism and often racism.” That definition sounds more like totalitarianism or dictatorship and is backward looking, specifically to Italy’s Benito Mussolini in the period 1922 to 1943. However, like national characters, political moods and mechanisms change over time, and the recent fascist thrust in American politics isn’t limited to a single leader with dictatorial power. Accordingly, the definition above has never really satisfied me.

I’ve blogged repeatedly about incipient fascism in the U.S., the imperial presidency (usually associated with George W. Bush but also characteristic of Barack Obama), James Howard Kunstler’s prediction of a cornpone fascist coming to power (the way paved by populism), and Sheldon Wolin’s idea of inverted totalitarianism. What ties these together is how power is deployed and against what targets. More specifically, centralized power (or force) is directed against domestic populations to advance social and political objectives without broad public support for the sole benefit of holders of power. That’s a more satisfactory definition of fascism to me, certainly far better that Peter Schiff’s ridiculous equation of fascism with socialism. Domination of others to achieve objectives describes the U.S. power structure (the military-industrial-corporate complex) to a tee. That doesn’t mean manufactured consent anymore; it means bringing the public into line, especially through propaganda campaigns, silencing of criticism, prosecuting whistle-blowers, and broad surveillance, all of which boil down to policing thought. The public has complied by embracing all manner of doctrine against enlightened self-interest, the very thing that was imagined to magically promote the general welfare and keep us from wrecking things or destroying ourselves unwittingly. Moreover, public support is not really obtained through propaganda and domination, only the pretense of agreement found convincing by fools. Similarly, admiration, affection, and respect are not won with a fist. Material objectives (e.g., resource reallocation, to use a familiar euphemism) achieved through force are just common theft.

So what is Antifa doing? It’s forcibly silencing others. It’s doing the work of fascist government operatives by proxy. It’s fighting fascism by becoming fascist, not unlike the Republican-led U.S. government in 2008 seeking bailouts for banks and large corporations, handily transforming our economy into a socialist experiment (e.g, crowd-funding casino capitalism through taxation). Becoming the enemy to fight the enemy is a nice trick of inversion, and many are so flummoxed by these contradictions they resort to Orwellian doublethink to reconcile the paradox. Under such conditions, there are no arguments that can convince. Battle lines are drawn, tribal affiliations are established, and the ideological war of brother against brother, American against American, intensifies until civility crumbles around us. Civil war and revolution haven’t occurred in the U.S. for 150 years, but they are popping up regularly around the globe, often at the instigation of the U.S. government (again, acting against the public interest). Is our turn coming because we Americans have been divided and conquered instead of recognizing the real source of threat?

“Come with me if you want to live.” That’s among the quotable lines from the latest movie in the Terminator franchise, though it’s not nearly so succinct or iconic as “I’ll be back” from the first Terminator. Whereas the latter has the quality (in hindsight) of slow, implacable inevitability (considering the Terminator is literally a death-bringer), the former occurs within the context of a character having only just traveled back in time, not yet adequately reoriented, and forced to make a snap decision under duress. “I’ll be back” might be easy to brush off as harmless (temporary denial) since the threat recedes — except that it doesn’t, it’s merely delayed. “Come with me …” demands a leap of faith (or trust) because the danger is very real at that instant.

Which quote, I must ask, better characterizes the threat of climate change? My answer: both, but at different times. Three to four decades ago, it was the “I’ll be back” type: building slowly but inevitable given the underlying structure of industrial civilization. That structure was known even then by a narrow circle of experts (e.g., engineers for Big Oil and at the Dept. of Energy) to be a heat engine, meaning that we would ultimately cook our own goose by warming the planet, altering the climatic steady state under which our most recent civilization has flourished and producing a steady loss of biodiversity and biomass until our own human habitat (the entirety of the planet by now) becomes a hostile environment unable (unwilling if one anthropomorphizes Mother Nature) to support our swollen population. All that was if we stayed on course and took no corrective action. Despite foreknowledge and ample warning, that’s precisely what occurred (and continues today).

With the Intergovernmental Panel on Climate Change (IPCC) in particular, the threat has for roughly a decade shifted over to “Come with me ….” It’s no longer possible to put things off, yet we continue to dither well beyond the tipping point where/when we can still save ourselves from self-annihilation. Although scientists have been gathering data and evidence, forming an overwhelming consensus, and sounding the alarm, scientific illiteracy, realpolitik, journalistic malpractice, and corporate greed have all conspired to grant the illusion of time to react we simply don’t have anymore (and truth be told, probably didn’t as of the early 1980s).

I’m aware of at least three journalists (relying on the far more authoritative work of scientific consensus) who have embraced the message: Dahr Jamail, Thom Hartmann, and David Wallace-Wells. None to my knowledge has been able to bring himself to admit that humanity is now a collection of dead men walking. They can’t muster the courage to give up hope (or to report truthfully), clinging to the possibility we may still have a fleeting chance to avert disaster. I heard Ralph Nader on his webcast say something to the same effect, namely, what good is it to rob others of hope? My personal values adhere to unstinting truth rather than illusion or self-deception, so I subscribe to Guy McPherson‘s assessment that we face near-term human extinction (precise date unknown but soon if, for example, this the year we get a blue ocean event). Simply put, McPherson is professor emeritus of natural resources and ecology and evolutionary biology at the University of Arizona [note my emphasis]. I trust his scholarship (summarizing the work of other scientists and drawing necessary though unpalatable conclusions) more than I trust journalistic shaping of the story for public consumption.

The obvious metaphor for what we face is a terminal medical diagnosis, or if one has hope, perhaps a death sentence about to be carried out but with the possibility of a last-minute stay of execution via phone call from the governor. Opinions vary whether one should hope/resist up to the final moment or make peace with one’s fate. By not telling the truth, I daresay the MSM has not given the public the second option by using the “I’ll be back” characterization when it’s really “Come with me ….” Various authors on the Web offer a better approximation of the truth (such as it can be known) and form a loose doomer network (a/k/a collapsniks). This blog is (an admittedly tiny) part of that doomersphere, which gives me no pleasure.

Haven’t purged my bookmarks in a long time. I’ve been collecting material about technological dystopia already now operating but expected to worsen. Lots of treatments out there and lots of jargon. My comments are limited.

Commandeering attention. James Williams discusses his recognition that interference media (all modern media now) keep people attuned to their feeds and erode free will, ultimately threatening democratic ideals by estranging people from reality. An inversion has occurred: information scarcity and attention abundance have become information abundance and attention scarcity.

Outrage against the machines. Ran Prieur (no link) takes a bit of the discussion above (probably where I got it) to illustrate how personal responsibility about media habits is confused, specifically, the idea that it’s okay for technology to be adversarial.

In the Terminator movies, Skynet is a global networked AI hostile to humanity. Now imagine if a human said, “It’s okay for Skynet to try to kill us; we just have to try harder to not be killed, and if you fail, it’s your own fault.” But that’s exactly what people are saying about an actual global computer network that seeks to control human behavior, on levels we’re not aware of, for its own benefit. Not only has the hostile AI taken over — a lot of people are taking its side against their fellow humans. And their advice is to suppress your biological impulses and maximize future utility like a machine algorithm.

Big Data is Big Brother. Here’s a good TedTalk by Zeynep Tufekci on how proprietary machine-learning algorithms we no longer control or understand, ostensibly used to serve targeted advertising, possess the power to influence elections and radicalize people. I call the latter down-the-rabbit-hole syndrome, where one innocuous video or news story is followed by another of increasing extremity until the viewer or reader reaches a level of outrage and indignation activating an irrational response.

(more…)

I pull in my share of information about current events and geopolitics despite a practiced inattention to mainstream media and its noisome nonsense. (See here for another who turned off the MSM.) I read or heard somewhere (can’t remember where) that most news outlets and indeed most other media, to drive traffic, now function as outrage engines, generating no small amount of righteousness, indignation, anger, and frustration at all the things so egregiously wrong in our neighborhoods, communities, regions, and across the world. These are all negative emotions, though legitimate responses to various scourges plaguing us currently, many of which are self-inflicted. It’s enough aggregate awfulness to draw people into the street again in principled protest, dissent, and resistance; it’s not yet enough to effect change. Alan Jacobs comments about outrage engines, noting that sharing via retweets is not the same as caring. In the Age of Irony, a decontextualized “yo, check this out!” is nearly as likely to be interpreted as support rather than condemnation (or mere gawking for entertainment value). Moreover, pointing, linking, and retweeting are each costless versions of virtue signaling. True virtue makes no object of publicity.

So where do I get my outrage quotient satisfied? Here is a modest linkfest, in no particular order, of sites not already on my blogroll. I don’t habituate these sites daily, but I drop in, often skimming, enough to keep abreast of themes and events of importance. (more…)