Archive for the ‘Blogosphere’ Category

The digital information ecosystem known as the Internet used to have numerous distinct interfaces but has now converged primarily on just two: electronic mail (e-mail) and the World Wide Web. (Aside: since cellphones are barely even phones anymore but are instead handheld computers and video cameras that connect to the Internet, text platforms such as X (formerly Twitter) are essentially e-mail accessed through a browser. Neither X nor TikTok is a fundamentally new interface despite dedicated apps and crowd-sourced content.) Sure, lots of protocols and programming tools operate below the surface, but how humans (at least nonprogrammers) interact with computers is governed by the graphical user interface (GUI) to deliver text, images, sound, and video. How much of that is still just straightforward pornography is unknown, but I’ve got suspicions.

When new platforms appear, they are routinely subordinated to the dynamics of how people connect digitally and are eventually coopted by sellers and advertisers. Accordingly, and with reliable predictability, most of those who manage to gather an audience succumb to inclusion of paid advertising, which tends to snowball into an avalanche. They often jump on the censorship bandwagon as well, hiding behind the flimsy argument that they are privately held companies at liberty to craft their user agreements and operate in bad faith. Who wants to participate in that ecosystem, essentially offering oneself up for capture by advertisers (just as governments suffers from corporate capture) and opinion channeling should one have an actual independent idea in conflict with the dominant culture? Media saps, attention whores, and narcissists, that’s who. Extending one’s proverbial fifteen minutes of fame into a lifetime of shilling and influencing between segments of spurious creative content (ever see the TikTok Toilet?) is a familiar path. However, turning around to offer effusive praise for new platforms that enrich the cheerleader is among the most obvious examples of solipsism. Sure, a few have been able to parlay overexposure into earning a living and/or expanding into a media empire, but even the so-called “king of all media” regrets having made a sustained ass of himself essentially for audience guffaws.

On the user/audience side of media, however, the dynamic is quite different (click a few times to get the picture) and thoroughly exasperating. The term attention economy covers all the lures, gimmicks, teases, and enticements to draw in and keep audiences engaged. Many have observed that with mass media (including TV, radio, and newspapers), you are the product being delivered to advertisers; whatever valuable news and entertainment are presented is secondary. And as with journalism, the most outrageous, salacious content drives the most traffic (note comment on porn above) in an inevitable race to the bottom of the brain stem. With this in mind, perhaps it would be instructive to imagine what a hypothetical Spiral Staircase Webcast (SSW) might be like had I not already promised never to monetize my blog.

(more…)

My inquiries into media theory long ago led me to Alan Jacobs and his abandoned, reactivated, then reabandoned blog Text Patterns. Jacobs is a promiscuous thinker and even more promiscuous technologist in that he has adopted and abandoned quite a few computer apps and publishing venues over time, offering explanations each time. Always looking for better tools, perhaps, but this roving public intellectual requires persistent attention lest one lose track of him. His current blog (for now) is The Homebound Symphony (not on my ruthlessly short blogroll), which is updated roughly daily, sometimes with linkfests or simple an image, other times with thoughtful analysis. Since I’m not as available as most academics to spend all day reading and synthesizing what I’ve read to put into a blog post, college class, or book, I am not on any sort of schedule and only publish new blog posts when I’m ready. Discovered in my latest visit to The Homebound Symphony was a plethora of super-interesting subject matter, which I daresay is relevant to the more literate and literary among us. Let me draw out the one that most piqued my interest. (That was the long way of tipping my hat to Jacobs for the link.)

In an old (by Internet standards) yet fascinating book review by Michael Walzer of Siep Stuurman’s The Invention of Humanity: Equality and Cultural Difference in World History (2017), Walzer describes the four inequalities that have persisted throughout human history, adding a fifth identified by Stuurman:

  • geographic inequality
  • racial inequality
  • hierarchical inequality
  • economic inequality
  • temporal inequality

I won’t unpack what each means if they’re not apparent on their face. Read for yourself. Intersections and overlapping are common in taxonomies of this sort, so don’t expect categories to be completely separate and distinct. The question of equality (or its inverse inequality) is a fairly recent development, part of a stew of 18th-century thought in the West that was ultimately distilled to one famous phrase “all men are created equal.” Seems obvious, but the phrase is fraught, and we’ve never really been equal, have we? So is it equal before god? Equal before the law? Equal in all opportunities and outcomes as social justice warriors now insist? On a moment’s inspection, no one can possibly believe we’re all equal despite aspirations that everyone be treated fairly. The very existence of perennial inequalities puts the lie to any notion of equality trucked in with the invention of humanity during the Enlightenment.

To those inequalities I would add a sixth: genetic inequality. Again, overlap with the others is acknowledged, but it might be worth observing that divergent inherited characteristics (other than wealth) appear quite early in life among siblings and peers, before most others manifest. By that, I certainly don’t mean race or sex, though differences clearly exist there as well. Think instead of intelligence, height, beauty, athletic ability, charisma, health and constitution, and even longevity (life span). Each of us has a mixture of characteristics that are plainly different from those of others and which provide either springboards or produce disadvantages. Just as it’s unusual to find someone in possession of all positive characteristics at once — the equivalent of rolling a 12 for each attribute of a new D&D character — few possess all negatives (a series of 1’s), either. Also, there’s probably no good way to rank best to worst, strongest to weakest, or most to least successful. Bean counters from one discipline or another might try, but that runs counter to the mythology “all men are created equal” and thus becomes a taboo to acknowledge, much less scrutinize.

What to do with the knowledge that all men are not in fact created equal and never will be? That some are stronger; more charming; smarter; taller with good teeth (or these days, dentists), hair, vision, and square jaws; luckier in the genetic lottery? Well, chalk it up, buddy. We all lack some things and possess others.

While working, I half listen to a variety of podcasts via YouTube, usually minimizing the window so that I don’t see the video. Some report that long-haul truckers are also avid podcast listeners (presumably discarding AM radio); who knows? At any rate, I find it dispiriting that nearly every podcast has attracted sponsors and now features unavoidable, in-your-face advertising on top of ubiquitous exhortations to like, subscribe, ring the bell, and buy merch. Ads are sometimes read live, no longer being prerecorded bits during regular commercial breaks. Segues into ad reads are often tortured, with tastelessness being an inverted badge of honor somehow.

I get that for those who have made podcasting their primary incomes, opining on anything and everything ad nauseum (sorta like me, actually), sponsorship is what keeps them stocked with peanut butter. Why do I still tune in? Well, some are actually entertaining, while others are exceptional clearinghouses for information I wouldn’t otherwise gather — at least when not pedantic and irritating. Good thing I’m only half listening. Case in point: a few weeks back, the DarkHorse Podcast (no link) announced it would begin doing ads, but to make the bitter pill easier to swallow, free endorsements (unpaid ads) would also be presented. Right … more of what I don’t want. In characteristic fashion, the two hosts beat that damn horse well into the afterlife, softening none of the irksome content (at least for me). Although legacy media (e.g., radio, TV, magazines, newsprint) has always required forfeiting some part of one’s time and attention to ignoring or filtering out ads, streaming services and online blockers have done away with much of the unwanted marketing. Perhaps that’s why I’m exasperated at it now being unavoidable again.

With this in mind, here’s my promise to you, dear reader: I will never monetize this blog or put it behind a paywall. I won’t even put up a tip jar or coffee mug to entice micropayments. The blog will also never connect to Facebook or Twitter or any other platform. This blog is totally free and unencumbered (except the ads WordPress puts in, which are relatively easy to dismiss and/or circumvent). Maybe I’m fortunate that I earn my living elsewhere and disavow any desire to be a pundit, influencer, or media figure. Those folks are uniformly unenviable, especially when distorted by their own celebrity so that they forget who they are. Instead, this blog will remain what it’s always been: a venue for me to work out my ideas and secondarily share them.

I simply can’t keep up with all the reading, viewing, and listening in my queue. Waking hours are too few, and concentration dissipates long before sleep overtakes. Accordingly, it’s much easier to settle into couch-potato mode and watch some mindless drivel, such as the Netflix hit Bridgerton binged in two sittings. (Unlike cinema critics, I’m not bothered especially by continuity errors, plot holes, clunky dialogue, weak character motivation, gaps of logic, or glossy decadence of the fictional worlds. I am bothered by the Kafka trap sprung on anyone who notices casting decisions that defy time and place — an ill-advised but now commonplace historical revisionism like editing Mark Twain.) As a result, blog posts are less frequent than they might perhaps be as I pronounce upon American (or more broadly, Western) culture, trying vainly to absorb it as a continuously moving target. Calls to mind the phrase Après moi, le déluge, except that there is no need to wait. A deluge of entertainment, news, analysis, punditry, and trolling has buried everyone already. So rather than the more careful consideration I prefer to post, here are some hot takes.

The Irregular Aphorist. Caitlin Johnstone offers many trenchant observations in the form of aphorisms (some of which I’ve quoted before), all gathered under the subtitle Notes From The Edge Of The Narrative Matrix. The modifier irregular only means that aphorisms are a regular but not constant feature. Her site doesn’t have a tag to that effect but probably ought to. Here’s one in particular that caught my attention:

Everything our species has tried has led us to a dying world and a society that is stark raving mad, so nobody is in any position to tell you that you are wrong.

Twin truths here are (1) the dying world and (2) societal madness, both of which I’ve been describing for some time. Glad when others recognize them, too.

Piling on. Though few still are willing to admit it, nonpharmaceutical interventions (NPIs, e.g., distancing, masks, and lockdowns) to stall or reduce the spread of the virus failed to achieve their objectives according to this study. Instead, NPIs piled on suffering no one could forestall. I read somewhere (no link) that the world is approaching half of total, cumulative deaths/infections predicted had nothing been done to impede the pandemic running its course. Adding in deaths of despair (numbers not entirely up to date), we’re using the wrong tools to fight the wrong battle. Of course, interventions opened up giant opportunities for power grabs and vulture capitalism, so the cynic in me shrugs and wonders half aloud “what did you expect, really?”

Growth of the Managerial Bureaucracy. A blog called Easily Distracted by Timothy Burke (never on my blogroll) publishes only a few times per year, but his analysis is terrific — at least when it doesn’t wind up being overlong and inconclusive. Since a student debt jubilee is back in the news (plenty of arguments pro and con), unintended consequences are anticipated in this quote:

When you set out to create elaborate tiers that segregate the deserving poor from the comfortable middle-class and the truly wealthy, you create a system that requires a massive bureaucracy to administer and a process that forces people into petitionary humiliation in order to verify their eligibility. You create byzantine cutoff points that become business opportunities for predatory rentiers.

Something similar may well be occurring with stimulus checks being issued pro rata (has anyone actually gotten one?), but at least we’re spared any petitionary humiliations. We get whatever the algorithms (byzantine cutoff points) dictate. How those funds will be gamed and attached is not yet clear. Stay alert.

No Defense of Free Speech. Alan Jacobs often recommends deleting, unsubscribing, and/or ignoring social media accounts (after his own long love-hate relationship with them) considering how they have become wholly toxic to a balanced psyche as well as principal enablers of surveillance capitalism and narrative control. However, in an article about the manorial elite, he’s completely lost the plot that absolutism is required in defense of free speech. It’s not sufficient to be blasé or even relieved when 45 is kicked off Twitter permanently or when multiple parties conspire to kill Parler. Establishing your own turf beyond the reach of Silicon Valley censors is a nice idea but frankly impractical. Isn’t that what whoever ran Parler (or posted there) must have thought? And besides, fencing off the digital commons these very entities created has catapulted them into the unenviable position of undemocratic, unelected wielders of monopolistic power and co-conspirators to boot. That’s what needs to be curtailed, not free speech.

The Taxonomic Apocalypse. Although drawn from fiction and thus largely hypothetical, a new book (coming late 2021) by Adam Roberts called It’s the End of the World: But What Are We Really Afraid Of? surveys doomsday stories and categorizes different versions of how it all ends. Alan Jacobs (yeah, him again — must have an advance copy of the manuscript) recommends it as “a delightful and provocative little book” but fails to grok two things: (1) these stories are rehearsals-cum-preparations for the real thing, and (2) the real thing really is bearing down on us implacably and so is no longer a mere hypothetical to contemplate and categorize for shits and grins. Despite acceptance of the eventualities that await all of us, reading Roberts’ taxonomy is not something I would expect to find delightful. Skip.

Narrative Collapse. Ran Prier (no link) sometimes makes statements revealing an unexpected god’s-eye view:

[45] is a mean rich kid who figured out that if he does a good Archie Bunker impression, every lost soul with an authoritarian father will think he’s the messiah. We’re lucky that he cares only about himself, instead of having some crazy utopian agenda. But the power, and the agency, is with the disaffected citizens of a declining empire, tasting barbarism.

This is all about people wanting to be part of a group that’s part of a story. Lately, some of the big group-stories have been dying: sky father religion, American supremacy, the conquest of nature, the virtue of wealth-seeking. In their place, young and clumsy group-stories struggle and rise.

Collapse of certain fundamental stories that animate our thinking is at the core of The Spiral Staircase (see About Brutus at top), though it’s often couched in terms of consciousness in transition. Getting through the transition (only temporarily, see previous item in list) probably means completion of the Counter-Enlightenment historical arc, which necessarily includes further descent into barbarism.

Hail Mary for Individualism. I always take special notice when someone cites Allan Bloom. Alan Jacobs (um, yeah, he’s prolific and I’m using his ideas again — sue me) cites Bloom to argue that individualism or the sovereign self, a product of the Enlightenment, is already dead. No doubt, the thought-world described so ably by Bloom no longer exists, but individualism has not yet died out by attrition or been fully dissolved in nonduality. Many of us born before the advent of the Internet retain selfhood and authenticity not yet coopted by or incorporated into mass mind. Moreover, ongoing struggles over identity (e.g., gender, sexual orientation, and race that are often used improperly to define the self) result from an inchoate sense that individualism is eroding precipitously, not that it’s already passé. Defiant attempts to (re)establish an authentic self (contravening all logic and becoming critical theory of one sort or another) in the face of this loss may well be a last-ditch effort to save the self, but it’s failing.

I’ve reached another crossroads. Chalk it up to pandemic exhaustion at being mostly cooped up for the better part of a year. Of course, this state is on top of other sources of exhaustion (politics, doom, the news grind cycle) that drained my enthusiasm for things I used to do before meaningful (to me) endeavors were all cancelled and everyone was forced to search for meaning staring at surfaces (e.g., the walls, pages, and screens — especially screens for most Americans, I daresay). So as the year and decade draw to a close, I anticipate a spate of lists and summaries as we move into 2021 with the hope it won’t be worse than 2020 — a faint hope, I might add, since nothing has been resolved except perhaps (!) which listless septuagenarian gets to sit in the Oval Office. The jury is still out whether vaccines will have the intended effect.

Aside: The calendar is not a timer or odometer. So although we change the calendar to 2021, the new year is the first year of the new decade (third decade of the 21st century, obviously). We struggled with this issue at the end of the previous century/millennium when 2000 became 2001, not more popularly when 1999 became 2000. This discrepancy is because calendars begin counting each month, year, etc. with 1, not 0. So the first ten counting numbers are 1–10, not 0–9, and all decades run from xx01 to xx10. However, timers and odometers begin counting at 0 and show elapsed intervals, so the first ten minutes or miles run from the start (at 0) to the end of 9, at which point the odometer in particular rolls to 10 and begins a new sequence. I realize I’m being a pointy-headed snoot about this, but it’s a relatively easy concept to understand. Innumeracy evident among the public is a microcosm for all the other easy concepts so badly misunderstood.

I’ve admitted to feelings of exhaustion and defeat numerous times, and indeed, hope eludes me whilst a narrow group of things still produce enjoyment. But my blogroll is no longer one of those things. I recently wrote the following to an acquaintance of mine:

Now that collapse narratives have matured, over two decades old for some (about 14 years for me), I notice that existential threats are still too remote and contingent for most to do more than signal some vague level of awareness and/or concern before returning to normal life. A few who sank into it deeply recognized that nothing positive comes out of it and have retreated from public life, or at least ongoing tracking and reporting. Several of the sites I used to frequent for news and perspective have dried up, and my finding is that adding more awfulness to the pile doesn’t enhance my understandings anymore, so I’ve also largely stopped gathering information. I still cite collapse frequently at my doom blog, but I have other things to write about.

I’m one of those who sank into the collapse narrative rather deeply and blogged about it consistently. By now, the sole available positive outcome has manifested: the recognition (and with it, resignation) that nothing will or can be done to avert disaster. So I’m dumping the doom and inactive blogs from my blogroll. I’ll continue to blog about and bear witness to the gathering storm: the cascade failure of industrial civilization. It’s proven to be a more protracted process than expected (at least by me), but no promises that it will stall until the end of the century at the conclusion of 2100 for sea level to rise and flora and fauna to expire. Human habitat will continue to diminish decade by decade, and at some point, so will human population — already shown to be rather precariously perched on an illusory safety and security we take as business as usual. I’ll keep a couple of the respectable truth-telling blogs just to have something to which to link. I have no links to add at this point.

Caveat: this post is uncharacteristically long and perhaps a bit disjointed. Or perhaps an emerging blogging style is being forged. Be forewarned.

Sam Harris has been the subject of or mentioned in numerous previous blog posts. His podcast Making Sense (formerly, Waking Up), partially behind a paywall but generously offered for free (no questions asked) to those claiming financial hardship, used to be among those I would tune in regularly. Like the Joe Rogan Experience (soon moving to Spotify — does that mean its disappearance from YouTube? — the diversity of guests and reliable intellectual stimulation have been attractive. Calling his podcast Making Sense aligns with my earnest concern over actually making sense of things as the world spins out of control and the epistemological crisis deepens. Yet Harris has been a controversial figure since coming to prominence as a militant atheist. I really want to like what Harris offers, but regrettably, he has lost (most of) my attention. Others reaching the same conclusion have written or vlogged their reasons, e.g., “Why I’m no longer a fan of ….” Do a search.

Having already ranted over specific issues Harris has raised, let me instead register three general complaints. First, once a subject is open for discussion, it’s flogged to death, often without reaching any sort of conclusion, or frankly, helping to make sense. For instance, Harris’ solo discussion (no link) regarding facets of the killing of George Floyd in May 2020, which event sparked still unabated civil unrest, did more to confuse than clarify. It was as though Harris were trying the court case by himself, without a judge, jury, or opposing counsel. My second complaint is that Harris’ verbosity, while impressive in many respects, leads to interviews marred by long-winded, one-sided speeches where the thread is hopelessly lost, blocking an interlocutor from tracking and responding effectively. Whether Harris intends to bury others under an avalanche of argument or does so uncontrollably doesn’t matter. It’s still a Gish gallop. Third is his over-emphasis on hypotheticals and thought experiments. Extrapolation is a useful but limited rhetorical technique, as is distillation. However, treating prospective events as certainties is tantamount to building arguments on poor foundations, namely, abstractions. Much as I admire Harris’ ambition to carve out a space within the public sphere to get paid for thinking and discussing topics of significant political and philosophical currency, he frustrates me enough that I rarely tune in anymore.

In contrast, the Rebel Wisdom channel on YouTube offers considerably more useful content, which includes a series on sensemaking. The face of Rebel Wisdom is documentarian David Fuller, who asks informed questions but avoids positioning himself in the expository center. Quite a change from the too-familiar news-anchor-as-opinion-maker approach taken by most media stars. If there were a blog, I would add it to my blogroll. However, offer of memberships ranging from $5 to $500 per month irks me. Paid-for VIP status too closely resembles selling of empty cachet or Catholic indulgences, especially those with guarantees of “special access.”

I became especially interested in Daniel Schmachtenberger‘s appearances on Rebel Wisdom and his approach to sensemaking. Lots of exciting ideas; clearly the fellow has developed an impressive framework for the dynamics involved. But to make it really useful, as opposed to purely theoretical, formal study akin to taking a philosophy course is needed. Maybe there’s written material available, but without a clear text resource, the prospect of sifting unguided through a growing collection of YouTube videos caused me to retreat (out of frustration? laziness?). At some later point, I learned that Schmachtenberger was a participant among a loose collection of under-the-radar intellectuals (not yet having elevated themselves to thought leaders) working on an alternative to politics-and-civilization-as-usual called Game B (for lack of a better name). A good article about Schmachtenberger and what’s called “The War on Sensemaking” (numerous Internet locations) is found here.

While the Game B gang seems to have imploded over disagreements and impasses (though there may well be Internet subcultures still carrying the torch), its main thrust has been picked up by Bret Weinstein and his DarkHorse Podcast (var.: Dark Horse) co-hosted by his wife Heather Heying. Together, they analyze contemporary political and cultural trends through the twin perspectives of evolutionary biology and game theory. They also live in Portland, Oregon, home to the most radical leftist civil unrest currently under way this summer of 2020. They further warn unambiguously that we Americans are at grave risk of losing the grand melting pot experiment the U.S. represents as the self-anointed leader of the free world and standard-bearer of liberal democratic values sprung from the Enlightenment. What is meant by protesters to succeed the current regime in this proto-revolutionary moment is wildly unclear, but it looks to be decidedly fascist in character. Accordingly, Weinstein and Heying are actively promoting Unity 2020 (var.: Unity2020 and Un1ty2020) to select and nominate an independent U.S. presidential candidate — “Not Trump. Not Biden.” Unless you’re jacked into the Internet and political discussions avidly, it’s quite easy to overlook this emergent political reform. I was vaguely aware of Articles of Unity and its “Plan to Save the Republic” yet still had trouble locating it via Web searches. Weinstein’s penchant (shared with his brother Eric) for coining new terms with flexible spelling is no aid.

Like Rebel Wisdom, Weinstein and Heying, each on their individual Patreon pages, offer multiple levels of membership and access: $2 to $250 per month for him, $5 to $17 per month for her. Why such high divergence, I wonder? I raise paid memberships repeatedly because, while acknowledging the need to fund worthwhile endeavor and to earn a living, there is something tacky and unseemly about enabling concentric inner circles exclusively through paid access — no other apparent qualification needed. More pointedly, an article called “The Corrupting Power Of The Inner Ring” by Rod Dreher at The American Conservative discusses David Brooks’ column about Alan Jacobs’ book How to Think (2017) where Jacobs cites C.S. Lewis’ concept of the inner ring — something to be distrusted. (Sorry about that long string of names.) Also demonstrates how ideas are highly derivative of antecedents found throughout culture and history.

Anyway, the DarkHorse Podcast provides some of the best analysis (not to be confused with news reporting or journalism, neither of which is even remotely successful at sensemaking anymore) to be found among those inserting themselves into the public conversation (if such a thing can be said to exist). Willingness to transform oneself into a pundit and then opine freely about anything and everything is a shared attribute of the people profiled above. (I specifically disclaimed punditry as a goal of mine when launching this blog.) That most of them have embraced podcasting (not blogging — I’m so unhip, committed to a legacy medium that both came and went with surprising celerity) as the primary medium of information exchange is quite contemporary. I surmise it’s silent acknowledgement that Americans (on the whole) no longer read and that text has fallen out of favor compared to speech, especially the eavesdropped conversational type. Podcasting doesn’t complete the information gathering and sensemaking shift from text and newsprint to TV and video begun many decades ago but certainly intensifies it. Podcasting has also demonstrated real money-making potential if one succeeds in attracting a sufficient audience (driving ad revenue) and/or a cadre of subscribers and contributors. Potential for real political engagement is unproven as yet.

Another public intellectual I cited briefly a couple years ago, Thomas Sowell, crossed my browsing path yet again. And yet again, I found myself somewhat credulously led down the primrose path set by his reckless (or savvy?) juxtaposition of facts and details until a seemingly logical conclusion appeared magically without his ever having made it manifest. In the two-year-old interview I watched (no link), Sowell states cause-and-effect (or substitutes one combo for another) confidently while simultaneously exuding false humility. He basically serves up a series of small sells leading to the big sell, except that the small sells don’t combine convincingly unless one is swept unawares into their momentum. But the small sells work individually, and I found myself agreeing repeatedly before having to recognize and refuse the final sale. I also recognize in Sowell’s reliance on facts and numerical data my own adherence to evidence. That’s an epistemological foundation we should all share. Moreover, my willingness to consider Sowell’s remarks is a weak stab at heterodoxy. But as the modern information environment has made abundantly clear, lying with numbers and distortion of facts (or more simply, fake news and narrative spin) are precisely what makes sensemaking so difficult yet critical. For instance, I have echoed Sowell recently in suggesting that inequality and structural violence may be less rooted in snarling, overt racism (at least since the Civil Rights Era) than in simple greed and opportunism while acknowledging that virulent white supremacism does still exit. Yet others insist that everything is political, or racist, or owing to class conflict, or subsumed entirely by biology, chemistry, or physics (or religion). Take your pick of interpretations of reality a/k/a sensemaking. I had halfway expected someone to take me to task for failing to voice the approved radical leftist orthodoxy or try to cancel me for publishing something nominally conservative or Sowellesque. But no one cares what I blog about; I have succeeded in avoiding punditry.

With such a sprawling survey of sensemakers good and bad, successful and unsuccessful (according to me), there is no handy conclusion. Instead, let me point to the launching point for this blog post: my blog post called “Mad World Preamble.” Even before that, I blogged about Iain McGilchrist’s book The Master and His Emissary (2010), drawing particular attention to chap. 12 as his diagnosis of how and when the modern world went mad. Perhaps we have indeed managed to step back from the atomic brink (MAD) only to totter and stumble through a few extra decades as PoMo madness overtook us completely in the latter half of the 20th century; and maybe the madness is not yet the hallucinatory type fully evident at a glance. However, look no further than the two gibbering fools foisted upon the voting public in the upcoming U.S. presidential election. Neither is remotely capable of serving responsibly. Every presidential election in the 21st century has been accompanied by breathless analysis prophesying the implosion of either political party following an electoral loss. Well, they both imploded and can’t field a proper candidate for high office anymore. There is probably no stronger test case for societal and institutional madness than the charade we’re now witnessing. Maybe Unity 2020 is onto something.

Most of us are familiar with a grandpa, uncle, or father who eventually turns into a cranky old man during late middle age or in his dotage. (Why is it a mostly male phenomenon?) In the last three decades, Clint Eastwood typecast himself as a cranky old man, building on lone-wolf characters (mostly cops, criminals, and cowboys) established earlier in his career. In real life, these guys spout talking points absorbed from mainstream media and narrative managers, or if they are truly lazy and/or can’t articulate anything coherently on their own, merely forward agitprop via e-mail like chain mail of yore. They also demonstrate remarkably forgivable racism, sexism, and bigotry, such as Eastwood’s rather enjoyable and ultimately redeemed character in the film Gran Torino. If interaction with such a fellow is limited to Thanksgiving gatherings once per year, crankiness can be tolerated fairly easily. If interactions are ongoing, then a typical reaction is simply to delete e-mail messages unread, or in the case of unavoidable face-to-face interaction, to chalk it up: Well, that’s just Grandpa Joe or Uncle Bill or Dad. Let him rant; he’s basically harmless now that he’s so old he creaks.

Except that not all of them are so harmless. Only a handful of the so-called Greatest Generation (I tire of the term but it’s solidly established) remain in positions of influence. However, lots of Boomers still wield considerable power despite their advancing age, looming retirement (and death), and basic out-of-touchness with a culture that has left them behind. Nor are their rants and bluster necessarily wrong. See, for instance, this rant by Tom Engelhardt, which begins with these two paragraphs:

Let me rant for a moment. I don’t do it often, maybe ever. I’m not Donald Trump. Though I’m only two years older than him, I don’t even know how to tweet and that tells you everything you really need to know about Tom Engelhardt in a world clearly passing me by. Still, after years in which America’s streets were essentially empty, they’ve suddenly filled, day after day, with youthful protesters, bringing back a version of a moment I remember from my youth and that’s a hopeful (if also, given Covid-19, a scary) thing, even if I’m an old man in isolation in this never-ending pandemic moment of ours.

In such isolation, no wonder I have the urge to rant. Our present American world, after all, was both deeply unimaginable — before 2016, no one could have conjured up President Donald Trump as anything but a joke — and yet in some sense, all too imaginable …

If my own father (who doesn’t read this blog) could articulate ideas as well as Engelhardt, maybe I would stop deleting unread the idiocy he forwards via e-mail. Admittedly, I could well be following in my father’s footsteps, as the tag rants on this blog indicates, but at least I write my own screed. I’m far less accomplished at it than, say, Engelhardt, Andy Rooney (in his day), Ralph Nader, or Dave Barry, but then, I’m only a curmudgeon-in-training, not having fully aged (or elevated?) yet to cranky old manhood.

As the fall presidential election draws near (assuming that it goes forward), the choice in the limited U.S. two-party system is between one of two cranky old men, neither of which is remotely capable of guiding the country through this rough patch at the doomer-anticipated end of human history. Oh, and BTW, echoing Engelhardt’s remark above, 45 has been a joke all of my life — a dark parody of success — and remains so despite occupying the Oval Office. Their primary opponent up to only a couple months ago was Bernie Sanders, himself a cranky old man but far more endearing at it. This is what passes for the best leadership on offer?

Many Americans are ready to move on to someone younger and more vibrant, able to articulate a vision for something, well, different from the past. Let’s skip right on past candidates (names withheld) who parrot the same worn-out ideas as our fathers and grandfathers. Indeed, a meme emerged recently to the effect that the Greatest Generation saved us from various early 20th-century scourges (e.g., Nazis and Reds) only for the Boomers to proceed in their turn to mess up the planet so badly nothing will survive new scourges already appearing. It may not be fair to hang such labels uniformly around the necks of either generation (or subsequent ones); each possesses unique characteristics and opportunities (some achieved, others squandered) borne out of their particular moment in history. But this much is clear: whatever happens with the election and whichever generational cohort assumes power, the future is gonna be remarkably different.

This is an infrequent feature of this blog: additions to and deletions from my blogroll. Other bloggers attract my attention for various reasons, mostly the quality of writing and ideas (interrelated), but over time, some start to repel me. This update has several in both categories.

At Wit’s End, Three-Pound Brain, and Bracing Views were are all added some while back. The first two have new posts very infrequently, but the quality is very high (IMO). The last is far more active and solicits commentary openly. Subject matter at these blogs varies widely, and only the third could be accused of being an outrage engine. It’s a worthwhile read nonetheless if political dysfunction doesn’t ignite in you a firestorm of rage and indignation.

Dropping Creative Destruction, Gin & Tacos and Pharyngula. The first has been dead for a long time; nothing there to see anymore besides the backblog. I thought it might eventually revive, but alas, no. Updates to the second have dropped significantly as authorial attention shifted to podcasting. The commentariat there was especially worthwhile, but with so few new posts, the disappearance of whimsical history lessons, and irritating focus on racehorse politics, the blog has lost my recommendation. The third used to be a fun read, especially for being well argued. The tone shifted at some point toward smug, woke felation service of an in-group, by definition excluding everyone else. Like another unmentioned blog dropped from my blogroll some years ago, the author behaves like an omniscient bully: being absolutely correct about everything all the time. The lack of humility or tolerance for ambiguity — or even the very human admission once in a while “I dunno …” — is exhausting.

Final admission: traffic to and from this blog is chronically low, so no blogger cares about being added or removed from my blogroll. No illusions about that on my part. However, respectable curation is a value worth periodic updates.

Nicholas Carr has a pair of thoughtful new posts at his blog Rough Type (see blogroll) under the tag “infinite media.” The second of the two is about context collapse, restoration, and content collapse. I won’t review that particular post; I’m merely pointing to it for you to read. Carr is a journalist and media theorist whose work is especially interesting to me as a partial antidote to what I’ve been calling our epistemological crisis. In short, he offers primers on how to think about stuff, that stuff being the primary medium through which most people now gather information: via screens.

Relatedly, the other media theorist to whom I pay attention is Alan Jacobs, who has a recent book (which I read but didn’t review or blog about) called more simply How to Think. It’s about recognizing and avoiding cognitive biases on the way to more disciplined, clear thinking. I mention these two fellows together because I’ve been reading their blogs and books for over a decade now and have been curious to observe how their public interactions have changed over time. They have each embraced and abandoned various new media (particularly social media) and adopted more stringent media ecology. Carr posts occasionally now and has closed comments at his blog (a shame, since his commentariat was valuable, quite unlike the troll mob at most sites). Jacobs is even more aggressive, starting and abandoning one blog after another (was active at multiple URLs, one formerly on my blogroll) and deleting his Twitter account entirely. Whatever goings-on occur at Facebook I can’t say; I never go there. These aren’t criticisms. We all evolve our associations and activities. But these two are unusual, perhaps, in that they evaluate and recommend with varying vehemence how to interact with electronic media tools.

The wide-open Web available to Americans (but restricted in some countries) used to be valorized as a wholly democratic, organic, grass-roots, decentralized force for good where information yearned to breathe free. Though pioneered by academic institutions, it wasn’t long before the porn industry became the first to monetize it effectively (cuz duh! that’s there the money was — at least initially) and then the whole thing was eventually overwhelmed by others with unique agendas and mechanisms, including commerce, surveillance, and propaganda. The surfeit of information demanded curation, and social media with algorithmic feeds became the default for folks either too lazy or just untrained (or uninterested) in how to think for themselves. Along the way, since a surprisingly large portion of human activity diverted to online media, that activity turned into a resource mined, harvested, and in turn monetized, much like the voting public has become a resource tracked, polled, channeled, activated, disenfranchized, corrupted, and analyzed to death.

An earlier media theorist I read with enthusiasm, Neil Postman, recommended that curricula include the study of semantics as applied to media. (Use of a word like semantics sends nonacademics running for the hills, but the recommendation is basically about thinking critically, even skeptically, regarding information, its sources, and its means of distribution.) The rise of handheld omnimedia postdates Postman, so I can only surmise that the bewildering array of information we confront absorb every day, which I liken to drinking from a fire hose, only compounds Postman’s concern that students are severely overmatched by media (especially advertising) intent on colonizing and controlling their minds. Thus, today’s information environment is a far cry from the stately slowness of earlier eras when teaching and learning (to say nothing of entertainment) were conducted primarily through reading, lecture, and discussion.

A comment came in on this blog chiding me for still blogging after 14 years. I admit hardly anyone reads anymore; they watch (or listen, as with audio-only podcasts). Preferred forms of media consumption have moved on from printed text, something USA Today recognized decades ago when it designed its print publication and sidewalk distribution boxes to look more like TVs. Nonetheless, the modest reproach reminded me of a cry in the wilderness by Timothy Burke: why he still blogs, though quite infrequently. (There’s a brokeback can’t-quit-you joke in there somewhere I’ll leave unformulated.) So this blog may indeed be past its proper expiration date, yet it remains for me one of the best means for organizing how I think about stuff. Without it, I’m afraid thoughts would be rattling loose inside my head, disorganized, only to be displaced by the next slurp from the fire hose.

The “American character,” if one can call it into being merely by virtue of naming it (the same rhetorical trick as solutionism), is diverse and ever-changing. Numerous characterizations have been offered throughout history, with Alexis de Tocqueville’s Democracy in America (1835 and 1840) being perhaps the one cited most frequently despite its outdatedness. Much in American character has changed since that time, and it’s highly questionable to think it was unified even then. However, as a means of understanding ourselves, it’s as good a place to start as any. A standard criticism of American character as seen from outside (i.e., when Americans travel abroad) is the so-called ugly American: loud, inconsiderate, boorish, and entitled. Not much to argue with there. A more contemporary assessment by Morris Berman, found throughout his “American trilogy,” is that we Americans are actually quite stupid, unaccountably proud of it, and constantly hustling (in the pejorative sense) in pursuit of material success. These descriptions don’t quite match up with familiar jingoism about how great America is (and of course, Americans), leading to non-Americans clamoring to emigrate here, or the self-worship we indulge in every national holiday celebrating political and military history (e.g., Independence Day, Veteran’s Day, Memorial Day).

I recently ran afoul of another ugly aspect of our national character: our tendency toward aggression and violence. In truth, this is hardly unique to Americans. Yet it came up glaringly in the context of a blog post at Pharyngula citing a Tweet comparing uneven application of law (and indignation among online chatterers?) when violence is committed by the political left vs. the political right. Degree of violence clearly matters, but obvious selection bias was deployed to present an egregiously lop-sided perspective. Radicals on both the left and right have shown little compunction about using violence to achieve their agendas. Never mind how poorly conceived those agendas may be. What really surprised me, however, was that my basic objection to violence in all forms across the spectrum was met with snark and ad hominem attack. When did reluctance to enact violence (including going to war) until extremity demands it become controversial?

My main point was that resorting to violence typically invalidates one’s objective. It’s a desperation move. Moreover, using force (e.g., intimidation, threats, physical violence — including throwing milkshakes) against ideological opponents is essentially policing others’ thoughts. But they’re fascists, right? Violence against them is justified because they don’t eschew violence. No, wrong. Mob justice and vigilantism obviate the rule of law and criminalize any perpetrator of violence. It’s also the application of faulty instrumental logic, ceding any principled claim to moral authority. But to commentators at the blog post linked above, I’m the problem because I’m not in support of fighting fascists with full force. Guess all those masked, caped crusaders don’t recognize that they’re contributing to lawlessness and mayhem. Now even centrists come in for attack for not be radical (or aggressive, or violent) enough. Oddly silent in the comments is the blog host, P.Z. Myers, who has himself communicated approval of milkshake patrols and Nazi punching, as though the presumptive targets (identified rather haphazardly and incorrectly in many instances) have no right to their own thoughts and ideas, vile though they may be, and that violence is the right way to “teach them a lesson.” No one learns the intended lesson when the victim of violence. Rather, if not simply cowed into submission (not the same as agreement), tensions tend to escalate into further and increasing violence. See also reaction formation.

Puzzling over this weird exchange with these, my fellow Americans (the ideologically possessed ones anyway), caused me to backtrack. For instance, the definition of fascism at dictionary.com is “a governmental system led by a dictator having complete power, forcibly suppressing opposition and criticism, regimenting all industry, commerce, etc., and emphasizing an aggressive nationalism and often racism.” That definition sounds more like totalitarianism or dictatorship and is backward looking, specifically to Italy’s Benito Mussolini in the period 1922 to 1943. However, like national characters, political moods and mechanisms change over time, and the recent fascist thrust in American politics isn’t limited to a single leader with dictatorial power. Accordingly, the definition above has never really satisfied me.

I’ve blogged repeatedly about incipient fascism in the U.S., the imperial presidency (usually associated with George W. Bush but also characteristic of Barack Obama), James Howard Kunstler’s prediction of a cornpone fascist coming to power (the way paved by populism), and Sheldon Wolin’s idea of inverted totalitarianism. What ties these together is how power is deployed and against what targets. More specifically, centralized power (or force) is directed against domestic populations to advance social and political objectives without broad public support for the sole benefit of holders of power. That’s a more satisfactory definition of fascism to me, certainly far better that Peter Schiff’s ridiculous equation of fascism with socialism. Domination of others to achieve objectives describes the U.S. power structure (the military-industrial-corporate complex) to a tee. That doesn’t mean manufactured consent anymore; it means bringing the public into line, especially through propaganda campaigns, silencing of criticism, prosecuting whistle-blowers, and broad surveillance, all of which boil down to policing thought. The public has complied by embracing all manner of doctrine against enlightened self-interest, the very thing that was imagined to magically promote the general welfare and keep us from wrecking things or destroying ourselves unwittingly. Moreover, public support is not really obtained through propaganda and domination, only the pretense of agreement found convincing by fools. Similarly, admiration, affection, and respect are not won with a fist. Material objectives (e.g., resource reallocation, to use a familiar euphemism) achieved through force are just common theft.

So what is Antifa doing? It’s forcibly silencing others. It’s doing the work of fascist government operatives by proxy. It’s fighting fascism by becoming fascist, not unlike the Republican-led U.S. government in 2008 seeking bailouts for banks and large corporations, handily transforming our economy into a socialist experiment (e.g, crowd-funding casino capitalism through taxation). Becoming the enemy to fight the enemy is a nice trick of inversion, and many are so flummoxed by these contradictions they resort to Orwellian doublethink to reconcile the paradox. Under such conditions, there are no arguments that can convince. Battle lines are drawn, tribal affiliations are established, and the ideological war of brother against brother, American against American, intensifies until civility crumbles around us. Civil war and revolution haven’t occurred in the U.S. for 150 years, but they are popping up regularly around the globe, often at the instigation of the U.S. government (again, acting against the public interest). Is our turn coming because we Americans have been divided and conquered instead of recognizing the real source of threat?