Posts Tagged ‘Rants’

rant on/

Authors I read and podcasters to whom I listen, mostly minor celebrities of the nonentertainment kind, often push their points of view using lofty appeals to reason and authority as though they possess unique access to truth but which is lacking among those whose critical thinking may be more limited. Seems to be the special province of pundits and thought leaders shilling their own books, blogs, newspaper columns, and media presence (don’t forget to comment and subscribe! ugh …). The worst offender on the scene may well be Sam Harris, who has run afoul of so many others recently that a critical mass is now building against him. With calm, even tones, he musters his evidence (some of it hotly disputed) and builds his arguments with the serene confidence of a Kung Fu master yet is astonished and amazed when others don’t defer to his rhetoric. He has behaved of late like he possesses heroic superpowers only to discover that others wield kryptonite or magic sufficient to defeat him. It’s been quite a show of force and folly. I surmise the indignity of suffering fools, at least from Harris’ perspective, smarts quite a bit, and his mewling does him no credit. So far, the person refusing most intransigently to take the obvious lesson from this teachable moment is Harris himself.

Well, I’m here to say that reason is no superpower. Indeed, it can be thwarted rather handily by garden-variety ignorance, stupidity, emotion, superstition, and fantasy. All of those are found in abundance in the public sphere, whereas reason is in rather short supply. Nor is reason a panacea, if only one could get everyone on board. None of this is even remotely surprising to me, but Harris appears to be taken aback that his interlocutors, many of whom are sophisticated thinkers, are not easily convinced. In the ivory tower or echo chamber Harris has constructed for himself, those who lack lack scientific rigor and adherence to evidence (or even better, facts and data) are infrequently admitted to the debate. He would presumably have a level playing field, right? So what’s going on that eludes Sam Harris?

As I’ve been saying for some time, we’re in the midst of an epistemological crisis. Defenders of Enlightenment values (logic, rationalism, detachment, equity, secularism), most of whom are academics, are a shrinking minority in the new democratic age. Moreover, the Internet has put regular, perhaps unschooled folks (Joe the Plumber, Ken Bone, any old Kardashian, and celebrities used to being the undeserved focus of attention) in direct dialogue with everyone else through deplorable comments sections. Journalists get their say, too, and amplify the unwashed masses when resorting to man-on-the-street interviews. At Gin and Tacos (see blogroll), this last is called the Cletus Safari. The marketplace of ideas has accordingly been so corrupted by the likes of, well, ME! that self-appointed public intellectuals like Harris can’t contend effectively with the onslaught of pure, unadulterated democracy where everyone participates. (Authorities claim to want broad civic participation, as when they exhort everyone to vote, but the reverse is more nearly true.) Harris already foundered on the shoals of competing truth claims when he hosted on his webcast a fellow academic, Jordan Peterson, yet failed to make any apparent adjustments in the aftermath. Reason remains for Harris the one true faith.

Furthermore, Jonathan Haidt argues (as I understand him, correct me if I’m mistaken) that motivated reasoning leads to cherry-picking facts and evidence. In practice, that means that selection bias results in opinions being argued as facts. Under such conditions, even well-meaning folks are prone to peddling false certainty. This may well be the case with Charles Murray, who is at the center of the Harris debacle. Murray’s arguments are fundamentally about psychometrics, a data-driven subset of sociology and psychology, which under ideal circumstances have all the dispassion of a stone. But those metrics are applied at the intersection of two taboos, race and intelligence (who knew? everyone but Sam Harris and Charles Murray …), then transmuted into public policy recommendations. If Harris were more circumspect, he might recognize that there is simply no way to divorce emotion from discussions of race and intelligence.

rant off/

More to say on this subject in part 2 to follow.


A year ago, I wrote about charges of cultural appropriation being levied upon fiction writers, as though fiction can now only be some watered-down memoir lest some author have the temerity to conjure a character based on someone other than him- or herself. Specifically, I linked to an opinion piece by Lionel Shriver in the NY Times describing having been sanctioned for writing characters based on ideas, identities, and backgrounds other that his own. Shriver has a new article in Prospect Magazine that provides an update, perhaps too soon to survey the scene accurately since the target is still moving, but nonetheless curious with respect to the relatively recent appearance of call-out culture and outrage engines. In his article, Shriver notes that offense and umbrage are now given equal footing with bodily harm and emotional scarring:

Time was that children were taught to turn aside tormentors with the cry, “Sticks and stones may break my bones, but words will never hurt me!” While you can indeed feel injured because Bobby called you fat, the law has traditionally maintained a sharp distinction between bodily and emotional harm. Even libel law requires a demonstration of palpable damage to reputation, which might impact your livelihood, rather than mere testimony that a passage in a book made you cry.

He also points out that an imagined “right not to be offended” is now frequently invoked, even though there is no possibility of avoiding offense if one is actually conscious in the world. For just one rather mundane example, the extraordinary genocidal violence of 20th-century history, once machines and mechanisms (now called WMDs) were applied to warfare (and dare I say it: statecraft), ought to be highly offensive to any humanitarian. That history cannot be erased, though I suppose it can be denied, revised, buried, and/or lost to living memory. Students or others who insist they be excused from being triggered by knowledge of awful events are proverbial ostriches burying their heads in the sand.

As variations of this behavior multiply and gain social approval, the Thought Police are busily mustering against all offense — real, perceived, or wholly imagined — and waging a broad-spectrum sanitation campaign. Shriver believes this could well pose the end of fiction as publishers morph into censors and authors self-censor in an attempt to pass through the SJW gauntlet. Here’s my counter-argument:

rant on/

I feel mightily offended — OFFENDED I say! — at the arrant stupidity of SJWs whose heads are full of straw (and strawmen), who are so clearly confused about what is even possible within the dictates and strictures of, well, reality, and accordingly retreated into cocoons of ideation from which others are scourged for failure to adhere to some bizarre, muddleheaded notion of equity. How dare you compel me to think prescribed thoughts emanating from your thought bubble, you damn bullies? I have my own thoughts and feelings deserving of support, maybe even more than yours considering your obvious naïveté about how the world works. Why aren’t you laboring to promote mine but instead clamoring to infect everyone with yours? Why is my writing so resoundingly ignored while you prance upon the stage demanding my attention? You are an affront to my values and sensibilities and can stuff your false piety and pretend virtue where the sun don’t shine. Go ahead and be offended; this is meant to offend. If it’s gonna be you or me who’s transgressed precisely because all sides of issues can’t be satisfied simultaneously, then on this issue, I vote for you to be in the hot seat.

rant off/

Be forewarned: this is long and self-indulgent. Kinda threw everything and the kitchen sink at it.

In the August 2017 issue of Harper’s Magazine, Walter Kirn’s “Easy Chair” column called “Apocalypse Always” revealed his brief, boyhood fascination with dystopian fiction. This genre has been around for a very long time, to which the Cassandra myth attests. Kirn’s column is more concerned with “high mid-twentieth-century dystopian fiction,” which in his view is now classic and canonical, an entire generation of Baby Boomers having been educated in such patterned thought. A new wave of dystopian fiction appeared in the 1990s and yet another more recently in the form of Young Adult novels (and films) that arguably serve better as triumphal coming-of-age stories albeit under dystopian circumstances. Kirn observes a perennial theme present in the genre: the twins disappearances of freedom and information:

In the classic dystopias, which concern themselves with the lack of freedom and not with surplus freedom run amok (the current and unforeseen predicament of many), society is superbly well organized, resembling a kind of hive or factory. People are sorted, classified, and ranked, their individuality suppressed through goon squads, potent narcotics, or breeding programs. Quite often, they wear uniforms, and express themselves, or fail to, in ritual utterance and gestures.

Whether Americans in 2018 resemble hollowed-out zombies suffering under either boot-heel or soft-serve oppression is a good question. Some would argue just that in homage to classic dystopias. Kirn suggests briefly that we might instead suffer from runaway anarchy, where too much freedom and licentiousness have led instead to a chaotic and disorganized society populated by citizens who can neither govern nor restrain themselves.

Disappearance of information might be understood in at least three familiar aspects of narrative framing: what happened to get us to this point (past as exposition, sometimes only hinted at), what the hell? is going on (present as conflict and action), and how is gets fixed (future as resolution and denouement). Strict control over information exercised by classic dystopian despots doesn’t track to conditions under which we now find ourselves, where more disorganized, fraudulent, and degraded information than ever is available alongside small caches of wisdom and understanding buried somewhere in the heap and discoverable only with the benefit of critical thinking flatly lost on at least a couple generations of miseducated graduates. However, a coherent narrative of who and what we are and what realistic prospects the future may hold has not emerged since the stifling version of the 1950s nuclear family and middle class consumer contentment. Kirn makes this comparison directly, where classic dystopian fiction

focus[es] on bureaucracy, coercion, propaganda, and depersonalization, overstates both the prowess of the hierarchs and the submissiveness of the masses, whom it still thinks of as the masses. It does not contemplate Trump-style charlatanism at the top, or a narcissistic populace that prizes attention over privacy. The threats to individualism are paramount; the scourge of surplus individualism, with everyone playing his own dunce king and slurping up resources until he bursts, goes unexplored.

Kirn’s further observations are worth a look. Go read for yourself.


rant on/

Four years, ago, the Daily Mail published an article with the scary title “HALF the world’s wild animals have disappeared in 40 years” [all caps in original just to grab your eyeballs]. This came as no surprise to anyone who’s been paying attention. I blogged on this very topic in my review of Vaclav Smil’s book Harvesting the Biosphere, which observed at the end a 50% decrease in wild mammal populations in the last hundred years. The estimated numbers vary according to which animal population and what time frame are under consideration. For instance, in 2003, CNN reported that only 10% of big ocean fish remain compared to 47 years prior. Predictions indicate that the oceans could be without any fish by midcentury. All this is old news, but it’s difficult to tell what we humans are doing about it other than worsening already horrific trends. The latest disappearing act is flying insects, whose number have decreased by 75% in the last 25 years according to this article in The Guardian. The article says, um, scientists are shocked. I don’t know why; these articles and indicators of impending ecological collapse have been appearing regularly for decades. Similar Malthusian prophesies are far older. Remember colony collapse disorder? Are they surprised it’s happening now, as opposed to the end of the 21st century, safely after nearly everyone now alive is long dead? C’mon, pay attention!

Just a couple days ago, the World Meteorological Association issued a press release indicating that greenhouse gases have surged to a new post-ice age record. Says the press release rather dryly, “The abrupt changes in the atmosphere witnessed in the past 70 years are without precedent.” You don’t say. Even more astoundingly, the popular online news site Engadget had this idiotic headline: “Scientists can’t explain a ‘worrying’ rise in methane levels” (sourcing Professor Euan Nisbet of Royal Holloway University of London). Um, what’s to explain? We’ve been burning the shit out of planetary resources, temperatures are rising, and methane formerly sequestered in frozen tundra and below polar sea floors is seeping out. As I said, old news. How far up his or her ass has any reputable scientist’s head got to be to make such an outta-touch pronouncement? My answer to my own question: suffocation. Engadget made up that dude just for the quote, right? Nope.

Not to draw too direct a connection between these two issues (wildlife disappearances and greenhouse gases — hey, I said pay attention!) because, ya know, reckless conjecture and unproven conclusions (the future hasn’t happened yet, duh, it’s the future, forever telescoping away from us), but a changing ecosystem means evolutionary niches that used to support nature’s profundity are no longer doing so reliably. Plus, we just plain ate a large percentage of the animals or drove them to extinction, fully or nearly (for now). As these articles routinely and tenderly suggest, trends are “worrying” for humans. After all, how are we gonna put seafood on our plates when all the fish have been displaced by plastic?

rant off/

rant on/

As the next in an as-yet unnumbered series of Storms of the Century (I predict more than a dozen at least) is poised to strike nearly the entirety of the State of Florida, we know with confidence from prior experience, recent and not so recent, that any lessons we might take regarding how human habitation situated along or near coastlines vulnerable to extreme weather events, now occurring with increasing frequency and vehemence, will remain intransigently unlearned. Instead, we’ll begin rebuilding on the very same sites as soon as construction labor and resources can be mustered and deployed. Happened in New Orleans and New Jersey; is about to happen in Houston; and will certainly happen all across Florida — even the fragile Florida Keys. I mean, shit, we can’t do without The Magic Kingdom and other attractions in the central-Florida tourist mecca, now can we?

This predictable spin around the dance floor might look like a tragicomic circus waltz (e.g., The Daring Young Man on the Flying Trapeze), or even out-of-tune, lopsided calliope music from the carousel, except that positioning ourselves right back in harm’s way would be better characterized as a danse macabre. I dub it the Builder’s Waltz, which could also be the Rebuilder’s Rumba, the Catastrophe Tango, the Demolition Jive … take your pick.

Obstinate refusal to apprehend reality as it slams into us is celebrated as virtue these days. Can’t lose hope even as dark forces coalesce all around us, right? Was it always so? Still, an inkling might be dawning on some addle-brained deniers that perhaps science-informed global warming and climate change news might actually be about something with real-world impact, such as dramatic reduction of oil refinery output or a lost citrus crop. So much for illusions of business as usual continuing unhindered into the foreseeable future. Instead, our future looks more like dominoes lined up to fall — like the line of hurricanes formed in the Atlantic. Good luck hunkering down and weathering once-in-a-lifetime storms that just keep coming. And rebuilding the same things in the same places, well, just let it go, man, ’cuz it’s already gone.

rant off/

For a variety of reasons, I go to see movies in the theater only a handful of times any given year. The reasons are unimportant (and obvious) and I recognize that, by eschewing the theater, I’m giving up the crowd experience. Still, I relented recently and went to see a movie at a new AMC Dolby Cinema, which I didn’t even know exists. The first thing to appreciate was that is was a pretty big room, which used to be standard when cinema was first getting established in the 1920s but gave way sometime in the 1970s to multiplex theaters able to show more than one title at a time in little shoebox compartments with limited seating. Spaciousness was a welcome throwback. The theater also had oversized, powered, leather recliners rather than cloth, fold-down seats with shared armrests. The recliners were quite comfortable but also quite unnecessary (except for now typical Americans unable to fit their fat asses in what used to be a standard seat). These characteristics are shared with AMC Prime theaters that dress up the movie-going experience and charge accordingly. Indeed, AMC now offers several types of premium cinema, including RealD 3D, Imax, Dine-In, and BigD.

Aside I: A friend only just reported on her recent trip to the drive-in theater, a dated cinema experience that is somewhat degraded unenhanced yet retains its nostalgic charm for those of us old enough to remember as kids the shabby chic of bringing one’s own pillows, blankets, popcorn, and drinks to a double feature and sprawling out on the hood and/or roof of the car (e.g., the family station wagon). My friend actually brought her dog to the drive-in and said she remembered and sorta missed the last call on dollar hot dogs at 11 PM that used to find all the kids madly, gleefully rushing the concession stand before food ran out.

What really surprised me, however, was how the Dolby Cinema experience turned into a visual, auditory, and kinesthetic assault. True, I was watching Wonder Woman (sorry, no review), which is set in WWI and features lots of gunfire and munitions explosions in addition to the usual invincible superhero punchfest, so I suppose the point is partly to be immersed in the environment, a cinematic stab at verisimilitude. But the immediacy of all the wham-bam, rock ’em-sock ’em action made me feel more like a participant in a theater of war than a viewer. The term shell shock (a/k/a battle fatigue a/k/a combat neurosis) refers to the traumatized disorientation one experiences in moments of high stress and overwhelming sensory input; it applies here. Even the promo before the trailers and feature, offered to demonstrate the theater’s capabilities themselves, was off-putting because of unnecessary and overweening volume and impact. Unless I’m mistaken, the seats even have built-in subwoofers to rattle theatergoers from below when loud, concussive events occur, which is often because, well, filmmakers love their spectacle as much as audiences do.

Aside II: One real-life lesson to be gleaned from WWI, or the Great War as it was called before WWII, went well beyond the simplistic truism that war is hell. It was that civility (read: civilization) had failed and human progress was a chimera. Technical progress, however, had made WWI uglier in many respects than previous warfare. It was an entirely new sort of horror. Fun fact: there are numerous districts in France, known collectively as Le Zone Rouge, where no one is allowed to live because of all the unexploded ordnance (100 years later!). Wonder Woman ends up having it both ways: acknowledging the horrific nature of war on the one hand yet valorizing and romanticizing personal sacrifice and eventual victory on the other. Worse, perhaps, it establishes that there’s always another enemy in the wings (otherwise, how could there be sequels?), so keep fighting. And for the average viewer, uniformed German antagonists are easily mistakable for Nazis of the subsequent world war, a historical gloss I’m guessing no one minds … because … Nazis.

So here’s my problem with AMC’s Dolby Cinema: why settle for routine or standard theater experience when it can be amped up to the point of offense? Similarly, why be content with the tame and fleeting though reliable beauty of a sunset when one can enjoy a widescreen, hyperreal view of cinematic worlds that don’t actually exist? Why settle for the subtle, old-timey charm of the carousel (painted horses, dizzying twirling, and calliope music) when instead one can strap in and get knocked sideways by roller coasters so extreme that riders leave wobbly and crying at the end? (Never mind the risk of being stranded on the tracks for hours, injured, or even killed by a malfunction.) Or why bother attending a quaint symphonic band concert in the park or an orchestral performance in the concert hall when instead one can go to Lollapalooza and see/hear/experience six bands in the same cacophonous space grinding it out at ear-splitting volume, along with laser light shows and flash-pot explosions for the sheer sake of goosing one’s senses? Coming soon are VR goggles that trick the wearer’s nervous system into accepting they are actually in the virtual game space, often first-person shooters depicting killing bugs or aliens or criminals without compunction. Our arts and entertainments have truly gotten out of hand.

If those criticisms don’t register, consider my post more than a decade ago on the Paradox of the Sybarite and Catatonic, which argues that our senses are so overwhelmed by modern life that we’re essentially numb from overstimulation. Similarly, let me reuse this Nietzsche quote (used before here) to suggest that on an aesthetic level, we’re not being served well in display and execution of refined taste so much as being whomped over the head and dragged willingly? through ordeals:

… our ears have become increasingly intellectual. Thus we can now endure much greater volume, much greater ‘noise’, because we are much better trained than our forefathers were to listen for the reason in it. All our senses have in fact become somewhat dulled because we always inquire after the reason, what ‘it means’, and no longer for what ‘it is’ … our ear has become coarsened. Furthermore, the ugly side of the world, originally inimical to the senses, has been won over for music … Similarly, some painters have made the eye more intellectual, and have gone far beyond what was previously called a joy in form and colour. Here, too, that side of the world originally considered ugly has been conquered by artistic understanding. What is the consequence of this? The more the eye and ear are capable of thought, the more they reach that boundary line where they become asensual. Joy is transferred to the brain; the sense organs themselves become dull and weak. More and more, the symbolic replaces that which exists … the vast majority, which each year is becoming ever more incapable of understanding meaning, even in the sensual form of ugliness … is therefore learning to reach out with increasing pleasure for that which is intrinsically ugly and repulsive, that is, the basely sensual. [italics not in original]

Even before I begin, you must know what the title means. It’s the proliferation of options that induces dread in the toothpaste aisle of the store. Paste or gel? Tartar control or extra whitening? Plain, mint, cinnamon, or bubble gum? The matrix of combinations is enough to reduce the typical shopper to a quivering state of high anxiety lest the wrong toothpaste be bought. Oh, how I long for the days when choices ran solely between plain Crest and Colgate. I can’t say whether the toothpaste effect originated with oral hygiene. A similarly bewildering host of choices confronts shoppers in the soft drink aisle. Foodstuffs seem especially prone to brand fragmentation. Woe be the retailer forced to shelve all 38 Heinz products on this page. (True, some are just different packaging of the same basic item, but still.)

Purveyors of alcoholic beverages are on the bandwagon, too. I rather like the bygone cliché of the cowboy/gunslinger riding off the range, swinging into the saloon, and ordering simply “whisky.” Nowadays, even a poorly stocked bar is certain to have a dozen or so whiskys (see this big brand list, which doesn’t include sub-brands or craft distillers.) Then come all the varieties of schnapps, rum, and vodka, each brand further fragmented with infusions and flavorings of every imaginable type. Some truly weird ones are found here. Who knew that these spirits were simply blank canvases awaiting the master distiller’s crazy inventiveness.

/rant on

What really gets my bile flowing on this issue, however, is the venerable Lays potato chip. Seriously, Frito-Lay, what are you thinking? You arguably perfected the potato chip, much like McDonald’s perfected the French fry. (Both are fried potato, interestingly.) Further, you have a timeless, unbeatable slogan: “betcha can’t eat just one.” The plain, salted chip, the “Classic” of the Lays brand, cannot be improved upon and is a staple comfort food. Yet you have succumbed to the toothpaste effect and gone haywire with flavorings (I won’t even countenance the Wavy, Poppables, Kettle-Cooked, Ruffles, and STAX varieties). For variety’s sake, I’d be content with a barbecue chip, maybe even salt & vinegar, but you’ve piled on past the point of ridiculousness:

  • cheddar & sour cream (a favorite of mine)
  • Chile limón
  • deli style
  • dill pickle
  • flamin’ hot
  • honey barbecue
  • limón
  • pico de gallo
  • salt & vinegar (not to my taste)
  • sour cream & onion (a good alternative)
  • sweet Southern heat barbecue
  • Southern biscuits & gravy
  • Tapatío (salsa picante)


Continuing from my previous post, Brian Phillips has an article, writing for MTV News, entitled “Shirtless Trump Saves Drowning Kitten: Facebook’s fake-news problem and the rise of the postmodern right.” (Funny title, that.) I navigated to the article via Alan Jacob’s post at Text Patterns (on my blogroll). Let me consider each in turn.

After chuckling that Phillips is directing his analysis to the wrong audience, an admittedly elitist response on my part, I must further admit that the article is awfully well-written and nails the blithe attitude accompanying epistemological destruction carried out, perhaps unwittingly but too well-established now to ignore, by developers of social media as distinguished from traditional news media. Which would be considered more mainstream today is up for debate. Maybe Phillips has the right audience after all. He certainly gets the importance of controlling the narrative:

Confusion is an authoritarian tool; life under a strongman means not simply being lied to but being beset by contradiction and uncertainty until the line between truth and falsehood blurs and a kind of exhaustion settles over questions of fact. Politically speaking, precision is freedom. It’s telling, in that regard, that Trump supporters, the voters most furiously suspicious of journalism, also proved to be the most receptive audience for fictions that looked journalism-like. Authoritarianism doesn’t really want to convince its supporters that their fantasies are true, because truth claims are subject to verification, and thus to the possible discrediting of authority. Authoritarianism wants to convince its supporters that nothing is true, that the whole machinery of truth is an intolerable imposition on their psyches, and thus that they might as well give free rein to their fantasies.

But Phillips is too clever by half, burying the issue in scholarly style that speaks successfully only to a narrow class of academics and intellectuals, much like the language and memes employed by the alt-right are said to be dog whistles perceptible only to rabid, mouth-breathing bigots. Both charges are probably unfair reductions, though with kernels of truth. Here’s some of Phillips overripe language:

Often the battleground for this idea [virtue and respect] was the integrity of language itself. The conservative idea, at that time [20 years ago], was that liberalism had gone insane for political correctness and continental theory, and that the way to resist the encroachment of Derrida was through fortifying summaries of Emerson … What had really happened was that the left had become sensitized to the ways in which conventional moral language tended to shore up existing privilege and power, and had embarked on a critique of this tendency that the right interpreted, with some justification, as an attack on the very concept of meaning.

More plainly, Phillips’ suggestion is that the radical right learned the lessons of Postmodernism (PoMo) even better than did the avant-garde left, the latter having outwitted themselves by giving the right subtle tools used later to outmaneuver everyone. Like other mildly irritating analyses I have read, it’s a statement of inversion: an idea bringing into existence its antithesis that unironically proves and undermines the original, though with a dose of Schadenfreude. This was (partially) the subject of a 4-part blog I wrote called “Dissolving Reality” back in Aug. and Sept. 2015. (Maybe half a dozen read the series; almost no one commented.)

So what does Alan Jacobs add to the discussion? He exhibits his own scholarly flourishes. Indeed, I admire the writing but find myself distracted by the writerly nature, which ejects readers from the flow of ideas to contemplate the writing itself. For instance, this:

It turns out that the children of the ruling classes learned their lessons well, so when they inherited positions in their fathers’ law firms they had some extra, and very useful, weapons in their rhetorical armory.

In precisely the same way, when, somewhat later, academic leftists preached that race and gender were the determinative categories of social analysis, members of the future alt-right were slouching in the back rows of their classrooms, baseball caps pulled down over their eyes, making no external motions but in their dark little hearts twitching with fervent agreement.

Terrific capture of the classroom culture in which teachers are steeped. Drawing identity politics more manifestly into the mix is a fairly obvious extrapolation over Phillips and may reflect the results of the presidential election, where pundits, wheeling around to reinterpret results that should not have so surprised them, now suggest Republican victories are a repudiation of leftist moral instruction. The depth of Phillips’ and Jacobs’ remarks is not so typical of most pundits, however, and their follow-up analysis at some point becomes just more PoMo flagellation. Here, Jacobs is even more clearly having some fun:

No longer did we have to fear being brought before the bar of Rational Evidence, that hanging judge of the Enlightenment who had sent so many believers to the gallows! You have your constructs and we have our constructs, and who’s to say which are better, right? O brave new world that hath such a sociology of knowledge in it!

This goes back to the heart of the issue, our epistemological crisis, but I dispute that race and gender are the determinative categories of social analysis, no matter how fashionable they may be in the academy. A simpler and more obvious big picture controls: it’s about life and death. My previous post was about geopolitics, where death is rained down upon foreign peoples and justifying rhetoric is spread domestically. Motivations may be complex and varied, but the destruction of people and truth affects everyone, albeit unevenly, without regard to race, gender, religion, nationality, etc. All are caught in the dragnet.

Moreover, with the advent of Western civilization, intellectuals have always been sensitive to the sociology of knowledge. It’s a foundation of philosophy. That it’s grown sclerotic long precedes PoMo theory. In fact, gradual breaking apart and dismantling of meaning is visible across all expressive genres, not just literature. In painting, it was Impressionism, Cubism, Dada and Surrealism, and Abstract Expressionism. In architecture, it was Art Deco, the International Style, Modernism, Brutalism, and Deconstructivism. In music, it was the Post-Romantic, the Second Viennese School, Modernism, Serialism, and Minimalism. In scientific paradigms, it was electromagnetism, relativity, quantum mechanics, the Nuclear Era, and semiconductors. The most essential characteristics in each case are increasingly dogmatic abstraction and drilling down to minutia that betray meaningful essences. Factoring in economic and political perversions, we arrive at our current epistemological phase where truth and consequences matter little (though death and destruction still do) so long as deceits, projections, and distractions hold minds in thrall. In effect, gravity is turned off and historical narratives levitate until reality finally, inevitably comes crashing down in a monstrous Jenga pile, as it does periodically.

In the meantime, I suppose Phillips and Jacobs can issue more gaseous noise into the fog bank the information environment has become. They can’t get much traction (nor can I) considering how most of the affluent West thinks at the level of a TV sitcom. In addition, steps being considered to rein in the worst excesses of fake news would have corporations and traditional news media appointed as watchers and censors. Beyond any free speech objections, which are significant, expecting culprits to police themselves only awards them greater power to dominate, much like bailouts rewarded the banks. More fog, more lies, more levitation.

/rant on

With a new round of presidential debates upon us (not really debates if one understands the nature of debate or indeed moderation — James Howard Kunstler called it “the gruesome spectacle of the so-called debate between Trump and Clinton in an election campaign beneath the dignity of a third-world shit-hole”), it’s worthwhile to keep in the front of one’s mind that the current style of public discourse does not aim to provide useful or actionable information with regard to either the candidates or the issues. Rather, the idea is to pummel the hapless listener, watcher, or reader into a quivering jangle of confusion by maintaining a nonstop onslaught of soundbites, assertions, accusations, grandstanding, and false narratives. Our information environment abets this style of machine-gun discourse, with innumerable feeds from InstaGoogTwitFaceTube (et cetera), all vying simultaneously for our limited attention and thereby guaranteeing that virtually nothing makes a strong impression before the next bit of BS displaces it in a rapid succession of predigested morsels having no nutritional content or value for earnest consumers of information (as opposed to mouth-breathers seeking emotional salve for their worst biases and bigotry). Many feeds are frankly indecipherable, such as when the message is brutally truncated and possessed of acronyms and hashtags, the screen is cluttered with multiple text scrolls, or panel participants talk over each other to claim more screen time (or merely raise their asshole quotient by being the most obnoxious). But no matter so long as the double barrels keep firing.

I caught Republican nominee Donald Trump’s campaign manager Kellyann Conway being interviewed by some banal featherweight pulling punches (sorry, no link, but she’s eminently searchable). Conway proved adept at deflecting obvious contradictions and reversals (and worse) of the Trump campaign by launching so many ideological bombs that nothing the interviewer raised actually landed. Questions and conflicts just floated away, unaddressed and unanswered. Her bizarre, hyperverbal incoherence is similar to the candidate’s stammering word salad, and ironically, both give new meaning to the decades-old term “Teflon” when applied to politics. Nothing sticks because piling on more and more complete wrongness and cognitive dissonance overwhelms and bewilders anyone trying to track the discussion. Trump and Conway are hardly alone in this, of course, though their mastery is notable (but not admirable). Talking heads gathered in panel discussions on, say, The View or Real Time with Bill Maher, just about any klatch occupying news and morning-show couches, and hosts of satirical news shows (some mentioned here) exhibit the same behavior: a constant barrage of high-speed inanity (and jokes, omigod the jokes!) that discourages consideration of an idea before driving pellmell onto the next.

Thoughtful persons might pause to wonder whether breathless, even virtuoso delivery results from or creates our abysmally short attention spans and lack of serious discussion of problems plaguing the nation. Well, why can’t it be both? Modern media is all now fast media, delivering hit-and-run spectacle to overloaded nervous systems long habituated to being goosed every few moments. (Or as quoted years ago, “the average Hollywood movie has become indistinguishable from a panic attack.”) Our nervous systems can’t handle it, obviously. We have become insatiable information addicts seeking not just the next fix but a perpetual fix, yet the impatient demand for immediate gratification — Internet always at our fingertips — is never quelled. Some new bit will be added to the torrent of foolishness sooner than it can be pulled down. And so we stumble like zombies, blindly and willingly, into a surreality of our own making, heads down and faces blue from the glare of the phone/tablet/computer. Of course, the shitshow is brightly festooned with buffoon candidates holding court over the masses neither intends to serve faithfully in office. Their special brand of insanity is repeated again and again throughout the ranks of media denizens (celebrity is a curse, much like obscene wealth, or didn’t you know that?) and is seeping into the ground water to poison all of us.

/rant off

rant on/

Monastic pursuit of a singular objective, away from the maddening and distracting rush of modern life, is a character attribute that receives more than its rightful share of attention. In its salutary forms, monastic pursuit is understood as admirable, visionary, iconic (or iconoclastic), and heroic. In creative endeavors, seclusion and disengagement from feedback are preconditions for finding one’s true voice and achieving one’s vision. In sports, the image of the athlete devoted to training for the big event — race, match, tournament — to the exclusion of all else is by now a tired trope. Indeed, in this Olympics season, athlete profiles — puff pieces of extraordinary predictability — typically depict competitors in isolation, absolutely no one else at the gym, in the pool, on the track, etc., as though everyone goes it alone without the support or presence of coaches or teammates. Over-specialization and -achievement are such that spectators are conditioned to expect successful individuals, champions, to bleed (quite literally) as a mark of devotion to their respective fields.

At some point, however, monastic pursuit morphs into something more recognizably maniacal. The author retreating to his cabin in the woods to write the great American novel becomes the revolutionary hermit composing his political manifesto. Healthy competition among rivals turns into decidedly unsportsmanlike conduct. (Lance Armstrong is the poster boy not just for doping but also for the sociopathy he displayed mistreating teammates and perpetuating the lie as vehemently and as long as he did. Further examples compound quickly in sports). Business leaders, discontented with (sometime obscene) profitability, target others in their market sector with the intent of driving them out of business and establishing monopolies. (This contrasts markedly with the ideology of self-correcting markets many CEOs falsely espouse.) In politics, high-minded campaigns and elected politicians formed around sound policy and good governance lose out to such dirty tricks as character assassination, rigged and stolen elections, partisanship, and reflexive obstructionism of projects that enjoy popular support. In journalism, fair and balanced reporting inverts to constant harping on preferred talking points to control narratives through sheer force of repetition. You get the idea.

It’s difficult to say from where this intemperate impulse arises, but we’re undoubtedly in a phase of history where nearly every field of endeavor manifests its own version of the arms race. Some might argue that in a cost-benefit analysis, we’re all better off because we enjoy fruits not obtainable without (some folks at least) taking a scorched-earth approach, raising the bar, and driving everyone to greater heights. The willingness of some to distort and disgrace themselves hideously may be a high price to pay, especially when it’s for simple entertainment, but so long as we aren’t paying the price personally, we’re willing spectators to whatever glory and train wrecks occur. I would argue that, ultimately, we’re all paying the price. Routine competition and conflict resolution have grown so unhinged that, just to be in the game, competitors must be prepared to go all in (poker lingo) at even modest provocation. As a result, for just one example, the spirit of America’s erstwhile pastime (baseball) has been so corrupted that balanced players and fans (!) stay away and are replaced by goons. A true level playing field probably never existed. Now, however, whoever can muster the most force (financial, rhetorical, criminal) wins the trophy, and we’re each in turn encouraged to risk all in our own monastic pursuit.

rant off/