Archive for October, 2022

A recent episode of the Dark Horse Podcast introduced what appeared initially to be a new bit of lingo: the Inversion Fallacy. I’ve discussed logical fallacies and hidden biases in the past, and this one bears directly my multipart blog series “Dissolving Reality” from 2015 where I put forward the Ironic and Post-Ironic mindsets. The Ironic is more nearly the reversal of meaning yet tracks with the Inversion Fallacy. Without getting too hung up on the pointless minutia of terminology (trying to distinguish between, say, reversal, inversion, transposition, contradiction, and opposition), inversion means to turn something upside-down or on its head. It’s also related to devil’s advocacy, topsy-turvy argumentation, and is not … is too! squabbles where a thing becomes its opposite. Several pundits and commentators have lost my readership because of frequent forays into disingenuous reverse argumentation. I simply lack patience.

As described on Dark Horse, the Inversion Fallacy occurs when a thing or idea is treated as equivalent to its inverse. One example now commonplace in Wokedom is to accuse someone of being racist and then insist denial is proof of racism. (Also heard this particular example called a Kafka Trap, also on Dark Horse). As math, the equation would be either x = 1/x or x = –x. Inversion is the former, reversal the latter. The x = –x formulation (the Ironic) suggests that an idea or thing automatically invokes (i.e., brings into being) its opposite, especially through the use of sarcasm. Here’s the old joke illustrating the point:

Professor of linguistics hold forth before a class of undergraduates, “In language as in mathematics, a double negative is a positive. But in no mathematics or language does a double positive equal a negative.”

To which a student replies dryly, “Yeah, right ….”

The modest advantage of the x = 1/x formulation is that when x = 0, the equation has no meaning because dividing by zero is … undefined. The obvious example is the oft-quoted (and misquoted) Vietnam War nonsense, “It became necessary to destroy the town to save it.” That’s dividing by zero in a nutshell.

The difference between the two formulations does not IMO prevent the fallacy from working. My suspicion is that multiple ways of observing, describing, and naming the fallacy exist. An attribute of the Post-Ironic is that the tension between thing and not thing is expanded to include a fluid spectrum of competing positions. Whether reversal or inversion, Ironic or Post-Ironic, the common element is the necessity to set aside obvious cognitive dissonance and enter a state of flux where meanings cannot be fixed. Just a few blog posts ago, I cited George Orwell’s famous formulation: “War is Peace. Freedom is Slavery. Ignorance is Strength.” Requires Orwellian Doublethink to accept those propositions.

Arranged from short to long.

A collective noun not in use but probably should be: a harassment of technologies. Needs no explanation.

From the Episcopal Church: the church key. A euphemism for a bottle opener for alcoholic beverages with bottle caps.

From various YouTube channels offering cinema reviews: memberberries. A cheap form of fan service, typically citing familiar nostalgic bit, lines, or characters to trigger a pleasing memory of previous TV shows and films. Generally used derogatorily.

Not new but new to me at least: ramekin. A small dish in which food can be baked and served. Reminded me of the far less commonplace hottle, which is a single-serving glass carafe for hot water, tea, or coffee. Here are representative pics:

From nowhere in particular: the poverty draft. An open secret (arguably, not really lingua nova) that recruitment into the U.S. military is aided substantially by the poverty of potential recruits. Thus, joining a branch of the armed services is not necessarily because of ideological agreement with its functions or an earnest desire to serve but instead — at the risk of life and limb — to get education and training not otherwise available or to expunge debt from more traditional educational institutions.

From Thomas Chatterton Williams (whom I might criticize for a number of reasons, but I’ll abjure): the Age of Theory. The modern age (pick a start date) has been called many things. I tend to call it the Age of Abundance since that quintessential characteristic is now decidedly on the wane. (Age of Oil and Fossil Fuel Era are essentially the same thing.) Age of Theory refers to PoMo reliance on theory and abstraction as a means of understanding and interpreting nearly everything. I’ve blogged quite a bit about living in our heads as distinguished from living in our bodies (i.e., being embodied). My book blogging through Iain McGilchrist’s The Master and His Emissary is most on point (see the McGilchrist tag).

From Peruvian writer and essayist Mario Vargas Llosa: the truth in the lies (translations vary — sometimes given as the truth of lies). Although Vargas Llosa is referencing fiction (writers writing about writing), the notion that a lie can reveal a more significant truth is at the heart of communications. Whether through advertising, public relations, entertainment, politicking, or propaganda, shaping opinion with use of subtle-to-obvious (mis-)framing or with straight-up lies and falsehoods is the contemporary information landscape, though many attempt to adhere rigorously to truth and reality. Separating malefactors from truth-tellers is the warrant and responsibility of any sovereign intellect — a formidable and ongoing task in an increasingly deranging public sphere.

Fast Food is Dead

Posted: October 21, 2022 in Consumerism, Culture, Idle Nonsense, Taste
Tags: ,

/rant on

Despite its obvious satisfactions, fast food has always been a guilty pleasure for anyone with … taste. Redirecting the evolved culture of the table to standing in the parking lot, eating in one’s vehicle, or skulking off with the feedbag to eat in isolation (typically in the blue glare of the TV) always felt somehow illicit, like getting away with breaking some unspoken rule regarding how people ought to manage to get sustenance in a civilized fashion. What was lacking in quality and self-respect was typically made up in speed, price, and consistency. If one simply needed something to stuff in one’s face while on the road or in a hurry, familiar fast-food options were easy to settle upon rather than go out of one’s way and/or commit extra time to eating at a diner or mid-tier restaurant with actual waitstaff. Back in the day (get off my lawn …!), one could fill one’s hands (and belly) with a burger, fries, and drink for around $5 within a few minutes of ordering at the counter. The comfort food itself was predictable and gratified the senses, being suffused with all the ingredients now understood as irresistible hyperstimuli: salt, sugar, and fat. Exactly no one thought of fast food as quality, but boy did it satisfy immediate cravings.

Well, that era is over. Although still ubiquitous in a degraded landscape overun by franchises (franchise hell is what I call it), the fast-food option has stopped supplying food that is any of cheap, quick, or savory. I’m no aficionado of fast food, but I’ve certainly consumed my share. Over the decades, prices have increased substantially, quality has suffered, and worst of all, it’s no longer fast. The saving grace of fast food was getting it into my grubby paws quickly and chowing down eagerly, followed predictably by the sensation of it sitting uncomfortably in my gut as a giant, undigested mass. The last few times I’ve indulged, the wait at the counter has been 30+ min! I’ve never done it, but I suspect anyone ordering fast food through one of the food-delivery apps on their phone pays extra and waits for roughly an hour. The two obvious factors causing delays are (1) the implicit and wrongheaded (says me) demand to keep the drive-through lanes moving before serving those inside the restaurant (term loosely applied) and (2) the cost-saving reduction of staff to a skeleton crew functioning constantly in crisis mode. I recall how decades ago an entire school bus of “diners” could be unloaded into a fast-food joint and all be served relatively efficiently. That could never happen today with six or fewer people behind the counter trying vainly to take and fill orders. There are no fry cooks anymore flipping burgers, though fries are still dropped into the fryer on site.

Quality suffers, too, as the food, if it can even be called that anymore, is increasingly some chemical concoction of nonfood prepared elsewhere, flash frozen, and reheated on site. The resulting upset stomach, often nearly to the point of vomiting, relegates fast food to an option only for those with iron bellies — formerly an attribute of youth but now considerably less so because of the precipitous rise of autoimmune disorders and food intolerance/allergies. Dunno what’s to be done to restore the past glory days (?) of fast food, and I’m not in the business of solving this dilemma, if indeed either a dilemma or solutions exists. Reports indicate some large portion of the public gets most of their meals from fast-food restaurants, and some unaccountably get nearly all their meals that way. Profitability also seems not to suffer even as patrons suffer long waits and considerable gastrointestinal distress. If the death of fast food is another example of the race to the bottom (of what?), it shows no sign of having yet sunk to the floor.

/rant off

A friend put in my hands a copy of Peter Zeihan’s book The End of the World is Just the Beginning: Mapping the Collapse of Globalization (2022) with instructions to read (and return) the book. Without a moment’s pause, I exclaimed “oh, that guy!” Zeihan has been making the rounds of various podcasts and interview shows hawking his book and its conclusions, so I had gotten the bullet, so to speak, a few times already. This is frequently and understandably the case with authors doing the promotional circuit and repeating the same talking points with each appearance. Some fare better in that regard, some worse. Zeihan is among the worse, partly because he has recently entered the doomosphere (or collapse space, if one prefers) publicly, whereas I’m not an ingénue on the subject so not easily led. Thus far, I’ve only read the introduction, so rather than book blogging, let me instead admit a few of my biases openly, mostly based on what I’ve learned about collapse over the past decade and a half, without any expectation that Zeihan will dispel or overcome them in the course of 475 pp. (not counting acknowledgements and index).

Measurement. As a demographer, Zeihan repeats one of the most basic conceptual errors in science, namely, that by taking the measure of something one can reveal its secrets. With human population trends in particular, measurement is unambiguous and easily mistaken for staring into a crystal ball — so long as history remains basically continuous. Thus, the phrase demographics is destiny gets batted around (sometimes disputed — do a search) as though the prophesied future is as inevitable and inescapable as the rising and setting sun. Well, demographics is in fact pretty reliable until the appearance of one or more metaphorical black swans. Flocks of them have been circling around the early 21st century.

Totality. The term globalization might be properly limited to use in economics, but it describes industrial civilization as well. When one collapses, so, too, does the other. They’re inextricably linked and form a unity or totality. No doubt different regions and/or geographies will collapse differently; that’s not in dispute. However, the title suggests grievous loss followed (immediately?) by opportunity. As I’ve understood various collapse scenarios (those parts that can be reliably anticipated), none permit a quick restart or global reset. Rather, the bottleneck will be severe enough, the loss of habitat and resources so egregious, that what remnants manage to survive (no assurances) will be tiny, barbarous, and extremely localized (including the bolthole billionaires, but then, calling them barbarous is a tautology) compared to the nearly eight billion global citizens now alive in the short-lived Age of Abundance. The beginning of what, exactly? After most species succumb just as in previous major extinction events (usually an extensive process but this time sped up by orders of magnitude), it will be a very quiet Earth for tens or hundreds of millions of years if it bounces back at all (no assurances).

Terraforming. Like it or not, human activity and ingenuity have essentially terraformed the planet, but not intentionally or for the better. Sure, we have skyscrapers, giant transportation and energy networks, enough archived knowledge and entertainment to sate even the most insatiable intellects consumers, and all the manifold material glories and know-how of the modern era. But on balance, our own refuse is littering literally every place around the Earth (air, water, soil, in orbit), a mixture of plastics and toxic waste in waterways and soils make water (beyond headwaters) undrinkable and many foods unhealthy, lacking in nutrition, and even carcinogenic, and subtle alterations in atmospheric chemistry are changing the climate. These are catastrophes so big and diffuse they might as well be invisible; many people simply can’t grok them. The terraformed planet is now a sacrifice zone, exploited and despoiled ruthlessly for short-term gain leaving no future worth living. Global supply chains are already breaking down and will not be able to adjust fast enough to avoid a megadeath pulse.

Hubris. Zeihan is a geopolitical strategist. Even if he’s correct in his analyses and prognostications, even if leaders heed his advice and prepare responsibly, even if all of humanity pulls together somehow to address cascade failure and eventual collapse, there is no reason to expect that history writ large can be steered toward desired outcomes to avoid worst case scenarios now barreling at us. That’s simply not the way history unfolds, and experience demonstrates that those who try to exert god-like influence over human affairs become maniacs, despots, and tyrants who generally manage to make matters worse. The world is already experiencing diasporas from politically, socially, economically, and ecologically destabilized regions, and the obvious, humane response (i.e., take them in) has been limited because those countries regarded as lifeboats (true or not) can’t haul them all aboard. The book’s Table of Contents doesn’t indicate consideration of that demographic effect and the index doesn’t list the term diaspora. Yet Zeihan’s got everything figgered well enough to offer strategic advice?

I support the idea of studying history to better understand ourselves in the present. But that can’t be the limit of a book with the tease “… just the beginning” right there in the title. Macrohistory is going to roll over all of us no matter what, and it’s wishful thinking to believe much can be done at this point to redirect the terrible consequences and momentum of past centuries. Although academics can recognize in hindsight major influences, technologies, ideologies, and inflection points that delivered us to this point in history, and perhaps even see how some near-term developments will break good or bad depending on fortuitous circumstance, no one planned, directed, or chose any of the epochal shifts of the past. Rather, human societies and civilizations muddle through and adapt continuously until — at last — they can’t anymore. Then they collapse. It’s happened over and over but never before (that can be ascertained) at a global scale. Yet Zeihan promises a new beginning. I, OTOH, can offer no assurances.

Like another show I dubbed SufferFest, if you haven’t yet forfeited the nearly three-hour run time, save your effort and skip Blonde unless you enjoy watching people suffer. This pointless, highly selective biopic on Marilyn Monroe depicts her solely as struggling under the so-called tyranny of the lurid male gaze. (No real acknowledgement of her participation in that dynamic.) Yeah, it’s been long established that the old Hollywood studio system and the devouring public can’t avoid destroying their most iconic creations — especially the females. That’s the bitter end of the hype cycle if one is unfortunate enough to succumb. Oddly, the film omits Marilyn Monroe’s death at a relatively young age under questionable circumstances.

To its extremely modest credit, the film separates Norma Jeane Mortensen (or Baker) from Marilyn Monroe, meaning that the blonde bombshell character was an act, a Hollywood fiction — at least until it stopped being one. Whereas most would probably prefer to be remembered in a positive light (e.g., an Irish wake), this film insists (and insists and insists and then insists some more) on its subject being a victim and sufferer, which makes the audience into voyeurs and accomplices. A couple brief moments suggesting her exceptional, incandescent talent on multiple levels were reframed as yet more suffering. The film probably fits the contemporary Woke narrative of victimhood and is wholly unenjoyable. I regret having watched it. Earns a negative recommendation: stay away.

The comic below struck a chord and reminded me of Gary Larson’s clumsily drawn but often trenchant The Far Side comics on scientific subjects.

This one masquerades as science but is merely wordplay, i.e., puns, double entendres, and unexpectedly funny malapropisms (made famous by Yogi Berra, among others). Wordplay is also found in various cultural realms, including comic strips and stand-up comedy, advertising and branding, politics, and now Wokedom (a subset of grassroots politics, some might argue). Playing with words has gone from being a clever, sometimes enjoyable diversion (e.g., crossword puzzles) to fully deranging, weaponized language. Some might be inclined to waive away the seriousness of that contention using the childhood retort “sticks and stones ….” Indeed, I’m far less convinced of the psychological power of verbal nastiness than those who insist words are violence. But it’s equally wrong to say that words don’t matter (much) or have no effect whatsoever. Otherwise, why would those acting in bad faith work so tirelessly to control the narrative, often by restricting free speech (as though writing or out-loud speech were necessary for thoughts to form)?

It’s with some exasperation that I observe words no longer retain their meanings. Yeah, yeah … language is dynamic. But semantic shifts usually occur slowly as language evolves. Moreover, for communication to occur effectively, senders and receivers must be aligned in their understandings of words. If you and I have divergent understandings of, say, yellow, we won’t get very far in discussions of egg yolks and sunsets. The same is true of words such as liberal, fascist, freedom, and violence. A lack of shared understanding of terms, perhaps borne out of ignorance, bias, or agenda, leads to communications breakdown. But it’s gotten far worse than that. The meanings of words have been thrown wide open to PoMo reinterpretation that often invert their meanings in precisely the way George Orwell observed in his novel 1984 (published 1949): “War is peace. Freedom is slavery. Ignorance is strength.” Thus, earnest discussion of limitations on free speech and actual restriction on social media platforms, often via algorithmic identification of keywords that fail to account for irony, sarcasm, or context, fail to register that implementation of restrictive kludges already means free speech is essentially gone. The usual exceptions (obscenity, defamation, incitement, gag orders, secrecy, deceptive advertising, student speech, etc.) are not nearly as problematic because they have been adjudicated for several generations and accepted as established practice. Indeed, many exceptions have been relaxed considerably (e.g., obscenity that has become standard patois now fails to shock or offend), and slimy workarounds are now commonplace (e.g., using “people are saying …” to say something horrible yet shielding oneself while saying it). Another gray area includes fighting words and offensive words, which are being expanded (out of a misguided campaign to sanitize?) to include many words with origins as clinical and scientific terms, and faux offense used to stifle speech.

Restrictions on free speech are working in many respects, as many choose to self-censor to avoid running afoul of various self-appointed watchdogs or roving Internet thought police (another Orwell prophecy come true) ready to pounce on some unapproved utterance. One can argue whether self-censorship is cowardly or judicious, I suppose. However, silence and the pretense of agreement only conceal thoughts harbored privately and left unexpressed, which is why restrictions on public speech are fool’s errands and strategic blunders. Maybe the genie can be bottled for a time, but that only produces resentment (not agreement), which boils over into seething rage (and worse) at some point.

At this particular moment in U.S. culture, however, restrictions are not my greatest concern. Rather, it’s the wholesale control of information gathering and reporting that misrepresent or remove from the public sphere ingredients needed to form coherent thoughts and opinions. It’s not happening only to the hoi polloi; those in positions of power and control are manipulated, too. (How many lobbyists per member of Congress, industry after industry, whispering in their ears like so many Wormtongues?) And in extreme cases of fame and cult of personality, a leader or despot unwittingly surrounds him- or herself by a coterie of yes-men frankly afraid to tell the truth out of careerist self-interest or because of shoot-the-messenger syndrome. It’s lonely at the top, right?

Addendum: Mere minutes after publishing this post, I wandered over to Bracing Views (on my blogroll) and found this post saying some of the same things, namely, that choking information off at the source results in a degraded information landscape. Worth a read.