Archive for the ‘Culture’ Category

Even before I begin, you must know what the title means. It’s the proliferation of options that induces dread in the toothpaste aisle of the store. Paste or gel? Tartar control or extra whitening? Plain, mint, cinnamon, or bubble gum? The matrix of combinations is enough to reduce the typical shopper to a quivering state of high anxiety lest the wrong toothpaste be bought. Oh, how I long for the days when choices ran solely between plain Crest and Colgate. I can’t say whether the toothpaste effect originated with oral hygiene. A similarly bewildering host of choices confronts shoppers in the soft drink aisle. Foodstuffs seem especially prone to brand fragmentation. Woe be the retailer forced to shelve all 38 Heinz products on this page. (True, some are just different packaging of the same basic item, but still.)

Purveyors of alcoholic beverages are on the bandwagon, too. I rather like the bygone cliché of the cowboy/gunslinger riding off the range, swinging into the saloon, and ordering simply “whisky.” Nowadays, even a poorly stocked bar is certain to have a dozen or so whiskys (see this big brand list, which doesn’t include sub-brands or craft distillers.) Then come all the varieties of schnapps, rum, and vodka, each brand further fragmented with infusions and flavorings of every imaginable type. Some truly weird ones are found here. Who knew that these spirits were simply blank canvases awaiting the master distiller’s crazy inventiveness.

/rant on

What really gets my bile flowing on this issue, however, is the venerable Lays potato chip. Seriously, Frito-Lay, what are you thinking? You arguably perfected the potato chip, much like McDonald’s perfected the French fry. (Both are fried potato, interestingly.) Further, you have a timeless, unbeatable slogan: “betcha can’t eat just one.” The plain, salted chip, the “Classic” of the Lays brand, cannot be improved upon and is a staple comfort food. Yet you have succumbed to the toothpaste effect and gone haywire with flavorings (I won’t even countenance the Wavy, Poppables, Kettle-Cooked, Ruffles, and STAX varieties). For variety’s sake, I’d be content with a barbecue chip, maybe even salt & vinegar, but you’ve piled on past the point of ridiculousness:

  • cheddar & sour cream (a favorite of mine)
  • Chile limón
  • deli style
  • dill pickle
  • flamin’ hot
  • honey barbecue
  • limón
  • pico de gallo
  • salt & vinegar (not to my taste)
  • sour cream & onion (a good alternative)
  • sweet Southern heat barbecue
  • Southern biscuits & gravy
  • Tapatío (salsa picante)

(more…)

I finished Graham Hancock’s Fingerprints of the Gods (1995). He saved the best part of the book, an examination of Egyptian megalithic sites, for the final chapters and held back his final conclusion — more conjecture, really — for the tail end. The possible explanation Hancock offers for the destruction and/or disappearance of a supposed civilization long predating the Egyptian dynasties, the subject of the entire book, is earth-crust displacement, a theory developed by Charles Hapgood relating to polar shifts. Long story short, evidence demonstrates that the Antarctic continent used to be 2,000 miles away from the South Pole (about 30° from the pole) in a temperate zone and may have been, according to Hancock, the home of a seafaring civilization that had traveled and mapped the Earth. It’s now buried under ice. I find the explanation plausible, but I wonder how much the science and research has progressed since the publication of Fingerprints. I have not yet picked up Magicians of the Gods (2015) to read Hancock’s update but will get to it eventually.

Without having studied the science, several competing scenarios exist regarding how the Earth’s crust, the lithosphere, might drift, shift, or move over the asthenosphere. First, it’s worth recognizing that the Earth’s rotational axis defines the two poles, which are near but not coincident with magnetic north and south. Axial shifts are figured in relation to the crust, not the entire planet (crust and interior). From a purely geometric perspective, I could well imagine the crust and interior rotating as different speeds, but since we lack more than theoretical knowledge of the Earth’s liquid interior (the inner core is reputedly solid), only the solid portions at the surface of the sphere offer a useful frame of reference. The liquid surfaces (oceans, seas) obviously flow, too, but are also understood primarily in relation to the solid crust both below and above sea level.

The crust could wander slowly and continuously, shift all at once, or some combination of both. If all at once, the inciting event might be a sudden change in magnetic stresses that breaks the entire lithosphere loose or perhaps a gigantic meteor hit that knocks the planet as a whole off its rotational axis. Either would be catastrophic for living things that are suddenly moved into a substantially different climate. Although spacing of such events is unpredictable and irregular, occurring in geological time, Hancock assembles considerable evidence to conclude that the most recent such occurrence was probably about 12,000 BCE at the conclusion of the last glacial maximum or ice age. This would have been well within the time humans existed on Earth but long enough ago in our prehistory that human memory systems record events only as unreliable myth and legend. They are also recorded in stone, but we have yet to decipher their messages fully other than to demonstrate that significant scientific knowledge of astronomy and engineering were once possessed by mankind but was lost until redeveloped during the last couple of centuries.

First, a few reminders:

  • The United States has been in an undeclared state of war for 15 years, the longest in U.S. history and long enough that young people today can say legitimately, “we’ve always been at war with Oceania.” The wars encompass the entirety of both terms of the Obama Administration.
  • The inciting events were attacks on U.S. soil carried out on September 11, 2001 (popularly, 9/11), which remain shrouded in controversy and conspiracy despite the official narrative assigning patsy blame to al-Qaida operating in Afghanistan and Iraq.
  • On the heels of the attacks, the Bush Administration commenced a propaganda campaign to sell invasion and regime change in those two countries and, over widespread public protest, went ahead and launched preemptive wars, ostensibly because an existential threat existed with respect to weapons of mass destruction (WMDs) possessed by Iraq in particular.
  • The propaganda campaign has since been revealed to have been cooked up and untrue, yet it buffaloed a lot of people into believing (even to this day) that Iraq was somehow responsible for 9/11.
  • Our preemptive wars succeeded quickly in toppling governments and capturing (and executing) their leaders but immediately got bogged down securing a peace that never came.
  • Even with an embarrassing mismatch of force, periodic troop surges and draw downs, trillions of dollars wasted spent prosecuting the wars, and incredible, pointless loss of life (especially on the opposing sides), our objective in the Middle East (other than the oil, stupid!) has never been clear. The prospect of final withdrawal is nowhere on the horizon.

Continuous war — declared or merely waged — has been true of the U.S. my whole life, though one would be hard pressed to argue that it truly represents an immediate threat to U.S. citizens except to those unlucky enough to be deployed in war zones. Still, the monkey-on-the-back is passed from administration to administration. One might hope, based on campaign rhetoric, that the new executive (45) might recognize continuous war as the hot potato it is and dispense with it, but the proposed federal budget, with its $52 billion increase in military spending (+10% over 2016), suggests otherwise. Meanwhile, attention has been turned away from true existential threats that have been bandied about in the public sphere for at least a decade: global warming and climate change leading to Near-Term Extinction (NTE). Proximal threats, largely imagined, have absorbed all our available attention, and depending on whom one polls, our worst fears have already been realized.

The 20th and 21st centuries (so far) have been a series of “hot” wars (as distinguished from the so-called Cold War). Indeed, there has scarcely been a time when the U.S. has not been actively engaged fighting phantoms. If the Cold War was a bloodless, ideological war to stem the nonexistent spread of communism, we have adopted and coopted the language of wartime to launch various rhetorical wars. First was LBJ’s War on Poverty, the only “war” aimed at truly helping people. Nixon got into the act with his War on Drugs, which was punitive. Reagan expanded the War on Drugs, which became the War on Crime. Clinton increased the punitive character of the War on Crime by instituting mandatory minimum sentencing, which had the side effect of establishing what some call the prison-industrial complex, inflating the incarceration rate of Americans to the point that the U.S. is now ranked second in the world behind the Seychelles (!), a ranking far, far higher than any other industrialized nation.

If U.S. authoritarians hadn’t found enough people to punish or sought to convince the public that threats exist on all sides, requiring constant vigilance and a massive security apparatus including military, civil police, and intelligence services comprised of 16 separate agencies (of which we know), Bush coined and declared the War on Terror aimed at punishing those foreign and domestic who dare challenge U.S. hegemony in all things. It’s not called a national security state for nuthin’, folks. I aver that the rhetorical War on Poverty has inverted and now become a War on the Poverty-Stricken. De facto debtors’ prisons have reappeared, predatory lending has become commonplace, and income inequality grows more exaggerated with every passing year, leaving behind large segments of the U.S. population as income and wealth pool in an ever-shrinking number of hands. Admittedly, the trend is global.

At some point, perhaps in the 1960s when The Establishment (or more simply, The Man) became a thing to oppose, the actual Establishment must have decided it was high time to circle the wagons and protect its privileges, essentially going to war with (against, really) the people. Now five decades on, holders of wealth and power demonstrate disdain for those outside their tiny circle, and our the government can no longer be said with a straight face to be of, by, and for the people (paraphrasing the last line of Lincoln’s Gettysburg Address). Rather, the government has been hijacked and turned into something abominable. Yet the people are strangely complicit, having allowed history to creep along with social justice in marked retreat. True threats do indeed exist, though not the ones that receive the lion’s share of attention. I surmise that, as with geopolitics, the U.S. government has brought into being an enemy and conflict that bodes not well for its legitimacy. Which collapse occurs first is anyone’s guess.

I often review my past posts when one receives a reader’s attention, sometimes adding tags and fixing typos, grammar, and broken links. One on my greatest hits (based on voting, not traffic) is Low Points in Education. It was among the first to tackle what I have since called our epistemological crisis, though I didn’t begin to use the epistemology tag until later. The crisis has caught up with a vengeance, though I can’t claim I’m the first to observe the problem. That dubious honor probably goes to Stephen Colbert, who coined the word truthiness in 2005. Now that alternative facts and fake news have entered the lingo as well (gaslighting has been revived), everyone has jumped on the bandwagon questioning the truthfulness or falsity behind anything coughed up in our media-saturated information environment. But as suggested in the first item discussed in Low Points in Education, what’s so important about truth?

It would be obvious and easy yet futile to argue in favor of high-fidelity appreciation of the world, even if only within the surprisingly narrow limits of human perception, cognition, and memory (all interrelated). Numerous fields of endeavor rely upon consensus reality derived from objectivity, measurement, reason, logic, and, dare I say it, facticity. Regrettably, human cognition doesn’t adhere any too closely to those ideals except when trained to value them. Well-educated folks have better acquaintance with such habits of mind; folks with formidable native intelligence can develop true authority, too. For the masses, however, those attributes are elusive, even for those who have partied through earned college degrees. Ironically worse, perhaps, are specialists, experts, and overly analytical intellectuals who exhibit what the French call a déformation professionelle. Politicians, pundits, and journalists are chief among the deformed and distorted. Mounting challenges to establishing truth now destabilize even mundane matters of fact, and it doesn’t help that myriad high-profile provocateurs (including the Commander in Chief, to whom I will henceforth refer only as “45”) are constantly throwing out bones for journalists to chase like so many unnourishing rubber chew toys.

Let me suggest, then, that human cognition, or more generally the mind, is an ongoing balancing act, making adjustments to stay upright and sane. Like the routine balance one keeps during locomotion, shifting weight side to side continuously, falling a bit only to catch oneself, difficulty is not especially high. But with the foundation below one’s feet shaking furiously, so to speak, legs get wobbly and many end up (figuratively at least) ass over teakettle. Further, the mind is highly situational, contingent, and improvisational and is prone to notoriously faulty perception even before one gets to marketing, spin, and arrant lies promulgated by those intent on coopting or directing one’s thinking. Simply put, we’re not particularly inclined toward accuracy but instead operate within a wide margin of error. Accordingly, we’re quite strong at adapting to ever-changing circumstance.

That strength turns out to be our downfall. Indeed, rootless adjustment to changing narrative is now so grave that basic errors of attribution — which entities said and did what — make it impossible to distinguish allies from adversaries reliably. (Orwell captured this with his line from the novel 1984, “Oceania was at war with Eurasia; therefore Oceania had always been at war with Eurasia.) Thus, on the back of a brazen propaganda campaign following 9/11, Iraq morphed from U.S. client state to rogue state demanding preemptive war. (Admittedly, the U.S. State Department had already lost control of its puppet despot, who in a foolish act of naked aggression tried to annex Kuwait, but that was a brief, earlier war quite unlike the undeclared one in which the U.S. has been mired for 16 years.) Even though Bush Administration lies have been unmasked and dispelled, many Americans continue to believe (incorrectly) that Iraq possessed WMDs and posed an existential threat to the U.S. The same type of confusion is arguably at work with respect to China, Russia, and Israel, which are mixed up in longstanding conflicts having significant U.S. involvement and provocation. Naturally, the default villain is always Them, never Us.

So we totter from moment to moment, reeling drunkenly from one breathtaking disclosure to the next, and are forced to reorient continuously in response to whatever the latest spin and spew happen to be. Some institutions retain the false sheen of respectability and authority, but for the most part, individuals are free to cherry-pick information and assemble their own truths, indulging along the way in conspiracy and muddle-headedness until at last almost no one can be reached anymore by logic and reason. This is our post-Postmodern world.

As a boy, my home included a coffee table book, title unknown, likely published circa 1960, about the origins of human life on Earth. (A more recent book of this type attracting lots of attention is Sapiens: A Brief History of Humankind (2015) by Yuval Harari, which I haven’t yet read.) It was heavily enough illustrated that my siblings and I consulted it mostly for the pictures, which can probably be excused since we were youngsters at the time time. What became of the book escapes me. In the intervening decades, I made no particular study of the ancient world — ancient meaning beyond the reach of human memory systems. Thus, ancient could potentially refer to anthropological history in the tens of thousands of years, evolutionary history stretching across tens of millions of years, geological history over hundreds of millions of years, or cosmological time going back a few billions. For the purpose of this blog post, let’s limit ancient to no more than fifty thousand years ago.

A few months ago, updates (over the 1960 publication) to the story of human history and civilization finally reached me (can’t account for the delay of several decades) via webcasts published on YouTube between Joe Rogan, Randall Carlson, and Graham Hancock. Rogan hosts the conversations; Carlson and Hancock are independent researchers whose investigations converge on evidence of major catastrophes that struck the ancient world during the Younger Dryas Period, erasing most but not all evidence of an antediluvian civilization. Whereas I’m a doomer, they are catastrophists. To call this subject matter fascinating is a considerable understatement. And yet, it’s neither here nor there with respect to how we conduct our day-to-day lives. Their revised history connects to religious origin stories, but such narratives have been relegated to myth and allegory for a long time already, making them more symbolic than historical.

In the tradition of Galileo, Copernicus, Newton, and Darwin, all of whom went against scientific orthodoxy of their times but were ultimately vindicated, Carlson and Graham appear to be rogue scientists/investigators exploring deep history and struggling against the conventional story of the beginnings of civilization around 6,000 years ago in the Middle East and Egypt. John Anthony West is another who disputes the accepted narratives and timelines. West is also openly critical of “quackademics” who refuse to consider accumulating evidence but instead collude to protect their cherished ideological and professional positions. The vast body of evidence being pieced together is impressive, and I truly appreciate their collective efforts. I’m about 50 pp. into Hancock’s Fingerprints of the Gods (1995), which contains copious detail not well suited to the conversational style of a webcast. His follow-up Magicians of the Gods (2015) will have to wait. Carlson’s scholarly work is published at the website Sacred Geometry International (and elsewhere, I presume).

So I have to admit that my blog, launched in 2006 as a culture blog, turned partially into a doomer blog as that narrative gained the weight of overwhelming evidence. What Carlson and Hancock in particular present is evidence of major catastrophes that struck the ancient world and are going to repeat: a different sort of doom, so to speak. Mine is ecological, financial, cultural, and finally civilizational collapse borne out of exhaustion, hubris, frailty, and most importantly, poor stewardship. Theirs is periodic cataclysmic disaster including volcanic eruptions and explosions, great floods (following ice ages, I believe), meteor strikes, earthquakes, tsunamis, and the like, each capable of ending civilization all at once. Indeed, those inevitable events are scattered throughout our geological history, though at unpredictable intervals often spaced tens or hundreds of thousands of years apart. For instance, the supervolcano under Yellowstone is known to blow roughly every 600,000 years, and we’re overdue. Further, the surface of the Moon indicates bombardment from meteors; the Earth’s history of the same is hidden somewhat by continuous transformation of the landscape lacking on the Moon. The number of near misses, also known as near-Earth objects, in the last few decades is rather disconcerting. Any of these disasters could strike at any time, or we could wait another 10,000 years.

Carlson and Hancock warn that we must recognize the dangers, drop our petty international squabbles, and unite as a species to prepare for the inevitable. To do otherwise would be to court disaster. However, far from dismissing the prospect of doom I’ve been blogging about, they merely add another category of things likely to kill us off. They give the impression that we should turn our attention away from sudden climate change, the Sixth Extinction, and other perils to which we have contributed heavily and worry instead about death from above (the skies) and below (the Earth’s crust). It’s impossible to say which is the most worrisome prospect. As a fatalist, I surmise that there is little we can do to forestall any of these eventualities. Our fate is already sealed in one respect or another. That foreknowledge make life precious for me, and frankly, is another reason to put aside our petty squabbles.

Punchfest

Posted: February 26, 2017 in Cinema, Culture, Idle Nonsense, Sports
Tags: , , ,

Early in the process of socialization, one learns that the schoolyard cry “Fight!” is half an alert (if one is a bystander) to come see and half an incitement to violence (if one is just entering into conflict). Fascination with seeing people duke it out, ostensibly to settle conflicts, never seems to grow old, though the mixed message about violence never solving anything sometimes slows things down. (Violence does in fact at least put an end to things. But the cycle of violence continues.) Fights have also lost the respectability of yore, where the victor (as with a duel or a Game of Thrones fight by proxy) was presumed to be vindicated. Now we mostly know better than to believe that might makes right. Successful aggressors can still be villains. Still, while the primal instinct to fight can be muted, it’s more typically channeled into entertainment and sport, where it’s less destructive than, say, warrior culture extending all the way from clans and gangs up to professional militaries.

Fighting in entertainment, especially in cinema, often depicts invulnerability that renders fighting pointless and inert. Why bother hitting Superman, the Incredible Hulk, Wolverine, or indeed any number of Stallone, Schwarzenegger, Segal, or Statham characters when there is no honest expectation of doing damage? They never get hurt, just irritated. Easy answer: because the voyeurism inherent in fighting endures. Even when the punchfest is augmented by guns we watch, transfixed by conflict even though outcomes are either predictable (heroes and good guys almost always win), moot, or an obvious set-up for the next big, stupid, pointless battle.

Fighting in sport is perhaps most classical in boxing, with weight classes evening out the competition to a certain degree. Boxing’s popularity has waxed and waned over time as charismatic fighters come and go, but like track and field, it’s arguably one of the purest expressions of sport, being about pure dominance. One could also argue that some team sports, such as hockey and American-style football, are as much about the collateral violence as about scoring goals. Professional wrestling, revealed to be essentially athletic acting, blends entertainment and sport, though without appreciable loss of audience appeal. As with cinema, fans seem to not care that action is scripted. Rising in popularity these days is mixed martial arts (MMA), which ups the ante over boxing by allowing all manner of techniques into the ring, including traditional boxing, judo, jiu-jitsu, wrestling, and straight-up brawling. If brawling works in the schoolyard and street against unwilling or inexperienced fighters, it rarely succeeds in the MMA ring. Skill and conditioning matter most, plus the lucky punch.

Every kid, boy or girl, is at different points bigger, smaller, or matched with someone else when things start to get ugly. So one’s willingness to engage and strategy are situational. In childhood, conflict usually ends quickly with the first tears or bloodied nose. I’ve fought on rare occasion, but I’ve never ever actually wanted to hurt someone. Truly wanting to hurt someone seems to be one attribute of a good fighter; another is the lack of fear of getting hit or hurt. Always being smaller than my peers growing up, if I couldn’t evade a fight (true for me most of the time), I would defend myself, but I wasn’t good at it. Reluctant willingness to fight was usually enough to keep aggressors at bay. Kids who grow up in difficult circumstances, fighting with siblings and bullies, and/or abused by a parent or other adult, have a different relationship with fighting. For them, it’s unavoidable. Adults who relish being bullies join the military and/or police or maybe become professional fighters.

One would have to be a Pollyanna to believe that we will eventually rise above violence and use of force. Perhaps it’s a good thing that in a period of relative peace (in the affluent West), we have alternatives to being forced to defend ourselves on an everyday basis and where those who want to can indulge their basic instinct to fight and establish dominance. Notions of masculinity and femininity are still wrapped up in how one expresses these urges, though in characteristic PoMo fashion, traditional boundaries are being erased. Now, everyone can be a warrior.

This past Thursday was an occasion of protest for many immigrant laborers who did not show up to work. Presumably, this action was in response to recent executive attacks on immigrants and hoped to demonstrate how businesses would suffer without immigrant labor doing jobs Americans frequently do not want. Tensions between the ownership and laboring classes have a long, tawdry history I cannot begin to summarize. As with other contextual failures, I daresay the general public believes incorrectly that such conflicts date from the 19th century when formal sociopolitical theories like Marxism were published, which intersect heavily with labor economics. An only slightly better understanding is that the labor movement commenced in the United Kingdom some fifty years after the Industrial Revolution began, such as with the Luddites. I pause to remind that the most basic, enduring, and abhorrent labor relationship, extending back millennia, is slavery, which ended in the U.S. only 152 years ago but continues even today in slightly revised forms around the globe.

Thursday’s work stoppage was a faint echo of general strikes and unionism from the middle of the 20th century. Gains in wages and benefits, working conditions, and negotiating position transferred some power from owners to laborers during that period, but today, laborers must sense they are back on their heels, defending conditions fought for by their grandparents but ultimately losing considerable ground. Of course, I’m sympathetic to labor, considering I’m not in the ownership class. (It’s all about perspective.) I must also admit, however, to once quitting a job after only one day that was simply too, well, laborious. I had that option at the time, though it ultimately led nearly to bankruptcy for me — a life lesson that continues to inform my attitudes. As I survey the scene today, however, I suspect many laborers — immigrants and native-born Americans alike — have the unenviable choice of accepting difficult, strenuous labor for low pay or being unemployed. Gradual reduction of demand for labor has two main causes: globalization and automation.

(more…)

A long while back, I blogged about things I just don’t get, including on that list the awful specter of identity politics. As I was finishing my undergraduate education some decades ago, the favored term was “political correctness.” That impulse now looks positively tame in comparison to what occurs regularly in the public sphere. It’s no longer merely about adopting what consensus would have one believe is a correct political outlook. Now it’s a broad referendum centered on the issue of identity, construed though the lens of ethnicity, sexual orientation, gender identification, lifestyle, religion, nationality, political orientation, etc.

One frequent charge levied against offenders is cultural appropriation, which is the adoption of an attribute or attributes of a culture by someone belonging to a different culture. Here, the term “culture” is a stand-in for any feature of one’s identity. Thus, wearing a Halloween costume from another culture, say, a bandido, is not merely in poor taste but is understood to be offensive if one is not authentically Mexican. Those who are infected with the meme are often called social justice warriors (SJW), and policing (of others, natch) is especially vehement on campus. For example, I’ve read of menu items at the school cafeteria being criticized for not being authentic enough. Really? The won ton soup offends Chinese students?

In an opinion-editorial in the NY Times entitled “Will the Left Survive the Millennials?” Lionel Shriver described being sanctioned for suggesting that fiction writers not be too concerned about creating characters from backgrounds different from one’s own. He contextualizes the motivation of SJWs this way: (more…)

I don’t have the patience or expertise to prepare and offer a detailed political analysis such as those I sometimes (not very often) read on other blogs. Besides, once the comments start filling up at those sites, every possible permutation is trotted out, muddying the initial or preferred interpretation with alternatives that make at least as much sense. They’re interesting brainstorming sessions, but I have to wonder what is accomplished.

My own back-of-the-envelope analysis is much simpler and probably no closer to (or farther from) being correct, what with everything being open to dispute. So the new POTUS was born in 1946, which puts the bulk of his boyhood in the 1950s, overlapping with the Eisenhower Administration. That period has lots of attributes, but the most significant (IMO), which would impact an adolescent, was the U.S. economy launching into the stratosphere, largely on the back of the manufacturing sector (e.g., automobiles, airplanes, TVs, etc.), and creating the American middle class. The interstate highway system also dates from that decade. Secondarily, there was a strong but misplaced sense of American moral leadership (one might also say authority or superiority), since we took (too much) credit for winning WWII.

However, it wasn’t great for everyone. Racism, misogyny, and other forms of bigotry were open and virulent. Still, if one was lucky to be a white, middle class male, things were arguably about as good as they would get, which many remember rather fondly, either through rose-colored glasses or otherwise. POTUS as a boy wasn’t middle class, but the culture around him supported a worldview that he embodies even now. He’s also never been an industrialist, but he is a real estate developer (some would say slumlord) and media figure, and his models are taken from the 1950s.

The decade of my boyhood was the 1970s, which were the Nixon, Ford, and Carter Administrations. Everyone could sense the wheels were already coming off the bus, and white male entitlement was far diminished from previous decades. The Rust Belt was already a thing. Like children from the 1950s forward, however, I spent a lot of time in front of the TV. Much of it was goofy fun such as Gilligan’s Island, The Brady Bunch, and interestingly enough, Happy Days. It was innocent stuff. What are the chances that, as a boy plopped in front of the TV, POTUS would have seen the show below (excerpted) and taken special notice considering that the character shares his surname?

Snopes confirms that this a real episode from the TV show Trackdown. Not nearly as innocent as the shows I watched. The coincidences that the character is a con man, promises to build a wall, and claims to be the only person who can save the town are eerie, to say the least. Could that TV show be lodged in the back of POTUS’ brain, along with so many other boyhood memories, misremembered and revised the way memory tends to do?

Some have said that the great economic expansion of the 1950s and 60s was an anomaly. A constellation of conditions configured to produce an historical effect, a Golden Era by some reckonings, that cannot be repeated. We simply cannot return to an industrial or manufacturing economy that had once (arguably) made America great. And besides, the attempt would accelerate the collapse of the ecosystem, which is already in free fall. Yet that appears to be the intention of POTUS, whose early regression to childhood is a threat to us all.

I pause periodically to contemplate deep time, ancient history, and other subjects that lie beyond most human conceptual abilities. Sure, we sorta get the idea of a very long ago past out there in the recesses or on the margins, just like we get the idea of U.S. sovereign debt now approaching $20 trillion. Problem is, numbers lose coherence when they mount up too high. Scales differ widely with respect to time and currency. Thus, we can still think reasonably about human history back to roughly 6,000 years ago, but 20,000 years ago or more draws a blank. We can also think about how $1 million might have utility, but $1 billion and $1 trillion are phantoms that appear only on ledgers and contracts and in the news (typically mergers and acquisitions). If deep time or deep debt feel like they don’t exist except as conceptual categories, try wrapping your head around the deep state , which in the U.S. is understood to be a surprisingly large rogue’s gallery of plutocrats, kleptocrats, and oligarchs drawn from the military-industrial-corporate complex, the intelligence community, and Wall Street. It exists but does so far enough outside the frame of reference most of us share that it effectively functions in the shadow of daylight where it can’t be seen for all the glare. Players are plain enough to the eye as they board their private jets to attend annual meetings of the World Economic Forum in Davos-Klosters, Switzerland, or two years ago the Jackson Hole [Economic] Summit in Jackson Hole, WY, in connection with the American Principles Project, whatever that is. They also enjoy plausible deniability precisely because most of us don’t really believe self-appointed masters of the universe can or should exist.

Another example of a really bad trip down the rabbit hole, what I might call deep cynicism (and a place I rarely allow myself to go), appeared earlier this month at Gin and Tacos (on my blogroll):

The way they [conservatives] see it, half the kids coming out of public schools today are basically illiterate. To them, this is fine. We have enough competition for the kinds of jobs a college degree is supposed to qualify one for as it is. Our options are to pump a ton of money into public schools and maybe see some incremental improvement in outcomes, or we can just create a system that selects out the half-decent students for a real education and future and then warehouse the rest until they’re no longer minors and they’re ready for the prison-poverty-violence cycle [add military] to Hoover them up. Vouchers and Charter Schools are not, to the conservative mind, a better way to educate kids well. They are a cheaper way to educate them poorly. What matters is that it costs less to people like six-figure income earners and home owners. Those people can afford to send their kids to a decent school anyway. Public education, to their way of thinking, used to be about educating people just enough that they could provide blue collar or service industry labor. Now that we have too much of that, a public high school is just a waiting room for prison. So why throw money into it? They don’t think education “works” anyway; people are born Good or Bad, Talented or Useless. So it only makes sense to find the cheapest possible way to process the students who were written off before they reached middle school. If charter schools manage to save 1% of them, great. If not, well, then they’re no worse than public schools. And they’re cheaper! Did I mention that they’re cheaper?

There’s more. I provided only the main paragraph. I wish I could reveal that the author is being arch or ironic, but there is no evidence of that. I also wish I could refute him, but there is similarly no useful evidence for that. Rather, the explanation he provides is a reality check that fits the experience of wide swaths of the American public, namely, that “public high school is just a waiting room for prison” (soon and again, debtor’s prison) and that it’s designed to be just that because it’s cheaper than actually educating people. Those truly interesting in being educated will take care of it themselves. Plus, there’s additional money to be made operating prisons.

Deep cynicism is a sort of radical awareness that stares balefully at the truth and refuses to blink or pretend. A psychologist might call it the reality principle; a scientist might aver that it relies unflinchingly on objective evidence; a philosopher might call it strict epistemology. To get through life, however, most of us deny abundant evidence presented to us daily in favor of dreams and fantasies that assemble into the dominant paradigm. That paradigm includes the notions that evil doesn’t really exist, that we’re basically good people who care about each other, and that our opportunities and fates are not, on the whole, established long before we begin the journey.