Archive for March, 2017

I often review my past posts when one receives a reader’s attention, sometimes adding tags and fixing typos, grammar, and broken links. One on my greatest hits (based on voting, not traffic) is Low Points in Education. It was among the first to tackle what I have since called our epistemological crisis, though I didn’t begin to use the epistemology tag until later. The crisis has caught up with a vengeance, though I can’t claim I’m the first to observe the problem. That dubious honor probably goes to Stephen Colbert, who coined the word truthiness in 2005. Now that alternative facts and fake news have entered the lingo as well (gaslighting has been revived), everyone has jumped on the bandwagon questioning the truthfulness or falsity behind anything coughed up in our media-saturated information environment. But as suggested in the first item discussed in Low Points in Education, what’s so important about truth?

It would be obvious and easy yet futile to argue in favor of high-fidelity appreciation of the world, even if only within the surprisingly narrow limits of human perception, cognition, and memory (all interrelated). Numerous fields of endeavor rely upon consensus reality derived from objectivity, measurement, reason, logic, and, dare I say it, facticity. Regrettably, human cognition doesn’t adhere any too closely to those ideals except when trained to value them. Well-educated folks have better acquaintance with such habits of mind; folks with formidable native intelligence can develop true authority, too. For the masses, however, those attributes are elusive, even for those who have partied through earned college degrees. Ironically worse, perhaps, are specialists, experts, and overly analytical intellectuals who exhibit what the French call a déformation professionelle. Politicians, pundits, and journalists are chief among the deformed and distorted. Mounting challenges to establishing truth now destabilize even mundane matters of fact, and it doesn’t help that myriad high-profile provocateurs (including the Commander in Chief, to whom I will henceforth refer only as “45”) are constantly throwing out bones for journalists to chase like so many unnourishing rubber chew toys.

Let me suggest, then, that human cognition, or more generally the mind, is an ongoing balancing act, making adjustments to stay upright and sane. Like the routine balance one keeps during locomotion, shifting weight side to side continuously, falling a bit only to catch oneself, difficulty is not especially high. But with the foundation below one’s feet shaking furiously, so to speak, legs get wobbly and many end up (figuratively at least) ass over teakettle. Further, the mind is highly situational, contingent, and improvisational and is prone to notoriously faulty perception even before one gets to marketing, spin, and arrant lies promulgated by those intent on coopting or directing one’s thinking. Simply put, we’re not particularly inclined toward accuracy but instead operate within a wide margin of error. Accordingly, we’re quite strong at adapting to ever-changing circumstance.

That strength turns out to be our downfall. Indeed, rootless adjustment to changing narrative is now so grave that basic errors of attribution — which entities said and did what — make it impossible to distinguish allies from adversaries reliably. (Orwell captured this with his line from the novel 1984, “Oceania was at war with Eurasia; therefore Oceania had always been at war with Eurasia.) Thus, on the back of a brazen propaganda campaign following 9/11, Iraq morphed from U.S. client state to rogue state demanding preemptive war. (Admittedly, the U.S. State Department had already lost control of its puppet despot, who in a foolish act of naked aggression tried to annex Kuwait, but that was a brief, earlier war quite unlike the undeclared one in which the U.S. has been mired for 16 years.) Even though Bush Administration lies have been unmasked and dispelled, many Americans continue to believe (incorrectly) that Iraq possessed WMDs and posed an existential threat to the U.S. The same type of confusion is arguably at work with respect to China, Russia, and Israel, which are mixed up in longstanding conflicts having significant U.S. involvement and provocation. Naturally, the default villain is always Them, never Us.

So we totter from moment to moment, reeling drunkenly from one breathtaking disclosure to the next, and are forced to reorient continuously in response to whatever the latest spin and spew happen to be. Some institutions retain the false sheen of respectability and authority, but for the most part, individuals are free to cherry-pick information and assemble their own truths, indulging along the way in conspiracy and muddle-headedness until at last almost no one can be reached anymore by logic and reason. This is our post-Postmodern world.

As I read into Fingerprints of the Gods by Graham Hancock and learn more about antiquity, it becomes clear that weather conditions on Earth were far more hostile then (say, 15,000 years ago) than now. Looking way, way back into millions of years ago, scientists have plotted global average temperature and atmospheric carbon, mostly using ice cores as I understand it, yielding this graph:

co2-levels-over-time1

I’ve seen this graph before, which is often used by climate change deniers to show a lack of correlation between carbon and temperature. That’s not what concerns me. Instead, the amazing thing is how temperature careens up and down quickly (in geological time) between two limits, 12°C and 22°C, and forms steady states known at Ice Age Earth and Hot House Earth. According to the graph, we’re close to the lower limit. It’s worth noting that because of the extremely long timescale, the graph is considerably smoothed.

(more…)

As a boy, my home included a coffee table book, title unknown, likely published circa 1960, about the origins of human life on Earth. (A more recent book of this type attracting lots of attention is Sapiens: A Brief History of Humankind (2015) by Yuval Harari, which I haven’t yet read.) It was heavily enough illustrated that my siblings and I consulted it mostly for the pictures, which can probably be excused since we were youngsters at the time time. What became of the book escapes me. In the intervening decades, I made no particular study of the ancient world — ancient meaning beyond the reach of human memory systems. Thus, ancient could potentially refer to anthropological history in the tens of thousands of years, evolutionary history stretching across tens of millions of years, geological history over hundreds of millions of years, or cosmological time going back a few billions. For the purpose of this blog post, let’s limit ancient to no more than fifty thousand years ago.

A few months ago, updates (over the 1960 publication) to the story of human history and civilization finally reached me (can’t account for the delay of several decades) via webcasts published on YouTube between Joe Rogan, Randall Carlson, and Graham Hancock. Rogan hosts the conversations; Carlson and Hancock are independent researchers whose investigations converge on evidence of major catastrophes that struck the ancient world during the Younger Dryas Period, erasing most but not all evidence of an antediluvian civilization. Whereas I’m a doomer, they are catastrophists. To call this subject matter fascinating is a considerable understatement. And yet, it’s neither here nor there with respect to how we conduct our day-to-day lives. Their revised history connects to religious origin stories, but such narratives have been relegated to myth and allegory for a long time already, making them more symbolic than historical.

In the tradition of Galileo, Copernicus, Newton, and Darwin, all of whom went against scientific orthodoxy of their times but were ultimately vindicated, Carlson and Graham appear to be rogue scientists/investigators exploring deep history and struggling against the conventional story of the beginnings of civilization around 6,000 years ago in the Middle East and Egypt. John Anthony West is another who disputes the accepted narratives and timelines. West is also openly critical of “quackademics” who refuse to consider accumulating evidence but instead collude to protect their cherished ideological and professional positions. The vast body of evidence being pieced together is impressive, and I truly appreciate their collective efforts. I’m about 50 pp. into Hancock’s Fingerprints of the Gods (1995), which contains copious detail not well suited to the conversational style of a webcast. His follow-up Magicians of the Gods (2015) will have to wait. Carlson’s scholarly work is published at the website Sacred Geometry International (and elsewhere, I presume).

So I have to admit that my blog, launched in 2006 as a culture blog, turned partially into a doomer blog as that narrative gained the weight of overwhelming evidence. What Carlson and Hancock in particular present is evidence of major catastrophes that struck the ancient world and are going to repeat: a different sort of doom, so to speak. Mine is ecological, financial, cultural, and finally civilizational collapse borne out of exhaustion, hubris, frailty, and most importantly, poor stewardship. Theirs is periodic cataclysmic disaster including volcanic eruptions and explosions, great floods (following ice ages, I believe), meteor strikes, earthquakes, tsunamis, and the like, each capable of ending civilization all at once. Indeed, those inevitable events are scattered throughout our geological history, though at unpredictable intervals often spaced tens or hundreds of thousands of years apart. For instance, the supervolcano under Yellowstone is known to blow roughly every 600,000 years, and we’re overdue. Further, the surface of the Moon indicates bombardment from meteors; the Earth’s history of the same is hidden somewhat by continuous transformation of the landscape lacking on the Moon. The number of near misses, also known as near-Earth objects, in the last few decades is rather disconcerting. Any of these disasters could strike at any time, or we could wait another 10,000 years.

Carlson and Hancock warn that we must recognize the dangers, drop our petty international squabbles, and unite as a species to prepare for the inevitable. To do otherwise would be to court disaster. However, far from dismissing the prospect of doom I’ve been blogging about, they merely add another category of things likely to kill us off. They give the impression that we should turn our attention away from sudden climate change, the Sixth Extinction, and other perils to which we have contributed heavily and worry instead about death from above (the skies) and below (the Earth’s crust). It’s impossible to say which is the most worrisome prospect. As a fatalist, I surmise that there is little we can do to forestall any of these eventualities. Our fate is already sealed in one respect or another. That foreknowledge makes life precious for me, and frankly, is another reason to put aside our petty squabbles.

Nick Carr has an interesting blog post (late getting to it as usual) highlighting a problem with our current information environment. In short, the constant information feed to which many of us subscribe and read on smartphones, which I’ve frequently called a fire hose pointed indiscriminately at everyone, has become the new normal. And when it’s absent, people feel anxiety:

The near-universal compulsion of the present day is, as we all know and as behavioral studies prove, the incessant checking of the smartphone. As Begley notes, with a little poetic hyperbole, we all “feel compelled to check our phones before we get out of bed in the morning and constantly throughout the day, because FOMO — the fear of missing out — fills us with so much anxiety that it feels like fire ants swarming every neuron in our brain.” With its perpetually updating, tightly personalized messaging, networking, searching, and shopping apps, the smartphone creates the anxiety that it salves. It’s a machine almost perfectly designed to turn its owner into a compulsive … from a commercial standpoint, the smartphone is to compulsion what the cigarette pack was to addiction

I’ve written about this phenomenon plenty of times (see here for instance) and recommended that wizened folks might adopt a practiced media ecology by regularly turning one’s attention away from the feed (e.g., no mobile media). Obviously, that’s easier for some of us than others. Although my innate curiosity (shared by almost everyone, I might add) prompts me to gather quite a lot of information in the course of the day/week, I’ve learned to be restrictive and highly judgmental about what sources I read, printed text being far superior in most respects to audio or video. No social media at all, very little mainstream media, and very limited “fast media” of the type that rushes to publication before enough is known. Rather, periodicals (monthly or quarterly) and books, which have longer paths to publication, tend to be more thoughtful and reliable. If I could never again be exposed to noise newsbits with, say, the word “Kardashian,” that would be an improvement.

Also, being aware that the basic economic structure underlying media from the advent of radio and television is to provide content for free (interesting, entertaining, and hyperpalatable perhaps, but simultaneously pointless ephemera) in order to capture the attention of a large audience and then load up the channel with advertisements at regular intervals, I now use ad blockers and streaming media to avoid being swayed by the manufactured desire that flows from advertising. If a site won’t display its content without disabling the ad blocker, which is becoming more commonplace, then I don’t give it my attention. I can’t avoid all advertising, much like I can’t avoid my consumer behaviors being tracked and aggregated by retailers (and others), but I do better than most. For instance, I never saw any Super Bowl commercials this year, which have become a major part of the spectacle. Sure, I’m missing out, but I have no anxiety about it. I prefer to avoid colonization of my mind by advertisers in exchange for cheap titillation.

In the political news media, Rachel Maddow has caught on that it’s advantageous to ignore a good portion of the messages flung at the masses like so much monkey shit. A further suggestion is that because of the pathological narcissism of the new U.S. president, denial of the rapt attention he craves by reinforcing only the most reasonable conduct of the office might be worth a try. Such an experiment would be like the apocryphal story of students conditioning their professor to lecture with his/her back to the class by using positive/negative reinforcement, paying attention and being quiet only when his/her back was to them. Considering how much attention is trained on the Oval Office and its utterances, I doubt such an approach would be feasible even if it were only journalists attempting to channel behavior, but it’s a curious thought experiment.

All of this is to say that there are alternatives to being harried and harassed by insatiable desire for more information at all times. There is no actual peril to boredom, though we behave as though an idle mind is either wasteful or fearsome. Perhaps we aren’t well adapted — cognitively or culturally — to the deluge of information pressing on us in modern life, which could explain (partially) this age of anxiety when our safety, security, and material comforts are as good as they’ve ever been. I have other thoughts about what’s really missing in modern life, which I’ll save for another post.