Archive for the ‘History’ Category

Reading further into Anthony Giddens’ book The Consequences of Modernity, I got a fuller (though still incomplete) sense of what is meant by his terms disembedding mechanisms, expert systems, and symbolic tokens, all of which disrupt time and space as formerly understood in traditional societies that enjoyed the benefit of centuries of continuity. I’ve been aware of analyses regarding, for instance, the sociology of money and the widespread effects of the introduction and adoption of mechanical clocks and timepieces. While most understand these developments superficially as unallayed progress, Giddens argues that they do in fact reorder our experience in the world away from an organic, immediate orientation toward an intellectualized adherence to distant, abstract, self-reinforcing (reflexive) mechanisms.

But those matters are not really what this blog post is about. Rather, this passage sparked my interest:

… when the claims of reason replaced those of tradition, they appeared to offer a sense of certitude greater than that provided by preexisting dogma. But this idea only appears persuasive so long as we do not see that the reflexivity of modernity actually subverts reason, at any rate where reason is understood as the gaining of certain knowledge … We are abroad in a world which is thoroughly constituted through reflexively applied knowledge, but where at the same time we can never be sure that any given element of that knowledge will not be revised. [p. 39]

Put another way, science and reason are axiomatically open to examination, challenge, and revision and often undergo disruptive change. That’s what is meant by Karl Popper’s phrase “all science rests upon shifting sand” and informs the central thesis of Thomas Kuhn’s well-known book The Structure of Scientific Revolutions. It’s not the narrow details that shift so much (hard sciences lead pretty reliably to applied engineering) as the overarching narrative, e.g., the story of the Earth, the cosmos, and ourselves as revealed through scientific inquiry and close examination. Historically, the absolute certainty of the medieval church, while not especially accurate in either details or narrative, yielded considerable authority to post-Enlightenment science and reason, which themselves continue to shift periodically.

Some of those paradigm shifts are so boggling and beyond the ken of the average thinker (including many college-educated folks) that our epistemology is now in crisis. Even the hard facts — like the age and shape of the Earth or its orbital relationship to other solar bodies — are hotly contested by some and blithely misunderstood by others. One doesn’t have to get bogged down in the vagaries of relativity, nuclear power and weapons, or quantum theory to lose the thread of what it means to live in the 21st century. Softer sciences such as psychology, anthropology, economics, and even history now deliver new discoveries and (re-)interpretations of facts so rapidly, like the dizzying pace of technological change, that philosophical systems are unmoored and struggling for legitimacy. For instance, earlier this year, a human fossil was found in Morocco that upended our previous knowledge of human evolution (redating the first appearance of biologically modern humans about 100,000 years earlier). More popularly, dieticians still disagree on what sorts of foods are healthy for most of us (though we can probably all agree that excess sugar is bad). Other recent developments include the misguided insistence among some neurobiologists and theorists that consciousness, free will, and the self do not exist (I’ll have a new post regarding that topic as time allows) and outright attacks on religion not just for being in error but for being the source of evil.

I have a hard time imagining other developments in 21st-century intellectual thought that would shake the foundations of our cosmology any more furiously than what we’re now experiencing. Even the dawning realization that we’ve essentially killed ourselves (with delayed effect) by gradually though consistently laying waste to our own habitat is more of an “oops” than the mind-blowing moment of waking up from The Matrix to discover the unreality of everything once believed. Of course, for fervent believers especially, the true facts (best as we can know them, since knowledge is forever provisional) are largely irrelevant in light of desire (what one wants to believe), and that’s true for people on both sides of the schism between church and science/reason.

As Shakespeare wrote in Hamlet, “There are more things in heaven and earth, Horatio, Than are dreamt of in your philosophy.” So it’s probably wrong to introduce a false dualism, though it has plenty of historical precedent. I’ll suggest instead that there are more facets and worldviews at play in the world that the two that have been warring in the West for the last 600 years.


The storms referenced in the earlier version of this post were civilization-ending cataclysms. The succession of North American hurricanes and earthquakes earlier this month of September 2017 were natural disasters. I would say that September was unprecedented in history, but reliable weather records do not extend very far back in human history and the geological record extending back into human prehistory would suggest that, except perhaps for their concentration within the span of a month, the latest storms are nothing out of the ordinary. Some have even theorized that hurricanes and earthquakes could be interrelated. In the wider context of weather history, this brief period of destructive activity may still be rather mild. Already in the last twenty years we’ve experienced a series of 50-, 100- and 500-year weather events that would suggest exactly what climate scientists have been saying, namely, that higher global average temperatures and more atmospheric moisture will lead to more activity in the category of superstorms. Throw drought, flood, and desertification into the mix. This (or worse, frankly) may have been the old normal when global average temperatures were several degrees warmer during periods of hothouse earth. All indications are that we’re leaving behind garden earth, the climate steady state (with a relatively narrow band of global temperature variance) enjoyed for roughly 12,000 years.

Our response to the latest line of hurricanes that struck the Gulf, Florida, and the Caribbean has been characterized as a little tepid considering we had the experience of Katrina from which to learn and prepare, but I’m not so sure. True, hurricanes can be seen hundreds of miles and days away, allowing folks opportunity to either batten down the hatches or flee the area, but we have never been able to handle mass exodus, typically via automobile, and the sheer destructive force of the storms overwhelms most preparations and delays response. So after Katrina, it appeared for several days that the federal government’s response was basically this: you’re on your own; that apparent response occurred again especially in Puerto Rico, which like New Orleans quickly devolved into a true humanitarian crisis (and is not yet over). Our finding (in a more charitable assessment on my part) is that despite foreknowledge of the event and past experience with similar events, we can’t simply swoop in and smooth things out after the storms. Even the first steps of recovery take time.

I’ve cautioned that rebuilding on the same sites, with the reasonable expectation of repeat catastrophes in a destabilized climate that will spawn superstorms reducing entire cities to garbage heaps, is a poor option. No doubt we’ll do it anyway, at least partially; it’s already well underway in Houston. I’ve also cautioned that we need to brace for a diaspora as climate refugees abandon destroyed and inundated cities and regions. It’s already underway with respect to Puerto Rico. This is a storm of an entirely different sort (a flood, actually) and can also been seen from hundreds of miles and weeks, months, years away. And like superstorms, a diaspora from the coasts, because of the overwhelming force and humanitarian crisis it represents, is not something for which we can prepare adequately. Still, we know it’s coming, like a 20- or 50-year flood.

This is the inverse of a prior post called “Truth Based on Fiction.”

Telling stories about ourselves is one of the most basic of human attributes stretching across oral and recorded history. We continue today to memorialize events in short, compact tellings, frequently movies depicting real-life events. I caught two such films recently: Truth (about what came to be known as Rathergate) and Snowden (about whistle-blower Edward Snowden).

Although Dan Rather is the famous figure associated with Truth, the story focuses more on his producer Mary Mapes and the group decisions leading to airing of a controversial news report about George W. Bush’s time in the Air National Guard. The film is a dramatization, not a documentary, and so is free to present the story with its own perspective and some embellishment. Since I’m not a news junkie, my memory of the events in 2004 surrounding the controversy are not especially well informed, and I didn’t mind the potential for the movie’s version of events to color my thinking. About some controversies and conspiracies, I feel no particular demand to adopt a strong position. The actors did well enough, but I felt Robert Redford was poorly cast as Dan Rather. Redford is too famous in his own right to succeed as a character actor playing a real-life person.

Debate over the patriotism or treason of Edward Snowden’s actions continues to swirl, but the film covers the issues pretty well, from his discovery of an intelligence services surveillance dragnet (in violation of the 4th Amendment to the U.S. Constitution) to his eventual disclosure of same to a few well-respected journalists. The film’s director and joint screenwriter, Oliver Stone, has made a career out of fiction based on truth, dramatizing many signal events from the nation’s history, repackaging them as entertainment in the process. I’m wary of his interpretations of history when presented in cinematic form, less so his alternative history lessons given as documentary. Unlike Truth, however, I have clear ideas in my mind regarding Snowden the man and Snowden the movie, so from a different standpoint, was again unconcerned about potential bias. Joseph Gordon-Levitt does well enough as the titular character, though he doesn’t project nearly the same insight and keen intelligence as Snowden himself does. I suspect the documentary Citizen Four (which I’ve not yet seen) featuring Snowden doing his own talking is a far better telling of the same episode of history.

In contrast, I have assiduously avoided several other recent films based on actual events. United 93, World Trade Center, and Deepwater Horizon spring to mind, but there are many others. The wounds and controversies stemming from those real-life events still smart too much for me to consider exposing myself to propaganda historical fictions. Perhaps in a few decades, after living memory of such events has faded or disappeared entirely, such stories can be told effectively, though probably not accurately. A useful comparison might be any one of several films called The Alamo.

Violent events of the past week (Charleston, VA; Barcelona, Spain) and political responses to them have dominated the news cycle, pushing other newsworthy items (e.g., U.S.-South Korean war games and a looming debt ceiling crisis) off the front page and into the darker recesses of everyone’s minds (those paying attention, anyway). We’re absorbed instead with culture wars run amok. I’m loath to apply the term terrorism to regular periodic eruptions of violence, both domestic and foreign. That term carries with it intent, namely, the objective to create day-to-day terror in the minds of a population so as to interfere with proper functions of society. It’s unclear to me whether recent perpetrators of violence are coherent enough to formulate sophisticated motivations or plans. The dumb, obvious way of doing things — driving into crowds of people — takes little or no planning and may just as well be the result of inchoate rage boiling over in a moment of high stress and opportunity. Of course, it needn’t be all or nothing, and considering our reflexively disproportionate responses, the term terrorism and attendant destabilization is arguably accurate even without specified intent. That’s why in the wake of 9/11 some 16 years ago, the U.S. has become a security state.

It’s beyond evident that hostilities have been simmering below the not-so-calm surface. Many of those hostilities, typically borne out of economic woes but also part of a larger clash of civilizations, take the form of identifying an “other” presumably responsible for one’s difficulties and then victimizing the “other” in order to elevate oneself. Of course, the “other” isn’t truly responsible for one’s struggles, so the violent dance doesn’t actually elevate anyone, as in “supremacy”; it just wrecks both sides (though unevenly). Such warped thinking seems to be a permanent feature of human psychology and enjoys popular acceptance when the right “other” is selected and universal condemnation when the wrong one is chosen. Those doing the choosing and those being chosen haven’t changed much over the centuries. Historical Anglo-Saxons and Teutons choose and people of color (all types) get chosen. Jews are also chosen with dispiriting regularity, which is an ironic inversion of being the Chosen People (if you believe in such things — I don’t). However, any group can succumb to this distorted power move, which is why so much ongoing, regional, internecine conflict exists.

As I’ve been saying for years, a combination of condemnation and RightThink has simultaneously freed some people from this cycle of violence but merely driven the holdouts underground. Supremacy in its various forms (nationalism, racism, antisemitism, etc.) has never truly been expunged. RightThink itself has morphed (predictably) into intolerance, which is now veering toward radicalism. Perhaps a positive outcome of this latest resurgence of supremacist ideology is that those infected with the character distortion have been emboldened to identify themselves publicly and thus can be dealt with somehow. Civil authorities and thought leaders are not very good at dealing with hate, often shutting people out of the necessary public conversation and/or seeking to legislate hate out of existence with restrictions on free speech. But it is precisely through free expression and diplomacy that we address conflict. Violence is a failure to remain civil (duh!), and war (especially the genocidal sort) is the extreme instance. It remains to be seen if the lid can be kept on this boiling pot, but considering cascade failures lined up to occur within the foreseeable future, I’m pessimistic that we can see our way past the destructive habit of shifting blame onto others who often suffer even worse than those holding the reins of power.

Previous blogs on this topic are here and here.

Updates to the Bulletin of the Atomic Scientists resetting the metaphorical doomsday clock hands used to appear at intervals of 3–7 years. Updates have been issued in each of the last three years, though the clock hands remained in the same position from 2015 to 2016. Does that suggest raised geopolitical instability or merely resumed paranoia resulting from the instantaneous news cycle and radicalization of society and politics? The 2017 update resets the minute hand slightly forward to 2½ minutes to midnight:

doomsdayclock_black_2-5mins_regmark2028129For the last two years, the minute hand of the Doomsday Clock stayed set at three minutes before the hour, the closest it had been to midnight since the early 1980s. In its two most recent annual announcements on the Clock, the Science and Security Board warned: “The probability of global catastrophe is very high, and the actions needed to reduce the risks of disaster must be taken very soon.” In 2017, we find the danger to be even greater, the need for action more urgent. It is two and a half minutes to midnight, the Clock is ticking, global danger looms. Wise public officials should act immediately, guiding humanity away from the brink. If they do not, wise citizens must step forward and lead the way …

The principal concern of the Bulletin since its creation has been atomic/nuclear war. Recent updates include climate change in the mix. Perhaps it is not necessary to remind regular readers here, but the timescales for these two threats are quite different: global thermonuclear war (a term from the 1980s when superpowers last got weird and paranoid about things) could erupt almost immediately given the right lunacy provocation, such as the sabre-rattling now underway between the U.S. and North Korea, whereas climate change is an event typically unfolding across geological time. The millions of years it usually takes to manifest climate change fully and reach a new steady state (hot house earth vs. ice age earth), however, appears to have been accelerated by human inputs (anthropogenic climate change, or as Guy McPherson calls it, abrupt climate change) to only a few centuries.

Nuclear arsenals around the world are the subject of a curious article at Visual Capitalist (including several reader-friendly infographics) by Nick Routley. The estimated number of weapons in the U.S. arsenal has risen since the last time I blogged about this in 2010. I still find it impossible to fathom why more than a dozen nukes are necessary, or in my more charitable moments toward the world’s inhabitants, why any of them are necessary. Most sober analysts believe we are far safer today than, say, the 1950s and early 1960s when brinkmanship was anybody’s game. I find this difficult to judge considering the two main actors today on the geopolitical stage are both witless, unpredictable, narcissistic maniacs. Moreover, the possibility of some ideologue (religious or otherwise) getting hold of WMDs (not necessarily nukes) and creating mayhem is increasing as the democratization of production filters immense power down to lower and lower elements of society. I for one don’t feel especially safe.

My previous entry on this topic is found here. The quintessential question asked with regard to education (often levied against educators) is “Why can’t Johnnie read?” I believe we now have several answers.

Why Bother With Basics?

A resurrected method of teaching readin’ and writin’ (from the 1930s as it happens) is “freewriting.” The idea is that students who experience writer’s block should dispense with basic elements such as spelling, punctuation, grammar, organization, and style to simply get something on the page, coming back later to revise and correct. I can appreciate the thinking, namely, that students so paralyzed from an inability to produce finished work extemporaneously should focus first on vomiting blasting something onto the page. Whether those who use freewriting actually go back to edit (as I do) is unclear, but it’s not a high hurdle to begin with proper rudiments.

Why Bother Learning Math?

At Michigan State University, the algebra requirement has been dropped from its general education requirements. Considering that algebra is a basic part of most high school curricula, jettisoning algebra from the university core curriculum is astonishing. Again, it’s not a terrible high bar to clear, but for someone granted a degree from an institution of higher learning to fail to do so is remarkable. Though the rationalization offered at the link above is fairly sophisticated, it sounds more like Michigan State is just giving up asking its students to bother learning. The California State University system has adopted a similar approach. Wayne State University also dropped its math requirement and upped the ante by recommending a new diversity requirement (all the buzz with social justice warriors).

Why Bother Learning Music?

The Harvard Crimson reports changes to the music curriculum, lowering required courses for the music concentration from 13 to 10. Notably, most of the quotes in the article are from students relieved to have fewer requirements to satisfy. The sole professor quoted makes a bland, meaningless statement about flexibility. So if you want a Harvard degree with a music concentration, the bar has been lowered. But this isn’t educational limbo, where the difficulty is increased as the bar goes down; it’s a change from higher education to not-so-high-anymore education. Not learning very much about music has never been prohibition to success, BTW. Lots of successful musicians don’t even read music.

Why Bother Learning History?

According to some conservatives, U.S. history curricula, in particular this course is offered by The College Board, teach what’s bad about America and undermine American exceptionalism. In 2015, the Oklahoma House Common Education Committee voted 11-4 for emergency House Bill 1380 (authored by Rep. Dan Fisher) “prohibiting the expenditure of funds on the Advanced Placement United States History course.” This naked attempt to sanitize U.S. history and substitute preferred (patriotic) narratives is hardly a new phenomenon in education.


So why can’t Johnnie read, write, know, understand, or think? Simply put, because we’re not bothering to teach him to read, write, know, understand, or think. Johnnie has instead become a consumer of educational services and political football. Has lowering standards ever been a solution to the struggle to getting a worthwhile education? Passing students through just to be rid of them (while collecting tuition) has only produced a mass of miseducated graduates. Similarly, does a certificate, diploma, or granted degree mean anything as a marker of achievement if students can’t be bothered to learn time-honored elements of a core curriculum? The real shocker, of course, is massaging the curriculum itself (U.S. history in this instance) to produce citizens ignorant of their own past and compliant with the jingoism of the present.

Having been asked to contribute to a new group blog (Brains Unite), this is my first entry, which is cross-posted here at The Spiral Staircase. The subject of this post is the future of transportation. I’ve no expertise in this area, so treat this writing with the authority it deserves, which is to say, very little.

Any prediction of what awaits us must inevitably consider what has preceded us. Britain and the United States were both in the vanguard during the 19th and early 20th centuries when it came to innovation, and this is no less true of transportation than any other good or service. I’m not thinking of the routine travels one makes in the course of a day (e.g., work, church, school) but rather long excursions outside one’s normal range, a radius that has expanded considerably since then. (This hold true for freight transportation, too, but I am dropping that side of the issue in favor of passenger transit.) What is interesting is that travel tended to be communal, something we today call mass transit. For example, the Conestoga wagon, the stagecoach, the riverboat, and the rail car are each iconic of the 19th-century American West.

Passenger rail continued into the mid-20th century but was gradually replaced (in the U.S.) by individual conveyance as the automobile became economically available to the masses. Air travel commenced about the same time, having transitioned fairly quickly from 1 or 2 seats in an exposed cockpit to sealed fuselages capable of transporting 30+ people (now several hundred) at once. Still, as romantic as air travel may once have been (it’s lost considerable luster since deregulation as airlines now treat passengers more like freight), nothing beats the freedom and adventure of striking out on the road in one’s car to explore the continent, whether alone or accompanied by others.

The current character of transportation is a mixture of individual and mass transit, but without consulting numbers at the U.S. Dept. of Transportation, I daresay that the automobile is the primary means of travel for most Americans, especially those forced into cars by meager mass transit options. My prediction is that the future of transportation will be a gradual return to mass transit for two reasons: 1) the individual vehicle will become too costly to own and operate and 2) the sheer number of people moving from place to place will necessitate large transports.

While techno-utopians continue to conjure new, exotic (unfeasible) modes of transportation (e.g., the Hyperloop, which will purportedly enable passengers to make a 100-mile trip in about 12 minutes), they are typically aimed at individual transport and are extremely vulnerable to catastrophic failure (like the Space Shuttles) precisely because they must maintain human environments in difficult spaces (low orbit, underwater, inside depressurized tubes, etc.). They are also aimed at circumventing the congestion of conventional ground transportation (a victim of its own success now that highways in many cities resemble parking lots) and shortening transit times, though the extraordinary costs of such systems far exceed the benefit of time saved.

Furthermore, as climate change ramps up, we will witness a diaspora from regions inundated by rising waters, typically along the coasts where some 80% of human population resides (so I’ve read, can’t remember where). Mass migration out of MENA is already underway and will be joined by large population flows out of, for example, the Indian subcontinent, the Indonesian archipelago, and dustbowls that form in the interiors of continents. Accordingly, the future of transportation may well be the past:

photo: National Geographic


photo: UN Refugee Agency

Back in undergraduate college, when just starting on my music education degree, I received an assignment where students were asked to formulate a philosophy of education. My thinking then was influenced by a curious textbook I picked up: A Philosophy of Music Education by Bennett Reimer. Of course, it was the wrong time for an undergraduate to perform this exercise, as we had neither maturity nor understanding equal to the task. However, in my naïvté, my answer was all about learning/teaching an aesthetic education — one that focused on appreciating beauty in music and the fine arts. This requires the cultivation of taste, which used to be commonplace among the educated but is now anathema. Money is the preeminent value now. Moreover, anything that smacks of cultural programming and thought control is now repudiated reflexively, though such projects are nonetheless undertaken continuously and surreptitiously through a variety of mechanisms. As a result, the typical American’s sense of what is beautiful and admirable is stunted. Further, knowledge of the historical context in which the fine arts exist is largely absent. (Children are ahistorical in this same way.) Accordingly, many Americans are coarse philistines whose tastes rarely extend beyond those acquired naturally during adolescence (including both biophilia and biophobia), thus the immense popularity of comic book movies, rock and roll music, and all manner of electronica.

When operating with a limited imagination and undeveloped ability to perceive and discern (and disapprove), one is a sitting duck for what ought to be totally unconvincing displays of empty technical prowess. Mere mechanism (spectacle) then possesses the power to transfix and amaze credulous audiences. Thus, the ear-splitting volume of amplified instruments substitutes for true emotional energy produced in exceptional live performance, ubiquitous CGI imagery (vistas and character movements, e.g., fight skills, that simply don’t exist in reality) in cinema produces wonderment, and especially, blinking lights and animated GIFs deliver the equivalent of a sugar hit (cookies, ice cream, soda) when they’re really placebos or toxins. Like hypnosis, the placebo effect is real and pronounced for those unusually susceptible to induction. Sitting ducks.

Having given the fine arts (including their historical contexts) a great deal of my academic attention and acquired an aesthetic education, my response to the video below fell well short of the blasé relativism most exhibit; I actively dislike it. (more…)

According to Hal Smith of The Compulsive Explainer (see my blogroll), the tragedy of our time is, simply put, failed social engineering. Most of his blog post is quoted below:

Americans, for example, have decided to let other forces manage their nation — and not let Americans themselves manage it. At least this is what I see happening, with the election of Trump. They have handed the management of their country over to a man with a collection of wacky ideas — and they feel comfortable with this. Mismanagement is going on everywhere — and why not include the government in this?

This is typical behavior for a successful society in decline. They cannot see what made them successful, has been taken too far — and is now working against them. The sensible thing for them to do is back off for awhile, analyze their situation — and ask “What is going wrong here?” But they never do this — and a collapse ensues.

In our present case, the collapse involves a global society based on Capitalism — that cannot adapt itself to a Computer-based economy. The Software ecosystem operates differently — it is based on cooperation, not competition.

Capitalism was based on just that — Capital — money making money. And it was very successful — for those it favored. Money is still important in the Computer economy — people still have to be paid. But what they are being paid for has changed — information is now being managed, something different entirely.

Hardware is still important — but that is not where the Big Money is being made. It is now being made in Computers, and their Software.

I’m sympathetic to this view but believe that a look back through history reveals something other than missed opportunities and too-slow adaptation as we fumbled our way forward, namely, repeated catastrophic failures. Such epic fails include regional and global wars, genocides, and societal collapses that rise well above the rather bland term mismanagement. A really dour view of history, taking into account more than a few truly vicious, barbaric episodes, might regard the world as a nearly continuous stage of horrors from which we periodically take refuge, and the last of these phases is drawing quickly to a close.

The breakneck speed of technological innovation and information exchange has resulted not in Fukuyama’s mistakenly exuberant “end of history” (kinda-sorta winning the Cold War but nevertheless losing the peace?) but instead an epoch where humans are frankly left behind by follow-on effects of their own unrestrained restlessness. Further, if history is a stage of horrors, then geopolitics is theater of the absurd. News reports throughout the new U.S. presidential administration, still less than 6 months in (more precisely, 161 days or 23 weeks), tell of massive economic and geopolitical instabilities threatening to collapse the house of cards with only a slight breeze. Contortions press agents and politicized news organs go through to provide cover for tweets, lies, and inanities emanating from the disturbed mind of 45 are carnival freak show acts. Admittedly, not much has changed over the previous two administrations — alterations of degree only, not kind — except perhaps to demonstrate beyond any shadow of doubt that our elected, appointed, and purchased leaders (acknowledging many paths to power) are fundamentally incompetent to deal effectively with human affairs, much less enact social engineering projects beyond the false happiness of Facebook algorithms that hide bad news. Look no further than the egregious awfulness of both presidential candidates in the last election coughed up like hairballs from the mouths of their respective parties. The aftermath of those institutional failures finds both major parties in shambles, further degraded than their already deplorable states prior to the election.

So how much worse can things get? Well, scary as it sounds, lots. The electrical grid is still working, water is still flowing to the taps, and supply lines continue to keep store shelves stocked with booze and brats for extravagant holiday celebrations. More importantly, we in the U.S. have (for now, unlike Europe) avoided repetition of any major terrorist attacks. But everyone with an honest ear to the ground recognizes our current condition as the proverbial calm before the storm. For instance, we’re threatened by the first ice-free Arctic in the history of mankind later this year and a giant cleaving off of the Larsen C Ice Shelf in Antarctica within days. In addition, drought in the Dakotas will result in a failed wheat harvest. Guy McPherson (in particular, may well be others) has been predicting for years that abrupt, nonlinear climate change when the poles warm will end the ability to grow grain at scale, leading to worldwide famine, collapse, and near-term extinction. Seems like we’re passing the knee of the curve. Makes concerns about maladaptation and failed social engineering pale by comparison.

I pull in my share of information about current events and geopolitics despite a practiced inattention to mainstream media and its noisome nonsense. (See here for another who turned off the MSM.) I read or heard somewhere (can’t remember where) that most news outlets and indeed most other media, to drive traffic, now function as outrage engines, generating no small amount of righteousness, indignation, anger, and frustration at all the things so egregiously wrong in our neighborhoods, communities, regions, and across the world. These are all negative emotions, though legitimate responses to various scourges plaguing us currently, many of which are self-inflicted. It’s enough aggregate awfulness to draw people into the street again in principled protest, dissent, and resistance; it’s not yet enough to effect change. Alan Jacobs comments about outrage engines, noting that sharing via retweets is not the same as caring. In the Age of Irony, a decontextualized “yo, check this out!” is nearly as likely to be interpreted as support rather than condemnation (or mere gawking for entertainment value). Moreover, pointing, linking, and retweeting are each costless versions of virtue signaling. True virtue makes no object of publicity.

So where do I get my outrage quotient satisfied? Here is a modest linkfest, in no particular order, of sites not already on my blogroll. I don’t habituate these sites daily, but I drop in, often skimming, enough to keep abreast of themes and events of importance. (more…)