Posts Tagged ‘Recent History’

I’m a little gobsmacked that, in the aftermath of someone finally calling out the open secret of the Hollywood casting couch (don’t know, don’t care how this news cycle started) and netting Harvey Weinstein in the process, so many well-known actors have added their “Me, too!” to the growing scandal. Where were all these sheep before now? As with Bill Cosby and Bill Clinton, what good does it do to allow a serial abuser to continue unchallenged until years, decades later a critical mass finally boils over? I have no special knowledge or expertise in this area, so what follows is the equivalent of a thought experiment.

Though the outlines of the power imbalance between a Hollywood executive and an actor seeking a role (or other industry worker seeking employment) are pretty clear, creating a rich opportunity for the possessor of such power to act like a creep or a criminal, the specific details are still a little shrouded — at least in my limited consumption of the scandal press. How much of Weinstein’s behavior veers over the line from poor taste to criminality is a difficult question precisely because lots of pictorial evidence exists showing relatively powerless people playing along. It’s a very old dynamic, and its quasi-transactional nature should be obvious.

In my idealized, principled view, if one has been transgressed, the proper response is not to slink away or hold one’s tongue until enough others are similarly transgressed to spring into action. The powerless are duty bound to assert their own power — the truth — much like a whistleblower feels compelled to disclose corruptions of government and corporate sectors. Admittedly, that’s likely to compound the initial transgression and come at some personal cost, great or small. But for some of us (a small percentage, I reckon), living with ourselves in silent assent presents an even worse option. By way of analogy, if one were molested by a sketchy uncle and said nothing, I can understand just wanting to move on. But if one said nothing yet knew the sketchy uncle had more kids lined up in the extended family to transgress, then stepping up to protect the younger and weaker would be an absolute must.

In the past few decades, clergy of the Catholic Church sexually abused many young people and deployed an institutional conspiracy to hide the behaviors and protect the transgressors. Exposure should have broken trust bonds between the church and the faithful and invalidated the institution as an abject failure. Didn’t quite work out that way. Similar scandals and corruption across a huge swath of institutions (e.g., corporate, governmental, military, educational, entertainment, and sports entities) have been appearing in public view regularly, yet as a culture, we tolerate more creeps and criminals than we shame or prosecute. ( is one of the sites that regularly reports these corruptions with respect to American empire; I can scarcely bear to read it sometimes.) I suspect part of that is a legitimate desire for continuity, to avoid burning down the house with everyone in it. That places just about everyone squarely within the “Me, too!” collective. Maybe I shouldn’t be so gobsmacked after all.

Caveat: This thought experiment definitely comes from a male perspective. I recognize that females view these issues quite differently, typically in consideration of far greater vulnerability than males experience (excepting the young boys in the Catholic Church example).


The storms referenced in the earlier version of this post were civilization-ending cataclysms. The succession of North American hurricanes and earthquakes earlier this month of September 2017 were natural disasters. I would say that September was unprecedented in history, but reliable weather records do not extend very far back in human history and the geological record extending back into human prehistory would suggest that, except perhaps for their concentration within the span of a month, the latest storms are nothing out of the ordinary. Some have even theorized that hurricanes and earthquakes could be interrelated. In the wider context of weather history, this brief period of destructive activity may still be rather mild. Already in the last twenty years we’ve experienced a series of 50-, 100- and 500-year weather events that would suggest exactly what climate scientists have been saying, namely, that higher global average temperatures and more atmospheric moisture will lead to more activity in the category of superstorms. Throw drought, flood, and desertification into the mix. This (or worse, frankly) may have been the old normal when global average temperatures were several degrees warmer during periods of hothouse earth. All indications are that we’re leaving behind garden earth, the climate steady state (with a relatively narrow band of global temperature variance) enjoyed for roughly 12,000 years.

Our response to the latest line of hurricanes that struck the Gulf, Florida, and the Caribbean has been characterized as a little tepid considering we had the experience of Katrina from which to learn and prepare, but I’m not so sure. True, hurricanes can be seen hundreds of miles and days away, allowing folks opportunity to either batten down the hatches or flee the area, but we have never been able to handle mass exodus, typically via automobile, and the sheer destructive force of the storms overwhelms most preparations and delays response. So after Katrina, it appeared for several days that the federal government’s response was basically this: you’re on your own; that apparent response occurred again especially in Puerto Rico, which like New Orleans quickly devolved into a true humanitarian crisis (and is not yet over). Our finding (in a more charitable assessment on my part) is that despite foreknowledge of the event and past experience with similar events, we can’t simply swoop in and smooth things out after the storms. Even the first steps of recovery take time.

I’ve cautioned that rebuilding on the same sites, with the reasonable expectation of repeat catastrophes in a destabilized climate that will spawn superstorms reducing entire cities to garbage heaps, is a poor option. No doubt we’ll do it anyway, at least partially; it’s already well underway in Houston. I’ve also cautioned that we need to brace for a diaspora as climate refugees abandon destroyed and inundated cities and regions. It’s already underway with respect to Puerto Rico. This is a storm of an entirely different sort (a flood, actually) and can also been seen from hundreds of miles and weeks, months, years away. And like superstorms, a diaspora from the coasts, because of the overwhelming force and humanitarian crisis it represents, is not something for which we can prepare adequately. Still, we know it’s coming, like a 20- or 50-year flood.

Here’s a familiar inspirational phrase from The Bible: the truth shall set you free (John 8:32). Indeed, most of us take it as, um, well, gospel that knowledge and understanding are unqualified goods. However, the information age has turned out to be a mixed blessing. Any clear-eyed view of the the way the world works and its long, tawdry history carries with it an inevitable awareness of injustice, inequity, suffering, and at the extreme end, some truly horrific epaisodes of groups victimizing each other. Some of the earliest bits of recorded history, as distinguished from oral history, are financial — keeping count (or keeping accounts). Today differs not so much in character as in the variety of counts being kept and the sophistication of information gathering.

The Bureau of Labor Statistics, a part of the U.S. Department of Labor, is one information clearinghouse that slices and dices available data according to a variety of demographic characteristics. The fundamental truth behind such assessments, regardless of the politics involved, is that when comparisons are made between unlike groups, say, between men and women or young and old, one should expect to find differences and indeed be rather surprised if comparisons revealed none. So the question of gender equality in the workplace, or its implied inverse, gender inequality in the workplace, is a form of begging the question, meaning that if one seeks differences, one shall most certainly find them. But those differences are not prima facie evidence of injustice in the sense of the popular meme that women are disadvantaged or otherwise discriminated against in the workplace. Indeed, the raw data can be interpreted according to any number of agendas, thus the phrase “lying with statistics,” and most of us lack the sophistication to contextualize statistics properly, which is to say, free of the emotional bias that plagues modern politics, and more specifically, identity politics.

The fellow who probably ran up against this difficulty the worst is Charles Murray in the aftermath of publication of his book The Bell Curve (1994), which deals with how intelligence manifests differently across demographic groups yet functions as the primary predictor of social outcomes. Murray is particularly well qualified to interpret data and statistics dispassionately, and in true seek-and-find fashion, differences between groups did appear. It is unclear how much his resulting prescriptions for social programs are borne out of data vs. ideology, but most of us are completely at sea wading through the issues without specialized academic training to make sense of the evidence.

More recently, another fellow caught in the crosshairs on issues of difference is James Damore, who was fired from his job at Google after writing what is being called an anti-diversity manifesto (but might be better termed an internal memo) that was leaked and then went viral. The document can be found here. I have not dug deeply into the details, but my impression is that Damore attempted a fairly academic unpacking of the issue of gender differences in the workplace as they conflicted with institutional policy only to face a hard-set ideology that is more RightThink than truth. In Damore’s case, the truth did set him free — free from employment. Even the NY Times recognizes that the Thought Police sprang into action yet again to demand that its pet illusions about society be supported rather than dispelled. These witch hunts and shaming rituals (vigilante justice carried out in the court of public opinion) are occurring with remarkable regularity.

In a day and age where so much information (too much information, as it turns out) is available to us to guide our thinking, one might hope for careful, rational analysis and critical thinking. However, trends point to the reverse: a return to tribalism, xenophobia, scapegoating, and victimization. There is also a victimization Olympics at work, with identity groups vying for imaginary medals awarded to whoever’s got it worst. I’m no Pollyanna when it comes to the notion that all men are brothers and, shucks, can’t we all just get along? That’s not our nature. But the marked indifference of the natural world to our suffering as it besets us with drought, fire, floods, earthquakes, tsunamis, hurricanes, tornadoes, and the like (and this was just the last week!) might seem like the perfect opportunity to find within ourselves a little grace and recognize our common struggles in the world rather than add to them.

This is the inverse of a prior post called “Truth Based on Fiction.”

Telling stories about ourselves is one of the most basic of human attributes stretching across oral and recorded history. We continue today to memorialize events in short, compact tellings, frequently movies depicting real-life events. I caught two such films recently: Truth (about what came to be known as Rathergate) and Snowden (about whistle-blower Edward Snowden).

Although Dan Rather is the famous figure associated with Truth, the story focuses more on his producer Mary Mapes and the group decisions leading to airing of a controversial news report about George W. Bush’s time in the Air National Guard. The film is a dramatization, not a documentary, and so is free to present the story with its own perspective and some embellishment. Since I’m not a news junkie, my memory of the events in 2004 surrounding the controversy are not especially well informed, and I didn’t mind the potential for the movie’s version of events to color my thinking. About some controversies and conspiracies, I feel no particular demand to adopt a strong position. The actors did well enough, but I felt Robert Redford was poorly cast as Dan Rather. Redford is too famous in his own right to succeed as a character actor playing a real-life person.

Debate over the patriotism or treason of Edward Snowden’s actions continues to swirl, but the film covers the issues pretty well, from his discovery of an intelligence services surveillance dragnet (in violation of the 4th Amendment to the U.S. Constitution) to his eventual disclosure of same to a few well-respected journalists. The film’s director and joint screenwriter, Oliver Stone, has made a career out of fiction based on truth, dramatizing many signal events from the nation’s history, repackaging them as entertainment in the process. I’m wary of his interpretations of history when presented in cinematic form, less so his alternative history lessons given as documentary. Unlike Truth, however, I have clear ideas in my mind regarding Snowden the man and Snowden the movie, so from a different standpoint, was again unconcerned about potential bias. Joseph Gordon-Levitt does well enough as the titular character, though he doesn’t project nearly the same insight and keen intelligence as Snowden himself does. I suspect the documentary Citizen Four (which I’ve not yet seen) featuring Snowden doing his own talking is a far better telling of the same episode of history.

In contrast, I have assiduously avoided several other recent films based on actual events. United 93, World Trade Center, and Deepwater Horizon spring to mind, but there are many others. The wounds and controversies stemming from those real-life events still smart too much for me to consider exposing myself to propaganda historical fictions. Perhaps in a few decades, after living memory of such events has faded or disappeared entirely, such stories can be told effectively, though probably not accurately. A useful comparison might be any one of several films called The Alamo.

Previous blogs on this topic are here and here.

Updates to the Bulletin of the Atomic Scientists resetting the metaphorical doomsday clock hands used to appear at intervals of 3–7 years. Updates have been issued in each of the last three years, though the clock hands remained in the same position from 2015 to 2016. Does that suggest raised geopolitical instability or merely resumed paranoia resulting from the instantaneous news cycle and radicalization of society and politics? The 2017 update resets the minute hand slightly forward to 2½ minutes to midnight:

doomsdayclock_black_2-5mins_regmark2028129For the last two years, the minute hand of the Doomsday Clock stayed set at three minutes before the hour, the closest it had been to midnight since the early 1980s. In its two most recent annual announcements on the Clock, the Science and Security Board warned: “The probability of global catastrophe is very high, and the actions needed to reduce the risks of disaster must be taken very soon.” In 2017, we find the danger to be even greater, the need for action more urgent. It is two and a half minutes to midnight, the Clock is ticking, global danger looms. Wise public officials should act immediately, guiding humanity away from the brink. If they do not, wise citizens must step forward and lead the way …

The principal concern of the Bulletin since its creation has been atomic/nuclear war. Recent updates include climate change in the mix. Perhaps it is not necessary to remind regular readers here, but the timescales for these two threats are quite different: global thermonuclear war (a term from the 1980s when superpowers last got weird and paranoid about things) could erupt almost immediately given the right lunacy provocation, such as the sabre-rattling now underway between the U.S. and North Korea, whereas climate change is an event typically unfolding across geological time. The millions of years it usually takes to manifest climate change fully and reach a new steady state (hot house earth vs. ice age earth), however, appears to have been accelerated by human inputs (anthropogenic climate change, or as Guy McPherson calls it, abrupt climate change) to only a few centuries.

Nuclear arsenals around the world are the subject of a curious article at Visual Capitalist (including several reader-friendly infographics) by Nick Routley. The estimated number of weapons in the U.S. arsenal has risen since the last time I blogged about this in 2010. I still find it impossible to fathom why more than a dozen nukes are necessary, or in my more charitable moments toward the world’s inhabitants, why any of them are necessary. Most sober analysts believe we are far safer today than, say, the 1950s and early 1960s when brinkmanship was anybody’s game. I find this difficult to judge considering the two main actors today on the geopolitical stage are both witless, unpredictable, narcissistic maniacs. Moreover, the possibility of some ideologue (religious or otherwise) getting hold of WMDs (not necessarily nukes) and creating mayhem is increasing as the democratization of production filters immense power down to lower and lower elements of society. I for one don’t feel especially safe.

My previous entry on this topic is found here. The quintessential question asked with regard to education (often levied against educators) is “Why can’t Johnnie read?” I believe we now have several answers.

Why Bother With Basics?

A resurrected method of teaching readin’ and writin’ (from the 1930s as it happens) is “freewriting.” The idea is that students who experience writer’s block should dispense with basic elements such as spelling, punctuation, grammar, organization, and style to simply get something on the page, coming back later to revise and correct. I can appreciate the thinking, namely, that students so paralyzed from an inability to produce finished work extemporaneously should focus first on vomiting blasting something onto the page. Whether those who use freewriting actually go back to edit (as I do) is unclear, but it’s not a high hurdle to begin with proper rudiments.

Why Bother Learning Math?

At Michigan State University, the algebra requirement has been dropped from its general education requirements. Considering that algebra is a basic part of most high school curricula, jettisoning algebra from the university core curriculum is astonishing. Again, it’s not a terrible high bar to clear, but for someone granted a degree from an institution of higher learning to fail to do so is remarkable. Though the rationalization offered at the link above is fairly sophisticated, it sounds more like Michigan State is just giving up asking its students to bother learning. The California State University system has adopted a similar approach. Wayne State University also dropped its math requirement and upped the ante by recommending a new diversity requirement (all the buzz with social justice warriors).

Why Bother Learning Music?

The Harvard Crimson reports changes to the music curriculum, lowering required courses for the music concentration from 13 to 10. Notably, most of the quotes in the article are from students relieved to have fewer requirements to satisfy. The sole professor quoted makes a bland, meaningless statement about flexibility. So if you want a Harvard degree with a music concentration, the bar has been lowered. But this isn’t educational limbo, where the difficulty is increased as the bar goes down; it’s a change from higher education to not-so-high-anymore education. Not learning very much about music has never been prohibition to success, BTW. Lots of successful musicians don’t even read music.

Why Bother Learning History?

According to some conservatives, U.S. history curricula, in particular this course is offered by The College Board, teach what’s bad about America and undermine American exceptionalism. In 2015, the Oklahoma House Common Education Committee voted 11-4 for emergency House Bill 1380 (authored by Rep. Dan Fisher) “prohibiting the expenditure of funds on the Advanced Placement United States History course.” This naked attempt to sanitize U.S. history and substitute preferred (patriotic) narratives is hardly a new phenomenon in education.


So why can’t Johnnie read, write, know, understand, or think? Simply put, because we’re not bothering to teach him to read, write, know, understand, or think. Johnnie has instead become a consumer of educational services and political football. Has lowering standards ever been a solution to the struggle to getting a worthwhile education? Passing students through just to be rid of them (while collecting tuition) has only produced a mass of miseducated graduates. Similarly, does a certificate, diploma, or granted degree mean anything as a marker of achievement if students can’t be bothered to learn time-honored elements of a core curriculum? The real shocker, of course, is massaging the curriculum itself (U.S. history in this instance) to produce citizens ignorant of their own past and compliant with the jingoism of the present.

Having been asked to contribute to a new group blog (Brains Unite), this is my first entry, which is cross-posted here at The Spiral Staircase. The subject of this post is the future of transportation. I’ve no expertise in this area, so treat this writing with the authority it deserves, which is to say, very little.

Any prediction of what awaits us must inevitably consider what has preceded us. Britain and the United States were both in the vanguard during the 19th and early 20th centuries when it came to innovation, and this is no less true of transportation than any other good or service. I’m not thinking of the routine travels one makes in the course of a day (e.g., work, church, school) but rather long excursions outside one’s normal range, a radius that has expanded considerably since then. (This hold true for freight transportation, too, but I am dropping that side of the issue in favor of passenger transit.) What is interesting is that travel tended to be communal, something we today call mass transit. For example, the Conestoga wagon, the stagecoach, the riverboat, and the rail car are each iconic of the 19th-century American West.

Passenger rail continued into the mid-20th century but was gradually replaced (in the U.S.) by individual conveyance as the automobile became economically available to the masses. Air travel commenced about the same time, having transitioned fairly quickly from 1 or 2 seats in an exposed cockpit to sealed fuselages capable of transporting 30+ people (now several hundred) at once. Still, as romantic as air travel may once have been (it’s lost considerable luster since deregulation as airlines now treat passengers more like freight), nothing beats the freedom and adventure of striking out on the road in one’s car to explore the continent, whether alone or accompanied by others.

The current character of transportation is a mixture of individual and mass transit, but without consulting numbers at the U.S. Dept. of Transportation, I daresay that the automobile is the primary means of travel for most Americans, especially those forced into cars by meager mass transit options. My prediction is that the future of transportation will be a gradual return to mass transit for two reasons: 1) the individual vehicle will become too costly to own and operate and 2) the sheer number of people moving from place to place will necessitate large transports.

While techno-utopians continue to conjure new, exotic (unfeasible) modes of transportation (e.g., the Hyperloop, which will purportedly enable passengers to make a 100-mile trip in about 12 minutes), they are typically aimed at individual transport and are extremely vulnerable to catastrophic failure (like the Space Shuttles) precisely because they must maintain human environments in difficult spaces (low orbit, underwater, inside depressurized tubes, etc.). They are also aimed at circumventing the congestion of conventional ground transportation (a victim of its own success now that highways in many cities resemble parking lots) and shortening transit times, though the extraordinary costs of such systems far exceed the benefit of time saved.

Furthermore, as climate change ramps up, we will witness a diaspora from regions inundated by rising waters, typically along the coasts where some 80% of human population resides (so I’ve read, can’t remember where). Mass migration out of MENA is already underway and will be joined by large population flows out of, for example, the Indian subcontinent, the Indonesian archipelago, and dustbowls that form in the interiors of continents. Accordingly, the future of transportation may well be the past:

photo: National Geographic


photo: UN Refugee Agency

I pull in my share of information about current events and geopolitics despite a practiced inattention to mainstream media and its noisome nonsense. (See here for another who turned off the MSM.) I read or heard somewhere (can’t remember where) that most news outlets and indeed most other media, to drive traffic, now function as outrage engines, generating no small amount of righteousness, indignation, anger, and frustration at all the things so egregiously wrong in our neighborhoods, communities, regions, and across the world. These are all negative emotions, though legitimate responses to various scourges plaguing us currently, many of which are self-inflicted. It’s enough aggregate awfulness to draw people into the street again in principled protest, dissent, and resistance; it’s not yet enough to effect change. Alan Jacobs comments about outrage engines, noting that sharing via retweets is not the same as caring. In the Age of Irony, a decontextualized “yo, check this out!” is nearly as likely to be interpreted as support rather than condemnation (or mere gawking for entertainment value). Moreover, pointing, linking, and retweeting are each costless versions of virtue signaling. True virtue makes no object of publicity.

So where do I get my outrage quotient satisfied? Here is a modest linkfest, in no particular order, of sites not already on my blogroll. I don’t habituate these sites daily, but I drop in, often skimming, enough to keep abreast of themes and events of importance. (more…)

So we’re back at it: bombing places halfway around the world for having the indignity to be at war and fighting it the wrong way. While a legitimate argument exists regarding a human rights violation requiring a response, that is not AFAIK the principal concern or interpretation of events. Rather, it’s about 45 being “presidential” for having ordered missile strikes. It must have been irresistible, with all the flashy metaphorical buttons demanding to be pushed at the first opportunity. I’m disappointed that his pacifist rhetoric prior to the election was merely oppositional, seeking only to score points against Obama. Although I haven’t absorbed a great deal of the media coverage, what I’ve seen squarely refuses to let a crisis go to waste. Indeed, as geopolitics and military escapades goes, we’re like moths to the flame. The most reprehensible media response was MSNBC anchor Brian Williams waxing rhapsodic about the beauty of the missiles as they lit up the air. How many screw-ups does this guy get?

Lessons learned during the 20th century that warfare is not just a messy, unfortunate affair but downright ugly, destructive, pointless, and self-defeating are unjustifiably forgotten. I guess it can’t be helped: it’s nympho-warmaking. We can’t stop ourselves; gotta have it. Consequences be damned. How many screw-ups do we get?

At least Keith Olbermann, the current king of righteous media indignation, had the good sense to put things in their proper context and condemn our actions (as I do). He also accused the military strike of being a stunt, which calls into question whether the provocation was a false flag operation. That’s what Putin is reported as saying. Personally, I cannot take a position on the matter, being at the mercy of the media and unable to gather any first-hand information. Doubts and disillusionment over what’s transpired and the endless spin cycle plague me. There will never be closure.

First, a few reminders:

  • The United States has been in an undeclared state of war for 15 years, the longest in U.S. history and long enough that young people today can say legitimately, “we’ve always been at war with Oceania.” The wars encompass the entirety of both terms of the Obama Administration.
  • The inciting events were attacks on U.S. soil carried out on September 11, 2001 (popularly, 9/11), which remain shrouded in controversy and conspiracy despite the official narrative assigning patsy blame to al-Qaida operating in Afghanistan and Iraq.
  • On the heels of the attacks, the Bush Administration commenced a propaganda campaign to sell invasion and regime change in those two countries and, over widespread public protest, went ahead and launched preemptive wars, ostensibly because an existential threat existed with respect to weapons of mass destruction (WMDs) possessed by Iraq in particular.
  • The propaganda campaign has since been revealed to have been cooked up and untrue, yet it buffaloed a lot of people into believing (even to this day) that Iraq was somehow responsible for 9/11.
  • Our preemptive wars succeeded quickly in toppling governments and capturing (and executing) their leaders but immediately got bogged down securing a peace that never came.
  • Even with an embarrassing mismatch of force, periodic troop surges and draw downs, trillions of dollars wasted spent prosecuting the wars, and incredible, pointless loss of life (especially on the opposing sides), our objective in the Middle East (other than the oil, stupid!) has never been clear. The prospect of final withdrawal is nowhere on the horizon.

Continuous war — declared or merely waged — has been true of the U.S. my whole life, though one would be hard pressed to argue that it truly represents an immediate threat to U.S. citizens except to those unlucky enough to be deployed in war zones. Still, the monkey-on-the-back is passed from administration to administration. One might hope, based on campaign rhetoric, that the new executive (45) might recognize continuous war as the hot potato it is and dispense with it, but the proposed federal budget, with its $52 billion increase in military spending (+10% over 2016), suggests otherwise. Meanwhile, attention has been turned away from true existential threats that have been bandied about in the public sphere for at least a decade: global warming and climate change leading to Near-Term Extinction (NTE). Proximal threats, largely imagined, have absorbed all our available attention, and depending on whom one polls, our worst fears have already been realized.

The 20th and 21st centuries (so far) have been a series of “hot” wars (as distinguished from the so-called Cold War). Indeed, there has scarcely been a time when the U.S. has not been actively engaged fighting phantoms. If the Cold War was a bloodless, ideological war to stem the nonexistent spread of communism, we have adopted and coopted the language of wartime to launch various rhetorical wars. First was LBJ’s War on Poverty, the only “war” aimed at truly helping people. Nixon got into the act with his War on Drugs, which was punitive. Reagan expanded the War on Drugs, which became the War on Crime. Clinton increased the punitive character of the War on Crime by instituting mandatory minimum sentencing, which had the side effect of establishing what some call the prison-industrial complex, inflating the incarceration rate of Americans to the point that the U.S. is now ranked second in the world behind the Seychelles (!), a ranking far, far higher than any other industrialized nation.

If U.S. authoritarians hadn’t found enough people to punish or sought to convince the public that threats exist on all sides, requiring constant vigilance and a massive security apparatus including military, civil police, and intelligence services comprised of 16 separate agencies (of which we know), Bush coined and declared the War on Terror aimed at punishing those foreign and domestic who dare challenge U.S. hegemony in all things. It’s not called a national security state for nuthin’, folks. I aver that the rhetorical War on Poverty has inverted and now become a War on the Poverty-Stricken. De facto debtors’ prisons have reappeared, predatory lending has become commonplace, and income inequality grows more exaggerated with every passing year, leaving behind large segments of the U.S. population as income and wealth pool in an ever-shrinking number of hands. Admittedly, the trend is global.

At some point, perhaps in the 1960s when The Establishment (or more simply, The Man) became a thing to oppose, the actual Establishment must have decided it was high time to circle the wagons and protect its privileges, essentially going to war with (against, really) the people. Now five decades on, holders of wealth and power demonstrate disdain for those outside their tiny circle, and our the government can no longer be said with a straight face to be of, by, and for the people (paraphrasing the last line of Lincoln’s Gettysburg Address). Rather, the government has been hijacked and turned into something abominable. Yet the people are strangely complicit, having allowed history to creep along with social justice in marked retreat. True threats do indeed exist, though not the ones that receive the lion’s share of attention. I surmise that, as with geopolitics, the U.S. government has brought into being an enemy and conflict that bodes not well for its legitimacy. Which collapse occurs first is anyone’s guess.