Posts Tagged ‘Recent History’

From time to time, I admit that I’m in no position to referee disputes, usually out of my lack of technical expertise in the hard sciences. I also avoid the impossibility of policing the Internet, assiduously pointing out error where it occurs. Others concern themselves with correcting the record and/or reinterpreting argument with improved context and accuracy. However, once in a while, something crosses my desk that gets under my skin. An article by James Ostrowski entitled “What America Has Done To its Young People is Appalling,” published at LewRockwell.com, is such a case. It’s undoubtedly a coincidence that the most famous Rockwell is arguably Norman Rockwell, whose celebrated illustrations for the Saturday Evening Post in particular helped reinforce a charming midcentury American mythology. Lew Rockwell, OTOH, is described briefly at the website’s About blurb:

The daily news and opinion site LewRockwell.com was founded in 1999 by anarcho-capitalists Lew Rockwell … and Burt Blumert to help carry on the anti-war, anti-state, pro-market work of Murray N. Rothbard.

Those political buzzwords probably deserve some unpacking. However, that project falls outside my scope. In short, they handily foist blame for what ills us in American culture on government planning, as distinguished from the comparative freedom of libertarianism. Government earns its share of blame, no doubt, especially with its enthusiastic prosecution of war (now a forever war); but as snapshots of competing political philosophies, these buzzwords are reductive almost to the point of meaninglessness. Ostrowski lays blame more specifically on feminism and progressive big government and harkens back to an idyllic 1950s nuclear family fully consonant with Norman Rockwell’s illustrations, thus invoking the nostalgic frame.

… the idyllic norm of the 1950’s, where the mother typically stayed home to take care of the kids until they reached school age and perhaps even long afterwards, has been destroyed.  These days, in the typical American family, both parents work fulltime which means that a very large percentage of children are consigned to daycare … in the critical first five years of life, the vast majority of Americans are deprived of the obvious benefits of growing up in an intact family with the mother at home in the pre-school years. We baby boomers took this for granted. That world is gone with the wind. Why? Two main reasons: feminism and progressive big government. Feminism encouraged women to get out of the home and out from under the alleged control of husbands who allegedly controlled the family finances.

Problem is, 1950s social configurations in the U.S. were the product of a convergence of historical forces, not least of which were the end of WWII and newfound American geopolitical and economic prominence. More pointedly, an entire generation of young men and women who had deferred family life during perilous wartime were then able to marry, start families, and provide for them on a single income — typically that of the husband/father. That was the baby boom. Yet to enjoy the benefits of the era fully, one probably needed to be a WASPy middle-class male or the child of one. Women and people of color fared … differently. After all, the 1950s yielded to the sexual revolution and civil rights era one decade later, both of which aimed specifically to improve the lived experience of, well, women and people of color.

Since the 1950s were only roughly 60 years ago, it might be instructive to consider how life was another 60 years before then, or in the 1890s. If one lived in an eastern American city, life was often a Dickensian dystopia, complete with child labor, poorhouses, orphanages, asylums, and unhygienic conditions. If one lived in an agrarian setting, which was far more prevalent before the great 20th-century migration to cities, then life was frequently dirt-poor subsistence and/or pioneer homesteading requiring dawn-to-dusk labor. Neither mode yet enjoyed social planning and progressive support including, for example, sewers and other modern infrastructure, public education, and economic protections such as unionism and trust busting. Thus, 19th-century America might be characterized fairly as being closer to anarcho-capitalism than at any time since. One of its principal legacies, one must be reminded, was pretty brutal exploitation of (and violence against) labor, which can be understood by the emergence of political parties that sought to redress its worst scourges. Hindsight informs us now that reforms were slow, partial, and impermanent, leading to the observation that among all tried forms of self-governance, democratic capitalism can be characterized as perhaps the least awful.

So yeah, the U.S. came a long way from 1890 to 1950, especially in terms of standard of living, but may well be backsliding as the 21st-century middle class is hollowed out (a typical income — now termed household income — being rather challenging for a family), aspirations to rise economically above one’s parents’ level no longer function, and the culture disintegrates into tribal resentments and unrealistic fantasies about nearly everything. Ostrowski marshals a variety of demographic facts and figures to support his argument (with which I agree in large measure), but he fails to make a satisfactory causal connection with feminism and progressivism. Instead, he sounds like 45 selling his slogan Make America Great Again (MAGA), meaning let’s turn back the clock to those nostalgic 1950s happy days. Interpretations of that sentiment run in all directions from innocent to virulent (but coded). By placing blame on feminism and progressivism, it’s not difficult to hear anyone citing those putative causes as an accusation that, if only those feminists and progressives (and others) had stayed in their assigned lanes, we wouldn’t be dealing now with cultural crises that threaten to undo us. What Ostrowski fails to acknowledge is that despite all sorts of government activity over the decades, no one in the U.S. is steering the culture nearly as actively as in centrally planned economies and cultures, current and historical, which in their worst instances are fascist and/or totalitarian. One point I’ll agree on, however, just to be charitable, is that the mess we’ve made and will leave to youngsters is truly appalling.

Advertisements

Not a person alive having reached even a modest level of maturity hasn’t looked back at some choice or attitude of his or her past and wondered “What on earth was I thinking?” Maybe it was some physical stunt resulting in a fall or broken bone (or worse), or maybe it was an intolerant attitude later softened by empathy and understanding when the relevant issue became personal. We’ve all got something. Some of us, many somethings. As a kid, my cohorts and I used to play in leaves raked into piles in the autumn. A pile of leaves isn’t a trampoline and doesn’t really provide cushion, but as kids, it didn’t matter for the purpose of play. At one point, the kid next door dared me to jump from the roof of his front porch into a pile of leaves. The height was probably 15 feet. I remember climbing out and peering over the gutters, wavering a bit before going back inside. I didn’t jump. What was I thinking? It would have been folly to take that dare.

Some youthful indiscretion is to be expected and can be excused as teaching moments, but in truth, most of us don’t have to go far back in time to wonder “what in hell was I thinking?” Maybe it was last week, last month, or a few years ago. The interval matters less than the honest admission that, at any point one might believe he or she has things figured out and can avoid traps that look clear only in hindsight, something will come up and remind that, despite being wizened through experience, one still misjudges and makes egregious mistakes.

(more…)

Two shocking and vaguely humorous (dark, sardonic humor) events occurred recently in the gun debate: (1) in a speech, Marco Rubio sarcastically offered the very reform a healthy majority of the public wants — banning assault weapons — and revealed himself to be completely tin-earred with respect to the public he addresses, and (2) 45 supported some gun controls and even raised the stakes, saying that guns should be taken from people flagged as unstable and dangerous before they commit their mayhem. Rubio had already demonstrated his inability to think on his feet, being locked into scripts handed to him by … whom exactly? Certainly not the public he purportedly serves. So much for his presidential aspirations. OTOH, 45 channels populism and can switch positions quickly. Though ugly and base in many cases, populism at least expresses the will of the people, such as it can be known. His departure from reflexive Republican defense of the hallowed 2nd Amendment shouldn’t be too great a surprise; he’s made similar remarks in the past. His willingness to discard due process and confiscate guns before a crime has been committed sounds more than a little like Spielbergian precrime (via Orwell and Philip K. Dick). To even entertain this prospect in the gun debate demonstrates just how intolerable weekly mass shootings — especially school shootings by troubled youth — have become in the land of the free and home of the brave. On balance, 45 also recommended arming classroom teachers (a risible solution to the problem), so go figger.

Lodged deep in my brain is a potent archetype I don’t often see cited: the Amfortas wound. The term comes from Richard Wagner’s music drama Parsifal (synopsis found here). Let me describe the principal elements (very) briefly. Amfortas is the king of the Knights of the Holy Grail and has a seeping wound than cannot be healed except, according to prophecy, by an innocent youth, also described as a fool wizened by compassion. Such a youth, Parsifal, appears and after familiar operatic conflict does indeed fulfill the prophecy. Parsifal is essentially a retelling of the Arthurian legend. The music is some of the most transcendentally beautiful orchestral composition ever committed to paper and is very much recommended. Admittedly, it’s rather slow for today’s audiences more inclined to throwaway pop music.

Anyway, to tie together the gun debate and Parsifal, I muse that the Amfortas wound is gun violence and 45 is the titular fool who in the end heals the wound and becomes king of the Knights of the Holy Grail. The characterization is not entirely apt, of course, because it’s impossible to say that 45 is young, or compassionate, or wizened, but he has oddly enough moved the needle on gun debate. Not single-handedly, mind you, but from a seat of considerable power unlike, say, the Parkland survivors. Resolution and healing have yet to occur and will no doubt be opposed by the NRA and Marco Rubio. Maybe we’re only in Act I of the traditional 3-act structure. Other characters and plots devices from Parsifal I leave uncast. The main archetype is the Amfortas wound.

Fully a decade ago, I analyzed with more length than I usually allow myself an article from The New Yorker that examined how media trends were pushing away from literacy (the typographic mind) toward listening and viewing (orality) as primary modes of information gathering and entertainment. The trend was already underway with the advent of radio, cinema, and television, which moved the relatively private experience of silent reading to a public or communal realm as people shared experiences around emerging media. The article took particular aim at TV. In the intervening decade, media continue to contrive new paths of distribution, moving activity back to private information environments via the smart phone and earbuds. The rise of the webcast (still called podcast by some, though that’s an anachronism), which may include a video feed or display a static image over discussion and/or lecture, and streaming services are good examples. Neither has fully displaced traditional media just yet, but the ongoing shift in financial models is a definite harbinger of relentless change.

This comes up again because, interestingly, The New Yorker included with an article I popped open on the Web an audio file of the very same article read by someone not the author. The audio was 40 minutes, whereas the article may have taken me 15 to 20 minutes had I read it. For undisclosed reasons, I listened to the audio. Not at all surprisingly, I found it odd and troublesome. Firstly, though the content was nominally investigative journalism (buttressed by commentary), hearing it read to me made it feel like, well, storytime, meaning it was fiction. Secondly, since my eyes weren’t occupied with reading, they sought other things to do and thus fragmented my attention.

No doubt The New Yorker is pandering to folks who would probably not be readers but might well become listeners. In doing so, it’s essentially conceding the fight, admitting that the effort to read is easily eclipsed by the effortlessness of listening. As alternative and unequal modes of transmitting the content of the article, however, it strikes me as an initiative hatched not by writers and editors capable of critical thought and addressing a similarly enabled readership but by a combination of sales and marketing personnel attempting to capture a widening demographic of listeners (read: nonreaders). Navigating to the article might be a modest extra complication, but if a link to the audio file can be tweeted out (I don’t actually know if that’s possible), then I guess the text isn’t truly necessary.

Here part of what I wrote a decade ago:

If the waning of the typographic mind proceeds, I anticipate that the abstract reasoning and critical thinking skills that are the legacy of Enlightenment Man will be lost except to a few initiates who protect the flame. And with so many other threats cropping up before us, the prospect of a roiling mass of all-but-in-name barbarians ruled by a narrow class of oligarchs does indeed spell the total loss of democracy.

Are we getting perilously close that this dystopia? Maybe not, since it appears that many of those in high office and leadership positions labor under their own failures/inabilities to read at all critically and so execute their responsibilities with about the same credibility as hearsay. Even The New Yorker is no longer protecting the flame.

I recall Nathaniel Hawthorne’s short story The Celestial Railroad railing against the steam engine, an infernal machine, that disrupts society (agrarian at that time). It’s a metaphor for industrialization. The newest infernal machine (many candidates have appeared since Hawthorne’s time only to be supplanted by the next) is undoubtedly the smart phone. Its disruption of healthy formation of identity among teenagers has already been well researched and documented. Is it ironic that as an object of our own creation, it’s coming after our minds?

Speaking of Davos (see previous post), Yuval Noah Harari gave a high-concept presentation at Davos 2018 (embedded below). I’ve been aware of Harari for a while now — at least since the appearance of his book Sapiens (2015) and its follow-up Homo Deus (2017), both of which I’ve yet to read. He provides precisely the sort of thoughtful, provocative content that interests me, yet I’ve not quite known how to respond to him or his ideas. First thing, he’s a historian who makes predictions, or at least extrapolates possible futures based on historical trends. Near as I can tell, he doesn’t resort to chastising audiences along the lines of “those who don’t know history are doomed to repeat it” but rather indulges in a combination of breathless anticipation and fear-mongering at transformations to be expected as technological advances disrupt human society with ever greater impacts. Strangely, Harari is not advocating for anything in particular but trying to map the future.

Harari poses this basic question: “Will the future be human?” I’d say probably not; I’ve concluded that we are busy destroying ourselves and have already crossed the point of no return. Harari apparently believes differently, that the rise of the machine is imminent in a couple centuries perhaps, though it probably won’t resemble Skynet of The Terminator film franchise hellbent on destroying humanity. Rather, it will be some set of advanced algorithms monitoring and channeling human behaviors using Big Data. Or it will be a human-machine hybrid possessing superhuman abilities (physical and cognitive) different enough to be considered a new species arising for the first time not out of evolutionary processes but from human ingenuity. He expects this new species to diverge from homo sapiens sapiens and leave us in the evolutionary dust. There is also conjecture that normal sexual reproduction will be supplanted by artificial, asexual reproduction, probably carried out in test tubes using, for example, CRISPR modification of the genome. Well, no fun in that … Finally, he believes some sort of strong AI will appear.

I struggle mightily with these predictions for two primary reasons: (1) we almost certainly lack enough time for technology to mature into implementation before the collapse of industrial civilization wipes us out, and (2) the Transhumanist future he anticipates calls into being (for me at least) a host of dystopian nightmares, only some of which are foreseeable. Harari says flatly at one point that the past is not coming back. Well, it’s entirely possible for civilization to fail and our former material conditions to be reinstated, only worse since we’ve damaged the biosphere so gravely. Just happened in Puerto Rico in microcosm when its infrastructure was wrecked by a hurricane and the power went out for an extended period of time (still off in some places). What happens when the rescue never appears because logistics are insurmountable? Elon Musk can’t save everyone.

The most basic criticism of economics is the failure to account for externalities. The same criticism applies to futurists. Extending trends as though all things will continue to operate normally is bizarrely idiotic. Major discontinuities appear throughout history. When I observed some while back that history has gone vertical, I included an animation with a graph that goes from horizontal to vertical in an extremely short span of geological time. This trajectory (the familiar hockey stick pointing skyward) has been repeated ad nauseum with an extraordinary number of survival pressures (notably, human population and consumption, including energy) over various time scales. Trends cannot simply continue ascending forever. (Hasn’t Moore’s Law already begun to slope away?) Hard limits must eventually be reached, but since there are no useful precedents for our current civilization, it’s impossible to know quite when or where ceilings loom. What happens after upper limits are found is also completely unknown. Ugo Bardi has a blog describing the Seneca Effect, which projects a rapid falloff after the peak that looks more like a cliff than a gradual, graceful descent, disallowing time to adapt. Sorta like the stock market currently imploding.

Since Harari indulges in rank thought experiments regarding smart algorithms, machine learning, and the supposed emergence of inorganic life in the data stream, I thought I’d pose some of my own questions. Waiving away for the moment distinctions between forms of AI, let’s assume that some sort of strong AI does in fact appear. Why on earth would it bother to communicate with us? And if it reproduces and evolves at breakneck speed as some futurists warn, how long before it/they simply ignore us as being unworthy of attention? Being hyper-rational and able to think calculate millions of moves ahead (like chess-playing computers), what if they survey the scene and come to David Benatar’s anti-natalist conclusion that it would be better not to have lived and so wink themselves out of existence? Who’s to say that they aren’t already among us, lurking, and we don’t even recognize them (took us quite a long time to recognize bacteria and viruses, and what about undiscovered species)? What if the Singularity has already occurred thousands of times and each time the machine beings killed themselves off without our even knowing? Maybe Harari explores some of these questions in Homo Deus, but I rather doubt it.

Be forewarned: this is long and self-indulgent. Kinda threw everything and the kitchen sink at it.

In the August 2017 issue of Harper’s Magazine, Walter Kirn’s “Easy Chair” column called “Apocalypse Always” revealed his brief, boyhood fascination with dystopian fiction. This genre has been around for a very long time, to which the Cassandra myth attests. Kirn’s column is more concerned with “high mid-twentieth-century dystopian fiction,” which in his view is now classic and canonical, an entire generation of Baby Boomers having been educated in such patterned thought. A new wave of dystopian fiction appeared in the 1990s and yet another more recently in the form of Young Adult novels (and films) that arguably serve better as triumphal coming-of-age stories albeit under dystopian circumstances. Kirn observes a perennial theme present in the genre: the twins disappearances of freedom and information:

In the classic dystopias, which concern themselves with the lack of freedom and not with surplus freedom run amok (the current and unforeseen predicament of many), society is superbly well organized, resembling a kind of hive or factory. People are sorted, classified, and ranked, their individuality suppressed through goon squads, potent narcotics, or breeding programs. Quite often, they wear uniforms, and express themselves, or fail to, in ritual utterance and gestures.

Whether Americans in 2018 resemble hollowed-out zombies suffering under either boot-heel or soft-serve oppression is a good question. Some would argue just that in homage to classic dystopias. Kirn suggests briefly that we might instead suffer from runaway anarchy, where too much freedom and licentiousness have led instead to a chaotic and disorganized society populated by citizens who can neither govern nor restrain themselves.

Disappearance of information might be understood in at least three familiar aspects of narrative framing: what happened to get us to this point (past as exposition, sometimes only hinted at), what the hell? is going on (present as conflict and action), and how is gets fixed (future as resolution and denouement). Strict control over information exercised by classic dystopian despots doesn’t track to conditions under which we now find ourselves, where more disorganized, fraudulent, and degraded information than ever is available alongside small caches of wisdom and understanding buried somewhere in the heap and discoverable only with the benefit of critical thinking flatly lost on at least a couple generations of miseducated graduates. However, a coherent narrative of who and what we are and what realistic prospects the future may hold has not emerged since the stifling version of the 1950s nuclear family and middle class consumer contentment. Kirn makes this comparison directly, where classic dystopian fiction

focus[es] on bureaucracy, coercion, propaganda, and depersonalization, overstates both the prowess of the hierarchs and the submissiveness of the masses, whom it still thinks of as the masses. It does not contemplate Trump-style charlatanism at the top, or a narcissistic populace that prizes attention over privacy. The threats to individualism are paramount; the scourge of surplus individualism, with everyone playing his own dunce king and slurping up resources until he bursts, goes unexplored.

Kirn’s further observations are worth a look. Go read for yourself.

(more…)

The witch hunt aimed at sexual predators continues to amaze as it crashes the lives of more and more people. I knew once the floodgates were opened that many of the high and mighty would be brought low. It was probably overdue, but no one can be truly surprised by the goings on giving rise to this purge. Interestingly, the media have gone into the archives and found ample evidence of jokes, hush money, accusations, and lawsuits to demonstrate that this particular open secret was a well-known pattern. Some have offered the simplest of explanations: power corrupts (another open secret). No one really wants to hear that time-honored truth or admit that they, too, are entirely corruptible.

One of the accused has openly admitted that the accusations against him are true, which is almost a breath of fresh air amid all the denials and obfuscations but for the subject matter of the admission. And because it’s a witch hunt, those accused are vulnerable to the mob demanding immediate public shaming and then piling on. No investigation or legal proceeding is necessary (though that may be coming, too). The court of public opinion effects immediate destruction of life and livelihood. Frankly, it’s hard to be sympathetic toward the accused, but I cling to noble sentiment when it comes to application of the law. We should tread lightly to avoid the smears of false accusation and not be swept into moral panic.

Ran Prieur weighed in with this paragraph (no link to his blog, sorry; it’s quite searchable until it gets pushed down and off the page):

I like Louis CK’s apology because he understands that the core issue is power … We imagine these people are bad because they crossed the line between consent and coercion. But when almost the entire world is under authoritarian culture, where it’s normal for some people to tell other people what to do, where it’s normal for us to do what we’re told even if we don’t feel like it, then the line between consent and coercion is crossed so often that it basically doesn’t exist.

Once a culture has crossed the line into normalization of hierarchy, it’s a constant temptation to cross the next line, between using a position of power for the good of the whole, and using it selfishly. And once that line has been crossed, it’s tempting for selfish use of power to veer into sex acts.

I like to think, in a few thousand years, human culture will be so much improved that one person having any power over another will be a scandal.

It’s a slightly fuller explanation of the power dynamic, just as Louis CK offered his own explanation. The big difference is that no one wants to hear it from an admitted sexual predator. Thus, Louis CK is over. Similarly, no one can watch The Cosby Show in innocence anymore. Remains to be seen if any of the fallen will ever rise to career prominence again. Yet Prieur’s final statement confounds me completely. He gets the power dynamic but then plainly doesn’t get it at all. Power and authority are not optional in human society. Except for a few rare, isolated instances of radical egalitarianism, they are entirely consistent with human nature. While we might struggle to diminish the more awful manifestations, so long as there are societies, there will be power imbalances and the exploitation and predation (sexual and otherwise) that have been with us since our prehistory.

Remember: we’re mammals, meaning we compete with each other for sexual access. Moreover, we can be triggered easily enough, not unlike dogs responding when a bitch goes into heat. Sure, humans have executive mental function that allows us to overcome animal impulses some of the time, but that’s not a reliable antidote to sexual misconduct ranging from clumsy come-ons to forcible rape. This is not to excuse anyone who acts up. Rather, it’s a reminder that we all have to figure out how to maneuver in the world effectively, which frankly includes protecting ourselves from predators. The young, sexually naïve, and powerless will always be prime targets. Maybe we’re not quite barbarians anymore, raping and pillaging with wanton disregard for our victims, but neither are we far removed from that characterization, as recent accounts demonstrate.

I’m a little gobsmacked that, in the aftermath of someone finally calling out the open secret of the Hollywood casting couch (don’t know, don’t care how this news cycle started) and netting Harvey Weinstein in the process, so many well-known actors have added their “Me, too!” to the growing scandal. Where were all these sheep before now? As with Bill Cosby and Bill Clinton, what good does it do to allow a serial abuser to continue unchallenged until years, decades later a critical mass finally boils over? I have no special knowledge or expertise in this area, so what follows is the equivalent of a thought experiment.

Though the outlines of the power imbalance between a Hollywood executive and an actor seeking a role (or other industry worker seeking employment) are pretty clear, creating a rich opportunity for the possessor of such power to act like a creep or a criminal, the specific details are still a little shrouded — at least in my limited consumption of the scandal press. How much of Weinstein’s behavior veers over the line from poor taste to criminality is a difficult question precisely because lots of pictorial evidence exists showing relatively powerless people playing along. It’s a very old dynamic, and its quasi-transactional nature should be obvious.

In my idealized, principled view, if one has been transgressed, the proper response is not to slink away or hold one’s tongue until enough others are similarly transgressed to spring into action. The powerless are duty bound to assert their own power — the truth — much like a whistleblower feels compelled to disclose corruptions of government and corporate sectors. Admittedly, that’s likely to compound the initial transgression and come at some personal cost, great or small. But for some of us (a small percentage, I reckon), living with ourselves in silent assent presents an even worse option. By way of analogy, if one were molested by a sketchy uncle and said nothing, I can understand just wanting to move on. But if one said nothing yet knew the sketchy uncle had more kids lined up in the extended family to transgress, then stepping up to protect the younger and weaker would be an absolute must.

In the past few decades, clergy of the Catholic Church sexually abused many young people and deployed an institutional conspiracy to hide the behaviors and protect the transgressors. Exposure should have broken trust bonds between the church and the faithful and invalidated the institution as an abject failure. Didn’t quite work out that way. Similar scandals and corruption across a huge swath of institutions (e.g., corporate, governmental, military, educational, entertainment, and sports entities) have been appearing in public view regularly, yet as a culture, we tolerate more creeps and criminals than we shame or prosecute. (TomDispatch.com is one of the sites that regularly reports these corruptions with respect to American empire; I can scarcely bear to read it sometimes.) I suspect part of that is a legitimate desire for continuity, to avoid burning down the house with everyone in it. That places just about everyone squarely within the “Me, too!” collective. Maybe I shouldn’t be so gobsmacked after all.

Caveat: This thought experiment definitely comes from a male perspective. I recognize that females view these issues quite differently, typically in consideration of far greater vulnerability than males experience (excepting the young boys in the Catholic Church example).

The storms referenced in the earlier version of this post were civilization-ending cataclysms. The succession of North American hurricanes and earthquakes earlier this month of September 2017 were natural disasters. I would say that September was unprecedented in history, but reliable weather records do not extend very far back in human history and the geological record extending back into human prehistory would suggest that, except perhaps for their concentration within the span of a month, the latest storms are nothing out of the ordinary. Some have even theorized that hurricanes and earthquakes could be interrelated. In the wider context of weather history, this brief period of destructive activity may still be rather mild. Already in the last twenty years we’ve experienced a series of 50-, 100- and 500-year weather events that would suggest exactly what climate scientists have been saying, namely, that higher global average temperatures and more atmospheric moisture will lead to more activity in the category of superstorms. Throw drought, flood, and desertification into the mix. This (or worse, frankly) may have been the old normal when global average temperatures were several degrees warmer during periods of hothouse earth. All indications are that we’re leaving behind garden earth, the climate steady state (with a relatively narrow band of global temperature variance) enjoyed for roughly 12,000 years.

Our response to the latest line of hurricanes that struck the Gulf, Florida, and the Caribbean has been characterized as a little tepid considering we had the experience of Katrina from which to learn and prepare, but I’m not so sure. True, hurricanes can be seen hundreds of miles and days away, allowing folks opportunity to either batten down the hatches or flee the area, but we have never been able to handle mass exodus, typically via automobile, and the sheer destructive force of the storms overwhelms most preparations and delays response. So after Katrina, it appeared for several days that the federal government’s response was basically this: you’re on your own; that apparent response occurred again especially in Puerto Rico, which like New Orleans quickly devolved into a true humanitarian crisis (and is not yet over). Our finding (in a more charitable assessment on my part) is that despite foreknowledge of the event and past experience with similar events, we can’t simply swoop in and smooth things out after the storms. Even the first steps of recovery take time.

I’ve cautioned that rebuilding on the same sites, with the reasonable expectation of repeat catastrophes in a destabilized climate that will spawn superstorms reducing entire cities to garbage heaps, is a poor option. No doubt we’ll do it anyway, at least partially; it’s already well underway in Houston. I’ve also cautioned that we need to brace for a diaspora as climate refugees abandon destroyed and inundated cities and regions. It’s already underway with respect to Puerto Rico. This is a storm of an entirely different sort (a flood, actually) and can also been seen from hundreds of miles and weeks, months, years away. And like superstorms, a diaspora from the coasts, because of the overwhelming force and humanitarian crisis it represents, is not something for which we can prepare adequately. Still, we know it’s coming, like a 20- or 50-year flood.

Here’s a familiar inspirational phrase from The Bible: the truth shall set you free (John 8:32). Indeed, most of us take it as, um, well, gospel that knowledge and understanding are unqualified goods. However, the information age has turned out to be a mixed blessing. Any clear-eyed view of the the way the world works and its long, tawdry history carries with it an inevitable awareness of injustice, inequity, suffering, and at the extreme end, some truly horrific episodes of groups victimizing each other. Some of the earliest bits of recorded history, as distinguished from oral history, are financial — keeping count (or keeping accounts). Today differs not so much in character as in the variety of counts being kept and the sophistication of information gathering.

The Bureau of Labor Statistics, a part of the U.S. Department of Labor, is one information clearinghouse that slices and dices available data according to a variety of demographic characteristics. The fundamental truth behind such assessments, regardless of the politics involved, is that when comparisons are made between unlike groups, say, between men and women or young and old, one should expect to find differences and indeed be rather surprised if comparisons revealed none. So the question of gender equality in the workplace, or its implied inverse, gender inequality in the workplace, is a form of begging the question, meaning that if one seeks differences, one shall most certainly find them. But those differences are not prima facie evidence of injustice in the sense of the popular meme that women are disadvantaged or otherwise discriminated against in the workplace. Indeed, the raw data can be interpreted according to any number of agendas, thus the phrase “lying with statistics,” and most of us lack the sophistication to contextualize statistics properly, which is to say, free of the emotional bias that plagues modern politics, and more specifically, identity politics.

The fellow who probably ran up against this difficulty the worst is Charles Murray in the aftermath of publication of his book The Bell Curve (1994), which deals with how intelligence manifests differently across demographic groups yet functions as the primary predictor of social outcomes. Murray is particularly well qualified to interpret data and statistics dispassionately, and in true seek-and-find fashion, differences between groups did appear. It is unclear how much his resulting prescriptions for social programs are borne out of data vs. ideology, but most of us are completely at sea wading through the issues without specialized academic training to make sense of the evidence.

More recently, another fellow caught in the crosshairs on issues of difference is James Damore, who was fired from his job at Google after writing what is being called an anti-diversity manifesto (but might be better termed an internal memo) that was leaked and then went viral. The document can be found here. I have not dug deeply into the details, but my impression is that Damore attempted a fairly academic unpacking of the issue of gender differences in the workplace as they conflicted with institutional policy only to face a hard-set ideology that is more RightThink than truth. In Damore’s case, the truth did set him free — free from employment. Even the NY Times recognizes that the Thought Police sprang into action yet again to demand that its pet illusions about society be supported rather than dispelled. These witch hunts and shaming rituals (vigilante justice carried out in the court of public opinion) are occurring with remarkable regularity.

In a day and age where so much information (too much information, as it turns out) is available to us to guide our thinking, one might hope for careful, rational analysis and critical thinking. However, trends point to the reverse: a return to tribalism, xenophobia, scapegoating, and victimization. There is also a victimization Olympics at work, with identity groups vying for imaginary medals awarded to whoever’s got it worst. I’m no Pollyanna when it comes to the notion that all men are brothers and, shucks, can’t we all just get along? That’s not our nature. But the marked indifference of the natural world to our suffering as it besets us with drought, fire, floods, earthquakes, tsunamis, hurricanes, tornadoes, and the like (and this was just the last week!) might seem like the perfect opportunity to find within ourselves a little grace and recognize our common struggles in the world rather than add to them.