Posts Tagged ‘Economics’

I remarked in an earlier blog that artists, being hypersensitive to emergent patterns and cultural vibes, often get to ideas sooner than the masses and express their sensibilities through creative endeavor. Those expressions in turn give watchers, viewers, listeners, readers, etc. a way of understanding the world through the artist’s interpretive lens. Interpretations may be completely fictitious, based on real-life events, or merely figurative as the medium allows. They are nonetheless an inevitable reflection of ourselves. Philistines who fail to appreciate that the arts function by absorbing and processing human experience at a deep, intuitive level may insist that the arts are optional or unworthy of attention or financial support. That’s an opinion not at all borne out in the culture, however, and though support may be vulnerable to shifts in valuation (e.g., withdrawal of federal funding for the NEA and PBS), the creative class will always seek avenues of expression, even at personal cost. The democratization of production has made modes of production and distribution for some media quite cheap compared to a couple decades ago. Others remain undeniably labor intensive.

What sparked my thinking are several TV series that have caught my attention despite my generally low level of attention to such media. I haven’t watched broadcast television in over a decade, but the ability to stream TV programming has made shows I have ignored for years far more easy to tune in on my own terms and schedule. “Tune in” is of course the wrong metaphor, but suffice it to say I’ve awarded some of my attention to shows that have up until now fell out of scope for me, cinema being more to my liking. The three shows I’ve been watching (only partway through each) are The Americans, Homeland, and Shameless. The first two are political thrillers (spy stuff) whereas the last is a slice-of-life family drama, which often veers toward comedy but keeps delivering instead tragedy. Not quite the same thing as dark comedy. Conflict is necessary for dramatic purposes, but the ongoing conflict in each of these shows flirts with the worst sorts of disaster, e.g., the spies being discovered and unmasked and the family being thrown out of its home and broken up. Episodic scenarios the writers concoct to threaten catastrophe at every step or at any moment gets tiresome after a while. Multiple seasons ensure that dramatic tension is largely dispelled, since the main characters are present year over year. (The trend toward killing off major characters in others popular TV dramas is not yet widespread.) But still, it’s no way to live, constantly in disaster mode. No doubt I’ve cherry picked three shows from a huge array of entertainments on offer.

Where art reflects reality is that we all now live in the early 21st century under multiple, constantly disquieting threats, large and small, including sudden climate change and ecological disaster, nuclear annihilation, meteor impacts, eruption of the shield volcano under Yellowstone, the Ring of Fire becoming active again (leading to more volcanic and earthquake activity), geopolitical dysfunction on a grand scale, and of course, global financial collapse. This, too, is no way to live. Admittedly, no one was ever promised a care-free life. Yet our inability to manage our own social institutions or shepherd the earth (as though that were our mandate) promise catastrophes in the fullness of time that have no parallels in human history. We’re not flirting with disaster so much as courting it.

Sociologists and historians prepare scholarly works that attempt to provide a grand narrative of the times. Cinema seems to be preoccupied with planetary threats requiring superhero interventions. Television, on the other hand, with its serial form, plumbs the daily angst of its characters to drive suspense, keeping viewers on pins and needles while avoiding final resolution. That final resolution is inevitably disaster, but it won’t appear for a few seasons at least — after the dramatic potential is wrung out of the scenario. I can’t quite understand why these shows are consumed for entertainment (by me no less than anyone else) except perhaps to distract from the clear and present dangers we all face every day.

Advertisements

Oddly, there is no really good antonym for perfectionism. Suggestions include sloppiness, carelessness, and disregard. I’ve settled on approximation, which carries far less moral weight. I raise the contrast between perfectionism and approximation because a recent study published in Psychological Bulletin entitled “Perfectionism Is Increasing Over Time: A Meta-Analysis of Birth Cohort Differences From 1989 to 2016″ makes an interesting observation. Here’s the abstract:

From the 1980s onward, neoliberal governance in the United States, Canada, and the United Kingdom has emphasized competitive individualism and people have seemingly responded, in kind, by agitating to perfect themselves and their lifestyles. In this study, the authors examine whether cultural changes have coincided with an increase in multidimensional perfectionism in college students over the last 27 years. Their analyses are based on 164 samples and 41,641 American, Canadian, and British college students, who completed the Multidimensional Perfectionism Scale (Hewitt & Flett, 1991) between 1989 and 2016 (70.92% female, Mage = 20.66). Cross-temporal meta-analysis revealed that levels of self-oriented perfectionism, socially prescribed perfectionism, and other-oriented perfectionism have linearly increased. These trends remained when controlling for gender and between-country differences in perfectionism scores. Overall, in order of magnitude of the observed increase, the findings indicate that recent generations of young people perceive that others are more demanding of them, are more demanding of others, and are more demanding of themselves.

The notion of perfection, perfectness, perfectibility, etc. has a long tortured history in philosophy, religion, ethics, and other domains I won’t even begin to unpack. From the perspective of the above study, let’s just say that the upswing in perfectionism is about striving to achieve success, however one assesses it (education, career, relationships, lifestyle, ethics, athletics, aesthetics, etc.). The study narrows its subject group to college students (at the outset of adult life) between 1989 and 2016 and characterizes the social milieu as neoliberal, hyper-competitive, meritocratic, and pressured to succeed in a dog-eat-dog environment. How far back into childhood results of the study (agitation) extend is a good question. If the trope about parents obsessing and competing over preschool admission is accurate (may be just a NYC thang), then it goes all the way back to toddlers. So much for (lost) innocence purchased and perpetuated through late 20th- and early 21st-century affluence. I suspect college students are responding to awareness of two novel circumstances: (1) likelihood they will never achieve levels of success comparable to their own parents, especially financial (a major reversal of historical trends) and (2) recognition that to best enjoy the fruits of life, a quiet, reflective, anonymous, ethical, average life is now quite insufficient. Regarding the second of these, we are inundated by media showing rich celebrities (no longer just glamorous actors/entertainers) balling out of control, and onlookers are enjoined to “keep up.” The putative model is out there, unattainable for most but often awarded by randomness, undercutting the whole enterprise of trying to achieve perfection.

(more…)

Speaking of Davos (see previous post), Yuval Noah Harari gave a high-concept presentation at Davos 2018 (embedded below). I’ve been aware of Harari for a while now — at least since the appearance of his book Sapiens (2015) and its follow-up Homo Deus (2017), both of which I’ve yet to read. He provides precisely the sort of thoughtful, provocative content that interests me, yet I’ve not quite known how to respond to him or his ideas. First thing, he’s a historian who makes predictions, or at least extrapolates possible futures based on historical trends. Near as I can tell, he doesn’t resort to chastising audiences along the lines of “those who don’t know history are doomed to repeat it” but rather indulges in a combination of breathless anticipation and fear-mongering at transformations to be expected as technological advances disrupt human society with ever greater impacts. Strangely, Harari is not advocating for anything in particular but trying to map the future.

Harari poses this basic question: “Will the future be human?” I’d say probably not; I’ve concluded that we are busy destroying ourselves and have already crossed the point of no return. Harari apparently believes differently, that the rise of the machine is imminent in a couple centuries perhaps, though it probably won’t resemble Skynet of The Terminator film franchise hellbent on destroying humanity. Rather, it will be some set of advanced algorithms monitoring and channeling human behaviors using Big Data. Or it will be a human-machine hybrid possessing superhuman abilities (physical and cognitive) different enough to be considered a new species arising for the first time not out of evolutionary processes but from human ingenuity. He expects this new species to diverge from homo sapiens sapiens and leave us in the evolutionary dust. There is also conjecture that normal sexual reproduction will be supplanted by artificial, asexual reproduction, probably carried out in test tubes using, for example, CRISPR modification of the genome. Well, no fun in that … Finally, he believes some sort of strong AI will appear.

I struggle mightily with these predictions for two primary reasons: (1) we almost certainly lack enough time for technology to mature into implementation before the collapse of industrial civilization wipes us out, and (2) the Transhumanist future he anticipates calls into being (for me at least) a host of dystopian nightmares, only some of which are foreseeable. Harari says flatly at one point that the past is not coming back. Well, it’s entirely possible for civilization to fail and our former material conditions to be reinstated, only worse since we’ve damaged the biosphere so gravely. Just happened in Puerto Rico in microcosm when its infrastructure was wrecked by a hurricane and the power went out for an extended period of time (still off in some places). What happens when the rescue never appears because logistics are insurmountable? Elon Musk can’t save everyone.

The most basic criticism of economics is the failure to account for externalities. The same criticism applies to futurists. Extending trends as though all things will continue to operate normally is bizarrely idiotic. Major discontinuities appear throughout history. When I observed some while back that history has gone vertical, I included an animation with a graph that goes from horizontal to vertical in an extremely short span of geological time. This trajectory (the familiar hockey stick pointing skyward) has been repeated ad nauseum with an extraordinary number of survival pressures (notably, human population and consumption, including energy) over various time scales. Trends cannot simply continue ascending forever. (Hasn’t Moore’s Law already begun to slope away?) Hard limits must eventually be reached, but since there are no useful precedents for our current civilization, it’s impossible to know quite when or where ceilings loom. What happens after upper limits are found is also completely unknown. Ugo Bardi has a blog describing the Seneca Effect, which projects a rapid falloff after the peak that looks more like a cliff than a gradual, graceful descent, disallowing time to adapt. Sorta like the stock market currently imploding.

Since Harari indulges in rank thought experiments regarding smart algorithms, machine learning, and the supposed emergence of inorganic life in the data stream, I thought I’d pose some of my own questions. Waiving away for the moment distinctions between forms of AI, let’s assume that some sort of strong AI does in fact appear. Why on earth would it bother to communicate with us? And if it reproduces and evolves at breakneck speed as some futurists warn, how long before it/they simply ignore us as being unworthy of attention? Being hyper-rational and able to think calculate millions of moves ahead (like chess-playing computers), what if they survey the scene and come to David Benatar’s anti-natalist conclusion that it would be better not to have lived and so wink themselves out of existence? Who’s to say that they aren’t already among us, lurking, and we don’t even recognize them (took us quite a long time to recognize bacteria and viruses, and what about undiscovered species)? What if the Singularity has already occurred thousands of times and each time the machine beings killed themselves off without our even knowing? Maybe Harari explores some of these questions in Homo Deus, but I rather doubt it.

Be forewarned: this is long and self-indulgent. Kinda threw everything and the kitchen sink at it.

In the August 2017 issue of Harper’s Magazine, Walter Kirn’s “Easy Chair” column called “Apocalypse Always” revealed his brief, boyhood fascination with dystopian fiction. This genre has been around for a very long time, to which the Cassandra myth attests. Kirn’s column is more concerned with “high mid-twentieth-century dystopian fiction,” which in his view is now classic and canonical, an entire generation of Baby Boomers having been educated in such patterned thought. A new wave of dystopian fiction appeared in the 1990s and yet another more recently in the form of Young Adult novels (and films) that arguably serve better as triumphal coming-of-age stories albeit under dystopian circumstances. Kirn observes a perennial theme present in the genre: the twins disappearances of freedom and information:

In the classic dystopias, which concern themselves with the lack of freedom and not with surplus freedom run amok (the current and unforeseen predicament of many), society is superbly well organized, resembling a kind of hive or factory. People are sorted, classified, and ranked, their individuality suppressed through goon squads, potent narcotics, or breeding programs. Quite often, they wear uniforms, and express themselves, or fail to, in ritual utterance and gestures.

Whether Americans in 2018 resemble hollowed-out zombies suffering under either boot-heel or soft-serve oppression is a good question. Some would argue just that in homage to classic dystopias. Kirn suggests briefly that we might instead suffer from runaway anarchy, where too much freedom and licentiousness have led instead to a chaotic and disorganized society populated by citizens who can neither govern nor restrain themselves.

Disappearance of information might be understood in at least three familiar aspects of narrative framing: what happened to get us to this point (past as exposition, sometimes only hinted at), what the hell? is going on (present as conflict and action), and how is gets fixed (future as resolution and denouement). Strict control over information exercised by classic dystopian despots doesn’t track to conditions under which we now find ourselves, where more disorganized, fraudulent, and degraded information than ever is available alongside small caches of wisdom and understanding buried somewhere in the heap and discoverable only with the benefit of critical thinking flatly lost on at least a couple generations of miseducated graduates. However, a coherent narrative of who and what we are and what realistic prospects the future may hold has not emerged since the stifling version of the 1950s nuclear family and middle class consumer contentment. Kirn makes this comparison directly, where classic dystopian fiction

focus[es] on bureaucracy, coercion, propaganda, and depersonalization, overstates both the prowess of the hierarchs and the submissiveness of the masses, whom it still thinks of as the masses. It does not contemplate Trump-style charlatanism at the top, or a narcissistic populace that prizes attention over privacy. The threats to individualism are paramount; the scourge of surplus individualism, with everyone playing his own dunce king and slurping up resources until he bursts, goes unexplored.

Kirn’s further observations are worth a look. Go read for yourself.

(more…)

Returning to the discomforts of my culture-critic armchair just in time of best- and worst-of lists, years in review, summaries of celebrity deaths, etc., the past year, tumultuous in many respects, was also strangely stable. Absent were major political and economic crises and calamities of which myriad harbingers and forebodings warned. Present, however, were numerous natural disasters, primary among them a series of North American hurricanes and wildfires. (They are actually part of a larger, ongoing ecocide now being accelerated by the Trump Administration’s ideology-fueled rollback of environmental protections and regulations, but that’s a different blog post.) I don’t usually make predictions, but I do live on pins and needles with expectations things could take a decidedly bad turn at any moment. For example, continuity of government — specifically, the executive branch — was not expected to last the year by many pundits, yet it did, and we’ve settled into a new normal of exceedingly low expectations with regard to the dignity and effectiveness of high office.

I’ve been conflicted in my desire for stability — often understood pejoratively as either the status quo or business as usual — precisely because those things represent extension and intensification of the very trends that spell our collective doom. Yet I’m in no hurry to initiate the suffering and megadeath that will accompany the cascade collapse of industrial civilization, which will undoubtedly hasten my own demise. I usually express this conflict as not knowing what to hope for: a quick end to things that leaves room for survival of some part of the biosphere (not including large primates) or playing things out to their bitter end with the hope that my natural life is preserved (as opposed to an unnatural end to all of us).

The final paragraph at this blog post by PZ Myers, author of Pharyngula seen at left on my blogroll, states the case for stability:

… I grew up in the shadow of The Bomb, where there was fear of a looming apocalypse everywhere. We thought that what was going to kill us was our dangerous technological brilliance — we were just too dang smart for our own good. We were wrong. It’s our ignorance that is going to destroy us, our contempt for the social sciences and humanities, our dismissal of the importance of history, sociology, and psychology in maintaining a healthy, stable society that people would want to live in. A complex society requires a framework of cooperation and interdependence to survive, and without people who care about how it works and monitor its functioning, it’s susceptible to parasites and exploiters and random wreckers. Ignorance and malice allow a Brexit to happen, or a Trump to get elected, or a Sulla to march on Rome to ‘save the Republic’.

So there’s the rub: we developed human institutions and governments ideally meant to function for the benefit and welfare of all people but which have gone haywire and/or been corrupted. It’s probably true that being too dang smart for our own good is responsible for corruptions and dangerous technological brilliance, while not being dang smart enough (meaning even smarter or more clever than we already are) causes our collective failure to achieve anything remotely approaching the utopian institutions we conceive. Hell, I’d be happy for competence these days, but even that low bar eludes us.

Instead, civilization teeters dangerously close to collapse on numerous fronts. The faux stability that characterizes 2017 will carry into early 2018, but who knows how much farther? Curiously, having just finished reading Graham Hancock’s The Magicians of the Gods (no review coming from me), he ends ends with a brief discussion of the Younger Dryas impact hypothesis and the potential for additional impacts as Earth passes periodically through a region of space, a torus in geometry, littered with debris from the breakup of a large body. It’s a different death-from-above from that feared throughout the Atomic Age but even more fearsome. If we suffer anther impact (or several), it would not be self-annihilation stemming from our dim long-term view of forces we set in motion, but that hardly absolves us of anything.

As time wears on and I add years to this mostly ignored blog, I keep running across ideas expressed herein, sometimes long ago, recapitulated in remarks and comments elsewhere. Absolutely disparate people can develop the same ideas independently, so I’m not claiming that my ideas are stolen. Maybe I’m merely in touch with the Zeitgeist and express it here only then to see or hear it again someplace else. I can’t judge objectively.

The latest coincidence is the growing dread with which I wake up every day, wondering what fresh new hell awaits with the morning news. The times in which we live are both an extension of our received culture and yet unprecedented in their novelty. Not only are there many more people in existence than 100 years ago and thus radical opinions and events occurring with extraordinary frequency, the speed of transmission is also faster than in the past. Indeed, the rush to publication has many news organs reporting before any solid information is available. The first instance of blanket crisis coverage I remember was the Challenger Disaster in 1986. It’s unknown to me how quickly news of various U.S. political assassinations in the 1960s spread, but I suspect reporting took more time than today and imparted to those events gravity and composure. Today is more like a renewed Wild West where anything goes, which has been the preferred characterization of the Internet since its creation. We’ll see if the recent vote to remove Net Neutrality has the effect of restraining things. I suspect that particular move is more about a money grab (selling premium open access vs. basic limited access) than thought control, but I can only guess as to true motivations.

I happened to be traveling when the news broke of a mass shooting in Las Vegas. Happily, what news I got was delayed until actual news-gathering had already sorted basic fact from confabulation. Paradoxically, after the first wave of “what the hell just happened?” there formed a second wave of “here’s what happened,” and later a third wave of “what the hell really happened?” appeared as some rather creative interpretations were offered up for consideration. That third wave is by now quite familiar to everyone as the conspiracy wave, and surfing it feels inevitable because the second wave is often so starkly unbelievable. Various websites and shows such as snopes.com, metabunk.org, MythBusters, and Penn & Teller: Bullshit! (probably others, too) presume to settle debates. While I’m inclined to believe scientific and documentary evidence, mere argument often fails to convince me, which is troubling, to say the least.

Fending off all the mis- and disinformation, or separating signal from noise, is a full-time job if one is willing to undertake it. That used to be the mandate of the journalistic news media, at least in principle. Lots of failures on that account stack up throughout history. However, since we’re in the midst of a cultural phase dominated by competing claims to authority and the public’s retreat into ideation, the substitute worlds of extended and virtual reality become attractive alternatives to the fresh new hell we now face every morning. Tune in and check in might be what we think we’re doing, but more accurately, we tune out and check out of responsible engagement with the real world. That’s the domain of incessantly chipper morning TV shows. Moreover, we like to believe in the mythical stories we tell ourselves about ourselves, such as, for example, how privacy doesn’t matter, or that the U.S. is a free, democratic, liberal beacon of hope, or that economic value inheres in made-up currencies. It’s a battle for your attention and subscription in the marketplace of ideas. Caveat emptor.

Societies sometimes employ leveling mechanisms to keep the high and mighty from getting too, well, high and mighty or to pull them back down when they nonetheless manage to scale untenable heights. Some might insist that the U.S. breakaway from the British crown and aristocratic systems in the Revolutionary Era was, among other things, to establish an egalitarian society in accordance with liberal philosophy of the day. This is true to a point, since we in the U.S. don’t have hereditary aristocratic titles, but a less charitable view is that the Founders really only substituted the landed gentry, which to say themselves, for the tyrannical British. Who scored worse on the tyranny scale is a matter of debate, especially when modern sensibilities are applied to historical practices. Although I don’t generally care for such hindsight moralizing, it’s uncontroversial that the phrase “all men are created equal” (from the U.S. Declaration of Independence) did not then apply, for instance, to slaves and women. We’re still battling to establish equality (a level playing field) among all men and women. For SJWs, the fight has become about equality of outcome (e.g., quotas), which is a perversion of the more reasonable and achievable equality of opportunity.

When and where available resources were more limited, say, in agrarian or subsistence economies, the distance or separation between top and bottom was relatively modest. In a nonresource economy, where activity is financialized and decoupled from productivity (Bitcoin, anyone?), the distance between top and bottom can grow appallingly wide. I suspect that an economist could give a better explanation of this phenomenon than I can, but my suspicion is that it has primarily to do with fiat currency (money issued without sound backing such as precious metals), expansion of credit, and creation of arcane instruments of finance, all of which give rise to an immense bureaucracy of administrative personnel to create, manage, and manipulate them.

The U.S. tax structure of the 1950s — steep taxes levied against the highest earners — was a leveling mechanism. Whether intentionally corrective of the excesses of the Jazz Age is beyond my knowledge. However, that progressive tax structure has been dismantled (“leveled,” one might say), shifting from progressive to regressive and now to transgressive. Regressive is where more or disproportionate tax responsibility is borne by those already struggling to satisfy their basic needs. Transgressive is outright punishment of those who fail to earn enough, as though the whip functions as a spur to success. Indeed, as I mentioned in the previous blog post, the mood of the country right now is to abandon and blame those whom financial success has eluded. Though the term debtor’s prison belongs to a bygone era, we still have them, as people are imprisoned over nonviolent infractions such as parking tickets only to have heavy, additional, administrative fines and fees levied on them, holding them hostage to payment. That’s victimizing the victim, pure and simple.

At the other end of the scale, the superrich ascend a hierarchy that is absurdly imbalanced since leveling mechanisms are no longer present. Of course, disdain of the nouveau riche exists, primarily because social training does not typically accompany amassing of new fortunes, allowing many of that cohort to be amazingly gauche and intransigently proud of it (names withheld). That disdain is especially the prerogative of those whose wealth is inherited, not the masses, but is not an effective leveling mechanism. If one is rich, famous, and charming enough, indulgences for bad or criminal behavior are commonplace. For instance, those convicted of major financial crime in the past decade are quite few, whereas beneficiaries (multimillionaires) of looting of the U.S. Treasury are many. One very recent exception to indulgences is the wave of people being accused of sexual misconduct, but I daresay the motivation is unrelated to that of standard leveling mechanisms. Rather, it’s moral panic resulting from strains being felt throughout society having to do with sexual orientation and identity.

When the superrich ascend into the billionaire class, they tend to behave supranationally: buying private islands or yachts outside the jurisdiction or control of nation states, becoming nominal residents of the most advantageous tax havens, and shielding themselves from the rabble. While this brand of anarchism may be attractive to some and justified to others, detaching from social hierarchies and abandoning or ignoring others in need once one’s own fortunes are secure is questionable behavior to say the least. Indeed, those of such special character are typically focal points of violence and mayhem when the lives of the masses become too intolerable. That target on one’s back can be ignored or forestalled for a long time, perhaps, but the eventuality of nasty blowback is virtually guaranteed. That’s the final leveling mechanism seen throughout history.

Brief, uncharacteristic foray into national politics. The Senate narrowly approved a tax reform bill that’s been hawked by that shiny-suit-wearing-used-car-salesman-conman-guy over the past months as simply a big, fat tax cut. From all appearances, it won’t quite work out that way. The 479-pp. bill is available here (PDF link), including last-minute handwritten amendments. I don’t know how typical that is of legislative processes, but I doubt rushing or forcing a vote in the dead of night on an unfinished bill no one has had the opportunity to review leads to good results. Moreover, what does that say to schoolchildren about finishing one’s homework before turning it in?

Considering the tax reform bill is still a work in progress, it’s difficult to know with much certainty its effects if/when signed into law. However, summaries and snapshots of tax effects on typical American households have been provided to aid in the layperson’s grasp of the bill. This one from Mic Network Inc. (a multichannel news/entertainment network with which I am unfamiliar, so I won’t vouch for its reliability) states that the bill is widely unpopular and few trust the advance marketing of the bill:

Only 16% of Americans have said they think the plan is actually going to cut their taxes, less than half the number of people polled who think that their bill is going to go up, according to a Nov. 15 poll from Quinnipiac University.

Yet it seems the Republican-led effort will be successful, despite concerns that many middle class people could actually see their taxes rise, that social programs could suffer, that small businesses could be harmed and that a hoped-for economic boom may never materialize. [links removed]

When a change in tax law goes into effect, one good question is, “who gets help and who gets hurt?” For decades now, the answer has almost always been Reverse Robin Hood: take (or steal) from the poor and give to the rich. That’s why income inequality has increased to extreme levels commencing with the Reagan administration. The economic field of play has been consciously, knowingly tilted in favor of certain groups at the expense of others. Does anyone really believe that those in power are looking out for the poor and downtrodden? Sorry, that’s not the mood of the nation right now. Rather than assisting people who need help, governments at all levels have been withdrawing support and telling people, in effect, “you’re on your own, but first pay your taxes.” I propose we call the new tax bill Reverse Cowgirl, because if anything is certain about it, it’s that lots of people are gonna get fucked.

Here’s a familiar inspirational phrase from The Bible: the truth shall set you free (John 8:32). Indeed, most of us take it as, um, well, gospel that knowledge and understanding are unqualified goods. However, the information age has turned out to be a mixed blessing. Any clear-eyed view of the the way the world works and its long, tawdry history carries with it an inevitable awareness of injustice, inequity, suffering, and at the extreme end, some truly horrific episodes of groups victimizing each other. Some of the earliest bits of recorded history, as distinguished from oral history, are financial — keeping count (or keeping accounts). Today differs not so much in character as in the variety of counts being kept and the sophistication of information gathering.

The Bureau of Labor Statistics, a part of the U.S. Department of Labor, is one information clearinghouse that slices and dices available data according to a variety of demographic characteristics. The fundamental truth behind such assessments, regardless of the politics involved, is that when comparisons are made between unlike groups, say, between men and women or young and old, one should expect to find differences and indeed be rather surprised if comparisons revealed none. So the question of gender equality in the workplace, or its implied inverse, gender inequality in the workplace, is a form of begging the question, meaning that if one seeks differences, one shall most certainly find them. But those differences are not prima facie evidence of injustice in the sense of the popular meme that women are disadvantaged or otherwise discriminated against in the workplace. Indeed, the raw data can be interpreted according to any number of agendas, thus the phrase “lying with statistics,” and most of us lack the sophistication to contextualize statistics properly, which is to say, free of the emotional bias that plagues modern politics, and more specifically, identity politics.

The fellow who probably ran up against this difficulty the worst is Charles Murray in the aftermath of publication of his book The Bell Curve (1994), which deals with how intelligence manifests differently across demographic groups yet functions as the primary predictor of social outcomes. Murray is particularly well qualified to interpret data and statistics dispassionately, and in true seek-and-find fashion, differences between groups did appear. It is unclear how much his resulting prescriptions for social programs are borne out of data vs. ideology, but most of us are completely at sea wading through the issues without specialized academic training to make sense of the evidence.

More recently, another fellow caught in the crosshairs on issues of difference is James Damore, who was fired from his job at Google after writing what is being called an anti-diversity manifesto (but might be better termed an internal memo) that was leaked and then went viral. The document can be found here. I have not dug deeply into the details, but my impression is that Damore attempted a fairly academic unpacking of the issue of gender differences in the workplace as they conflicted with institutional policy only to face a hard-set ideology that is more RightThink than truth. In Damore’s case, the truth did set him free — free from employment. Even the NY Times recognizes that the Thought Police sprang into action yet again to demand that its pet illusions about society be supported rather than dispelled. These witch hunts and shaming rituals (vigilante justice carried out in the court of public opinion) are occurring with remarkable regularity.

In a day and age where so much information (too much information, as it turns out) is available to us to guide our thinking, one might hope for careful, rational analysis and critical thinking. However, trends point to the reverse: a return to tribalism, xenophobia, scapegoating, and victimization. There is also a victimization Olympics at work, with identity groups vying for imaginary medals awarded to whoever’s got it worst. I’m no Pollyanna when it comes to the notion that all men are brothers and, shucks, can’t we all just get along? That’s not our nature. But the marked indifference of the natural world to our suffering as it besets us with drought, fire, floods, earthquakes, tsunamis, hurricanes, tornadoes, and the like (and this was just the last week!) might seem like the perfect opportunity to find within ourselves a little grace and recognize our common struggles in the world rather than add to them.

Allow me to propose a hypothetical, to conduct a thought experiment if you will.

Let’s say that the powers that be, our governmental and corporate overlords, have been fully aware and convinced of impending disaster for some time, decades even. What to do with that burdensome information? How to prepare the public or themselves? Make the truth openly public and possibly spark a global panic or bury the information, denying and obfuscating when news eventually got out? Let’s say that, early on, the decision was made to bury the information and keep plodding through a few more blissfully ignorant decades as though nothing were amiss. After all, prophecies of disaster, extrapolating simple trend lines (such as population growth), were not uncommon as early as the 18th and 19th centuries. Science had made sufficient progress by the 1970s to recognize without much controversy that problems with industrial civilization were brewing and would soon overflow, overtaking our ability to maintain control over the processes we set in motion or indeed ourselves. Thus, at the intuitive level of deep culture, we initiated the ecology movement, the predecessor of environmentalism, and experienced the (first) international oil crisis. The decision to bury the prognosis for civilization (doom!) resulted in keeping a lid on things until the information swung fully into public view in the middle 2000s (the decade, not the century), thanks to a variety of scientists not among the power elite who sounded the alarms anew. At that point, obfuscation and disinformation became the dominant strategies.

Meanwhile, to keep the lights on and the store shelves stocked, the powers that be launched a campaign of massive debt spending, stealing from a future we would never reach anyway, and even dabbled at modest terraforming to forestall the worst by spraying chemicals in the atmosphere, creating global dimming. This program, like many others, were denied and made into conspiracy theories (chemtrails vs. contrails), enabling the public to ignore the obvious evidence of climate change and resulting slo-mo environmental collapse. Public uprising and outrage were easily quelled with essentially the same bread and circuses in which the Classical Romans indulged as their empire was in the midst of a protracted collapse. Modern global industrial empire will not experience the same centuries-long disintegration.

Now, I’ll admit, I don’t actually believe much of this. As with most conspiracies, this hypothetical doesn’t pass the straight-face test. Nor do the powers that be demonstrate competence sufficient to pull off even routine programs, much less extravagant ones. However, elements are undoubtedly true, such as the knowledge that energy policy and resources simply won’t meet anticipated demand with global population still swelling out of control. Neither will food production. Rather than make a difficult and questionable philosophical decision to serve the public interest by hiding the truth and keeping modern civilization going until the breaking point of a hard crash, at which point few would survive (or want to), the easy decision was probably made to ignore and obfuscate the truth, do nothing to keep the worst ravages of global industry from hastening our demise, and gather to themselves all financial resources, leaving everyone else in the lurch. The two basic options are to concern ourselves with everyone’s wellbeing over time vs. one’s own position in the short term.

In case the denial and obfuscation has worked on you, the reader of this doom blog, please consider (if you dare) this lengthy article at New York Magazine called “The Uninhabitable Earth” by David Wallace-Wells. Headings are these:

  1. “Doomsday”
  2. Heat Death
  3. The End of Food
  4. Climate Plagues
  5. Unbreathable Air
  6. Perpetual War
  7. Permanent Economic Collapse
  8. Poisoned Oceans
  9. The Great Filter

No one writes this stuff just to scare the public and get attention. Rather, it’s about telling the truth and whistle-blowing. While captains if industry and kings of the realm slumber, fattened and self-satisfied upon their beds, at least some of the rest of us recognize that the future is barrelling at us with the same indifference for human wellbeing (or the natural world) that our leaders have shown.