Posts Tagged ‘Economics’

Returning to the discomforts of my culture-critic armchair just in time of best- and worst-of lists, years in review, summaries of celebrity deaths, etc., the past year, tumultuous in many respects, was also strangely stable. Absent were major political and economic crises and calamities of which myriad harbingers and forebodings warned. Present, however, were numerous natural disasters, primary among them a series of North American hurricanes and wildfires. (They are actually part of a larger, ongoing ecocide now being accelerated by the Trump Admistration’s ideology-fueled rollback of environmental protections and regulations, but that’s a different blog post.) I don’t usually make predictions, but I do live on pins and needles with expectations things could take a decidedly bad turn at any moment. For example, continuity of government — specifically, the executive branch — was not expected to last the year by many pundits, yet it did, and we’ve settled into a new normal of exceedingly low expectations with regard to the dignity and effectiveness of high office.

I’ve been conflicted in my desire for stability — often understood pejoratively as either the status quo or business as usual — precisely because those things represent extension and intensification of the very trends that spell our collective doom. Yet I’m in no hurry to initiate the suffering and megadeath that will accompany the cascade collapse of industrial civilization, which will undoubtedly hasten my own demise. I usually express this conflict as not knowing what to hope for: a quick end to things that leaves room for survival of some part of the biosphere (not including large primates) or playing things out to their bitter end with the hope that my natural life is preserved (as opposed to an unnatural end to all of us).

The final paragraph at this blog post by PZ Myers, author of Pharyngula seen at left on my blogroll states the case for stability:

… I grew up in the shadow of The Bomb, where there was fear of a looming apocalypse everywhere. We thought that what was going to kill us was our dangerous technological brilliance — we were just too dang smart for our own good. We were wrong. It’s our ignorance that is going to destroy us, our contempt for the social sciences and humanities, our dismissal of the importance of history, sociology, and psychology in maintaining a healthy, stable society that people would want to live in. A complex society requires a framework of cooperation and interdependence to survive, and without people who care about how it works and monitor its functioning, it’s susceptible to parasites and exploiters and random wreckers. Ignorance and malice allow a Brexit to happen, or a Trump to get elected, or a Sulla to march on Rome to ‘save the Republic’.

So there’s the rub: we developed human institutions and governments ideally meant to function for the benefit and welfare of all people but which have gone haywire and/or been corrupted. It’s probably true that being too dang smart for our own good is responsible for corruptions and dangerous technological brilliance, while not being dang smart enough (meaning even smarter or more clever than we already are) causes our collective failure to achieve anything remotely approaching the utopian institutions we conceive. Hell, I’d be happy for competence these days, but even that low bar eludes us.

Instead, civilization teeters dangerously close to collapse on numerous fronts. The faux stability that characterizes 2017 will carry into early 2018, but who knows how much farther? Curiously, having just finished reading Graham Hancock’s The Magicians of the Gods (no review coming from me), he ends ends with a brief discussion of the Younger Dryas impact hypothesis and the potential for additional impacts as Earth passes periodically through a region of space, a torus in geometry, littered with debris from the breakup of a large body. It’s a different death-from-above from that feared throughout the Atomic Age but even more fearsome. If we suffer anther impact (or several), it would not be self-annihilation stemming from our dim long-term view of forces we set in motion, but that hardly absolves us of anything.

Advertisements

As time wears on and I add years to this mostly ignored blog, I keep running across ideas expressed herein, sometimes long ago, recapitulated in remarks and comments elsewhere. Absolutely disparate people can develop the same ideas independently, so I’m not claiming that my ideas are stolen. Maybe I’m merely in touch with the Zeitgeist and express it here only then to see or hear it again someplace else. I can’t judge objectively.

The latest coincidence is the growing dread with which I wake up every day, wondering what fresh new hell awaits with the morning news. The times in which we live are both an extension of our received culture and yet unprecedented in their novelty. Not only are there many more people in existence than 100 years ago and thus radical opinions and events occurring with extraordinary frequency, the speed of transmission is also faster than in the past. Indeed, the rush to publication has many news organs reporting before any solid information is available. The first instance of blanket crisis coverage I remember was the Challenger Disaster in 1986. It’s unknown to me how quickly news of various U.S. political assassinations in the 1960s spread, but I suspect reporting took more time than today and imparted to those events gravity and composure. Today is more like a renewed Wild West where anything goes, which has been the preferred characterization of the Internet since its creation. We’ll see if the recent vote to remove Net Neutrality has the effect of restraining things. I suspect that particular move is more about a money grab (selling premium open access vs. basic limited access) than thought control, but I can only guess as to true motivations.

I happened to be traveling when the news broke of a mass shooting in Las Vegas. Happily, what news I got was delayed until actual news-gathering had already sorted basic fact from confabulation. Paradoxically, after the first wave of “what the hell just happened?” there formed a second wave of “here’s what happened,” and later a third wave of “what the hell really happened?” appeared as some rather creative interpretations were offered up for consideration. That third wave is by now quite familiar to everyone as the conspiracy wave, and surfing it feels inevitable because the second wave is often so starkly unbelievable. Various websites and shows such as snopes.com, metabunk.org, MythBusters, and Penn & Teller: Bullshit! (probably others, too) presume to settle debates. While I’m inclined to believe scientific and documentary evidence, mere argument often fails to convince me, which is troubling, to say the least.

Fending off all the mis- and disinformation, or separating signal from noise, is a full-time job if one is willing to undertake it. That used to be the mandate of the journalistic news media, at least in principle. Lots of failures on that account stack up throughout history. However, since we’re in the midst of a cultural phase dominated by competing claims to authority and the public’s retreat into ideation, the substitute worlds of extended and virtual reality become attractive alternatives to the fresh new hell we now face every morning. Tune in and check in might be what we think we’re doing, but more accurately, we tune out and check out of responsible engagement with the real world. That’s the domain of incessantly chipper morning TV shows. Moreover, we like to believe in the mythical stories we tell ourselves about ourselves, such as, for example, how privacy doesn’t matter, or that the U.S. is a free, democratic, liberal beacon of hope, or that economic value inheres in made-up currencies. It’s a battle for your attention and subscription in the marketplace of ideas. Caveat emptor.

Societies sometimes employ leveling mechanisms to keep the high and mighty from getting too, well, high and mighty or to pull them back down when they nonetheless manage to scale untenable heights. Some might insist that the U.S. breakaway from the British crown and aristocratic systems in the Revolutionary Era was, among other things, to establish an egalitarian society in accordance with liberal philosophy of the day. This is true to a point, since we in the U.S. don’t have hereditary aristocratic titles, but a less charitable view is that the Founders really only substituted the landed gentry, which to say themselves, for the tyrannical British. Who scored worse on the tyranny scale is a matter of debate, especially when modern sensibilities are applied to historical practices. Although I don’t generally care for such hindsight moralizing, it’s uncontroversial that the phrase “all men are created equal” (from the U.S. Declaration of Independence) did not then apply, for instance, to slaves and women. We’re still battling to establish equality (a level playing field) among all men and women. For SJWs, the fight has become about equality of outcome (e.g., quotas), which is a perversion of the more reasonable and achievable equality of opportunity.

When and where available resources were more limited, say, in agrarian or subsistence economies, the distance or separation between top and bottom was relatively modest. In a nonresource economy, where activity is financialized and decoupled from productivity (Bitcoin, anyone?), the distance between top and bottom can grow appallingly wide. I suspect that an economist could give a better explanation of this phenomenon than I can, but my suspicion is that it has primarily to do with fiat currency (money issued without sound backing such as precious metals), expansion of credit, and creation of arcane instruments of finance, all of which give rise to an immense bureaucracy of administrative personnel to create, manage, and manipulate them.

The U.S. tax structure of the 1950s — steep taxes levied against the highest earners — was a leveling mechanism. Whether intentionally corrective of the excesses of the Jazz Age is beyond my knowledge. However, that progressive tax structure has been dismantled (“leveled,” one might say), shifting from progressive to regressive and now to transgressive. Regressive is where more or disproportionate tax responsibility is borne by those already struggling to satisfy their basic needs. Transgressive is outright punishment of those who fail to earn enough, as though the whip functions as a spur to success. Indeed, as I mentioned in the previous blog post, the mood of the country right now is to abandon and blame those whom financial success has eluded. Though the term debtor’s prison belongs to a bygone era, we still have them, as people are imprisoned over nonviolent infractions such as parking tickets only to have heavy, additional, administrative fines and fees levied on them, holding them hostage to payment. That’s victimizing the victim, pure and simple.

At the other end of the scale, the superrich ascend a hierarchy that is absurdly imbalanced since leveling mechanisms are no longer present. Of course, disdain of the nouveau riche exists, primarily because social training does not typically accompany amassing of new fortunes, allowing many of that cohort to be amazingly gauche and intransigently proud of it (names withheld). That disdain is especially the prerogative of those whose wealth is inherited, not the masses, but is not an effective leveling mechanism. If one is rich, famous, and charming enough, indulgences for bad or criminal behavior are commonplace. For instance, those convicted of major financial crime in the past decade are quite few, whereas beneficiaries (multimillionaires) of looting of the U.S. Treasury are many. One very recent exception to indulgences is the wave of people being accused of sexual misconduct, but I daresay the motivation is unrelated to that of standard leveling mechanisms. Rather, it’s moral panic resulting from strains being felt throughout society having to do with sexual orientation and identity.

When the superrich ascend into the billionaire class, they tend to behave supranationally: buying private islands or yachts outside the jurisdiction or control of nation states, becoming nominal residents of the most advantageous tax havens, and shielding themselves from the rabble. While this brand of anarchism may be attractive to some and justified to others, detaching from social hierarchies and abandoning or ignoring others in need once one’s own fortunes are secure is questionable behavior to say the least. Indeed, those of such special character are typically focal points of violence and mayhem when the lives of the masses become too intolerable. That target on one’s back can be ignored or forestalled for a long time, perhaps, but the eventuality of nasty blowback is virtually guaranteed. That’s the final leveling mechanism seen throughout history.

Brief, uncharacteristic foray into national politics. The Senate narrowly approved a tax reform bill that’s been hawked by that shiny-suit-wearing-used-car-salesman-conman-guy over the past months as simply a big, fat tax cut. From all appearances, it won’t quite work out that way. The 479-pp. bill is available here (PDF link), including last-minute handwritten amendments. I don’t know how typical that is of legislative processes, but I doubt rushing or forcing a vote in the dead of night on an unfinished bill no one has had the opportunity to review leads to good results. Moreover, what does that say to schoolchildren about finishing one’s homework before turning it in?

Considering the tax reform bill is still a work in progress, it’s difficult to know with much certainty its effects if/when signed into law. However, summaries and snapshots of tax effects on typical American households have been provided to aid in the layperson’s grasp of the bill. This one from Mic Network Inc. (a multichannel news/entertainment network with which I am unfamiliar, so I won’t vouch for its reliability) states that the bill is widely unpopular and few trust the advance marketing of the bill:

Only 16% of Americans have said they think the plan is actually going to cut their taxes, less than half the number of people polled who think that their bill is going to go up, according to a Nov. 15 poll from Quinnipiac University.

Yet it seems the Republican-led effort will be successful, despite concerns that many middle class people could actually see their taxes rise, that social programs could suffer, that small businesses could be harmed and that a hoped-for economic boom may never materialize. [links removed]

When a change in tax law goes into effect, one good question is, “who gets help and who gets hurt?” For decades now, the answer has almost always been Reverse Robin Hood: take (or steal) from the poor and give to the rich. That’s why income inequality has increased to extreme levels commencing with the Reagan administration. The economic field of play has been consciously, knowingly tilted in favor of certain groups at the expense of others. Does anyone really believe that those in power are looking out for the poor and downtrodden? Sorry, that’s not the mood of the nation right now. Rather than assisting people who need help, governments at all levels have been withdrawing support and telling people, in effect, “you’re on your own, but first pay your taxes.” I propose we call the new tax bill Reverse Cowgirl, because if anything is certain about it, it’s that lots of people are gonna get fucked.

Here’s a familiar inspirational phrase from The Bible: the truth shall set you free (John 8:32). Indeed, most of us take it as, um, well, gospel that knowledge and understanding are unqualified goods. However, the information age has turned out to be a mixed blessing. Any clear-eyed view of the the way the world works and its long, tawdry history carries with it an inevitable awareness of injustice, inequity, suffering, and at the extreme end, some truly horrific epaisodes of groups victimizing each other. Some of the earliest bits of recorded history, as distinguished from oral history, are financial — keeping count (or keeping accounts). Today differs not so much in character as in the variety of counts being kept and the sophistication of information gathering.

The Bureau of Labor Statistics, a part of the U.S. Department of Labor, is one information clearinghouse that slices and dices available data according to a variety of demographic characteristics. The fundamental truth behind such assessments, regardless of the politics involved, is that when comparisons are made between unlike groups, say, between men and women or young and old, one should expect to find differences and indeed be rather surprised if comparisons revealed none. So the question of gender equality in the workplace, or its implied inverse, gender inequality in the workplace, is a form of begging the question, meaning that if one seeks differences, one shall most certainly find them. But those differences are not prima facie evidence of injustice in the sense of the popular meme that women are disadvantaged or otherwise discriminated against in the workplace. Indeed, the raw data can be interpreted according to any number of agendas, thus the phrase “lying with statistics,” and most of us lack the sophistication to contextualize statistics properly, which is to say, free of the emotional bias that plagues modern politics, and more specifically, identity politics.

The fellow who probably ran up against this difficulty the worst is Charles Murray in the aftermath of publication of his book The Bell Curve (1994), which deals with how intelligence manifests differently across demographic groups yet functions as the primary predictor of social outcomes. Murray is particularly well qualified to interpret data and statistics dispassionately, and in true seek-and-find fashion, differences between groups did appear. It is unclear how much his resulting prescriptions for social programs are borne out of data vs. ideology, but most of us are completely at sea wading through the issues without specialized academic training to make sense of the evidence.

More recently, another fellow caught in the crosshairs on issues of difference is James Damore, who was fired from his job at Google after writing what is being called an anti-diversity manifesto (but might be better termed an internal memo) that was leaked and then went viral. The document can be found here. I have not dug deeply into the details, but my impression is that Damore attempted a fairly academic unpacking of the issue of gender differences in the workplace as they conflicted with institutional policy only to face a hard-set ideology that is more RightThink than truth. In Damore’s case, the truth did set him free — free from employment. Even the NY Times recognizes that the Thought Police sprang into action yet again to demand that its pet illusions about society be supported rather than dispelled. These witch hunts and shaming rituals (vigilante justice carried out in the court of public opinion) are occurring with remarkable regularity.

In a day and age where so much information (too much information, as it turns out) is available to us to guide our thinking, one might hope for careful, rational analysis and critical thinking. However, trends point to the reverse: a return to tribalism, xenophobia, scapegoating, and victimization. There is also a victimization Olympics at work, with identity groups vying for imaginary medals awarded to whoever’s got it worst. I’m no Pollyanna when it comes to the notion that all men are brothers and, shucks, can’t we all just get along? That’s not our nature. But the marked indifference of the natural world to our suffering as it besets us with drought, fire, floods, earthquakes, tsunamis, hurricanes, tornadoes, and the like (and this was just the last week!) might seem like the perfect opportunity to find within ourselves a little grace and recognize our common struggles in the world rather than add to them.

Allow me to propose a hypothetical, to conduct a thought experiment if you will.

Let’s say that the powers that be, our governmental and corporate overlords, have been fully aware and convinced of impending disaster for some time, decades even. What to do with that burdensome information? How to prepare the public or themselves? Make the truth openly public and possibly spark a global panic or bury the information, denying and obfuscating when news eventually got out? Let’s say that, early on, the decision was made to bury the information and keep plodding through a few more blissfully ignorant decades as though nothing were amiss. After all, prophecies of disaster, extrapolating simple trend lines (such as population growth), were not uncommon as early as the 18th and 19th centuries. Science had made sufficient progress by the 1970s to recognize without much controversy that problems with industrial civilization were brewing and would soon overflow, overtaking our ability to maintain control over the processes we set in motion or indeed ourselves. Thus, at the intuitive level of deep culture, we initiated the ecology movement, the predecessor of environmentalism, and experienced the (first) international oil crisis. The decision to bury the prognosis for civilization (doom!) resulted in keeping a lid on things until the information swung fully into public view in the middle 2000s (the decade, not the century), thanks to a variety of scientists not among the power elite who sounded the alarms anew. At that point, obfuscation and disinformation became the dominant strategies.

Meanwhile, to keep the lights on and the store shelves stocked, the powers that be launched a campaign of massive debt spending, stealing from a future we would never reach anyway, and even dabbled at modest terraforming to forestall the worst by spraying chemicals in the atmosphere, creating global dimming. This program, like many others, were denied and made into conspiracy theories (chemtrails vs. contrails), enabling the public to ignore the obvious evidence of climate change and resulting slo-mo environmental collapse. Public uprising and outrage were easily quelled with essentially the same bread and circuses in which the Classical Romans indulged as their empire was in the midst of a protracted collapse. Modern global industrial empire will not experience the same centuries-long disintegration.

Now, I’ll admit, I don’t actually believe much of this. As with most conspiracies, this hypothetical doesn’t pass the straight-face test. Nor do the powers that be demonstrate competence sufficient to pull off even routine programs, much less extravagant ones. However, elements are undoubtedly true, such as the knowledge that energy policy and resources simply won’t meet anticipated demand with global population still swelling out of control. Neither will food production. Rather than make a difficult and questionable philosophical decision to serve the public interest by hiding the truth and keeping modern civilization going until the breaking point of a hard crash, at which point few would survive (or want to), the easy decision was probably made to ignore and obfuscate the truth, do nothing to keep the worst ravages of global industry from hastening our demise, and gather to themselves all financial resources, leaving everyone else in the lurch. The two basic options are to concern ourselves with everyone’s wellbeing over time vs. one’s own position in the short term.

In case the denial and obfuscation has worked on you, the reader of this doom blog, please consider (if you dare) this lengthy article at New York Magazine called “The Uninhabitable Earth” by David Wallace-Wells. Headings are these:

  1. “Doomsday”
  2. Heat Death
  3. The End of Food
  4. Climate Plagues
  5. Unbreathable Air
  6. Perpetual War
  7. Permanent Economic Collapse
  8. Poisoned Oceans
  9. The Great Filter

No one writes this stuff just to scare the public and get attention. Rather, it’s about telling the truth and whistle-blowing. While captains if industry and kings of the realm slumber, fattened and self-satisfied upon their beds, at least some of the rest of us recognize that the future is barrelling at us with the same indifference for human wellbeing (or the natural world) that our leaders have shown.

According to Hal Smith of The Compulsive Explainer (see my blogroll), the tragedy of our time is, simply put, failed social engineering. Most of his blog post is quoted below:

Americans, for example, have decided to let other forces manage their nation — and not let Americans themselves manage it. At least this is what I see happening, with the election of Trump. They have handed the management of their country over to a man with a collection of wacky ideas — and they feel comfortable with this. Mismanagement is going on everywhere — and why not include the government in this?

This is typical behavior for a successful society in decline. They cannot see what made them successful, has been taken too far — and is now working against them. The sensible thing for them to do is back off for awhile, analyze their situation — and ask “What is going wrong here?” But they never do this — and a collapse ensues.

In our present case, the collapse involves a global society based on Capitalism — that cannot adapt itself to a Computer-based economy. The Software ecosystem operates differently — it is based on cooperation, not competition.

Capitalism was based on just that — Capital — money making money. And it was very successful — for those it favored. Money is still important in the Computer economy — people still have to be paid. But what they are being paid for has changed — information is now being managed, something different entirely.

Hardware is still important — but that is not where the Big Money is being made. It is now being made in Computers, and their Software.

I’m sympathetic to this view but believe that a look back through history reveals something other than missed opportunities and too-slow adaptation as we fumbled our way forward, namely, repeated catastrophic failures. Such epic fails include regional and global wars, genocides, and societal collapses that rise well above the rather bland term mismanagement. A really dour view of history, taking into account more than a few truly vicious, barbaric episodes, might regard the world as a nearly continuous stage of horrors from which we periodically take refuge, and the last of these phases is drawing quickly to a close.

The breakneck speed of technological innovation and information exchange has resulted not in Fukuyama’s mistakenly exuberant “end of history” (kinda-sorta winning the Cold War but nevertheless losing the peace?) but instead an epoch where humans are frankly left behind by follow-on effects of their own unrestrained restlessness. Further, if history is a stage of horrors, then geopolitics is theater of the absurd. News reports throughout the new U.S. presidential administration, still less than 6 months in (more precisely, 161 days or 23 weeks), tell of massive economic and geopolitical instabilities threatening to collapse the house of cards with only a slight breeze. Contortions press agents and politicized news organs go through to provide cover for tweets, lies, and inanities emanating from the disturbed mind of 45 are carnival freak show acts. Admittedly, not much has changed over the previous two administrations — alterations of degree only, not kind — except perhaps to demonstrate beyond any shadow of doubt that our elected, appointed, and purchased leaders (acknowledging many paths to power) are fundamentally incompetent to deal effectively with human affairs, much less enact social engineering projects beyond the false happiness of Facebook algorithms that hide bad news. Look no further than the egregious awfulness of both presidential candidates in the last election coughed up like hairballs from the mouths of their respective parties. The aftermath of those institutional failures finds both major parties in shambles, further degraded than their already deplorable states prior to the election.

So how much worse can things get? Well, scary as it sounds, lots. The electrical grid is still working, water is still flowing to the taps, and supply lines continue to keep store shelves stocked with booze and brats for extravagant holiday celebrations. More importantly, we in the U.S. have (for now, unlike Europe) avoided repetition of any major terrorist attacks. But everyone with an honest ear to the ground recognizes our current condition as the proverbial calm before the storm. For instance, we’re threatened by the first ice-free Arctic in the history of mankind later this year and a giant cleaving off of the Larsen C Ice Shelf in Antarctica within days. In addition, drought in the Dakotas will result in a failed wheat harvest. Guy McPherson (in particular, may well be others) has been predicting for years that abrupt, nonlinear climate change when the poles warm will end the ability to grow grain at scale, leading to worldwide famine, collapse, and near-term extinction. Seems like we’re passing the knee of the curve. Makes concerns about maladaptation and failed social engineering pale by comparison.

From the not-really-surprising-news category comes a New Scientist report earlier this month that the entire world was irradiated by follow-on effects of the Fukushima disaster. Perhaps it’s exactly as the article states: the equivalent of one X-ray. I can’t know with certainty, nor can bupkis be done about it by the typical Earth inhabitant (or the atypical inhabitant, I might add). Also earlier this month, a tunnel collapse at the Dept. of Energy’s Hanford nuclear waste storage site in Washington State gave everyone a start regarding possible or potential nearby release of radiation. Similar to Fukushima, I judge there is little by way of trust regarding accurate news or disclosure and fuck all anyone can do about any of it.

I’m far too convinced of collapse by now to worry too much about these Tinkerbells, knowing full well that what’s to come will be worse by many magnitudes of order when the firecrackers start popping due to inaction and inevitability. Could be years or decades away still; but as with other aspects of collapse, who knows precisely when? Risky energy plant operations and nuclear waste disposal issues promise to be with us for a very long time indeed. Makes it astonishing to think that we plunged full-steam ahead without realistic (i.e., politically acceptable) plans to contain the problems before creating them. Further, nuclear power is still not economically viable without substantial government subsidy. The likelihood of abandonment of this technological boondoggle seems pretty remote, though perhaps not as remote as the enormous expense of decommissioning all the sites currently operating.

These newsbits and events also reminded me of the despair I felt in 1986 on the heels of the Chernobyl disaster. Maybe in hindsight it’s not such a horrible thing to cede entire districts to nature for a period of several hundred years as what some have called exclusion or sacrifice zones. Absent human presence, such regions demonstrate remarkable resilience and profundity in a relatively short time. Still, it boggles the mind, doesn’t it, to think of two exclusion zones now, Chernobyl and Fukushima, where no one should go until, at the very least, the radioactive half-life has expired? Interestingly, that light at the end of the tunnel, so to speak, seems to be telescoping even farther away from the date of the disaster, a somewhat predictable shifting of the goalposts. I’d conjecture that’s because contamination has not yet ceased and is actually ongoing, but again, what do I know?

On a lighter note, all this also put me in mind of the hardiness of various foodstuffs. God knows we consume loads of crap that can hardly be called food anymore, from shelf-stable fruit juices and bakery items (e.g., Twinkies) that never go bad to not-cheese used by Taco Bell and nearly every burger joint in existence to McDonald’s burgers and fries that refuse to spoil even when left out for months to test that very thing. It give me considerable pause to consider that foodstuff half-lives have been radically and unnaturally extended by creating abominable Frankenfoods that beggar the imagination. For example, strawberries and tomatoes used to be known to spoil rather quickly and thus couldn’t withstand long supply lines from farm to table; nor were they available year round. Rather sensibly, people grew their own when they could. Today’s fruits and veggies still spoil, but interventions undertaken to extend their stability have frequently come at the expense of taste and nutrition. Organic and heirloom markets have sprung up to fill those niches, which suggest the true cost of growing and distributing everyday foods that will not survive a nuclear holocaust.

I pull in my share of information about current events and geopolitics despite a practiced inattention to mainstream media and its noisome nonsense. (See here for another who turned off the MSM.) I read or heard somewhere (can’t remember where) that most news outlets and indeed most other media, to drive traffic, now function as outrage engines, generating no small amount of righteousness, indignation, anger, and frustration at all the things so egregiously wrong in our neighborhoods, communities, regions, and across the world. These are all negative emotions, though legitimate responses to various scourges plaguing us currently, many of which are self-inflicted. It’s enough aggregate awfulness to draw people into the street again in principled protest, dissent, and resistance; it’s not yet enough to effect change. Alan Jacobs comments about outrage engines, noting that sharing via retweets is not the same as caring. In the Age of Irony, a decontextualized “yo, check this out!” is nearly as likely to be interpreted as support rather than condemnation (or mere gawking for entertainment value). Moreover, pointing, linking, and retweeting are each costless versions of virtue signaling. True virtue makes no object of publicity.

So where do I get my outrage quotient satisfied? Here is a modest linkfest, in no particular order, of sites not already on my blogroll. I don’t habituate these sites daily, but I drop in, often skimming, enough to keep abreast of themes and events of importance. (more…)

First, a few reminders:

  • The United States has been in an undeclared state of war for 15 years, the longest in U.S. history and long enough that young people today can say legitimately, “we’ve always been at war with Oceania.” The wars encompass the entirety of both terms of the Obama Administration.
  • The inciting events were attacks on U.S. soil carried out on September 11, 2001 (popularly, 9/11), which remain shrouded in controversy and conspiracy despite the official narrative assigning patsy blame to al-Qaida operating in Afghanistan and Iraq.
  • On the heels of the attacks, the Bush Administration commenced a propaganda campaign to sell invasion and regime change in those two countries and, over widespread public protest, went ahead and launched preemptive wars, ostensibly because an existential threat existed with respect to weapons of mass destruction (WMDs) possessed by Iraq in particular.
  • The propaganda campaign has since been revealed to have been cooked up and untrue, yet it buffaloed a lot of people into believing (even to this day) that Iraq was somehow responsible for 9/11.
  • Our preemptive wars succeeded quickly in toppling governments and capturing (and executing) their leaders but immediately got bogged down securing a peace that never came.
  • Even with an embarrassing mismatch of force, periodic troop surges and draw downs, trillions of dollars wasted spent prosecuting the wars, and incredible, pointless loss of life (especially on the opposing sides), our objective in the Middle East (other than the oil, stupid!) has never been clear. The prospect of final withdrawal is nowhere on the horizon.

Continuous war — declared or merely waged — has been true of the U.S. my whole life, though one would be hard pressed to argue that it truly represents an immediate threat to U.S. citizens except to those unlucky enough to be deployed in war zones. Still, the monkey-on-the-back is passed from administration to administration. One might hope, based on campaign rhetoric, that the new executive (45) might recognize continuous war as the hot potato it is and dispense with it, but the proposed federal budget, with its $52 billion increase in military spending (+10% over 2016), suggests otherwise. Meanwhile, attention has been turned away from true existential threats that have been bandied about in the public sphere for at least a decade: global warming and climate change leading to Near-Term Extinction (NTE). Proximal threats, largely imagined, have absorbed all our available attention, and depending on whom one polls, our worst fears have already been realized.

The 20th and 21st centuries (so far) have been a series of “hot” wars (as distinguished from the so-called Cold War). Indeed, there has scarcely been a time when the U.S. has not been actively engaged fighting phantoms. If the Cold War was a bloodless, ideological war to stem the nonexistent spread of communism, we have adopted and coopted the language of wartime to launch various rhetorical wars. First was LBJ’s War on Poverty, the only “war” aimed at truly helping people. Nixon got into the act with his War on Drugs, which was punitive. Reagan expanded the War on Drugs, which became the War on Crime. Clinton increased the punitive character of the War on Crime by instituting mandatory minimum sentencing, which had the side effect of establishing what some call the prison-industrial complex, inflating the incarceration rate of Americans to the point that the U.S. is now ranked second in the world behind the Seychelles (!), a ranking far, far higher than any other industrialized nation.

If U.S. authoritarians hadn’t found enough people to punish or sought to convince the public that threats exist on all sides, requiring constant vigilance and a massive security apparatus including military, civil police, and intelligence services comprised of 16 separate agencies (of which we know), Bush coined and declared the War on Terror aimed at punishing those foreign and domestic who dare challenge U.S. hegemony in all things. It’s not called a national security state for nuthin’, folks. I aver that the rhetorical War on Poverty has inverted and now become a War on the Poverty-Stricken. De facto debtors’ prisons have reappeared, predatory lending has become commonplace, and income inequality grows more exaggerated with every passing year, leaving behind large segments of the U.S. population as income and wealth pool in an ever-shrinking number of hands. Admittedly, the trend is global.

At some point, perhaps in the 1960s when The Establishment (or more simply, The Man) became a thing to oppose, the actual Establishment must have decided it was high time to circle the wagons and protect its privileges, essentially going to war with (against, really) the people. Now five decades on, holders of wealth and power demonstrate disdain for those outside their tiny circle, and our the government can no longer be said with a straight face to be of, by, and for the people (paraphrasing the last line of Lincoln’s Gettysburg Address). Rather, the government has been hijacked and turned into something abominable. Yet the people are strangely complicit, having allowed history to creep along with social justice in marked retreat. True threats do indeed exist, though not the ones that receive the lion’s share of attention. I surmise that, as with geopolitics, the U.S. government has brought into being an enemy and conflict that bodes not well for its legitimacy. Which collapse occurs first is anyone’s guess.