Archive for the ‘Industrial Collapse’ Category

I don’t have the patience or expertise to prepare and offer a detailed political analysis such as those I sometimes (not very often) read on other blogs. Besides, once the comments start filling up at those sites, every possible permutation is trotted out, muddying the initial or preferred interpretation with alternatives that make at least as much sense. They’re interesting brainstorming sessions, but I have to wonder what is accomplished.

My own back-of-the-envelope analysis is much simpler and probably no closer to (or farther from) being correct, what with everything being open to dispute. So the new POTUS was born in 1946, which puts the bulk of his boyhood in the 1950s, overlapping with the Eisenhower Administration. That period has lots of attributes, but the most significant (IMO), which would impact an adolescent, was the U.S. economy launching into the stratosphere, largely on the back of the manufacturing sector (e.g., automobiles, airplanes, TVs, etc.), and creating the American middle class. The interstate highway system also dates from that decade. Secondarily, there was a strong but misplaced sense of American moral leadership (one might also say authority or superiority), since we took (too much) credit for winning WWII.

However, it wasn’t great for everyone. Racism, misogyny, and other forms of bigotry were open and virulent. Still, if one was lucky to be a white, middle class male, things were arguably about as good as they would get, which many remember rather fondly, either through rose-colored glasses or otherwise. POTUS as a boy wasn’t middle class, but the culture around him supported a worldview that he embodies even now. He’s also never been an industrialist, but he is a real estate developer (some would say slumlord) and media figure, and his models are taken from the 1950s.

The decade of my boyhood was the 1970s, which were the Nixon, Ford, and Carter Administrations. Everyone could sense the wheels were already coming off the bus, and white male entitlement was far diminished from previous decades. The Rust Belt was already a thing. Like children from the 1950s forward, however, I spent a lot of time in front of the TV. Much of it was goofy fun such as Gilligan’s Island, The Brady Bunch, and interestingly enough, Happy Days. It was innocent stuff. What are the chances that, as a boy plopped in front of the TV, POTUS would have seen the show below (excerpted) and taken special notice considering that the character shares his surname?

Snopes confirms that this a real episode from the TV show Trackdown. Not nearly as innocent as the shows I watched. The coincidences that the character is a con man, promises to build a wall, and claims to be the only person who can save the town are eerie, to say the least. Could that TV show be lodged in the back of POTUS’ brain, along with so many other boyhood memories, misremembered and revised the way memory tends to do?

Some have said that the great economic expansion of the 1950s and 60s was an anomaly. A constellation of conditions configured to produce an historical effect, a Golden Era by some reckonings, that cannot be repeated. We simply cannot return to an industrial or manufacturing economy that had once (arguably) made America great. And besides, the attempt would accelerate the collapse of the ecosystem, which is already in free fall. Yet that appears to be the intention of POTUS, whose early regression to childhood is a threat to us all.

Anthropologists, pundits, armchair cultural critics (like me), and others sometimes offer an aspect or characteristic, usually singular, that separates the human species from other animals. (Note: humans are animals, not the crowning creation of god in his own image, the dogma of major religions.) Typical singular aspects include tool use (very early on, fire), language, agriculture, self-awareness (consciousness), and intelligence, that last including especially the ability to conceptualize time and thus remember and plan ahead. The most interesting candidate suggested to me is our ability to kill from a distance. Without going into a list of things we don’t think we share with other species but surprisingly do, it interests me that none other possesses the ability to kill at a distance (someone will undoubtedly prove me wrong on this).

Two phrases spring to mind: nature is red in tooth and claw (Tennyson) and human life is nasty, brutish, and short (Hobbes). Both encapsulate what it means to have to kill to eat, which is hardly unique to animals. All sorts of plants, insects, and microorganisms embed themselves in hosts, sometimes killing the host and themselves. Symbiotic relationships also exist. The instance that interests me, though, is the act of killing in the animal kingdom that requires putting one’s own body at risk in life-or-death attack. Examples falling short of killing abound, such as intimidation to establish hierarchy, but to eat, an animal must kill its prey.

Having watched my share of historical fiction (pre-1800, say, but especially sword-and-sandal and medieval epics) on the TeeVee and at the cinema, the dramatic appeal of warring armies slamming into each other never seems to get old. Fighting is hand-to-hand or sword-to-sword, which are tantamount to the same. Archer’s arrows, projectiles launched from catapults and trebuchets, thrown knives, spears, and axes, and pouring boiling oil over parapets are killing from a relatively short distance, but the action eventually ends up being very close. The warrior code in fighting cultures honors the willingness to put oneself in harm’s way, to risk one’s own body. Leaders often exhibit mutual respect and may even share some intimacy. War may not be directly about eating, since humans are not cannibals under most circumstances; rather, it’s usually about control of resources, so secondarily about eating by amassing power. Those historical dramas often depict victors celebrating by enjoying lavish feasts.

Modern examples of warfare and killing from a distance make raining down death from above a bureaucratic action undertaken with little or no personal risk. Artillery, carpet bombing from 20,000 feet, drone strikes (controlled from the comfort of some computer lab in the Utah desert), and nuclear bombs are the obvious examples. No honorable warrior code attaches to such killing. Indeed, the chain of command separates the execution of kill orders from moral responsibility — probably a necessary disconnect when large numbers of casualties (collateral damage, if one prefers the euphemism) can be expected. Only war criminals, either high on killing or banally impervious to empathy and compassion, would dispatch hundreds of thousands at a time.

If killing from a distance is in most cases about proximity or lack thereof, one further example is worth mentioning: killing across time. While most don’t really conceptualize the space-time continuum as interconnected, the prospect of choices made today manifesting in megadeath in the foreseeable future is precisely the sort of bureaucratized killing from a distance that should be recognized and forestalled. Yet despite our supposed intellectual superiority over other species, we cannot avoid waging war, real and rhetorical, to control resources and narratives that enable us to eat. Eating the future would be akin to consuming seed corn, but that metaphor is not apt. Better perhaps to say that we’re killing the host. We’re embedded in the world, as indeed is everything we know to be alive, and rely upon the profundity of the biosphere for survival. Although the frequent charge is that humanity is a parasite or has become as cancer on the world, that tired assessment, while more accurate than not, is a little on the nose. A more charitable view is that, as a species, humanity, as the apex predator, has expanded its habitat to include the entire biosphere, killing to eat, and is slowly consuming and transforming it into a place uninhabitable by us, just as a yeast culture consumes its medium and grows to fill the space before dying all at once. So the irony or Pyrrhic victory is that we while we may fatten ourselves (well, some of us) in the short term, we have also created conditions leading to our own doom. Compared to other species whose time on Earth lasted tens of millions of years, human life on Earth turns out to be exactly what Hobbes said: nasty, brutish, and short.

Back in the day, I studied jazz improvisation. Like many endeavors, it takes dedication and continuous effort to develop the ear and learn to function effectively within the constraints of the genre. Most are familiar with the most simple form: the 12-bar blues. Whether more attuned to rhythm, harmony, lyrics, or structure doesn’t much matter; all elements work together to define the blues. As a novice improviser, structure is easy to grasp and lyrics don’t factor in (I’m an instrumentalist), but harmony and rhythm, simple though they may be to understand, are formidable when one is making up a solo on the spot. That’s improvisation. In class one day, after two passes through the chord changes, the instructor asked me how I thought I had done, and I blurted out that I was just trying to fill up the time. Other students heaved a huge sigh of recognition and relief: I had put my thumb on our shared anxiety. None of us were skilled enough yet to be fluent or to actually have something to say — the latter especially the mark of a skilled improvisor — but were merely trying to plug the whole when our turn came.

These days, weekends feel sorta the same way. On Friday night, the next two days often feel like a yawning chasm where I plan what I know from experience will be an improvisation, filling up the available time with shifting priorities, some combination of chores, duties, obligations, and entertainments (and unavoidable bodily functions such as eating, sleeping, etc.). Often enough I go back to work with stories to tell about enviable weekend exploits, but just I often have a nagging feeling that I’m still a novice with nothing much to say or contribute, just filling up the time with noise. And as I contemplate what years and decades may be left to me (if the world doesn’t crack up first), the question arises: what big projects would I like to accomplish before I’m done? That, too, seems an act of improvisation.

I suspect recent retirees face these dilemmas with great urgency until they relax and decide “who cares?” What is left to do, really, before one finally checks out? If careers are completed, children are raised, and most of life’s goals are accomplished, what remains besides an indulgent second childhood of light hedonism? Or more pointedly, what about one’s final years keeps it from feeling like quiet desperation or simply waiting for the Grim Reaper? What last improvisations and flourishes are worth undertaking? I have no answers to these questions. They don’t press upon me just yet with any significance, and I suffered no midlife crisis (so far) that would spur me to address the questions head on. But I can feel them gathering in the back of my mind like a shadow — especially with the specters of American-style fascism, financial and industrial collapse, and NTE looming.

Predictions are fool’s errands. Useful ones, anyway. The future branches in so many possible directions that truly reliable predictions are banal, such as the sun will rise in the east, death, and taxes. (NTE is arguably another term for megadeath, but I gotta reinforce that prediction to keep my doomer bonafides.) Now only a few days prior to the general election finds me anxious that the presidential race is still too close to call. More than a few pundits say that Donald Trump could actually win. At the same time, a Hillary Clinton win gives me no added comfort, really. Moreover, potential squabbles over the outcome threaten to turn the streets into riot zones. I had rather expected such disruptions during or after the two nominating conventions, but they settled on their presumptive nominees without drama.

Polls are often predictive, of course, and despite their acknowledged margins of error, they typically forecast results with enough confidence that many voters don’t bother to vote, safe in the assumption that predicted results (an obvious oxymoron) make moot the need to actually cast one’s vote. (The West Coast must experience this phenomenon more egregiously than the East Coast, except perhaps for California’s rather large population and voting power. Has Hawaii ever mattered?) For that reason alone, I’d like to see a blackout on polling in the weeks leading up to an election (2–3 ought to do), including election day. This would allow us to avoid repeating the experience of the Chicago Daily Tribune publishing the headline “Dewey Defeats Truman” back in 1948.

Analysis of voting patterns and results also dissuades voters from considering anything other than a strategic vote for someone able to actually win, as opposed to supporting worthy candidates polling far enough behind they don’t stand a chance of winning, thus reinforcing a two-party system no one really likes because it keeps delivering supremely lousy candidates. Jesse Ventura, having defied the polls and been elected to office as an independent, has been straightforward about his disdain for the notion that voting outside the two main parties is tantamount to throwing away one’s vote. A related meme is that by voting for independent Ralph Nader in 2000, the Democratic vote was effectively split, handing the win (extraordinarily close and contestable though it was) to George Bush. My thinking aligns with Jesse Ventura, not with those who view votes for Ralph Nader as betrayals.

If the presidential race is still too close for comfort, Michael Moore offers a thoughtful explanation how Trump could win:

This excerpt from Moore’s new film TrumpLand has been taken out of context by many pro-Trump ideologues. I admit the first time I saw it I was unsure whether Moore supports Trump. Additional remarks elsewhere indicate that he does not. The spooky thing is that as emotional appeals go, it’s clear that Trump connects with the people powerfully. But Moore is right about another thing: to vote for Trump is really a giant “fuck you” to the establishment, which won’t end well.

In what has become a predictable status quo, President Obama recently renewed our official state of emergency with respect to the so-called War on Terror. It’s far too late to declare a new normal; we’ve been in this holding pattern for 16 years now. The article linked above provides this useful context:

There are now 32 states of national emergency pending in the United States, with the oldest being a 1979 emergency declared by President Jimmy Carter to impose sanctions during the Iran hostage crisis. Most are used to impose economic sanctions — mostly as a formality, because Congress requires it under the International Emergency Economic Powers Act.

In his term in office, Obama has declared 13 new emergencies, continued 21 declared by his predecessors and revoked just two, which imposed sanctions on Liberia and Russia.

Pro forma renewal of multiple states of national emergency is comparable to the 55-year-old U.S. embargo against Cuba, due for reauthorization next month, though indications are that the embargo may finally be relaxed or deauthorized. Both are examples of miserably failed policy, but they confer a semblance of power on the executive branch. Everyone knows by now that no one relinquishes power willingly, so Obama, like chief executives before him, keeps on keeping on ad nauseum.

Considering Obama’s credential as a Constitutional scholar, relatively unique among U.S. presidents, one might expect him to weigh his options with greater circumspection and with an eye toward restoring suspended civil liberties. However, he has shown little interest in doing so (as far as I know). In combination with the election only a couple months away, the U.S. appears to be in a position similar to Germany in 1932 — ready and willing to elect a despot (take your pick …) and continue its slide into fascism. Can’t even imagine avoiding that outcome now.

The surprising number of ongoing emergencies makes me point to James Howard Kunstler and his book The Long Emergency (2006). Though I haven’t read the book (I’m a failed doomer, I suppose), my understanding is that his prediction of a looming and lingering emergency is based on two intertwined factors currently playing out in geopolitics: peak oil and global warming. (“Climate change” is now preferred over “global warming.”) Those two dire threats (and the California drought) have faded somewhat from the headlines, partially due to fatigue, replaced primarily by terrorism and economic stresses, but the dangers never went away. Melting icecaps and glaciers are probably the clearest incontrovertible indications of anthropogenic global warming, which is poised to trigger nonlinear climate change and hasten the Sixth Extinction. We don’t know when, precisely, though time is growing short. Similarly, reports on energy production and consumption are subject to considerable falsification in the public sphere, making it impossible to know just how close in time we are to a new energy crisis. That inevitability has also been the target of a disinformation campaign, but even a rudimentary understanding of scientific principles is sufficient to enable clear thinkers to penetrate the fog.

I have no plans to return to doom blogging with any vigor. One emergency stacked upon the next, ready to collapse in a cascade of woe, has defeated me, and I have zero expectation that any real, meaningful response can be formulated and executed, especially while we are distracted with terrorism and creeping fascism.

I hate to sound a conspiratorial note, and you’re free to disregard what follows, but it seems worthwhile to take further notice of the rash of violence last week.

In a commentary by John Whitehead at The Rutherford Institute, blame for what Whitehead calls “America’s killing fields” is laid at the feet of a variety of entities, including numerous elected officials and taxpayer-funded institutions. The more important quote appearing right at the top is this:

We have long since passed the stage at which a government of wolves would give rise to a nation of sheep. As I point out in my book Battlefield America: The War on the American People, what we now have is a government of psychopaths that is actively breeding a nation of psychopathic killers. [links redacted]

While this may read as unsupported hyperbole to some, I rather suspect Whitehead tells a truth hidden in plain sight — one we refuse to acknowledge because it’s so unsavory. Seeing that Whitehead gave it book-length consideration, so I’m inclined to grant his contention. One can certainly argue about intent, objectives, mechanisms, and techniques. Those are open to endless interpretation. I would rather concentrate on results, which speak for themselves. The fact is that in the U.S., Western Europe, and the Middle East, a growing number of people are in effect wind-up toys being radicalized and set loose. Significantly, recent perpetrators of violence are not only the disenfranchised but also police, current and former military, politicians, and pundits whose mindsets are not directed to diplomacy but instead establish “taking out the enemy” as the primary response to conflict. The enemy is also being redefined irrationally to include groups identified by race, religion, vocation, political persuasion, etc. (always has been, in fact, though the more virulent manifestations were driven underground for a time).

Childhood wind-up toys are my chosen metaphor because they’re mindless, pointless devices that are energized, typically by tightening a spring, and released for idle entertainment to move around and bump into things harmlessly until they sputter out. Maniacal mass killers “bump into” targets selected randomly via simple proximity to some venue associated with the killer’s pet peeve, so victims are typically in the wrong place at the wrong time. Uniformed police might be the exception. One might ask who or what is doing the winding of the spring. The pull quote above says it’s a government of psychopaths breeding yet more psychopaths. That is certainly true with respect to the ruling classes — what used to be the aristocracy in older cultures but now is more nearly a kleptocracy in the U.S. — and members of a monstrous security apparatus (military, civil police, intelligence services, etc.) now that the U.S. has effectively become a garrison state. Self-reinforcing structures have hardened over time, and their members perpetuate them. I’ve even heard suspicions that citizens are being “chipped,” that is, programmed in the sense of psyops to explode into mayhem with unpredictable certainty, though for what purpose I can only imagine.

The simpler explanation that makes more sense to me is that our culture is crazy-making. We no longer function well in a hypercomplex world — especially one so overloaded with information — without losing our grounding, our grip on truth, meaning, and value, and going mad. Contemporary demands on the nervous system have outstripped biological adaptation, so we respond to constant strain and stress with varying levels of dysfunction. No doubt some folks handle their difficulties better than others; it’s the ones who snap their springs who are of grave concern these days. Again, the mechanism isn’t all that important, as the example from Nice, France, demonstrates. Rather, it’s about loss of orientation that allows someone to rationalize killing a bunch of people all at once as somehow a good idea. Sadly, there is no solution so long as our collective attention is trained on the wrong things, perpetuating a network of negative feedback loops that makes us all loopy and a few of us highly dangerous. Welcome to the asylum.

An enduring trope of science fiction is naming of newly imagined gadgets and technologies (often called technobabble with a mixture of humor and derision), as well as naming segments of human and alien societies. In practice, that means renaming already familiar things to add a quasi-futuristic gleam, and it’s a challenge faced by every story that adopts an alternative or futuristic setting: describing the operating rules of the fictional world but with reference to recognizable human characteristics and institutions. A variety of recent Young Adult (YA) fiction has indulged in this naming and renaming, some of which have been made into movies, mostly dystopic in tone, e.g., the Hunger Games tetralogy, the Twilight saga, the Harry Potter series, the Maze Runner, and the Divergent trilogy. (I cite these because, as multipart series, they are stronger cultural touchstones, e.g., Star Wars, than similar once-and-done adult cinematic dystopias, e.g., Interstellar and ElyseumStar Trek is a separate case, considering how it has devolved after being rebooted from its utopian though militaristic origins into a pointless series of action thrillers set in space.) Some exposition rises to the level of lore but is mostly mere scene-setting removed slightly from our own reality. Similar naming schemes are used in cinematic universes borne out of comic books, especially character names, powers, and origins. Because comic book source material is extensive, almost all of it becomes lore, which is enjoyed by longtime children initiates into the alternate universes created by the writers and illustrators but mildly irritating to adult moviegoers like me.

History also has names for eras and events sufficiently far back in time for hindsight to provide a clear vantage point. In the U.S., we had the Colonial Era, the Revolutionary Period, The Frontier Era and Wild West, the Industrial/Mechanical Age, Modernism, and Postmodernism, to name a few but by no means all. Postmodernism is already roughly 40 years old, yet we have not yet named the era in which we now live. Indeed, because we’re the proverbial fish inside the fishbowl, unable to recognize the water in which we swim, the contemporary moment may have no need of naming, now or at any given time. That task awaits those who follow. We have, however, given names to the succession of generations following the Baby Boom. How well their signature characteristics fit their members is the subject of considerable debate.

As regular readers of this blog already know, I sense that we’re on the cusp of something quite remarkable, most likely a hard, discontinuous break from our recent past. Being one of the fish in the bowl, I probably possess no better understanding of our current phase of history than the next. Still, if had to choose one word to describe the moment, it would be dissolution. My 4-part blog post about dissolving reality is one attempt to provide an outline. A much older post called aged institutions considers the time-limited effectiveness of products of human social organization. The grand question of our time might be whether we are on the verge of breaking apart or simply transitioning into something new — will it be catastrophe or progress?

News this past week of Britain’s exit from the European Union may be only one example of break-up vs. unity, but the drive toward secession and separatism (tribal and ideological, typically based on bogus and xenophobic identity groups constantly thrown in our faces) has been gaining momentum even in the face of economic globalization (collectivism). Scotland very nearly seceded from the United Kingdom last year; Quebec has had multiple referenda about seceding from Canada, none yet successful; and Vermont, Texas, and California have all flirted with secession from the United States. No doubt some would argue that such examples of dissolution, actual or prospective, are actually transitional, meaning progressive. And perhaps they do in fact fulfill the need for smaller, finer, localized levels of social organization that many have argued are precisely what an era of anticipated resource scarcity demands. Whether what actually manifests will be catastrophe (as I expect it will) is, of course, what history and future historians will eventually name.

The question comes up with some regularity: “Where were you when …?” What goes in place of the ellipsis has changed with the generations. For my parents, it was the (first) Kennedy assassination (1963). For my siblings and chronological peers, it was first the Three Mile Island accident (1979) and then the Challenger disaster (1986). For almost everyone since, it’s been the September 11 attacks (2001), though a generation lacking memory of those events is now entering adulthood. These examples are admittedly taken from mainstream U.S. culture. If one lives elsewhere, it might be the Mexico City earthquake (1985), the Chernobyl disaster (1986), the Indonesian tsunami (2004), the Haitian earthquake (2010), the Fukushima disaster (2011), or any number of other candidates populating the calendar. Even within the U.S., other more local events might take on greater significance, such as Hurricane Katrina (2005) along the Gulf Coast or Hurricane Iniki (1992) in Hawaii, the latter of which gives September 11 a very different meaning for those who suffered through it.

What these events all have in common is partially a loss of innocence (particularly the man-made disasters) and a feeling of suspended animation while events are sorted out and news is processed. I remember with clarity how the TV news went into full disaster mode for the Challenger disaster, which proved to be the ridiculous template for later events, including 9/11. Most of the coverage was denial of the obvious and arrant speculation, but the event itself was captivating enough that journalists escaped wholesale condemnation they plainly deserved. The “where were you?” question is usually answered with the moment one became aware of some signal event, such as “I was on the bus” or “I was eating dinner.” News vectors changed dramatically from 1986 to 2001, as two relatively arbitrary points of comparison within my lifetime. Being jarred out of complacency and perceiving the world suddenly as a rather dangerous place hardly expresses the weird wait-and-see response most of us experience in the wake of disaster.

Hurricanes typically come with a narrow warning, but other events strike without clear expectation — except perhaps in terms of their inevitability. That inevitability informs expectations of further earthquakes (e.g., San Andreas and New Madrid faults) and volcanic eruptions (e.g., the Yellowstone supervolcano), though the predictive margin of error can be hundreds or even thousands of years. My more immediate concern is with avoidable man-made disasters that are lined up to fall like a series of dominoes. As with natural disasters, we’re all basically sitting ducks, completely vulnerable to what armed mayhem may arise. But rather than enter into suspended animation in the wake of events, current political, financial, and ecological conditions find me metaphorically ducking for cover in expectation of the inevitable blow(s). Frankly, I’ve been expecting political and financial crack-ups for years, and current events demonstrate extremely heightened risk. (Everyone seems to be asking which will be worse: a Trump or Clinton presidency? No one believes either candidate can guide us successfully through the labyrinth.) A tandem event (highly likely, in my view) could easily trigger a crisis of significant magnitude, given the combination of violent hyperbole and thin operational tolerance for business as usual. I surmise that anyone who offers the line “may you live in interesting times” has a poor understanding of what’s truly in store for us. What happens with full-on industrial collapse is even harder to contemplate.

Today is the 10-year anniversary of the opening of this blog. As a result, there is a pretty sizeable backblog should anyone decide to wade in. As mentioned in my first post, I only opened this blog to get posting privileges at a group blog I admired because it functioned more like a discussion than a broadcast. The group blog died of attrition years ago, yet here I am 10 years later still writing my personal blog (which isn’t really about me).

Social media lives and dies by the numbers, and mine are deplorable. Annual traffic has ranged from about 6,800 to about 12,500 hits, much of which I’m convinced is mere background noise and bot traffic. Cumulative hits number about 90,140, and unique visitors are about 19,350, neither of which is anything to crow about for a blog of this duration. My subscriber count continues to climb pointlessly, now resting at 745. However, I judge I might have only a half dozen regular readers and perhaps half again as many commentators. I’ve never earned a cent for my effort, nor am I likely to ever put up a Patreon link or similar goad for donations. All of which only demonstrate that almost no one cares what I have to write about. C’est la vie. I don’t write for that purpose and frankly wouldn’t know what to write about if I were trying to drive numbers.

So if you have read my blog, what are some of the thing you might have gathered from me? Here’s an incomplete synopsis:

  • Torture is unspeakably bad. History is full of devices, methodologies, and torturers, but we learned sometime in the course of two 20th-century world wars that nothing justifies it. Nevertheless, it continues to occur with surprising relish, and those who still torture (or want to) are criminally insane.
  • Skyscrapers are awesomely tall examples of technical brilliance, exuberance, audacity, and hubris. Better expressions of techno-utopian, look-mom-no-hands, self-defeating narcissism can scarcely be found. Yet they continue to be built at a feverish pace. The 2008 financial collapse stalled and/or doomed a few projects, but we’re back to game on.
  • Classical music, despite record budgets for performing ensembles, has lost its bid for anything resembling cultural and artistic relevance by turning itself into a museum (performing primarily works of long-dead composers) and abandoning emotional expression in favor of technical perfection, which is probably an accurate embodiment of the spirit of the times. There is arguably not a single living composer who has become a household name since Aaron Copland, who died in 1990 but was really well-known in the 1940s and 50s.
  • We’re doomed — not in any routine sense of the word having to do with individual mortality but in the sense of Near-Term (Human) Extinction (NTE). The idea is not widely accepted in the least, and the arguments are too lengthy to repeat (and unlikely to convince). However, for those few able to decipher it, the writing is on the wall.
  • American culture is a constantly moving target, difficult to define and describe, but its principal features are only getting uglier as time wears on. Resurgent racism, nationalism, and misogyny make clear that while some strides have been made, these attitudes were only driven underground for a while. Similarly, colonialism never really died but morphed into a new version (globalization) that escapes criticism from the masses, because, well, goodies.
  • Human consciousness — another moving target — is cratering (again) after 3,000–5,000 years. We have become hollow men, play actors, projecting false consciousness without core identity or meaning. This cannot be sensed or assessed easily from the first-person perspective.
  • Electronic media makes us tools. The gleaming attractions of sterile perfection and pseudo-sociability have hoodwinked most of the public into relinquishing privacy and intellectual autonomy in exchange for the equivalent of Huxley’s soma. This also cannot be sensed or assessed easily from the first-person perspective.
  • Electoral politics is a game played by the oligarchy for chumps. Although the end results are not always foreseeable (Jeb!), the narrow range of options voters are given (lesser of evils, the devil you know …) guarantees that fundamental change in our dysfunctional style of government will not occur without first burning the house down. After a long period of abstention, I voted in the last few elections, but my heart isn’t really in it.
  • Cinema’s infatuation with superheros and bankable franchises (large overlap there) signals that, like other institutions mentioned above, it has grown aged and sclerotic. Despite large budgets and impressive receipts (the former often over $100 million and the latter now in the billions for blockbusters) and considerable technical prowess, cinema has lost its ability to be anything more than popcorn entertainment for adolescent fanboys (of all ages).

This is admittedly a pretty sour list. Positive, worthwhile manifestations of the human experience are still out there, but they tend to be private, modest, and infrequent. I still enjoy a successful meal cooked in my own kitchen. I still train for and race in triathlons. I still perform music. I still make new friends. But each of these examples is also marred by corruptions that penetrate everything we do. Perhaps it’s always been so, and as I, too, age, I become increasingly aware of inescapable distortions that can no longer be overcome with innocence, ambition, energy, and doublethink. My plan is to continue writing the blog until it feels like a burden, at which point I’ll stop. But for now, there’s too much to think and write about, albeit at my own leisurely pace.

A long while back (8 years ago), I drew attention to a curious bit of rhyming taking place in the world of architecture: the construction of skyscrapers that twist from base to top (see also here). I even suggested that one per city was needed, which seems to be slowly manifesting. Back then, the newest installment was the Infinity Tower, now fully built and known as the Cayan Tower. The doomed planned Chicago Spire has yet to get off the ground. Another incarnation of the basic twisting design is the Evolution Tower in Moscow, completed in 2014 (though I only just learned about it):

0_1c66c4_836e7329_-2-xxxl

There are plenty more pics at the Skyscraper page devoted to this building.

News of this development comes to me by way of James Howard Kunstler’s Eyesore of the Month feature at his website. I draw attention to Kunstler because he is far better qualified to evaluate and judge architecture than am I, even though most of his remarks are disparagement. Kunstler and I share both aesthetic and doomer perspectives on stunt architecture, and the twisting design seems to be one faddish way to avoid the boxy, straight-line approach to supertall buildings that dominated for some fifty years. Indeed, many buildings of smaller stature now seek that same avoidance, which used to be accomplished via ornamentation but is now structural. Such designs and construction are enabled by computers, thought it remains to be seen how long maintenance and repair can be sustained in an era of diminishing financial resources. (Material resources are a different but related matter, but these days, almost no one bothers with anything without financial incentive or reward.)

When the last financial collapse occurred in 2008 (extending into 2009 with recovery since then mostly faked), lots of projects were mothballed. I note, however, that Chicago has many new projects underway, and I can only surmise that other skylines are similarly full of cranes signalling the return of multibillion-dollar construction projects aimed at the well-off. Mention of crumbling infrastructure has been ongoing for decades now. Here’s one recent example. Yet attention and funding seems to flow in the direction of projects that really do not need doing. While it might be true that the discrepancy here lies with public vs. private funding, it appears to me another case of mismanaging our own affairs by focusing too much on marquee projects while allowing dated and perhaps less attractive existing structures to decay and crumble.