Posts Tagged ‘Absurdity’

Most of us are familiar with a grandpa, uncle, or father who eventually turns into a cranky old man during late middle age or in his dotage. (Why is it a mostly male phenomenon?) In the last three decades, Clint Eastwood typecast himself as a cranky old man, building on lone-wolf characters (mostly cops, criminals, and cowboys) established earlier in his career. In real life, these guys spout talking points absorbed from mainstream media and narrative managers, or if they are truly lazy and/or can’t articulate anything coherently on their own, merely forward agitprop via e-mail like chain mail of yore. They also demonstrate remarkably forgivable racism, sexism, and bigotry, such as Eastwood’s rather enjoyable and ultimately redeemed character in the film Gran Torino. If interaction with such a fellow is limited to Thanksgiving gatherings once per year, crankiness can be tolerated fairly easily. If interactions are ongoing, then a typical reaction is simply to delete e-mail messages unread, or in the case of unavoidable face-to-face interaction, to chalk it up: Well, that’s just Grandpa Joe or Uncle Bill or Dad. Let him rant; he’s basically harmless now that he’s so old he creaks.

Except that not all of them are so harmless. Only a handful of the so-called Greatest Generation (I tire of the term but it’s solidly established) remain in positions of influence. However, lots of Boomers still wield considerable power despite their advancing age, looming retirement (and death), and basic out-of-touchness with a culture that has left them behind. Nor are their rants and bluster necessarily wrong. See, for instance, this rant by Tom Engelhardt, which begins with these two paragraphs:

Let me rant for a moment. I don’t do it often, maybe ever. I’m not Donald Trump. Though I’m only two years older than him, I don’t even know how to tweet and that tells you everything you really need to know about Tom Engelhardt in a world clearly passing me by. Still, after years in which America’s streets were essentially empty, they’ve suddenly filled, day after day, with youthful protesters, bringing back a version of a moment I remember from my youth and that’s a hopeful (if also, given Covid-19, a scary) thing, even if I’m an old man in isolation in this never-ending pandemic moment of ours.

In such isolation, no wonder I have the urge to rant. Our present American world, after all, was both deeply unimaginable — before 2016, no one could have conjured up President Donald Trump as anything but a joke — and yet in some sense, all too imaginable …

If my own father (who doesn’t read this blog) could articulate ideas as well as Engelhardt, maybe I would stop deleting unread the idiocy he forwards via e-mail. Admittedly, I could well be following in my father’s footsteps, as the tag rants on this blog indicates, but at least I write my own screed. I’m far less accomplished at it than, say, Engelhardt, Andy Rooney (in his day), Ralph Nader, or Dave Barry, but then, I’m only a curmudgeon-in-training, not having fully aged (or elevated?) yet to cranky old manhood.

As the fall presidential election draws near (assuming that it goes forward), the choice in the limited U.S. two-party system is between one of two cranky old men, neither of which is remotely capable of guiding the country through this rough patch at the doomer-anticipated end of human history. Oh, and BTW, echoing Engelhardt’s remark above, 45 has been a joke all of my life — a dark parody of success — and remains so despite occupying the Oval Office. Their primary opponent up to only a couple months ago was Bernie Sanders, himself a cranky old man but far more endearing at it. This is what passes for the best leadership on offer?

Many Americans are ready to move on to someone younger and more vibrant, able to articulate a vision for something, well, different from the past. Let’s skip right on past candidates (names withheld) who parrot the same worn-out ideas as our fathers and grandfathers. Indeed, a meme emerged recently to the effect that the Greatest Generation saved us from various early 20th-century scourges (e.g., Nazis and Reds) only for the Boomers to proceed in their turn to mess up the planet so badly nothing will survive new scourges already appearing. It may not be fair to hang such labels uniformly around the necks of either generation (or subsequent ones); each possesses unique characteristics and opportunities (some achieved, others squandered) borne out of their particular moment in history. But this much is clear: whatever happens with the election and whichever generational cohort assumes power, the future is gonna be remarkably different.

I’m aware of at least two authors who describe American character in less than glowing terms: Alexis de Tocqueville and Morris Berman. Tocqueville’s book Democracy in America (two vols., 1835 and 1840) is among the most cited, least read (by 21st-century Americans, anyway) books about America. (I admit to not having read it.) Berman’s American trilogy (titles unnamed, all of which I’ve read) is better known by contemporary Americans (those who read, anyway) and is unflinching in its denunciation of, among other things, our prideful stupidity. Undoubtedly, others have taken a crack at describing American character.

American identity, OTOH, if such a thing even exists, is somewhat more elusive for a variety of reasons. For instance, Americans lack the formative centuries or millennia of history Europeans and Asians have at their disposal. Moreover, Americans (except for Native Americans — multiple synonyms and euphemisms available) are immigrants (or their descendants) drawn from all around the globe. Accordingly, we lack a coherent unifying narrative about who we are. The American melting pot may come closest but is insufficient in its generality. National identity may well be fraying in other societies as each loses its distinctiveness over time. Nonetheless, two influential factors to formation of a loose American identity are negative identity (defining oneself as against others, e.g., adolescent rebellion rather fitting for a young nation) and borrowed identity (better known as cultural appropriation). The latter has been among the chief complaints of social justice warriors.

(more…)

The crisis consists precisely in the fact that the old is dying and the new
cannot be born; in this interregnum a great variety of morbid symptoms appear.

Antonio Gramsci

 As a kid, I was confused when during some TV drama I heard the phrase “The king is dead; long live the king!” I was interpreting events too literally: the king had just died, so how could his subjects proclaim for him long life? Only when age awarded me greater sophistication (probably not wisdom, though) did I then realize that the phrase connotes the end of one era and the start of another. Old regent dies; new regent assumes power. We’re in the midst of such as transition from one era to the next, though it isn’t marked clearly by the death of a leader. Indeed, when I launched this blog in 2006, that was what I sensed and said so plainly in the About Brutus link at top, which hasn’t changed since then except to correct my embarrassing typos. I initially thought the transition would be about an emerging style of consciousness. Only slightly later, I fell down the rabbit hole regarding climate change (an anthropogenic, nonlinear, extinction-level process). I still believe my intuitions and/or conclusions on both subjects, but I’ve since realized that consciousness was always a moving target and climate change could unfold slowly enough to allow other fundamental shifts to occur alongside. No promises, though. We could also expire rather suddenly if things go awry quickly and unexpectedly. At this point, however, and in a pique of overconfidence, I’m willing to offer that another big transition has finally come into focus despite its being underway as I write. Let me explain. In his book America: The Farewell Tour (2018), Chris Hedges writes this:

Presently, 42 percent of the U.S. public believes in creationism … [and] nearly a third of the population, 94 million people, consider themselves evangelical. Those who remain in a reality-based universe do not take seriously the huge segment of the public, mostly white and working-class, who because of economic distress have primal yearnings for vengeance, new glory, and moral renewal and are easily seduced by magical thinking … The rational, secular forces, those that speak in the language of fact and reason, are hated and feared, for they seek to pull believers back into “the culture of death” that nearly destroyed them. The magical belief system, as it was for impoverished German workers who flocked to the Nazi Party, is an emotional life raft. It is all the supports them. [pp. 50–51]

That’s where we are now, retreating into magical thinking we supposedly left behind in the wake of the Enlightenment. Call it the Counter-Enlightenment (or Un-Enlightenment). We’re on this track for a variety of reasons but primarily because the bounties of the closing Age of Abundance have been gobbled up by a few plutocrats. Most of the rest of population, formerly living frankly precarious lives (thus, the precariat), have now become decidedly unnecessary (thus, the unnecessariat). The masses know that they have been poorly served by their own social, political, and cultural institutions, which have been systematically hijacked and diverted into service of the obscenely, absurdly rich.

Three developments occurring right now, this week, indicate that we’re not just entering an era of magical thinking (and severely diminishing returns) but that we’ve lost our shit, gone off the deep end, and sought escape valves to release intolerable pressures. It’s the same madness of crowds writ large — something that periodically overtakes whole societies, as noted above by Chris Hedges. Those developments are (1) the U.S. stock market (and those worldwide?) seesawing wildly on every piece of news, (2) deranged political narratives and brazenly corrupt machinations that attempt to, among other things, install select the preferred Democratic presidential candidate to defeat 45, and (3) widespread panic over the Covid-19 virus. Disproportionate response to the virus is already shutting down entire cities and regions even though the growing epidemic so far in the U.S. has killed fewer people than, say, traffic accidents. Which will wreak the worst mayhem is a matter of pointless conjecture since the seriousness of the historical discontinuity will require hindsight to access. Meanwhile, the king is dead. Long live the king!

Didn’t expect to come back to this one so soon, but an alternative meaning behind my title just appeared. Whereas the first post was about cancel culture, this redux is about finding people willing and able to act as mouthpieces for whatever narrative the powers that be wish to foist on the public, as in “Where do they dig up these characters people?”

Wide-ranging opinion is not difficult to obtain in large populations, so although plenty of folks are willing to be paid handsomely to mouth whatever words are provided to them (e.g., public relations hacks, social media managers, promoters, spokespersons, actors, and straight-up shills in advertisements of all sorts), a better approach is simply to find people who honestly believe the chosen narrative so that they can do others’ bidding guilelessly, which is to say, without any need of selling their souls. This idea first came to my attention in an interview (can’t remember the source) given by Noam Chomsky where is chided the interviewer, who had protested that no one was telling him what to say, by observing that if he didn’t already share the desired opinion, he wouldn’t have the job. The interviewer was hired and retained precisely because he was already onboard. Those who depart from the prescribed organizational perspective are simply not hired, or if their opinions evolve away from the party line, they are fired. No need to name names, but many have discovered that journalistic objectivity (or at least a pose of objectivity) and independent thought are not high values in the modern media landscape.

Here’s a good example: 19-year-old climate change denier/skeptic Naomi Seibt is being billed as the anti-Greta Thunberg. No doubt Seibt believes the opinions she will be presenting at the Heartland Institute later this week. All the more authenticity if she does. But it’s a little suspicious, brazen and clumsy even, that another European teenage girl is being raised up to dispel Time Magazine‘s 2019 Person of the Year, Greta Thunberg. Maybe it’s even true, as conspiracists suggest, that Thunberg herself is being used to drive someone else’s agenda. The MSM is certainly using her to drive ratings. These questions are all ways to distract from the main point, which is that we’re driving ourselves to extinction (alongside most of the rest of the living world) by virtue of the way we inhabit the planet and consume its finite resources.

Here’s a second example: a “debate” on the subject of socialism between economists Paul Krugman and Richard Wolff on PBS‘s show Democracy Now!

 

Let me disclose my biases up front. I’ve never liked economists as analysts of culture, sociology, or electoral politics. Krugman in particular has always read like more of an apologist for economic policies that support the dysfunctional status quo, so I pay him little attention. On the other hand, Wolff has engaged his public as a respectable teacher/explainer of the renewed socialist movement of which he is a part, and I give him my attention regularly. In truth, neither of these fellow needed to be “dug up” from obscurity. Both are heavily covered in the media, and they did a good job not attacking each other while making their cases in the debate.

The weird thing was how Krugman is so clearly triggered by the word socialism, even though he acknowledges that the U.S. has many robust examples of socialism already. He was clearly the one designated to object to socialism as an ideology and describes socialism as an electoral kiss of death. Maybe he has too many childhood memories of ducking, covering, and cowering during those Atomic Era air raid drills and so socialism and communism were imprinted on him as evils never to be entertained. At least three generations after him lack those memories, however, and are not traumatized by the prospect of socialism. In fact, that’s what the Democratic primaries are demonstrating: no fear but rather enthusiastic support for the avowed Democratic Socialist on the ballots. Who are the fearful ones? Capitalists. They would be wise to learn sooner than later that the public, as Wolff says plainly, is ready for change. Change is coming for them.

Color me surprised to learn that 45 is considering a new executive order mandating that the “classical architectural style shall be the preferred and default style” for new and upgraded federal buildings, revising the Guiding Principles for Federal Architecture issued in 1962. Assuredly, 45 is hardly expected to weigh in on respectable aesthetic choices considering his taste runs toward gawdy, glitzy, ostentatious surface display (more Baroque) than restraint, dignity, poise, and balance (more Classical or Neoclassical).

Since I pay little attention to mainstream news propaganda organs, I learned of this from James Howard Kunstler’s blog Clusterfuck Nation (see blogroll) as though the order had already issued, but it’s apparently still in drafting. Twas nice to read Kunstler returning to his roots in architectural criticism. He’s never left it behind entirely; his website has a regular feature called Eyesore of the Month, which I rather enjoy reading. He provides a brief primer how architectural styles in the 20th century (all lumped together as Modernism) embody the Zeitgeist, namely, techno-narcissism. (I’m unconvinced that Modernism is a direct rebuke of 20th-century fascists who favored Classicism.) Frankly, with considerably more space at his disposal, Iain McGilchrist explores Modernist architecture better and with far greater erudition in The Master and his Emissary (2010), which I blogged through some while ago. Nonetheless, this statement by Kunstler deserves attention:

The main feature of this particular moment is that techno-industrial society has entered an epochal contraction presaging collapse due to over-investments in hyper-complexity. That hyper-complexity has come to be perfectly expressed in architecture lately in the torqued and tortured surfaces of gigantic buildings designed by computers, with very poor prospects for being maintained, or even being useful, as we reel into a new age of material scarcity and diminished expectations …

This is the life-out-of-balance statement in a nutshell. We are over-extended and wedded to an aesthetic of power that requires preposterous feats of engineering to build and continuous resource inputs to operate and maintain. (Kunstler himself avers elsewhere that an abundance of cheap, easily harvested energy enabled the Modern Era, so chalking up imminent collapse due primarily to over-investment in hyper-complexity seems like substitution of a secondary or follow-on effect for the main one.) My blogging preoccupation with skyscrapers demonstrates my judgment that the vertical dimension of the human-built world in particular is totally out of whack, an instantiation of now-commonplace stunt architecture. Should power ever fail for any sustained duration, reaching floors above, say, the 10th and delivering basic services to them, such as water for sinks and toilets, quickly becomes daunting.

However, that’s a technical hurdle, not an aesthetic consideration. The Modernist government buildings in question tend to be Brutalist designs, which often look like high-walled concrete fortresses or squat, impenetrable bunkers. (Do your own image search.) They project bureaucratic officiousness and disconcern if not open hostility toward the people they purport to serve. Basically, enter at your own risk. They share with the International Style a formal adherence to chunky geometric forms, often presented impassively (as pure abstraction) or in an exploded view (analogous to a cubist painting showing multiple perspectives simultaneously). Curiously, commentary at the links above is mostly aligned with perpetuating the Modernist project and aesthetic as described by Kunstler and McGilchrist. No interruptions, difficulties, or vulnerabilities are contemplated. Commentators must not be reading the same analyses I am, or they’re blithely supportive of progress in some vague sense, itself a myth we tell ourselves.

One of the victims of cancel culture, coming to my attention only days ago, is Kate Smith (1907–1986), a singer of American popular song. Though Smith had a singing career spanning five decades, she is best remembered for her version(s) of Irving Berlin’s God Bless America, which justifiably became a bit of Americana. The decades of Smith’s peak activity were the 1930s and 40s.

/rant on

I dunno what goes through people’s heads, performing purity rituals or character excavation on folks long dead. The controversy stems from Smith having a couple other songs in her discography: That’s Why Darkies Were Born (1931) and Pickaninny Heaven from the movie Hello, Everybody! (1933). Hate to break it anyone still living under a rock, but these dates are not far removed from minstrelsy, blackface, and The Birth of a Nation (1915) — a time when typical Americans referred to blacks with a variety of terms we now consider slurs. Such references were still used during the American civil rights movement (1960s) and are in use among some virulent white supremacists even today. I don’t know the full context of Kate Smith having sung those songs, but I suspect I don’t need to. In that era, popular entertainment had few of the sensibilities regarding race we now have (culture may have moved on, but it’s hard to say with a straight face it’s evolved or progressed humanely), and uttering commonly used terms back then was not automatic evidence of any sort of snarling racism.

I remember having heard my grandparents, nearly exact contemporaries of Kate Smith, referring to blacks (the term I grew up with, still acceptable I think) with other terms we no longer consider acceptable. It shocked me, but to them, that’s simply what blacks were called (the term(s) they grew up with). Absolutely nothing in my grandparents’ character or behavior indicated a nasty, racist intent. I suspect the same was true of Kate Smith in the 1930s.

Back when I was a librarian, I also saw plenty of sheet music published before 1920 or so with the term darkie (or darkey) in the title. See for example this. The Library of Congress still uses the subject headings “negro spirituals” (is there another kind?) and “negro songs” to refer to various subgenres of American folk song that includes slave songs, work songs, spirituals, minstrel music, protest songs, etc. Maybe we should cancel the Library of Congress. Some published music titles from back then even call them coon songs. That last one is totally unacceptable today, but it’s frankly part of our history, and like changing character names in Mark Twain’s Huckleberry Finn, sanitizing the past does not make it go away or any less discomfiting. But if you wanna bury your head in the sand, go ahead, ostrich.

Also, if some person or entity ever does some questionably racist, sexist, or malign thing (even something short of abominable) situated contextually in the past, does that mean he, she, or it must be cancelled irrevocably? If that be the case, then I guess we gotta cancel composer Richard Wagner, one of the most notorious anti-Semites of the 19th century. Also, stop watching Pixar, Marvel, and Star Wars films (among others), because remember that time when Walt Disney Studios (now Walt Disney Company) made a racist musical film, Song of the South (1946)? Disney’s tainted legacy (extending well beyond that one movie) is at least as awful as, say, Kevin Spacey, and we’re certainly not about to rehabilitate him.

/rant off

I’ve grown rather tired of hearing the financial 1% to 0.01% (depending on source) being called the “elite.” There is nothing about them most would recognize as elite other than their absurd wealth. As a rule, they’re not particularly admirable men and women; they’re merely aspirational (as in everyone thinking “wish I had all that money” — the moral lesson about the corruptions of excessive wealth rarely having been learned). The manner in which such fortunes are amassed pretty much invalidates claims to moral or ethical superiority. In the typical case, “real” money is acquired by identifying one or more market opportunities and exploiting them ruthlessly. The object of exploitation might be a natural resource, labor, a product or service, or a combination.

Admittedly, effort goes into exploiting a market niche, and it often takes considerable time to develop and mature (less these days in overheated and overvalued markets), but the pattern is well established. Further, those who succeed are often mere beneficiaries of happenstance from among many competing entrepreneurs, speculators, financiers, and media types putting in similar effort. While capitalism is not as blind as rock-paper-scissors or subtly skilled as poker, both of which are designed to produce an eventual sole winner (and making everyone else losers), this economic system tends over time to pool increasing wealth in the accounts of those who have already “won” the game. Thus, wealth inequality increases until social conditions become so intolerable (e.g., tent cities across the U.S.) the masses revolt. How many resets of this deplorable game do we get?

Meanwhile — and here’s another thing I can’t grok — billionaires seem discontent (alert: intentional fallacy) to merely enjoy their wealth or heaven forfend use it to help others. Instead, they announce their ambitions to rule by campaigning for high office, typically skipping the preliminary step of developing actual political skills, because (doncha know?) everything worth having can be bought. Few sane people actually believe that a giant fortune qualifies someone for high office, except of course them who gots the fortunes and have gone off the deep end. They’re so use to being fawned over by sycophants and cozied up to by false admirers that it’s doubtful anyone is ever bold enough to tell them anything resembling truth about themselves (notably including major character deficiencies). So the notion enters the noggin that the next big project ought be to squat on high office as though it’s a right bequeathed specially to the ultrarich, whether one is Tom Steyer, Michael Bloomberg, Oprah Winfrey, Mark Zuckerberg, Mark Cuban, or (gasp!) that trailblazer who demonstrated it’s possible: 45. In a pinch, mere millions will have to suffice, as most congressfolk and senators can attest.

According to Anand Giridharadas, author of the book Winners Take All, seeking political office and practicing philanthropy is not at all the public service or “giving back” it pretends to be. Rather, it’s an attempt to maintain the status quo (funneling money upstream to those at the top), thus forestalling one of those nasty resets where the rabble overwhelms their betters with a fury known in past centuries to get quite out of hand. A few supposed elites riding herd over the great unwashed masses sounds rather passé, no? The bygone stuff of barbarian hordes and robber barons? But it describes the current day, too, and considering these folks are basically taking a giant dump on billions of other people, sorta gives a new, inverted meaning to the term squatter’s rights.

Robots are coming; we all know it. Frankly, for some implementations, they’re already here. For example, I recently took interest in robotic vacuums. I already have an upright vacuum with the usual attachments I push around on weekends, plus brooms and dustpans for hard, uncarpeted floors. But I saw a robotic vacuum in action and found myself considering purchasing something I knew existed but never gave thought to needing. All it took was watching one scuttling along the floor aimlessly, bumping harmlessly into furniture, to think perhaps my living experience would be modestly enhanced by passive clean-up while I’m out of the house — at least I thought so until I saw the price range extends from roughly $150 to $500. Surprised me, too, to see how crowded the marketplace is with competing devices from different manufacturers. Can’t rationalize the expense as a simple labor-saving device. The effort it replaces just isn’t that arduous.

Another robotic device caught my eye: the Gita cargo robot by Piaggio Fast Forward. I will admit that a stuff carrier for those with mobility issues might be a worthwhile device, much like Segway seemed like a relatively good idea to increase range for those with limited mobility — at least before such devices branched into self-balancing hoverboards and motorized scooters that now clog the sidewalks, create unnecessary hazards, and send thousands each year to emergency rooms with broken wrists (or worse). One of those little Gita buggers following able-bodied folks around seems to me the height of foolishness, not to mention laziness. The video review I saw (sorry, no link, probably outta date and based on a prototype) indicated that the Gita is not ready for prime time and requires the user to wear a camera/belt assembly for the Gita to track and follow its owner. Its limited capacity and operating duration between charges (yeah, another thing to plug in — sigh), plus its inability to negotiate doors effectively, makes it seem like more trouble that it’s worth for the hefty price of around $3,250.

Billed as a robot butler, the Gita falls well short of a Jetsons or Star Wars upright robot that’s able, for example, to execute commands and interact verbally. Maybe the Gita represents the first baby steps toward that envisioned future (or long time ago in a galaxy far, far away), but I rather doubt it. Moreover, we’re already irritatingly besieged by people face-planted in their phones. Who wants a future were others (let’s say half of the people we come into contact with in hallways, corridors, and parking lots) are attended by a robot cargo carrier or fully functioning robot butler? In the meantime, just like the Google Glass that was never adopted widely, anyone seen with a Gita trailing behind is a tool.

Much ado over nothing was made this past week regarding a technical glitch (or control room error) during the first of two televised Democratic presidential debates where one pair of moderators’ mics was accidentally left on and extraneous, unintended speech leaked into the broadcast. It distracted the other pair of moderators enough to cause a modest procedural disruption. Big deal. This was not the modal case of a hot mic where someone, e.g., a politician, swears (a big no-no despite the shock value being almost completely erased in today’s media landscape) or accidentally reveals callous attitudes (or worse) thinking that no one important was listening or recording. Hot mics in the past have led to public outrage and criminal investigations. One recent example that still sticks in everyone’s craw was a novice political candidate who revealed he could use his fame and impudent nerve to “grab ’em by the pussy.” Turned out not to be the career killer everyone thought it would be.

The latest minor furor over a hot mic got me thinking, however, about inadvertent revelation of matters of genuine public interest. Three genres spring to mind: documentary films, whistle-blowing, and investigative journalism, that last including category outliers such as Wikileaks. Whereas a gaffe on a hot mic usually means the leaker/speaker exposes him- or herself and thus has no one else to blame, disclosures occurring in the other three categories are often against the will of those exposed. It’s obviously in the public interest to know about corruption, misbehavior, and malfeasance in corporate and political life, but the manner in which such information is made public is controversial. Those who expose others suffer harassment and persecution. Documentarians probably fare the best with respect to being left alone following release of information. Michael Moore, for all his absurd though entertaining theatrics, is free (so far as I know) to go about his business and do as he pleases. However, gestures to protect whistle-blowers are just that: gestures. Those who have leaked classified government information in particular, because they gained access to such information through security clearances and signed nondisclosure agreements (before knowing what secrets they were obliged to keep, which is frankly the way such obligations work), are especially prone to reprisal and prosecution. Such information is literally not theirs to disclose, but when keeping others’ secrets is heinous enough, some people feel their conscience and more duty is superior to job security and other risks involved. Opinions vary, sometimes passionately. And now even journalists who uncover or merely come into possession of evidence of wrongdoing and later publish it — again, decidedly in the public interest — are subject to (malicious?) prosecution. Julian Assange is the current test case.

The free speech aspect of revealing someone else’s amoral and criminal acts is a fraught argument. However, it’s clear that as soon as damaging information comes to light, focus shifts away from the acts and their perpetrators to those who publish the information. Shifting the focus is a miserable yet well-established precedent by now, the result being that most folks who might consider coming forward to speak up now keep things to themselves rather than suffer entirely foreseeable consequences. In that light, when someone comes forward anyway, knowing that they will be hounded, vilified, arrested, and worse, he or she deserved more respect for courage and self-sacrifice than generally occurs in the aftermath of disclosure. The flip side — condemnation, prosecution, and death threats — are already abundant in the public sphere.

Some time after reports of torture at Guantánamo, Abu Ghraib, and Bagram went public, a handful of low-level servicemen (“bad apples” used to deflect attention down the command hierarchy) were prosecuted, but high-level officials (e.g., former U.S. presidents Bush and Obama, anyone in their respective administrations, and commanding officers on site) were essentially immunized from prosecution. That example is not quite the same as going after truth-tellers, but it’s a rather egregious instance of bad actors going unprosecuted. I’m still incensed by it. And that’s why I’m blogging about the hot mic. Lots of awful things go on behind the scenes without public knowledge or sanction. Those who commit high crimes (including war crimes) clearly know what they’re doing is wrong. Claims of national security are often invoked and gag laws are legislated into existence on behalf of private industry. When leaks do inevitably occur, those accused immediately attack the accuser, often with the aid of others in the media. Denials may also be issued (sometimes not — why bother?), but most bad actors hide successfully behind the deflecting shift of focus. When will those acting in the shadows against the public interest and in defiance of domestic and international law ever be brought to justice? I daresay the soul of the nation is at stake, and as long as officialdom escapes all but temporary public relations problems to be spun, the pride everyone wants to take as Americans eludes us. In the meantime, there’s a lot to answer for, and it keeps piling up.

For a time after the 2008 financial collapse, skyscraper projects in Chicago came to a dead halt, mostly due to dried-up financing. My guess (since I don’t know with any reliability) is that much the same obtained worldwide. However, the game appears to be back on, especially in New York City, one of few cities around the globe where so-called “real money” tends to pool and collect. Visual Capitalist has an interesting infographic depicting changes to the NYC skyline every 20 years. The number of supertalls topping 1,000 feet expected by 2020 is quite striking.

Courtesy of Visual Capitalist

The accompanying text admits that NYC is left in the dust by China, specifically, the Pearl River Delta Megacity, which includes Hong Kong, Shenzhen, Macau, and others. As I’ve written before, the mad rush to build (earning ridiculous, absurd, imaginary prestige points awarded by and to exactly no one) takes no apparent notice of a slo-mo crack-up in the way modern societies organize and fund themselves. The new bear market might give one … um, pause.

Also left in the dust is Chicago, home of the original skyscraper. Since the 2008 collapse, Chicago’s most ambitious project, the ill-fated Chicago Spire (a/k/a the Fordham Spire) was abandoned despite a big hole dug in the ground and some foundation work completed. An absence of completed prestige projects since 2008 means Chicago has been lapped several times over by NYC, not that anyone is counting. The proposed site of the Chicago Spire is too enticing, however — just inside Lake Shore Drive at the mouth of the Chicago River — for it to be dormant for long. Indeed, a press release last year (escaped my attention at the time) announced redevelopment of the site, and a slick website is operating for now (linked in the past to similar sites that went abandoned along with their subject projects). Also reported late last year, Chicago appears to have rejoined the game in earnest, with multiple projects already under construction and others in the planning/approval phases.

So if hiatus was called the last time we crashed financially (a regular occurrence, I note), it seems we’ve called hiatus on the hiatus and are back in a mad, futile race to remake modernity into gleaming vertical cities dotting the globe. Such hubris and exuberance might be intoxicating to technophiles, but I’m reminded of a observation (can’t locate a quote, sorry) to the effect that civilizations’ most extravagant projects are undertaken just before their collapses. Our global civilization is no different.