Posts Tagged ‘Absurdity’

Color me surprised to learn that 45 is considering a new executive order mandating that the “classical architectural style shall be the preferred and default style” for new and upgraded federal buildings, revising the Guiding Principles for Federal Architecture issued in 1962. Assuredly, 45 is hardly expected to weigh in on respectable aesthetic choices considering his taste runs toward gawdy, glitzy, ostentatious surface display (more Baroque) than restraint, dignity, poise, and balance (more Classical or Neoclassical).

Since I pay little attention to mainstream news propaganda organs, I learned of this from James Howard Kunstler’s blog Clusterfuck Nation (see blogroll) as though the order had already issued, but it’s apparently still in drafting. Twas nice to read Kunstler returning to his roots in architectural criticism. He’s never left it behind entirely; his website has a regular feature called Eyesore of the Month, which I rather enjoy reading. He provides a brief primer how architectural styles in the 20th century (all lumped together as Modernism) embody the Zeitgeist, namely, techno-narcissism. (I’m unconvinced that Modernism is a direct rebuke of 20th-century fascists who favored Classicism.) Frankly, with considerably more space at his disposal, Iain McGilchrist explores Modernist architecture better and with far greater erudition in The Master and his Emissary (2010), which I blogged through some while ago. Nonetheless, this statement by Kunstler deserves attention:

The main feature of this particular moment is that techno-industrial society has entered an epochal contraction presaging collapse due to over-investments in hyper-complexity. That hyper-complexity has come to be perfectly expressed in architecture lately in the torqued and tortured surfaces of gigantic buildings designed by computers, with very poor prospects for being maintained, or even being useful, as we reel into a new age of material scarcity and diminished expectations …

This is the life-out-of-balance statement in a nutshell. We are over-extended and wedded to an aesthetic of power that requires preposterous feats of engineering to build and continuous resource inputs to operate and maintain. (Kunstler himself avers elsewhere that an abundance of cheap, easily harvested energy enabled the Modern Era, so chalking up imminent collapse due primarily to over-investment in hyper-complexity seems like substitution of a secondary or follow-on effect for the main one.) My blogging preoccupation with skyscrapers demonstrates my judgment that the vertical dimension of the human-built world in particular is totally out of whack, an instantiation of now-commonplace stunt architecture. Should power ever fail for any sustained duration, reaching floors above, say, the 10th and delivering basic services to them, such as water for sinks and toilets, quickly becomes daunting.

However, that’s a technical hurdle, not an aesthetic consideration. The Modernist government buildings in question tend to be Brutalist designs, which often look like high-walled concrete fortresses or squat, impenetrable bunkers. (Do you own image search.) They project bureaucratic officiousness and disconcern if not open hostility toward the people they purport to serve. Basically, enter at your own risk. They share with the International Style a formal adherence to chunky geometric forms, often presented impassively (as pure abstraction) or in an exploded view (analogous to a cubist painting showing multiple perspectives simultaneously). Curiously, commentary at the links above is mostly aligned with perpetuating the Modernist project and aesthetic as described by Kunstler and McGilchrist. No interruptions, difficulties, or vulnerabilities are contemplated. Commentators must not be reading the same analyses I am, or they’re blithely supportive of progress in some vague sense, itself a myth we tell ourselves.

One of the victims of cancel culture, coming to my attention only days ago, is Kate Smith (1907–1986), a singer of American popular song. Though Smith had a singing career spanning five decades, she is best remembered for her version(s) of Irving Berlin’s God Bless America, which justifiably became a bit of Americana. The decades of Smith’s peak activity were the 1930s and 40s.

/rant on

I dunno what goes through people’s heads, performing purity rituals or character excavation on folks long dead. The controversy stems from Smith having a couple other songs in her discography: That’s Why Darkies Were Born (1931) and Pickaninny Heaven from the movie Hello, Everybody! (1933). Hate to break it anyone still living under a rock, but these dates are not far removed from minstrelsy, blackface, and The Birth of a Nation (1915) — a time when typical Americans referred to blacks with a variety of terms we now consider slurs. Such references were still used during the American civil rights movement (1960s) and are in use among some virulent white supremacists even today. I don’t know the full context of Kate Smith having sung those songs, but I suspect I don’t need to. In that era, popular entertainment had few of the sensibilities regarding race we now have (culture may have moved on, but it’s hard to say with a straight face it’s evolved or progressed humanely), and uttering commonly used terms back then was not automatic evidence of any sort of snarling racism.

I remember having heard my grandparents, nearly exact contemporaries of Kate Smith, referring to blacks (the term I grew up with, still acceptable I think) with other terms we no longer consider acceptable. It shocked me, but to them, that’s simply what blacks were called (the term(s) they grew up with). Absolutely nothing in my grandparents’ character or behavior indicated a nasty, racist intent. I suspect the same was true of Kate Smith in the 1930s.

Back when I was a librarian, I also saw plenty of sheet music published before 1920 or so with the term darkie (or darkey) in the title. See for example this. The Library of Congress still uses the subject headings “negro spirituals” (is there another kind?) and “negro songs” to refer to various subgenres of American folk song that includes slave songs, work songs, spirituals, minstrel music, protest songs, etc. Maybe we should cancel the Library of Congress. Some published music titles from back then even call them coon songs. That last one is totally unacceptable today, but it’s frankly part of our history, and like changing character names in Mark Twain’s Huckleberry Finn, sanitizing the past does not make it go away or any less discomfiting. But if you wanna bury your head in the sand, go ahead, ostrich.

Also, if some person or entity ever does some questionably racist, sexist, or malign thing (even something short of abominable) situated contextually in the past, does that mean he, she, or it must be cancelled irrevocably? If that be the case, then I guess we gotta cancel composer Richard Wagner, one of the most notorious anti-Semites of the 19th century. Also, stop watching Pixar, Marvel, and Star Wars films (among others), because remember that time when Walt Disney Studios (now Walt Disney Company) made a racist musical film, Song of the South (1946)? Disney’s tainted legacy (extending well beyond that one movie) is at least as awful as, say, Kevin Spacey, and we’re certainly not about to rehabilitate him.

/rant off

I’ve grown rather tired of hearing the financial 1% to 0.01% (depending on source) being called the “elite.” There is nothing about them most would recognize as elite other than their absurd wealth. As a rule, they’re not particularly admirable men and women; they’re merely aspirational (as in everyone thinking “wish I had all that money” — the moral lesson about the corruptions of excessive wealth rarely having been learned). The manner in which such fortunes are amassed pretty much invalidates claims to moral or ethical superiority. In the typical case, “real” money is acquired by identifying one or more market opportunities and exploiting them ruthlessly. The object of exploitation might be a natural resource, labor, a product or service, or a combination.

Admittedly, effort goes into exploiting a market niche, and it often takes considerable time to develop and mature (less these days in overheated and overvalued markets), but the pattern is well established. Further, those who succeed are often mere beneficiaries of happenstance from among many competing entrepreneurs, speculators, financiers, and media types putting in similar effort. While capitalism is not as blind as rock-paper-scissors or subtly skilled as poker, both of which are designed to produce an eventual sole winner (and making everyone else losers), this economic system tends over time to pool increasing wealth in the accounts of those who have already “won” the game. Thus, wealth inequality increases until social conditions become so intolerable (e.g., tent cities across the U.S.) the masses revolt. How many resets of this deplorable game do we get?

Meanwhile — and here’s another thing I can’t grok — billionaires seem discontent (alert: intentional fallacy) to merely enjoy their wealth or heaven forfend use it to help others. Instead, they announce their ambitions to rule by campaigning for high office, typically skipping the preliminary step of developing actual political skills, because (doncha know?) everything worth having can be bought. Few sane people actually believe that a giant fortune qualifies someone for high office, except of course them who gots the fortunes and have gone off the deep end. They’re so use to being fawned over by sycophants and cozied up to by false admirers that it’s doubtful anyone is ever bold enough to tell them anything resembling truth about themselves (notably including major character deficiencies). So the notion enters the noggin that the next big project ought be to squat on high office as though it’s a right bequeathed specially to the ultrarich, whether one is Tom Steyer, Michael Bloomberg, Oprah Winfrey, Mark Zuckerberg, Mark Cuban, or (gasp!) that trailblazer who demonstrated it’s possible: 45. In a pinch, mere millions will have to suffice, as most congressfolk and senators can attest.

According to Anand Giridharadas, author of the book Winners Take All, seeking political office and practicing philanthropy is not at all the public service or “giving back” it pretends to be. Rather, it’s an attempt to maintain the status quo (funneling money upstream to those at the top), thus forestalling one of those nasty resets where the rabble overwhelms their betters with a fury known in past centuries to get quite out of hand. A few supposed elites riding herd over the great unwashed masses sounds rather passé, no? The bygone stuff of barbarian hordes and robber barons? But it describes the current day, too, and considering these folks are basically taking a giant dump on billions of other people, sorta gives a new, inverted meaning to the term squatter’s rights.

Robots are coming; we all know it. Frankly, for some implementations, they’re already here. For example, I recently took interest in robotic vacuums. I already have an upright vacuum with the usual attachments I push around on weekends, plus brooms and dustpans for hard, uncarpeted floors. But I saw a robotic vacuum in action and found myself considering purchasing something I knew existed but never gave thought to needing. All it took was watching one scuttling along the floor aimlessly, bumping harmlessly into furniture, to think perhaps my living experience would be modestly enhanced by passive clean-up while I’m out of the house — at least I thought so until I saw the price range extends from roughly $150 to $500. Surprised me, too, to see how crowded the marketplace is with competing devices from different manufacturers. Can’t rationalize the expense as a simple labor-saving device. The effort it replaces just isn’t that arduous.

Another robotic device caught my eye: the Gita cargo robot by Piaggio Fast Forward. I will admit that a stuff carrier for those with mobility issues might be a worthwhile device, much like Segway seemed like a relatively good idea to increase range for those with limited mobility — at least before such devices branched into self-balancing hoverboards and motorized scooters that now clog the sidewalks, create unnecessary hazards, and send thousands each year to emergency rooms with broken wrists (or worse). One of those little Gita buggers following able-bodied folks around seems to me the height of foolishness, not to mention laziness. The video review I saw (sorry, no link, probably outta date and based on a prototype) indicated that the Gita is not ready for prime time and requires the user to wear a camera/belt assembly for the Gita to track and follow its owner. Its limited capacity and operating duration between charges (yeah, another thing to plug in — sigh), plus its inability to negotiate doors effectively, makes it seem like more trouble that it’s worth for the hefty price of around $3,250.

Billed as a robot butler, the Gita falls well short of a Jetsons or Star Wars upright robot that’s able, for example, to execute commands and interact verbally. Maybe the Gita represents the first baby steps toward that envisioned future (or long time ago in a galaxy far, far away), but I rather doubt it. Moreover, we’re already irritatingly besieged by people face-planted in their phones. Who wants a future were others (let’s say half of the people we come into contact with in hallways, corridors, and parking lots) are attended by a robot cargo carrier or fully functioning robot butler? In the meantime, just like the Google Glass that was never adopted widely, anyone seen with a Gita trailing behind is a tool.

Much ado over nothing was made this past week regarding a technical glitch (or control room error) during the first of two televised Democratic presidential debates where one pair of moderators’ mics was accidentally left on and extraneous, unintended speech leaked into the broadcast. It distracted the other pair of moderators enough to cause a modest procedural disruption. Big deal. This was not the modal case of a hot mic where someone, e.g., a politician, swears (a big no-no despite the shock value being almost completely erased in today’s media landscape) or accidentally reveals callous attitudes (or worse) thinking that no one important was listening or recording. Hot mics in the past have led to public outrage and criminal investigations. One recent example that still sticks in everyone’s craw was a novice political candidate who revealed he could use his fame and impudent nerve to “grab ’em by the pussy.” Turned out not to be the career killer everyone thought it would be.

The latest minor furor over a hot mic got me thinking, however, about inadvertent revelation of matters of genuine public interest. Three genres spring to mind: documentary films, whistle-blowing, and investigative journalism, that last including category outliers such as Wikileaks. Whereas a gaffe on a hot mic usually means the leaker/speaker exposes him- or herself and thus has no one else to blame, disclosures occurring in the other three categories are often against the will of those exposed. It’s obviously in the public interest to know about corruption, misbehavior, and malfeasance in corporate and political life, but the manner in which such information is made public is controversial. Those who expose others suffer harassment and persecution. Documentarians probably fare the best with respect to being left alone following release of information. Michael Moore, for all his absurd though entertaining theatrics, is free (so far as I know) to go about his business and do as he pleases. However, gestures to protect whistle-blowers are just that: gestures. Those who have leaked classified government information in particular, because they gained access to such information through security clearances and signed nondisclosure agreements (before knowing what secrets they were obliged to keep, which is frankly the way such obligations work), are especially prone to reprisal and prosecution. Such information is literally not theirs to disclose, but when keeping others’ secrets is heinous enough, some people feel their conscience and more duty is superior to job security and other risks involved. Opinions vary, sometimes passionately. And now even journalists who uncover or merely come into possession of evidence of wrongdoing and later publish it — again, decidedly in the public interest — are subject to (malicious?) prosecution. Julian Assange is the current test case.

The free speech aspect of revealing someone else’s amoral and criminal acts is a fraught argument. However, it’s clear that as soon as damaging information comes to light, focus shifts away from the acts and their perpetrators to those who publish the information. Shifting the focus is a miserable yet well-established precedent by now, the result being that most folks who might consider coming forward to speak up now keep things to themselves rather than suffer entirely foreseeable consequences. In that light, when someone comes forward anyway, knowing that they will be hounded, vilified, arrested, and worse, he or she deserved more respect for courage and self-sacrifice than generally occurs in the aftermath of disclosure. The flip side — condemnation, prosecution, and death threats — are already abundant in the public sphere.

Some time after reports of torture at Guantánamo, Abu Ghraib, and Bagram went public, a handful of low-level servicemen (“bad apples” used to deflect attention down the command hierarchy) were prosecuted, but high-level officials (e.g., former U.S. presidents Bush and Obama, anyone in their respective administrations, and commanding officers on site) were essentially immunized from prosecution. That example is not quite the same as going after truth-tellers, but it’s a rather egregious instance of bad actors going unprosecuted. I’m still incensed by it. And that’s why I’m blogging about the hot mic. Lots of awful things go on behind the scenes without public knowledge or sanction. Those who commit high crimes (including war crimes) clearly know what they’re doing is wrong. Claims of national security are often invoked and gag laws are legislated into existence on behalf of private industry. When leaks do inevitably occur, those accused immediately attack the accuser, often with the aid of others in the media. Denials may also be issued (sometimes not — why bother?), but most bad actors hide successfully behind the deflecting shift of focus. When will those acting in the shadows against the public interest and in defiance of domestic and international law ever be brought to justice? I daresay the soul of the nation is at stake, and as long as officialdom escapes all but temporary public relations problems to be spun, the pride everyone wants to take as Americans eludes us. In the meantime, there’s a lot to answer for, and it keeps piling up.

For a time after the 2008 financial collapse, skyscraper projects in Chicago came to a dead halt, mostly due to dried-up financing. My guess (since I don’t know with any reliability) is that much the same obtained worldwide. However, the game appears to be back on, especially in New York City, one of few cities around the globe where so-called “real money” tends to pool and collect. Visual Capitalist has an interesting infographic depicting changes to the NYC skyline every 20 years. The number of supertalls topping 1,000 feet expected by 2020 is quite striking.

Courtesy of Visual Capitalist

The accompanying text admits that NYC is left in the dust by China, specifically, the Pearl River Delta Megacity, which includes Hong Kong, Shenzhen, Macau, and others. As I’ve written before, the mad rush to build (earning ridiculous, absurd, imaginary prestige points awarded by and to exactly no one) takes no apparent notice of a slo-mo crack-up in the way modern societies organize and fund themselves. The new bear market might give one … um, pause.

Also left in the dust is Chicago, home of the original skyscraper. Since the 2008 collapse, Chicago’s most ambitious project, the ill-fated Chicago Spire (a/k/a the Fordham Spire) was abandoned despite a big hole dug in the ground and some foundation work completed. An absence of completed prestige projects since 2008 means Chicago has been lapped several times over by NYC, not that anyone is counting. The proposed site of the Chicago Spire is too enticing, however — just inside Lake Shore Drive at the mouth of the Chicago River — for it to be dormant for long. Indeed, a press release last year (escaped my attention at the time) announced redevelopment of the site, and a slick website is operating for now (linked in the past to similar sites that went abandoned along with their subject projects). Also reported late last year, Chicago appears to have rejoined the game in earnest, with multiple projects already under construction and others in the planning/approval phases.

So if hiatus was called the last time we crashed financially (a regular occurrence, I note), it seems we’ve called hiatus on the hiatus and are back in a mad, futile race to remake modernity into gleaming vertical cities dotting the globe. Such hubris and exuberance might be intoxicating to technophiles, but I’m reminded of a observation (can’t locate a quote, sorry) to the effect that civilizations’ most extravagant projects are undertaken just before their collapses. Our global civilization is no different.

For ambulatory creatures, vision is arguably the primary sense of the five (main) senses. Humans are among those species that stand upright, facilitating a portrait orientation when interacting among ourselves. The terrestrial environment on which we live, however, is in landscape (as distinguished from the more nearly 3D environments of birds and insects in flight or marine life in rivers, lakes, seas, and oceans). My suspicion is that modest visual conflict between portrait and landscape is among the dynamics that give rise to the orienting response, a step down from the startle reflex, that demands full attention when visual environments change.

I recall reading somewhere that wholesale changes in surroundings, such as when crossing a threshold, passing through a doorway, entering or exiting a tunnel, and notably, entering and exiting an elevator, trigger the orienting response. Indeed, the flush of disorientation before one gets his or her bearings is tantamount to a mind wipe, at least momentarily. This response may also help to explain why small, bounded spaces such as interiors of vehicles (large and small) in motion feel like safe, contained, hermetically sealed personal spaces. We orient visually and kinesthetically at the level of the interior, often seated and immobile, rather than at the level of the outer landscape being traversed by the vehicle. This is true, too, of elevators, a modern contraption that confounds the nervous system almost as much as revolving doors — particularly noticeable with small children and pets until they become habituated to managing such doorways with foreknowledge of what lies beyond.

The built environment has historically included transitional spaces between inner and outer environments. Churches and cathedrals include a vestibule or narthex between the exterior door and inner door leading to the church interior or nave. Additional boundaries in church architecture mark increasing levels of hierarchy and intimacy, just as entryways of domiciles give way to increasingly personal spaces: parlor or sitting room, living room, dining room, kitchen, and bedroom. (The sheer utility of the “necessary” room defies these conventions.) Commercial and entertainment spaces use lobbies, atria, and prosceniums in similar fashion.

What most interests me, however, is the transitional space outside of buildings. This came up in a recent conversation, where I observed that local school buildings from the early to middle part of the 20th century have a distinguished architecture set well back from the street where lawns, plazas, sidewalks, and porches leading to entrances function as transitional spaces and encourage social interaction. Ample window space, columnar entryways, and roof embellishments such as dormers, finials, cupolas, and cornices add style and character befitting dignified public buildings. In contrast, 21st-century school buildings in particular and public buildings in general, at least in the city where I live, tend toward porchless big-box warehouses built right up to the sidewalk, essentially robbing denizens of their social space. Blank, institutional walls forbid rather than invite. Consider, for example, how students gathered in a transitional space are unproblematic, whereas those congregated outside a school entrance abutting a narrow sidewalk suggest either a gauntlet to be run or an eruption of violence in the offing. (Or maybe they’re just smoking.) Anyone forced to climb past loiterers outside a commercial establishment experiences similar suspicions and discomforts.

Beautifully designed and constructed public spaces of yore — demonstrations of a sophisticated appreciation of both function and intent — have fallen out of fashion. Maybe they understood then how transitional spaces ease the orientation response, or maybe they only intuited it. Hard to say. Architectural designs of the past acknowledged and accommodated social functions and sophisticated aesthetics that are today actively discouraged except for pointless stunt architecture that usually turns into boondoggles for taxpayers. This has been the experience of many municipalities when replacing or upgrading schools, transit centers, sports arenas, and public parks. Efficient land use today drives toward omission of transitional space. One of my regular reads is James Howard Kunstler’s Eyesore of the Month, which profiles one architectural misfire after the next. He often mocks the lack of transitional space, or when present, observes its open hostility to pedestrian use, including unnecessary obstacles and proximity to vehicular traffic (noise, noxious exhaust, and questionable safety) discouraging use. Chalk this up as another collapsed art (e.g., painting, music, literature, and poetry) so desperate to deny the past and establish new aesthetics that it has ruined itself.

Everyone knows how to play Rock, Paper, Scissors, which typically comes up as a quick means of settling some minor negotiation with the caveat that the winner is entirely arbitrary. The notion of a Rock, Paper, Scissors tournament is therefore a non sequitur, since the winner by no means possesses skill, or strategic combinations of throws devised to reliably defeat opponents. Rather, winners are the unfortunate recipients of a blind but lucky sequence, an algorithm, that produces an eventual winner yet is indifferent to the outcome. I can’t say quite why, exactly, but I’ve been puzzling over how three-way conflicts might be decided were the categories instead Strong, Stupid, and Smart, respectively.

Rock is Strong, obviously, because it’s blunt force, whereas Paper is Stupid because it’s blank, and Scissors is Smart because it’s the only one that has any design or sophistication. For reassignments to work, however, the circle of what beats what would have to be reversed: Strong beats Stupid, Stupid beats Smart, and Smart beats Strong. One could argue that Strong and Stupid are equally dense, but arguendo, let’s grant Strong supremacy in that contest. Interestingly, Stupid always beats Smart because Smart’s advantage is handily nullified by Stupid. Finally, Smart beats Strong because the David and Goliath parable has some merit. Superhero fanboys are making similar arguments with respect to the hotly anticipated Superman v. Batman (v. Wonder Woman) film to be released in 2016. The Strong argument is that Superman need land only one punch to take out Batman (a mere human with gadgets and bad-ass attitude), but the Smart argument is that Batman will outwit Superman by, say, deploying kryptonite or exploiting Superman’s inherent good guyness to defeat him.

A further puzzle is how the game Strong, Stupid, Smart works out in geopolitics. The U.S. is clearly Strong, the last remaining world superpower (though still dense as a board — new revelations keep reinforcing that judgment), and uses its strength to bully Stupids into submission. Numerous countries have shifted categories from Strong to Stupid over time — quite a few in fact if one surveys more than a few decades of world history. Stupids have also fought each other to effective stalemate in most of the world, though not without a few wins and losses chalked up. What remains, however, is for a truly Smart regime to emerge to take down Strong. The parody version of such a match-up is told in the book The Mouse That Roared (also a movie with Peter Sellars). But since Smart is vanquished by Stupid, and the world has an overabundance of Stupids, it is unlikely that Smart can ever do better than momentary victory.

Our current slate of presidential candidates is a mostly a field of Stupids with a couple Strongs thrown in (remember: still equally dense as Stupid). Then there are a couple insanely stupid Stupids who distort the circle into an out-of-kilter bizarro obloid. As with geopolitics, a Smart candidate is yet to emerge, but such a candidate would only defeat Strongs, clearing the way for a Stupid victory. This should be obvious to any strategist, and indeed, no truly Smart candidates have declared, knowing full well that they would gain no traction with the half-literate, mouth-breathing public composed largely of Stupids who predictably fall in love with the most insanely stupid Stupid candidate out there. An engaged Smart candidate would thus hand the victory to the insanely Stupid, who should be unelectable from the outset, but go figger. So then the deep strategy (gawd, how I hate this) would be to go with the devil you know, since a saint could never prevail against all the demons.