Archive for the ‘Artistry’ Category

For a variety of reasons, I go to see movies in the theater only a handful of times any given year. The reasons are unimportant (and obvious) and I recognize that, by eschewing the theater, I’m giving up the crowd experience. Still, I relented recently and went to see a movie at a new AMC Dolby Cinema, which I didn’t even know exists. The first thing to appreciate was that is was a pretty big room, which used to be standard when cinema was first getting established in the 1920s but gave way sometime in the 1970s to multiplex theaters able to show more than one title at a time in little shoebox compartments with limited seating. Spaciousness was a welcome throwback. The theater also had oversized, powered, leather recliners rather than cloth, fold-down seats with shared armrests. The recliners were quite comfortable but also quite unnecessary (except for now typical Americans unable to fit their fat asses in what used to be a standard seat). These characteristics are shared with AMC Prime theaters that dress up the movie-going experience and charge accordingly. Indeed, AMC now offers several types of premium cinema, including RealD 3D, Imax, Dine-In, and BigD.

Aside I: A friend only just reported on her recent trip to the drive-in theater, a dated cinema experience that is somewhat degraded unenhanced yet retains its nostalgic charm for those of us old enough to remember as kids the shabby chic of bringing one’s own pillows, blankets, popcorn, and drinks to a double feature and sprawling out on the hood and/or roof of the car (e.g., the family station wagon). My friend actually brought her dog to the drive-in and said she remembered and sorta missed the last call on dollar hot dogs at 11 PM that used to find all the kids madly, gleefully rushing the concession stand before food ran out.

What really surprised me, however, was how the Dolby Cinema experience turned into a visual, auditory, and kinesthetic assault. True, I was watching Wonder Woman (sorry, no review), which is set in WWI and features lots of gunfire and munitions explosions in addition to the usual invincible superhero punchfest, so I suppose the point is partly to be immersed in the environment, a cinematic stab at verisimilitude. But the immediacy of all the wham-bam, rock ’em-sock ’em action made me feel more like a participant in a theater of war than a viewer. The term shell shock (a/k/a battle fatigue a/k/a combat neurosis) refers to the traumatized disorientation one experiences in moments of high stress and overwhelming sensory input; it applies here. Even the promo before the trailers and feature, offered to demonstrate the theater’s capabilities themselves, was off-putting because of unnecessary and overweening volume and impact. Unless I’m mistaken, the seats even have built-in subwoofers to rattle theatergoers from below when loud, concussive events occur, which is often because, well, filmmakers love their spectacle as much as audiences do.

Aside II: One real-life lesson to be gleaned from WWI, or the Great War as it was called before WWII, went well beyond the simplistic truism that war is hell. It was that civility (read: civilization) had failed and human progress was a chimera. Technical progress, however, had made WWI uglier in many respects than previous warfare. It was an entirely new sort of horror. Fun fact: there are numerous districts in France, known collectively as Le Zone Rouge, where no one is allowed to live because of all the unexploded ordnance (1oo years later!). Wonder Woman ends up having it both ways: acknowledging the horrific nature of war on the one hand yet valorizing and romanticizing personal sacrifice and eventual victory on the other. Worse, perhaps, it establishes that there’s always another enemy in the wings (otherwise, how could there be sequels?), so keep fighting. And for the average viewer, uniformed German antagonists are easily mistakable for Nazis of the subsequent world war, a historical gloss I’m guessing no one minds … because … Nazis.

So here’s my problem with AMC’s Dolby Cinema: why settle for routine or standard theater experience when it can be amped up to the point of offense? Similarly, why be content with the tame and fleeting though reliable beauty of a sunset when one can enjoy a widescreen, hyperreal view of cinematic worlds that don’t actually exist? Why settle for the subtle, old-timey charm of the carousel (painted horses, dizzying twirling, and calliope music) when instead one can strap in and get knocked sideways by roller coasters so extreme that riders leave wobbly and crying at the end? (Never mind the risk of being stranded on the tracks for hours, injured, or even killed by a malfunction.) Or why bother attending a quaint symphonic band concert in the park or an orchestral performance in the concert hall when instead one can go to Lollapalooza and see/hear/experience six bands in the same cacophonous space grinding it out at ear-splitting volume, along with laser light shows and flash-pot explosions for the sheer sake of goosing one’s senses? Coming soon are VR goggles that trick the wearer’s nervous system into accepting they are actually in the virtual game space, often first-person shooters depicting killing bugs or aliens or criminals without compunction. Our arts and entertainments have truly gotten out of hand.

If those criticisms don’t register, consider my post more than a decade ago on the Paradox of the Sybarite and Catatonic, which argues that our senses are so overwhelmed by modern life that we’re essentially numb from overstimulation. Similarly, let me reuse this Nietzsche quote (used before here) to suggest that on an aesthetic level, we’re not being served well in display and execution of refined taste so much as being whomped over the head and dragged willingly? through ordeals:

… our ears have become increasingly intellectual. Thus we can now endure much greater volume, much greater ‘noise’, because we are much better trained than our forefathers were to listen for the reason in it. All our senses have in fact become somewhat dulled because we always inquire after the reason, what ‘it means’, and no longer for what ‘it is’ … our ear has become coarsened. Furthermore, the ugly side of the world, originally inimical to the senses, has been won over for music … Similarly, some painters have made the eye more intellectual, and have gone far beyond what was previously called a joy in form and colour. Here, too, that side of the world originally considered ugly has been conquered by artistic understanding. What is the consequence of this? The more the eye and ear are capable of thought, the more they reach that boundary line where they become asensual. Joy is transferred to the brain; the sense organs themselves become dull and weak. More and more, the symbolic replaces that which exists … the vast majority, which each year is becoming ever more incapable of understanding meaning, even in the sensual form of ugliness … is therefore learning to reach out with increasing pleasure for that which is intrinsically ugly and repulsive, that is, the basely sensual. [italics not in original]

What’s Missing

Posted: May 9, 2017 in Artistry, Cinema, Consumerism, Culture

Pessimists, misanthropes, fatalists, and doomers (I’m all types) often find themselves defending the position that maybe we don’t live at the very best, super-duper, tippy-top time in history, that although technology in particular has admittedly delivered some amazing innovations and improved our material conditions considerably even over our fairly recent past, we nonetheless lack spirituality and meaning — unless of course one’s spirituality and meaning are mistakenly found in technology. The technological sublime is so deceptively glamorous and ubiquitous that it obscures the idea that maybe another time and place made for a more ethical, moral, and noble life. (Experience varied widely, obviously.) In addition, personal freedoms enjoyed in liberal democracies are difficult to argue against, though that’s more characteristic of those at the very top of the socioeconomic heap than those of us below who know to keeps things buttoned up lest we discomfit our betters. Perhaps the most broadly enjoyed modern development is increased lifespan borne out of improved healthcare. And quite recently, the grossly expanded communications age (the information age is nested inside) reputedly makes learning and keeping in touch far easier than ever before. I won’t dispel any of these. However, I have two complaints against modernity as evidenced by a body of posts extending back through the life of this blog: life lacks many salutary aspects delivered passively by bygone social structures and technology smuggles in a host of problems with its bounty.

Among the failed attributes are (1) lack of a cohesive narrative for what life ought to mean beyond grubbing for money, chasing fame and social cachet, and overpopulating the planet with progeny (many of whom suffer neglect while parents are away grubbing for money), (2) lack of true community and social conditions necessary to be properly situated within a wholesome context, (3) spiritual and emotional vacuity, (4) destructive social presences and bullying, especially online (as exemplified by the Bully-in-Chief) and by civil authorities, (5) unrelenting, disorienting technological and social change, including slang and memes that are impossible to track fully without being in the thrall of celebrities, pundits, and media in general, (6) little prospect of things improving near term, and (7) the still-dawning realization that, like the existential angst from the so-called Atomic Age and its threat of complete nuclear annihilation, we possess tools sufficient to destroy ourselves many times over (including rotting ourselves out from the inside in a fevered race to the cultural bottom) and have unwittingly fired the slo-mo suicide gun. We’re only just waiting for the bullet to strike its target. Since the gun is industrial civilization, the target is all of us. Dead men walking, or perhaps more accurately, the zombie shuffle.

A significant minority has replied to this set of miserable circumstances by voting into office our our current president, 45. According to one astute commentator, 45 was never understood as a solution but rather a murder weapon used by the disenfranchised to kill the host, namely, our sick society run by plutocrats maniacally hellbent on destroying everyone outside their immediate concern, which is just about everybody.

(more…)

I see plenty of movies over the course of a year but had not been to a theater since The Force Awakens came out slightly over a year ago. The reason is simple: it costs too much. With ticket prices nearing $15 and what for me had been obligatory popcorn and soda (too much of both the way they’re bundled and sold — ask anyone desperately holding back their pee until the credits roll!), the endeavor climbed to nearly $30 just for one person. Never mind that movie budgets now top $100 million routinely; the movie-going experience simply isn’t worth $30 a pop. Opening weekend crowds (and costumes)? Fuggedaboudit! Instead, I view films at home on DVD (phooey on Blueray) or via a streaming service. Although I admit I’m missing out on being part of an audience, which offers the possibility of being carried away on a wave of crowd emotion, I’m perfectly happy watching at home, especially considering most films are forgettable fluff (or worse) and filmmakers seem to have forgotten how to shape and tell good stories. So a friend dragged me out to see Rogue One, somewhat late after its opening by most standards. Seeing Star Wars and other franchise installments now feels like an obligation just to stay culturally relevant. Seriously, soon enough it will be Fast & Furious Infinitum. We went to a newly built theater with individual recliners and waiters (no concession stands). Are film-goers no longer satisfied by popcorn and Milk Duds? No way would I order an $80 bottle of wine to go with Rogue One. It’s meant to be a premium experience, with everything served to you in the recliner, and accordingly, charges premium prices. Too bad most films don’t warrant such treatment. All this is preliminary to the actual review, of course.

I had learned quite a bit about Rogue One prior to seeing it, not really caring about spoilers, and was pleasantly surprised it wasn’t as bad as some complain. Rogue One brings in all the usual Star Wars hallmarks: storm troopers, the Force, X-Wings and TIE Fighters, ray guns and light sabers, the Death Star, and familiar characters such as Grand Moff Tarkin, Darth Vader, Princess Leia, etc. Setting a story within the Star Wars universe makes most of that unavoidable, though some specific instances did feel like gratuitous fan service, such as the 3-second (if that) appearance of C3PO and R2D2. The appearance of things and characters I already knew about didn’t feel to me like an extra thrill, but how much I needed to already know about Star Wars just to make sense of Rogue One was a notable weakness. Thus, one could call Rogue One a side story, but it was by no means a stand-alone story. Indeed, characters old and new were given such slipshod introductions (or none at all!) that they functioned basically as chess pieces moved around to drive the game forward. Good luck divining their characteristic movements and motivations. Was there another unseen character manipulating everyone? The Emperor? Who knows? Who cares! It was all a gigantic, faceless, pawn sacrifice. When at last the main rebels died, there was no grief or righteousness over having at least accomplished their putative mission. Turns out the story was all about effects, not emotional involvement. And that’s how I felt: uninvolved. It was a fireworks display ending with a pointless though clichéd grand finale. Except I guess that watching a bunch of fake stuff fake blow up was the fake point.

About what passed for a story: the Rebellion learns (somehow?!) that they face total annihilation from a new superweapon called the Death Star. (Can’t remember whether that term was actually used in the film.) While the decision of leadership is to scatter and flee, a plucky band of rebels within the rebellion insist on flinging themselves against the enemy without a plan except to improvise once on site, whereupon leadership decides irrationally to do the same. The lack of strategy is straight out of The Return of the King, distracting the enemy from the true mission objective, but the visual style is more like the opening of Saving Private Ryan, which is to say, full, straight-on bombardment and invasion. Visual callbacks to WWII infantry uniforms and formations couldn’t be more out of place. To call these elements charmless is to give them too much credit. Rather, they’re hackneyed. However, they probably fit well enough within the Saturday-morning cartoon, newsreel, swashbuckler sensibility that informed the original Star Wars films from the 1970s. Problem is, those 1970s kids are grown and want something with greater gravitas than live-action space opera. Newer Star Wars audiences are stuck in permanent adolescence because of what cinema has become, with its superhero franchises and cynical money grabs.

As a teenager when the first trilogy came out, I wanted more of the mystical element — the Force — than I wanted aerial battles, sword fights, or chase scenes. The goofy robots, reluctant heroes, and bizarre aliens were fun, but they were balanced by serious, steady leadership (the Jedi) and a couple really bad-ass villains. While it’s known George Lucas had the entire character arc of Anakin Skywalker/Darth Vader in mind from the start, it’s also fair to say that no one quite knew in Episode 4 just how iconic Vader the villain would become, which is why his story became the centerpiece of the first two trilogies (how many more to come?). However, Anakin/Vader struggled with the light/dark sides of the Force, which resonated with anyone familiar with the angel/demon nomenclature of Christianity. When the Force was misguidedly explained away as Midi-clorians (science, not mysticism), well, the bottom dropped out of the Star Wars universe. At that point, it became a grand WWII analogue populated by American GIs and Nazis — with some weird Medievalism and sci-fi elements thrown in — except that the wrong side develops the superweapon. Rogue One makes that criticism even more manifest, though it’s fairly plain to see throughout the Star Wars films.

Let me single out one actor for praise: Ben Mendelsohn as Orson Krennic. It’s hard for me to decide whether he chews the scenery, upstaging Darth Vader as a villain in the one scene they share, or he’s among a growing gallery of underactors whose flat line delivery and blandness invites viewers to project upon them characterization telegraphed through other mechanisms (costuming, music, plot). Either way, I find him oddly compelling and memorable, unlike the foolish, throwaway, sacrificial band of rebellious rebels against the rebellion and empire alike. Having seen Ben Mendelsohn in other roles, he possesses an unusual screen magnetism that reminds me of Sean Connery. He tends to play losers and villains and be a little one-note (not a bag of tricks but just one trick), but he is riveting on-screen for the right reasons compared to, say, the ookiness of the two gratuitous CGI characters in Rogue One.

So Rogue One is a modestly enjoyable and ephemeral romp through the Star Wars universe. It delivers and yet fails to deliver, which about as charitable as I can be.

Stray links build up over time without my being able to handle them adequately, so I have for some time wanted a way of purging them. I am aware of other bloggers who curate and aggregate links with short commentaries quite well, but I have difficulty making my remarks pithy and punchy. That said, here are a few that I’m ready to purge in this first attempt to dispose of a few links from by backlog.

Skyfarm Fantasies

Futurists have offered myriad visions of technologies that have no hope of being implemented, from flying cars to 5-hour workweeks to space elevators. The newest pipe dream is the Urban Skyfarm, a roughly 30-story tree-like structure with 24 acres of space using solar panels and hydroponics to grow food close to the point of consumption. Utopian engineering such as this crops up frequently (pun intended) and may be fun to contemplate, but in the U.S. at least, we can’t even build high-speed rail, and that technology is already well established elsewhere. I suppose that’s why cities such as Seoul and Singapore, straining to make everything vertical for lack of horizontal space, are the logical test sites.

Leaving Nashville

The City of Nashville is using public funds to buy homeless people bus tickets to leave town and go be poor somewhere else. Media spin is that the city is “helping people in need,” but it’s obviously a NIMBY response to a social problem city officials and residents (not everyone, but enough) would rather not have to address more humanely. How long before cities begin completing with each other in numbers of people they can ship off to other cities? Call it the circle of life when the homeless start gaming the programs, revisiting multiple cities in an endless circuit.

Revisioneering

Over at Rough Type, Nick Carr points to an article in The Nation entitled “Instagram and the Fantasy of of Mastery,” which argues that a variety of technologies now give “artists” the illusion of skill, merit, and vision by enabling work to be easily executed using prefab templates and stylistic filters. For instance, in pop music, the industry standard is to auto-tune everyone’s singing to hide imperfections. Carr’s summary probably is better than the article itself and shows us the logical endpoint of production art in various media undertaken without the difficult work necessary to develop true mastery.

Too Poor to Shop

The NY Post reported over the summer that many Americans are too poor to shop except for necessities. Here are the first two paragraphs:

Retailers have blamed the weather, slow job growth and millennials for their poor results this past year, but a new study claims that more than 20 percent of Americans are simply too poor to shop.

These 26 million Americans are juggling two to three jobs, earning just around $27,000 a year and supporting two to four children — and exist largely under the radar, according to America’s Research Group, which has been tracking consumer shopping trends since 1979.

Current population in the U.S. is around 325 million. Twenty percent of that number is 65 million; twenty-six million is 8 percent. Pretty basic math, but I guess NY Post is not to be trusted to report even simple things accurately. Maybe it’s 20% of U.S. households. I dunno and can’t be bothered to check. Either way, that’s a pretty damning statistic considering the U.S. stock market continues to set new all-time highs — an economic recovery not shared with average Americans. Indeed, here are a few additional newsbits and links stolen ruthlessly from theeconomiccollapseblog.com:

  • The number of Americans that are living in concentrated areas of high poverty has doubled since the year 2000.
  • In 2007, about one out of every eight children in America was on food stamps. Today, that number is one out of every five.
  • 46 million Americans use food banks each year, and lines start forming at some U.S. food banks as early as 6:30 in the morning because people want to get something before the food supplies run out.
  • The number of homeless children in the U.S. has increased by 60 percent over the past six years.
  • According to Poverty USA, 1.6 million American children slept in a homeless shelter or some other form of emergency housing last year.

For further context, theeconomiccollapseblog also points to “The Secret Shame of Middle Class Americans” in The Atlantic, which reports, among other things, that fully 47% of Americans would struggle to scrape together a mere $400 in an emergency.

How do such folks respond to the national shopping frenzy kicking off in a few days with Black Friday, Small Business Saturday, Charitable Sunday, and Cyber Monday? I suggest everyone stay home.

See this exchange where Neil deGrasse Tyson chides Sam Harris for failing to speak to his audience in terms it understands:

The upshot is that lay audiences simply don’t subscribe to or possess the logical, rational, abstract style of discourse favored by Harris. Thus, Harris stands accused of talking past his audience — at least somewhat — especially if his audience is understood to be the general public rather than other well-educated professionals. Subject matter is less important than style but revolves around politics, and worse, identity politics. Everyone has abundant opinions about those, whether informed by rational analysis or merely fed by emotion and personal resonance.

The lesson deGrasse Tyson delivers is both instructive and accurate yet also demands that the level of discourse be lowered to a common denominator (like the reputed 9th-grade speech adopted by the evening news) that regrettably forestalls useful discussion. For his part (briefly, at the end), Harris takes the lesson and does not resort to academic elitism, which would be obvious and easy. Kudos to both, I guess, though I struggle (being somewhat an elitist); the style-over-substance argument really goes against the grain for me. Enhancements to style obviously work, and great communicators use them and are convincing as a result. (I distinctly recall Al Gore looking too much like a rock star in An Inconvenient Truth. Maybe it backfired. I tend to think that style could not overcome other blocks to substance on that particular issue.) Slick style also allows those with nefarious agendas to hoodwink the public into believing nonsense.

(more…)

A couple of posts ago, I used the phrase “pay to play” in reference to our bought-and-paid-for system of political patronage. This is one of those open secrets we all recognize but gloss over because, frankly, in a capitalist economy, anything that can be monetized and corrupted will be. Those who are thus paid to play enjoy fairly handsome rewards for doing not very much, really. Yet the paradigm is self-reinforcing, much like the voting system, with promises of increased efficiency and effectiveness with greater levels of participation. Nothing of the sort has proven to be true; it’s simply a goad we continue to hear, some believing in the carrot quite earnestly, others holding their noses and ponying up their dollars and votes, and still others so demoralized and disgusted with the entire pointless constellation of lies and obfuscations that refusing to participate feels like the only honest response. (Periodic arguments levied my way that voting is quite important have failed to convince me that my vote matters a whit. Rather, it takes a bizarre sort of doublethink to conclude that casting my ballot is meaningful. Of late, I’ve succumbed to sustained harangues and shown up to vote, but my heart’s not in it.) I can’t distinguish so well anymore between true believers and mere manipulators except to observe that the former are more likely to be what few civic-minded voters remain and the latter are obviously candidates and their PR hacks. Journalists? Don’t get me started.

The phrase put me in mind of two other endeavors (beyond politics) where a few professionals enjoy being paid to play: sports and performing arts. Both enjoy heavy subscription among the masses early in life, as student sports and performing groups offer training and experience. The way most of us start out, in fact, we actually pay to play through classes, lessons, training, dues, and memberships that provide access to experts and put us in position to reap rewards later in life. Maybe you attended tennis camp or music camp as a kid, or you paid for a college education (defrayed perhaps by activity scholarships) majoring in athletics or theater. Lots of variations exist, and they’re not limited to youth. As an endurance athlete, I continue to pay entrance fees to race organizers for the opportunity to race on courses with support that would otherwise be unavailable without the budget provided by participants, sponsorship notwithstanding. Chicago’s popular 16-inch softball leagues are pay-to-play sports.

A second phase might be giving it away for free. As with paying to play, pure enjoyment of the endeavor works as a strong motivation and justification. This is probably more common in the community-level performing arts, where participation is just plain fun. And who knows? Exposure might lead to a big break or discovery. It’s also what motivates quite a lot of amateur athletes, especially for sports that have not gone mainstream. Olympic athletes (tertiary events) might fall roughly into this category, especially when their primary incomes are derived elsewhere. A third phase is being paid to play. If the audience or fan base is big enough, the financial rewards and fame can be considerable. However, those who enter the professional ranks don’t always demonstrate such great prowess, especially early on. More than a few blow up and flame out quickly, unable to sustain the spark that launched their careers. There’s also being paid to play but earning well short of a livable wage, which borders on giving it away for free or at least for too little. A final phase is being paid not to play. A mean interpretation of that would be that one is screwing up or blocking others’ opportunities to the point where it becomes worthwhile to pay someone to not show up or to go away. A more charitable interpretation would be that one’s employment contract includes time-off benefits that require continuous payments even when not playing.

As with my post about the differences between the Participation, Achievement, and Championship Models, I’m now content with numerous endeavors to be either pay to play, play for free, or play for too little. Participation makes it worthwhile under any payment regime, the alternative typically being sitting at home on my couch wasting my time in front of the TV. I never made it to the enviable position of being paid to play or paid not to play. Still, as an individual of some attainment and multiple areas of expertise, I admit finding it irksome to observe some truly awful people out there pulling in attention and wealth despite rather feeble efforts or abilities. The meritocracy may not be dead, but it often looks comatose.

Today is the 10-year anniversary of the opening of this blog. As a result, there is a pretty sizeable backblog should anyone decide to wade in. As mentioned in my first post, I only opened this blog to get posting privileges at a group blog I admired because it functioned more like a discussion than a broadcast. The group blog died of attrition years ago, yet here I am 10 years later still writing my personal blog (which isn’t really about me).

Social media lives and dies by the numbers, and mine are deplorable. Annual traffic has ranged from about 6,800 to about 12,500 hits, much of which I’m convinced is mere background noise and bot traffic. Cumulative hits number about 90,140, and unique visitors are about 19,350, neither of which is anything to crow about for a blog of this duration. My subscriber count continues to climb pointlessly, now resting at 745. However, I judge I might have only a half dozen regular readers and perhaps half again as many commentators. I’ve never earned a cent for my effort, nor am I likely to ever put up a Patreon link or similar goad for donations. All of which only demonstrate that almost no one cares what I have to write about. C’est la vie. I don’t write for that purpose and frankly wouldn’t know what to write about if I were trying to drive numbers.

So if you have read my blog, what are some of the thing you might have gathered from me? Here’s an incomplete synopsis:

  • Torture is unspeakably bad. History is full of devices, methodologies, and torturers, but we learned sometime in the course of two 20th-century world wars that nothing justifies it. Nevertheless, it continues to occur with surprising relish, and those who still torture (or want to) are criminally insane.
  • Skyscrapers are awesomely tall examples of technical brilliance, exuberance, audacity, and hubris. Better expressions of techno-utopian, look-mom-no-hands, self-defeating narcissism can scarcely be found. Yet they continue to be built at a feverish pace. The 2008 financial collapse stalled and/or doomed a few projects, but we’re back to game on.
  • Classical music, despite record budgets for performing ensembles, has lost its bid for anything resembling cultural and artistic relevance by turning itself into a museum (performing primarily works of long-dead composers) and abandoning emotional expression in favor of technical perfection, which is probably an accurate embodiment of the spirit of the times. There is arguably not a single living composer who has become a household name since Aaron Copland, who died in 1990 but was really well-known in the 1940s and 50s.
  • We’re doomed — not in any routine sense of the word having to do with individual mortality but in the sense of Near-Term (Human) Extinction (NTE). The idea is not widely accepted in the least, and the arguments are too lengthy to repeat (and unlikely to convince). However, for those few able to decipher it, the writing is on the wall.
  • American culture is a constantly moving target, difficult to define and describe, but its principal features are only getting uglier as time wears on. Resurgent racism, nationalism, and misogyny make clear that while some strides have been made, these attitudes were only driven underground for a while. Similarly, colonialism never really died but morphed into a new version (globalization) that escapes criticism from the masses, because, well, goodies.
  • Human consciousness — another moving target — is cratering (again) after 3,000–5,000 years. We have become hollow men, play actors, projecting false consciousness without core identity or meaning. This cannot be sensed or assessed easily from the first-person perspective.
  • Electronic media makes us tools. The gleaming attractions of sterile perfection and pseudo-sociability have hoodwinked most of the public into relinquishing privacy and intellectual autonomy in exchange for the equivalent of Huxley’s soma. This also cannot be sensed or assessed easily from the first-person perspective.
  • Electoral politics is a game played by the oligarchy for chumps. Although the end results are not always foreseeable (Jeb!), the narrow range of options voters are given (lesser of evils, the devil you know …) guarantees that fundamental change in our dysfunctional style of government will not occur without first burning the house down. After a long period of abstention, I voted in the last few elections, but my heart isn’t really in it.
  • Cinema’s infatuation with superheros and bankable franchises (large overlap there) signals that, like other institutions mentioned above, it has grown aged and sclerotic. Despite large budgets and impressive receipts (the former often over $100 million and the latter now in the billions for blockbusters) and considerable technical prowess, cinema has lost its ability to be anything more than popcorn entertainment for adolescent fanboys (of all ages).

This is admittedly a pretty sour list. Positive, worthwhile manifestations of the human experience are still out there, but they tend to be private, modest, and infrequent. I still enjoy a successful meal cooked in my own kitchen. I still train for and race in triathlons. I still perform music. I still make new friends. But each of these examples is also marred by corruptions that penetrate everything we do. Perhaps it’s always been so, and as I, too, age, I become increasingly aware of inescapable distortions that can no longer be overcome with innocence, ambition, energy, and doublethink. My plan is to continue writing the blog until it feels like a burden, at which point I’ll stop. But for now, there’s too much to think and write about, albeit at my own leisurely pace.

A long while back (8 years ago), I drew attention to a curious bit of rhyming taking place in the world of architecture: the construction of skyscrapers that twist from base to top (see also here). I even suggested that one per city was needed, which seems to be slowly manifesting. Back then, the newest installment was the Infinity Tower, now fully built and known as the Cayan Tower. The doomed planned Chicago Spire has yet to get off the ground. Another incarnation of the basic twisting design is the Evolution Tower in Moscow, completed in 2014 (though I only just learned about it):

0_1c66c4_836e7329_-2-xxxl

There are plenty more pics at the Skyscraper page devoted to this building.

News of this development comes to me by way of James Howard Kunstler’s Eyesore of the Month feature at his website. I draw attention to Kunstler because he is far better qualified to evaluate and judge architecture than am I, even though most of his remarks are disparagement. Kunstler and I share both aesthetic and doomer perspectives on stunt architecture, and the twisting design seems to be one faddish way to avoid the boxy, straight-line approach to supertall buildings that dominated for some fifty years. Indeed, many buildings of smaller stature now seek that same avoidance, which used to be accomplished via ornamentation but is now structural. Such designs and construction are enabled by computers, thought it remains to be seen how long maintenance and repair can be sustained in an era of diminishing financial resources. (Material resources are a different but related matter, but these days, almost no one bothers with anything without financial incentive or reward.)

When the last financial collapse occurred in 2008 (extending into 2009 with recovery since then mostly faked), lots of projects were mothballed. I note, however, that Chicago has many new projects underway, and I can only surmise that other skylines are similarly full of cranes signalling the return of multibillion-dollar construction projects aimed at the well-off. Mention of crumbling infrastructure has been ongoing for decades now. Here’s one recent example. Yet attention and funding seems to flow in the direction of projects that really do not need doing. While it might be true that the discrepancy here lies with public vs. private funding, it appears to me another case of mismanaging our own affairs by focusing too much on marquee projects while allowing dated and perhaps less attractive existing structures to decay and crumble.

My work commute typically includes bus, train, and walking legs to arrive at my destination. If wakefulness and an available seat allow, I often read on the bus and train. (This is getting to be exceptional compared to other commuters, who are more typically lost in their phones listening to music, watching video, checking FB, or playing games. Some are undoubtedly reading, like me, but electronic media, which I find distasteful, alter the experience fundamentally from ink on paper.) Today, I was so absorbed in my reading that by the time I looked up, I missed my bus stop, and half an hour later, I nearly missed my train stop, too. The experience of tunnel vision in deep concentration is not at all unfamiliar to me, but it is fleeting and unpredictable. More typical is a relaxed yet alert concentration that for me takes almost no effort anymore.

So what sent me ’round the bend? The book I’m currently reading, Nick Carr’s The Glass Cage, takes a diversion into the work of poet Robert Frost. Carr uses Frost to illustrate his point about immersion in bodily work with manageable difficulty lending the world a more robust character than the detached, frictionless world experienced with too much technological mediation and ease. Carr does a terrific job contextualizing Frost’s lyric observations in a way quite unlike the contextual analysis one might undertake in a high school or college classroom, which too often makes the objects of study lifeless and irrelevant. Carr’s discussion put me unexpectedly into an aesthetic mode of contemplation, as distinguished from analytic or kinesthetic modes. There are probably others.

I don’t often go into aesthetic mode. It requires the right sort of stimulation. The Carr/Frost combination put me there, and so I tunneled into the book and forgot my commute. That momentary disorientation is often pleasurable, but for me, it can also be distressing. My infrequent visits to art museums are often accompanied by a vague unease at the sometimes nauseating emotionalism of the works on display. It’s an honest response, though I expect most folks can’t quite understand why something beautiful would provoke something resembling a negative response. In contrast, my experience in the concert hall is usually frustration, as musicians have become ever more corporate and professional in their performance over time to the detriment and exclusion of latent emotional content. I suppose that as Super Bowl Sunday is almost upon us (about which I care not at all), the typical viewer gets an emotional/aesthetic charge out of that overhyped event, especially if the game is hotly contested rather than a blowout. I seek and find my moments in less crass expressions of the human spirit.

rant on/

This is the time of year when media pundits pause to look back and consider the previous year, typically compiling unasked-for “best of” lists to recap what everyone may have experienced — at least if one is absorbed by entertainment media. My interest in such nonsense is passive at best, dismissive at worst. Further, more and more lists are weighed and compiled by self-appointed and guileless fanboys and -girls, some of whom are surprisingly knowledgeable (sign of a misspent youth?) and insightful yet almost uniformly lack a sufficiently longitudinal view necessary to form circumspect and expert opinions. The analogy would be to seek wisdom from a 20- or 30-something in advance of its acquisition. Sure, people can be “wise beyond their years,” which usually means free of the normal illusions of youth without yet having become a jaded, cynical curmudgeon — post-ironic hipster is still available — but a real, valuable, historical perspective takes more than just 2-3 decades to form.

For instance, whenever I bring up media theory to a youngster (from my point of reckoning), usually someone who has scarcely known the world without 24/7/365 access to all things electronic, he or she simply cannot conceive what it means to be without that tether/pacifier/security blanket smothering them. It doesn’t feel like smothering because no other information environment has ever been experienced (excepting perhaps in early childhood, but even that’s not guaranteed). Even a brief hiatus from the information blitzkrieg, a two-week vacation, say, doesn’t suffice. Rather, only someone olde enough to remember when it simply wasn’t there — at least in the personal, isolating, handheld sense — can know what it was like. I certainly remember when thought was free to wander, invent, and synthesize without pressure to incorporate a continuous stream of incoming electronic stimuli, most of which amounts to ephemera and marketing. I also remember when people weren’t constantly walled in by their screens and feeds, when life experience was more social, shared, and real rather than private, personal, and virtual. And so that’s why when I’m away from the radio, TV, computer, etc. (because I purposely and pointedly carry none of it with me), I’m less a mark than the typical media-saturated fool face-planted in a phone or tablet for the lures, lies, cons, and swindles that have become commonplace in late-stage capitalism.

Looking back in another sense, I can’t help but to feel a little exasperated by the splendid reviews of the life in music led by Pierre Boulez, who died this past week. Never heard of him? Well, that just goes to show how far classical music has fallen from favor that even a titan such as he makes utterly no impression on the general public, only specialists in a field that garners almost no attention anymore. Yet I defy anyone not to know who Kim Kardashian is. Here’s the bigger problem: despite being about as favorably disposed toward classical music as it is possible to be, I have to admit that no one I know (including quite a few musicians) would be able to hum or whistle or sing a recognizable tune by Boulez. He simply doesn’t pass the whistle test. But John Williams (of Star Wars fame) certainly does. Nor indeed would anyone put on a recording of one of Boulez’s works to listen to. Not even his work as a conductor is all that compelling, either live or on disc (I’ve experienced plenty of both). As one looks back on the life of Pierre Boulez, as one is wont to do upon his passing, how can it be that such prodigious talent as he possessed could be of so little relevance?

Consider these two examples flip sides of the same coin. One enjoys widespread subscription but is base (opinions differ); the other is obscure but (arguably) refined. Put differently, one is pedestrian, the other admirable. Or over a lifetime, one is placebo (or worse), the other fulfilling. Looking back upon my own efforts and experiences in life, I would much rather be overlooked or forgotten than be petty and (in)famous. Yet mass media conspires to make us all into nodes on a network with goals decidedly other than human respectability or fulfillment. So let me repeat the challenge question of this blog: are you climbing or descending?

rant off/