Archive for the ‘Taste’ Category

A little more content lite (even though my complaint is unavoidable). Saw on Motherboard a report on a first-person, Web-based shopping game about Black Friday zombie mall shoppers. You can play here. It’s pure kitsch but does reinforce the deplorable behaviors of sale-crazed shoppers swarming over each other to get at goodies (especially cheap electronics), sometimes coming to blows. Videos of 2015 Black Friday brawls appeared almost immediately.

We apparently learn nothing year-over-year as we reenact our ritual feeding frenzy, lasting all the way through New Year’s Eve. (I never go out on Black Friday.) I might have guessed that big box retailers face diminishing returns with store displays torn apart, disgruntled shoppers, traumatized employees, and the additional cost of rent-a-cops to herd the masses and maintain order (which obviously doesn’t work in many instances). Yet my e-mail inbox keeps loading up with promotions and advertisements, even a day later. The video game in particular reminds me of Joe Bageant’s great line: “We have embraced the machinery of our undoing as recreation.”

Advertisements

The video below came to my attention recently, which shows a respectable celebrity, violinist/conductor Itzhak Perlman, being dicked around in an interview he probably undertook in good faith. My commentary follows.

Publicized pranks and gotchas are by no means rare. Some are good-natured and quite funny, but one convention of the prank is to unmask it pretty quickly. In the aftermath, the target typically either laughs if off, leaves without comment, or less often, storms out in disgust. Andy Kaufman as “Tony Clifton” was probably among the first to sustain a prank well past the point of discomfort, never unmasking himself. Others have since gotten in on the antics, though results are probably not any worse dickishness (dickery?) than Kaufman’s.

Fake interviews by comedians posing as news people are familiar to viewers of The Daily Show and its spinoff The Colbert Report (its run now completed). Zack Galifianakis does the same schtick in Between Two Ferns. It always surprises me when targets fall into the trap, exposing themselves as clueless ideologues willing to be hoisted with their own petards. However, Colbert in particular balanced his arch Republican stage persona with an unmistakable respect for his interview subject, which was at times inspired. Correspondents from The Daily Show are frequently pretty funny, but they almost never convey any respect for the subjects of the interview. Nick Canellakis (shown above) apparently has a whole series of interviews with classical musicians where he feigns idiocy and insult. Whereas some interview subjects are media savvy enough to get the joke and play along, I find this attempt at humor tasteless and unbearable.

Further afield, New Media Rockstars features a burgeoning list of media hosts who typically operate cheaply over the Web via YouTube, supported by an array of social media. At least one, Screen Junkies (the only one I watch), has recently blown into an entire suite of shows. I won’t accuse them all of being talentless hacks or dicking people around for pointless yuks, but I often pause to wonder what makes the shows worth producing beyond the hosts’ embarrassingly encyclopedic knowledge of comics, cartoons, TV shows, movies, etc. They’re fanboys (and girls) who have leveraged their misspent youth and eternal adolescence to gush and gripe about their passions. Admittedly, this may not be so different from sports fanatics (especially human statisticians), opera geeks, and nerds of others stripes.

Throwaway media may have unintentionally smuggled in tasteless shenanigans such as those by Nick Canellakis. Various comedians (unnamed) have similarly offered humorless discomfort as entertainment. Reality TV shows explored this area a while back, which I called trainwreck television. Cheaply produced video served over the Web has unleashed a barrage of dreck in all these categories. Some shows may eventually find their footing and become worthwhile. In the meantime, I anticipate seeing plenty more self-anointed media hosts dicking around celebrities and audiences alike.

The English language has words for everything, and whenever something new comes along, we coin a new word. The latest neologism I heard is bolthole, which refers to the the location one bolts to when collapse and civil unrest reach intolerable proportions. At present, New Zealand is reputed to be the location of boltholes purchased and kept by the ultrarich, which has the advantage of being located in the Southern Hemisphere, meaning remote from the hoi polloi yet reachable by private plane or oceangoing yacht. Actually, bolthole is an older term now being repurposed, but it seems hip and current enough to be new coin.

Banned words are the inverse of neologisms, not in the normal sense that they simply fall out of use but in their use being actively discouraged. Every kid learns this early on when a parent or older sibling slips and lets an “adult” word pass his or her lips that the kid isn’t (yet) allowed to use. (“Mom, you said fuck!”) George Carlin made a whole routine out of dirty words (formerly) banned from TV. Standards have been liberalized since the 1970s, and now people routinely swear or refer to genitalia on TV and in public. Sit in a restaurant or ride public transportation (as I do), eavesdrop a little speech within easy earshot (especially private cellphone conversations), and just count the casual F-bombs.

The worst field of banned-words nonsense is political correctness, which is intertwined with identity politics. All the slurs and epithets directed at, say, racial groups ought to be disused, no doubt, but we overcompensate by renaming everyone (“____-American”) to avoid terms that have little or no derogation. Even more ridiculous, at least one egregiously insulting term has been reclaimed as an badge of honor unbanned banned word by the very group it oppresses. It takes Orwellian doublethink to hear that term — you all know what it is — used legitimately exclusively by those allowed to use it. (I find it wholly bizarre yet fear to wade in with my own prescriptions.) Self-disparaging language, typically in a comedic context, gets an unwholesome pass, but only if one is within the identity group. (Women disparage women, gays trade on gay stereotypes, Jews indulge in jokey anti-Semitism, etc.) We all laugh and accept it as safe, harmless, and normal. President Obama is continuously mixed up appearances (“optics”), or what to call things — or not call them, as the case may be. For instance, his apparent refusal to call terrorism originating in the Middle East “Muslim terrorism” has been met with controversy.

I’m all for calling a thing what it is, but the term terrorism is too loosely applied to any violent act committed against (gasp!) innocent Americans. Recent events in Charleston, SC, garnered the terrorism label, though other terms would be more apt. Further, there is nothing intrinsically Muslim about violence and terrorism. Yeah, sure, Muslims have a word or doctrine — jihad — but it doesn’t mean what most think or are led to believe it means. Every religion across human history has some convenient justification for the use of force, mayhem, and nastiness to promulgate its agenda. Sometimes it’s softer and inviting, others time harder and more militant. Unlike Bill Maher, however, circumspect thinkers recognize that violence used to advance an agenda, like words used to shape narratives, are not the province of any particular hateful or hate-filled group. Literally everyone does it to some extent. Indeed, the passion with which anyone pursues an agenda is paradoxically celebrated and reviled depending on content and context, and it’s a long, slow, ugly process of sorting to arrive as some sort of Rightthink®, which then becomes conventional wisdom before crossing over into political correctness.

If I were to get twisted and strained over every example of idiocy on parade, I’d be permanently distorted. Still, a few issues have crossed my path that might be worth bringing forward.

Fealty to the Flag

An Illinois teacher disrespected the American flag during a classroom lesson on free speech. Context provided in this article is pretty slim, but it would seem to me that a lesson on free speech might be precisely the opportunity to demonstrate that tolerance of discomfiting counter-opinion is preferable to the alternative: squelching it. Yet in response to complaints, the local school board voted unanimously to fire the teacher of the offending lesson. The ACLU ought to have a field day with this one, though I must admit there can be no convincing others that desecrating the flag is protected free speech. Some remember a few years ago going round and round on this issue with a proposed Constitutional amendment. Patriots stupidly insist on carving out an exception to free speech protections when it comes to the American flag, which shows quite clearly that they are immune to the concept behind the 1st Amendment, which says this:

Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the government for a redress of grievances. [emphasis added]

Naturally, interpretations of the Bill of Rights vary widely, but it doesn’t take a Constitutional scholar to parse the absolute character of these rights. Rights are trampled all the time, of course, as the fired Illinois teacher just found out.

Fealty to the Wrong Flag

The Confederate battle flag has come back into the national spotlight following racially inspired events in Charleston, SC. (Was it ever merely a quaint, anachronistic, cultural artifact of the American South?) CNN has a useful article separating fact from fiction, yet some Southerners steadfastly defend the flag. As a private issue of astonishingly poor taste, idiocy, and free speech, individuals should be allowed to say what they want and fly their flags at will, but as a public issue for states and/or institutions that still fly the flag or emblazon it on websites, letterhead, etc., it’s undoubtedly better to give up this symbol and move on. (more…)

I’m not a serious cineaste, but I have offered a few reviews on The Spiral Staircase. There are many, many cineastes out there, though, and although cinema is now an old medium (roughly 100 years old), cineastes tend to be on the younger side of 35 years. Sure, lots of established film critics are decidedly older, typically acting under the aegis of major media outlets, but I’m thinking specifically of the cohort who use new, democratized media (e.g., cheap-to-produce and -distribute YouTube channels) to indulge in their predilections. For example, New Media Rockstars has a list of their top 100 YouTube channels (NMR No. 1 contains links to the rest). I have heard of almost none of them, since I don’t live online like so many born after the advent of the Information/Communications Age. The one I pay particular attention to is Screen Junkies (which includes Honest Trailers, the Screen Junkies Show, and Movie Fights), and I find their tastes run toward childhood enthusiasms that mire their criticism in a state of permanent adolescence and self-mocking geekdom. The preoccupation with cartoons, comic books, action figures, superheros, and popcorn films couldn’t be more clear. Movies Fights presumes to award points on the passion, wit, and rhetoric of the fighters rather than quality of the films they choose to defend. However, adjudication is rarely neutral, since trump cards tend to get played when a superior film or actor is cited against an inferior one.

So I happened to catch three recent flicks that are central to Screen Junkies canon: Captain America: Winter Soldier, The Avengers: Age of Ultron, and Transformers: Age of Extinction (links unnecessary). They all qualify as CGI festivals — films centered on hyperkinetic action rather than story or character (opinions differ, naturally). The first two originate from the MCU (acronym alert: MCU = Marvel Cinematic Universe, which is lousy with comic book superheros) and the last is based on a Saturday-morning children’s cartoon. Watching grown men and a few women on Screen Junkies getting overexcited about content originally aimed at children gives me pause, yet I watch them to see what fighters say, knowing full well that thoughtful remarks are infrequent.

Were I among the fighters (no chance, since I don’t have my own media fiefdom), I would likely be stumped when a question needs immediate recall (by number, as in M:I:3 for the third Mission Impossible film) of a specific entry from any of numerous franchises pumping out films regularly like those named above. Similarly, my choices would not be so limited to films released after 1990 as theirs, that year being the childhood of most of the fighters who appear. Nor would my analysis be so embarrassingly visual in orientation, since I understand good cinema to be more about story and character than whiz-bang effects.

Despite the visual feast fanboys adore (what mindless fun!), lazy CGI festivals suffer worst from overkill, far outstripping the eye’s ability to absorb onscreen action fully or effectively. Why bother with repeat viewing of films with little payoff in the first place? CGI characters were interesting in and of themselves the first few times they appeared in movies without causing suspension of belief, but now they’re so commonplace that they feel like cheating. Worse, moviegoers are now faced with so many CGI crowds, clone and robot armies, zombie swarms, human-animal hybrids, et cetera ad nauseum, little holds the interest of jaded viewers. Thus, because so few scenes resonate emotionally, sheer novelty substitutes (ineffectively) for meaning, not that most chases or slugfests in the movies offer much truly original. The complaint is heard all the time: we’ve seen it before.

Here’s my basic problem with the three CGI-laden franchise installments I saw recently: their overt hypermilitarism. When better storytellers such as Kubrick or Coppola make films depicting the horrors of war (or other existential threats, such as the ever-popular alien invasion), their perspective is indeed that war is horrible, and obvious moral and ethical dilemmas flow from there. When hack filmmakers pile up frenzied depictions of death and destruction, typically with secondary or tertiary characters whose dispatch means and feels like nothing, and with cities destroyed eliciting no emotional response because it’s pure visual titillation, they have no useful, responsible, or respectable commentary. Even the Screen Junkies recognize that, unlike, say, Game of Thrones, none of their putative superheroes really face much more than momentary distress before saving the day in the third act and certainly no lasting injury (a little make-up blood doesn’t convince me). Dramatic tension simply drains away, since happy resolutions are never in doubt. Now, characters taking fake beatdowns are laughter inducing, sorta like professional wrestling after the sheepish admission that they’ve been acting all along. Frankly, pretend drama with nothing at stake is a waste of effort and the audience’s time and trust. That so many fanboys enjoy being goosed or that some films make lots of money is no justification. The latter is one reason why cinema so often fails to rise to the aspiration of art: it’s too bound up in grubbing for money.

I have read a number of exhortations by gurus of one sort or another, usually plumbing the depths of self-delusion, to “imagine the absurd” as a means of unlocking one’s latent creativity blocked by hewing too closely to convention and, dare I say it, reality. Invoking absurdity seems to me redundant: we (well, some) already live with absurd comfort and wealth, purchased by the sweat and suffering of many, not least of which is the Earth itself (or herself, if you prefer). Slights and insults absorbed by the biosphere in the last decade may be individually insignificant, but over time and in aggregate, they constitute a proverbial death by a thousand cuts. But you don’t have to take my word for it: investigate for yourself how many plant and animal species have suffered significant die-offs. Other absurdities are piling up, too, especially in the area of politics, which is looking more than ever (how is that even possible?) like a clown circus as, for example, more candidate from both major political parties enter the 2016 presidential race. We haven’t even gotten to loony third-party candidates yet.

These are familiar ideas to readers of this blog, and although they bear repeating, they are not really what I want to discuss. Rather, it has become increasingly clear that in an age of excess and entitlement — call it the Age of Absurdity — the awful truth can only be told through comedy, just like Michael Moore’s short-lived comic documentary TV show of the same name. Sure, there are a growing number of Cassandras like me prophesying doom, but our claim on public dialogue is thus far negligible. Further, serious documentaries exposing absurd levels of corruption, mendacity, idiocy, and cruelty are currently enjoying a golden age. But compare these to any number of TV shows and movies — all offered for comedic entertainment purposes — that are now functioning as de facto news outlets (The Daily Show and Real Time with Bill Maher have been particularly manifest about this), and it’s easy to see that the public prefers its truth sugar-coated, even if it’s given a reverse twist such as with The Colbert Report. (I can’t watch Colbert for the same reason I can’t watch South Park or The Simpsons: they’re too accurate, too on the nose, even as jokey reflections of or on the Age of Absurdity.) The only thing one needs to reveal truth inside a comedy show (not just the fake news shows) is to ignore the laugh track and turn off one’s sense of humor, treating each comedy bit earnestly, the way a child would. That’s how awful, accurate, and absurd things have become.

Take, for instance, this article in The New Yorker, which is satire on its face but quite literally tells the truth when considered soberly. The last line, “Our research is very preliminary, but it’s possible that they [denialists of all stripes] will become more receptive to facts once they are in an environment without food, water, or oxygen,” is pretty macabre but tells precisely the thing to be expected when supplies falter.

Take, for another instance, the celebrity roasts that Comedy Central has revived. I’ve watched only a few clips, but roasters typically say egregiously insulting things that are quite literally true about the roastee, who laughs and smiles through the humiliation. Insult comedy may perhaps be embellished or exaggerated for effect, but it scarcely needs to be. To call someone a hack or comment on his/her undesired unwarranted overexposure (commonplace now in the era of omnimedia and leaked sex tapes) takes a little comedic shaping, but there is always a sizable kernel of truth behind the jokes. That’s what makes comedy funny, frankly. This might be best exemplified when a joke is made “too soon.” The subject matter will become funny in time, after the shocking truth has worn off some, but when too soon, the insult is just too much to take in good taste and no enjoyment can be had from exposing that particular truth.

Is there a conclusion to be drawn? I dunno. The culture has room for both seriousness and humor, gallows and otherwise. I wonder sometimes if the ability to act with seriousness of purpose to forestall the worst is even possible anymore. Instead, we’re absorbed by distractions and cheap entertainments that divert our attention to trifles. (Why am I aware in even the slightest way of the Kardashians?) A true expression of the Zeitgeist perhaps, we know deep down that the dominoes are tipping over and we’re lined up to take hit after hit until everything has crumbled around us. So why not laugh and party right up to the bitter end?

A Surfeit of Awards

Posted: January 29, 2015 in Culture, Education, Idle Nonsense, Tacky, Taste
Tags: ,

/rant on

I get alumni magazines from two colleges/universities I attended. These institutional organs are unapologetic boosters of the accomplishments of alumni, faculty, and students. They also trumpet never-ending capital campaigns, improvements to facilities, and new and refurbished buildings. The latest round of news from my two schools feature significant new and rebuilt structures, accompanied by the naming of these structures after the foundations, contributors, and faculty/administrators associated with their execution. Well and good, you might surmise, but I always have mixed feelings. No doubt there are certain thresholds that must be met for programs to function and excel: stadia and gyms, locker rooms, concert halls and theaters, practice and rehearsal spaces, equipment, computer labs, libraries and their holdings, etc. Visiting smaller schools having inadequate facilities always brought that point home. Indeed, that’s one of the reasons why anyone chooses a school: for the facilities.

Since the late sixties or so, I have witnessed one school after another (not just in higher education) becoming what I think of as lifestyle schools. Facilities are not merely sufficient or superior; they range into the lap of luxury and excess. It’s frankly embarrassing that the quality and furnishings of dormitories now exceed what most students will enjoy for decades post-graduation. In my college years, no one found it the slightest bit embarrassing to have meager accommodations. That’s not why one was there. Now the expectation is to luxuriate. Schools clearly compete to attract students using a variety of enticements, but delivering the best lifestyle while in attendance was formerly not one of them. But the façades and accoutrements are much easier to evaluate than the academic programs, which have moved in the opposite direction. Both are now fraudulent at many schools; it’s a game of dress-up.

That rant, however, may only the tip of the proverbial iceberg. I cannot escape the sense that we celebrate ourselves and our spurious accomplishments with amazing disregard for their irrelevance. Unlike many, who dream of achieving immortality through proxy, the desire to see one’s name on the side of a building, in a hall of fame, on an endowed chair, etched in a record book, or otherwise gouged into posterity confounds me. Yet I can’t go anywhere without finding another new feature named after someone, usually posthumously but not always, whose memory must purportedly be preserved. (E.g., Chicago recently renamed the Circle Interchange after its first and only female mayor, Jane Byrne, causing some confusion due to inadequate signage.) The alumni magazines were all about newly named buildings, chairs, scholarships, halls, bricks, and waste cans. It got to be sickening. The reflex is now established: someone gives a pile of money or teaches (or administers) for a time, name something after him or her. And as we enter championship and awards season in sports and cinema, the surfeit of awards doled out, often just for showing up and doing one’s job, is breathtaking.

Truly memorable work and achievement need no effusive praise. They are perpetuated through subscription. Yet even they, as Shelley reminds us, pass from memory eventually. Such is the way of the world in the long stretches of time (human history) we have inhabited it. Readers of this blog will know that, in fairly awful terms, that time is rapidly drawing to a close due to a variety of factors, but primarily because of our own prominence. So one might wonder, why all this striving and achieving and luxuriating and self-celebrating when its end is our own destruction?

/rant off

Difficult Pleasures

Posted: January 14, 2015 in Consumerism, Culture, Idle Nonsense, Taste

Everyone has acquaintance with what I call the menu problem. One goes to an unfamiliar restaurant and studies the menu to make a selection from among a broad range of options. Those options may change (or not) based on seasonal availability of quality ingredients or some chef/menu designer at the home office in Columbus, Ohio, changing things up to create buzz in a bid to attract greater market share. (McDonald’s rolls out or resurrects moribund sandwiches with surprising regularity.) No matter, one must content oneself with the eventual decision or else suffer buyer’s remorse.

But the problem lies not so much in the grass-is-greener syndrome of food choices not elected but in the presumption that an optimal choice is possible from myriad options. Put another way, if one can make a poor choice (say, something not to taste), then it’s implicit that one can make a superior choice, maybe even one leading to a peak experience — all this simply by showing up and paying (or overpaying) the bill. A similar quandary lies behind the problem of brand competition and fragmentation, where available options for the right (or wrong) toothpaste, cola, cell phone and plan, credit card, TV show, movie, etc. multiply and create bewilderment if one deliberates too much. Even for someone with effectively unlimited funds, time limitations result in the inability to evaluate even a majority of slated offerings.

And therein lies the rub: conspicuous consumption is too easy and carries with it the suggestion of ecstasy if only one chooses well. Little may be done to earn the enjoyment or reward, and without some struggle, getting what one wants often feels hollow. In contrast, honest gratification over even meager portions, quality, or results often follows on hardship and extended effort. For example, those who have roughed it out in the wilderness know that doing without for a spell can transform something as pedestrian as granola into an unexpectedly superlative experience. Or in the classroom, an easy A is unappreciated precisely because students know intuitively that it isn’t really evidence of learning or achievement, whereas a hard-won B– might be the source of considerable pride. Under the right conditions, one might even feel some justifiable righteousness, though in my experience, hardships endured tend to produce humility.

One of the sure-fire ways I discovered of triggering euphoria is endurance racing. When the finish line finally swings into view, I recognize that in a few more moments, I will have accomplished the distance and be able to stop pushing. My time and place relative to my age group are irrelevant. I also know that I can’t have that rush if I don’t first sacrifice and suffer for it. Further, contentment and euphoria cannot be sustained for long. Rather, they typically come at the end of something, inviting a nostalgic glow that fades as normalcy reasserts itself.

I’m writing about this because I have rubbed elbows with some folks for whom the most perfect, exquisite pleasure is their expectation for everything all the time because they use their wealth to procure only the best at every turn. Maybe they subscribe to some form of lifestyle service, such as this one, and have others pampering and praising them round the clock. I contend that they don’t actually know how to be happy or to enjoy themselves because, when something goes awry or falls short, the fits and conniptions would embarrass a three-year-old. See, for example, this. Such shameful behavior also puzzles me because the current marketplace is a veritable cornucopia (not yet the notorious deathtrap from The Hunger Games). Improved distribution and delivery make stuff available cheaply and easily to nearly everyone with a computer and a credit card. Yet many take it all for granted, grind away miserably at service providers who fail their standards, and fail to recognize that most of it is poised to go away. The current Age of Abundance is shifting toward an epoch-making contraction, but in the meantime, some feel only discontentment because their appetites can never be sated. They don’t understand that difficult pleasures and cessation of suffering are far more gratifying than the lap of luxury.

Kyung Wha Chung has been in the back of my mind for decades. Her recording of the Berg (and Bartók) Violin Concerto(s) with the Chicago Symphony Orchestra under Sir Georg Solti has long been on my list of favorite recordings, all the more so for making a difficult work intelligible to the listener. Her other recordings have mostly escaped my attention, and I’ve never heard her perform live. Three interesting developments have brought her again to my attention: Decca’s new release of a box set of her recordings, her return to the London stage that first brought her fame, and her regrettable response to an audience coughing fit from that stage. Coverage of the last two news items has been provided by Norman Lebrecht at his website Slipped Disc. I’ve linked to Lebrecht twice in the past, but he’s not on my blogroll because he writes deplorable clickbait headlines. I appreciate his work aggregating classical music news, which is mostly about personnel (hiring and firing), but his obvious pandering irks me. The incident of the coughing spasm filtering through the audience, however, attracted my attention independent of the individuals involved. Commentary at Slipped Disc runs the gamut from “she was right to respond” to “an artist should never acknowledge the public in such a manner.” The conflict is irresolvable, of course, but let me opine anyway.

Only a few venues/activities exist where cultured people go to enjoy themselves in the exercise of good manners and taste. The concert hall (classical music, including chamber music and solo recitals but not popular musics) is one such oasis. Charges of snobbery and elitism are commonplace when criticisms of the fine arts come into play, but the mere fact that absolutely anyone can buy a ticket and attend puts the lie to that. Better to focus such coarse thinking on places like golf, country, and suppers clubs that openly exclude nonmembers, typically on the basis of nonpayment of onerous membership fees. Other bases for exclusion I will leave alone. (The supposition that sophistication accompanies wealth is absurd, as anyone having acquaintance with such places can attest.) I note, too, that democratization of everything has brought more access to fine arts to everyone — but at a cost, namely, the manners and self-control needed for the audience space to function effectively has eroded in the last few decades.

Is has been said that all arts aspire to the condition of music, with its unity of subject matter and form that fosters direct connection to the emotions. As such, the concert artist (and ensembles) in the best case scenario casts an emotional spell over audiences. In response, audiences cannot sit in stony silence but should be emotionally open and engaged. Distractions, whether visual or aural, unavoidably dispel the tone established in performance, no matter if they happen to occur during the brief interval between movements rather than during performance. A noisy, extended interval where the audience coughed, fidgeted, and otherwise rearranged itself reportedly occurred after the first movement of a Mozart sonata performed by Kyung Wha Chung, and she was irritated enough to respond indelicately by upbraiding the parent of a child, the child unfortunately being among the last to be heard coughing. As a result, there was a palpable tension in the room that didn’t wear off, not unlike when an audience turns on a performer.

Audience disruption at concerts is not at all unusual; in some estimations, lack of decorum has only increased over the years. My first memory of a concert being temporarily derailed by the audience was in the middle 1980s. So now the arguments are flying back and forth, such as that the audience pays to see/hear what’s offered onstage and the artist has no business complaining. Another goes that the artist should be operating on a lofty aesthetic plane that would disallow notice-taking of audience behavior. (Miles Davis is renowned and sometimes reviled for having often turned his back to the audience in performance.) Both quite miss the point that it is precisely an emotional circuit among composer (or by proxy, the composer’s work), performer, and audience that makes the endeavor worthwhile. Excellence in composition and performance are requirements, and so too is the thoughtful contribution of the audience to close the circuit. Suggestions that boorish behavior by audiences is irrelevant fail to account for the sensitivity needed among all parties to make the endeavor effective.

It happens that I gave a solo recital a few months ago, my first in more than a decade. I am by no means an artist anywhere near the accomplishment of Kyung Wha Chung (few are, frankly), but I rely on audience response the same as any performer. My first surprise was the number of no-shows among my friends and peers who had confirmed their attendance. Then, after the completion of the first four-movement sonata, the audience sat silently, not making a peep. It fell to me to respond, to invite applause, to overcome the anxiety in the room regarding the proper way to act. (Clapping between movements is not customary, and clumsy audiences who clap in the wrong places have sometimes been shushed, so I surmised there was fear about when applause was supposed to happen.) Further, due to the awkwardness of the performance space (only one place the piano would fit), three latecomers (35+ min. into the performance) paraded right past me, between movements, to get seated. I was affected by these surprises but tried to take them in stride. Still, it’s fair to say my concentration was more than a little rattled. So I have some sympathy for any performer whose audience behaves unpredictably.

At the extremes, there are artists whose performance style is deep concentration or a nearly hypnotic state where even small disruptions take them out of the moment, whereas others can continue unimpeded through an air raid. No one-size-fits-all solution exists, of course, and in hindsight, it’s always possible to imagine better ways to respond to setbacks. However, I cannot join in the side of the debate that condemns Kyung Wha Chung, however regrettable her response was.

The Chicago Reader has a feature article on something I have blogged about repeatedly, namely, infiltration of abandoned structures to take photographs and video(s) in the interest of documenting modern ruins and establishing an aesthetic I called “post-industrial chic.” The Reader article provides new nomenclature for this behavior and sensibility: urban exploration, or urbex for short. The article cites Detroit, Chicago, and Gary (IN) as urbex hubs, but my previous surfing around the Internet revealed plenty of other sites, including those on other continents, though perhaps none so concentrated as the American rust belt. The idea is proliferating, perhaps even faster than abandonment of structures built to house our more enterprising endeavors, with Facebook pages, Meet.Up groups, and an already defunct zine/blog/book complex called Infiltration, which is/was devoted to penetrating places where one is not supposed to be. It would be suitably ironic if Infiltration had itself been abandoned, but instead, its founder and chief instigator passed away.

It’s impossible to know what may be going on inside of the minds of those who are, by turns, documentarians, aesthetes and artists, thrill-seekers, and voyeurs. Have they pieced together the puzzle yet, using their travels to observe that so many of these crumbling structures represent the ephemeral and illusory might of our economic and technical achievements, often and unexpectedly from the Depression Era with its art deco ornamentation? Is there really beauty to be found in squalor?

Answers to those questions are not altogether apparent from urbex sources. Whereas artistic statements are de rigeur in galleries and artist’s websites, urbex purveyors tend to be uncharacteristically silent about their drive to document. There are frequent paeans to the faded, former glory of the abandoned sites, but what resonates is the suggestion of human activity and optimism no longer enjoyed but held over in the broken fibers of the structures rather than a recognition that, by not even being worth the bother of tearing down, these ruins are close reminders of our own uselessness in old age, impermanence, and mortality.

To those more doom-aware, if I can be so presumptuous, another deeper significance flows from late-modern ruins: our self-defeat. The Pyrrhic victory of human success (in demographic terms) over the rest of creation has lasted long enough to spans entire lifetimes, which has been enjoyed innocently by those born at the propitious historical moment (if, indeed, they managed to survive various 20th-century genocides and wars). But for those of us born only a little later, we are already witness to the few decayed bits (thus far) of the far more expansive human-built world we will leave behind.

This fate was explored by the History Channel film Life After People, which omits the obvious reasons for our disappearance but simply leaps ahead in time to contemplate how the natural world reacts to our absence. The film, as it turns out, became the pilot for a series that appears to have run for two seasons, largely on its own recycled bits. Invented imagery of this eventuality is echoed in all manner of cinematic demolition derbies, with New York City and the White House among the most iconic locations to undergo ritual destruction for our, um, what? Enjoyment?