Posts Tagged ‘Celebrity’

I don’t have the patience or expertise to prepare and offer a detailed political analysis such as those I sometimes (not very often) read on other blogs. Besides, once the comments start filling up at those sites, every possible permutation is trotted out, muddying the initial or preferred interpretation with alternatives that make at least as much sense. They’re interesting brainstorming sessions, but I have to wonder what is accomplished.

My own back-of-the-envelope analysis is much simpler and probably no closer to (or farther from) being correct, what with everything being open to dispute. So the new POTUS was born in 1946, which puts the bulk of his boyhood in the 1950s, overlapping with the Eisenhower Administration. That period has lots of attributes, but the most significant (IMO), which would impact an adolescent, was the U.S. economy launching into the stratosphere, largely on the back of the manufacturing sector (e.g., automobiles, airplanes, TVs, etc.), and creating the American middle class. The interstate highway system also dates from that decade. Secondarily, there was a strong but misplaced sense of American moral leadership (one might also say authority or superiority), since we took (too much) credit for winning WWII.

However, it wasn’t great for everyone. Racism, misogyny, and other forms of bigotry were open and virulent. Still, if one was lucky to be a white, middle class male, things were arguably about as good as they would get, which many remember rather fondly, either through rose-colored glasses or otherwise. POTUS as a boy wasn’t middle class, but the culture around him supported a worldview that he embodies even now. He’s also never been an industrialist, but he is a real estate developer (some would say slumlord) and media figure, and his models are taken from the 1950s.

The decade of my boyhood was the 1970s, which were the Nixon, Ford, and Carter Administrations. Everyone could sense the wheels were already coming off the bus, and white male entitlement was far diminished from previous decades. The Rust Belt was already a thing. Like children from the 1950s forward, however, I spent a lot of time in front of the TV. Much of it was goofy fun such as Gilligan’s Island, The Brady Bunch, and interestingly enough, Happy Days. It was innocent stuff. What are the chances that, as a boy plopped in front of the TV, POTUS would have seen the show below (excerpted) and taken special notice considering that the character shares his surname?

Snopes confirms that this a real episode from the TV show Trackdown. Not nearly as innocent as the shows I watched. The coincidences that the character is a con man, promises to build a wall, and claims to be the only person who can save the town are eerie, to say the least. Could that TV show be lodged in the back of POTUS’ brain, along with so many other boyhood memories, misremembered and revised the way memory tends to do?

Some have said that the great economic expansion of the 1950s and 60s was an anomaly. A constellation of conditions configured to produce an historical effect, a Golden Era by some reckonings, that cannot be repeated. We simply cannot return to an industrial or manufacturing economy that had once (arguably) made America great. And besides, the attempt would accelerate the collapse of the ecosystem, which is already in free fall. Yet that appears to be the intention of POTUS, whose early regression to childhood is a threat to us all.

I discovered “The Joe Rogan Experience” on YouTube recently and have been sampling from among the nearly 900 pod- or webcasts posted there. I’m hooked. Rogan is an impressive fellow. He clearly enjoys the life of the mind but, unlike many who are absorbed solely in ideas, has not ignored the life of the body. Over time, he’s also developed expertise in multiple endeavors and can participate knowledgeably in discussion on many topics. Webcasts are basically long, free-form, one-on-one conversations. This lack of structure gives the webcast ample time to explore topics in depth or simply meander. Guests are accomplished or distinguished in some way and usually have fame and wealth to match, which often affects content (i.e., Fitzgerald’s observation: “The rich are different than you and me”). One notable bar to entry is having a strong media presence.

Among the recurring themes, Rogan trots out his techno optimism, which is only a step short of techno utopianism. His optimism is based on two interrelated developments in recent history: widespread diffusion of information over networks and rapid advances in medical devices that can be expected to accelerate, to enhance human capabilities, and soon to transform us into supermen, bypassing evolutionary biology. He extols these views somewhat regularly to his guests, but alas, none of the guests I’ve watched seem to be able to fathom the ideas satisfactorily enough to take up the discussion. (The same is true of Rogan’s assertion that money is just information, which is reductive and inaccurate.) They comment or joke briefly and move onto something more comfortable or accessible. Although I don’t share Rogan’s optimism, I would totally engage in discussion of his flirtation with Transhumanism (a term he doesn’t use). That’s why I’m blogging here about Rogan, in addition to my lacking enough conventional distinction and fame to score an invite to be a guest on his webcast. Plus, he openly disdains bloggers, many of whom moderate comments (I don’t) or otherwise channel discussion to control content. Oh, well.


/rant on

With a new round of presidential debates upon us (not really debates if one understands the nature of debate or indeed moderation — James Howard Kunstler called it “the gruesome spectacle of the so-called debate between Trump and Clinton in an election campaign beneath the dignity of a third-world shit-hole”), it’s worthwhile to keep in the front of one’s mind that the current style of public discourse does not aim to provide useful or actionable information with regard to either the candidates or the issues. Rather, the idea is to pummel the hapless listener, watcher, or reader into a quivering jangle of confusion by maintaining a nonstop onslaught of soundbites, assertions, accusations, grandstanding, and false narratives. Our information environment abets this style of machine-gun discourse, with innumerable feeds from InstaGoogTwitFaceTube (et cetera), all vying simultaneously for our limited attention and thereby guaranteeing that virtually nothing makes a strong impression before the next bit of BS displaces it in a rapid succession of predigested morsels having no nutritional content or value for earnest consumers of information (as opposed to mouth-breathers seeking emotional salve for their worst biases and bigotry). Many feeds are frankly indecipherable, such as when the message is brutally truncated and possessed of acronyms and hashtags, the screen is cluttered with multiple text scrolls, or panel participants talk over each other to claim more screen time (or merely raise their asshole quotient by being the most obnoxious). But no matter so long as the double barrels keep firing.

I caught Republican nominee Donald Trump’s campaign manager Kellyann Conway being interviewed by some banal featherweight pulling punches (sorry, no link, but she’s eminently searchable). Conway proved adept at deflecting obvious contradictions and reversals (and worse) of the Trump campaign by launching so many ideological bombs that nothing the interviewer raised actually landed. Questions and conflicts just floated away, unaddressed and unanswered. Her bizarre, hyperverbal incoherence is similar to the candidate’s stammering word salad, and ironically, both give new meaning to the decades-old term “Teflon” when applied to politics. Nothing sticks because piling on more and more complete wrongness and cognitive dissonance overwhelms and bewilders anyone trying to track the discussion. Trump and Conway are hardly alone in this, of course, though their mastery is notable (but not admirable). Talking heads gathered in panel discussions on, say, The View or Real Time with Bill Maher, just about any klatch occupying news and morning-show couches, and hosts of satirical news shows (some mentioned here) exhibit the same behavior: a constant barrage of high-speed inanity (and jokes, omigod the jokes!) that discourages consideration of an idea before driving pellmell onto the next.

Thoughtful persons might pause to wonder whether breathless, even virtuoso delivery results from or creates our abysmally short attention spans and lack of serious discussion of problems plaguing the nation. Well, why can’t it be both? Modern media is all now fast media, delivering hit-and-run spectacle to overloaded nervous systems long habituated to being goosed every few moments. (Or as quoted years ago, “the average Hollywood movie has become indistinguishable from a panic attack.”) Our nervous systems can’t handle it, obviously. We have become insatiable information addicts seeking not just the next fix but a perpetual fix, yet the impatient demand for immediate gratification — Internet always at our fingertips — is never quelled. Some new bit will be added to the torrent of foolishness sooner than it can be pulled down. And so we stumble like zombies, blindly and willingly, into a surreality of our own making, heads down and faces blue from the glare of the phone/tablet/computer. Of course, the shitshow is brightly festooned with buffoon candidates holding court over the masses neither intends to serve faithfully in office. Their special brand of insanity is repeated again and again throughout the ranks of media denizens (celebrity is a curse, much like obscene wealth, or didn’t you know that?) and is seeping into the ground water to poison all of us.

/rant off

A couple of posts ago, I used the phrase “pay to play” in reference to our bought-and-paid-for system of political patronage. This is one of those open secrets we all recognize but gloss over because, frankly, in a capitalist economy, anything that can be monetized and corrupted will be. Those who are thus paid to play enjoy fairly handsome rewards for doing not very much, really. Yet the paradigm is self-reinforcing, much like the voting system, with promises of increased efficiency and effectiveness with greater levels of participation. Nothing of the sort has proven to be true; it’s simply a goad we continue to hear, some believing in the carrot quite earnestly, others holding their noses and ponying up their dollars and votes, and still others so demoralized and disgusted with the entire pointless constellation of lies and obfuscations that refusing to participate feels like the only honest response. (Periodic arguments levied my way that voting is quite important have failed to convince me that my vote matters a whit. Rather, it takes a bizarre sort of doublethink to conclude that casting my ballot is meaningful. Of late, I’ve succumbed to sustained harangues and shown up to vote, but my heart’s not in it.) I can’t distinguish so well anymore between true believers and mere manipulators except to observe that the former are more likely to be what few civic-minded voters remain and the latter are obviously candidates and their PR hacks. Journalists? Don’t get me started.

The phrase put me in mind of two other endeavors (beyond politics) where a few professionals enjoy being paid to play: sports and performing arts. Both enjoy heavy subscription among the masses early in life, as student sports and performing groups offer training and experience. The way most of us start out, in fact, we actually pay to play through classes, lessons, training, dues, and memberships that provide access to experts and put us in position to reap rewards later in life. Maybe you attended tennis camp or music camp as a kid, or you paid for a college education (defrayed perhaps by activity scholarships) majoring in athletics or theater. Lots of variations exist, and they’re not limited to youth. As an endurance athlete, I continue to pay entrance fees to race organizers for the opportunity to race on courses with support that would otherwise be unavailable without the budget provided by participants, sponsorship notwithstanding. Chicago’s popular 16-inch softball leagues are pay-to-play sports.

A second phase might be giving it away for free. As with paying to play, pure enjoyment of the endeavor works as a strong motivation and justification. This is probably more common in the community-level performing arts, where participation is just plain fun. And who knows? Exposure might lead to a big break or discovery. It’s also what motivates quite a lot of amateur athletes, especially for sports that have not gone mainstream. Olympic athletes (tertiary events) might fall roughly into this category, especially when their primary incomes are derived elsewhere. A third phase is being paid to play. If the audience or fan base is big enough, the financial rewards and fame can be considerable. However, those who enter the professional ranks don’t always demonstrate such great prowess, especially early on. More than a few blow up and flame out quickly, unable to sustain the spark that launched their careers. There’s also being paid to play but earning well short of a livable wage, which borders on giving it away for free or at least for too little. A final phase is being paid not to play. A mean interpretation of that would be that one is screwing up or blocking others’ opportunities to the point where it becomes worthwhile to pay someone to not show up or to go away. A more charitable interpretation would be that one’s employment contract includes time-off benefits that require continuous payments even when not playing.

As with my post about the differences between the Participation, Achievement, and Championship Models, I’m now content with numerous endeavors to be either pay to play, play for free, or play for too little. Participation makes it worthwhile under any payment regime, the alternative typically being sitting at home on my couch wasting my time in front of the TV. I never made it to the enviable position of being paid to play or paid not to play. Still, as an individual of some attainment and multiple areas of expertise, I admit finding it irksome to observe some truly awful people out there pulling in attention and wealth despite rather feeble efforts or abilities. The meritocracy may not be dead, but it often looks comatose.

I’m not paying close attention to the RNC in Cleveland. Actually, I’m ignoring it completely, still hoping that it doesn’t erupt in violence before the closing curtain. Yet I can’t help but hear some relevant news, and I have read a few commentaries. Ultimately, the RNC sounds like a sad, sad nonevent put on by amateurs, with many party members avoiding coming anywhere near. What’s actually transpiring is worthy of out-loud laughter at the embarrassment and indignities suffered by participants. The particular gaffe that caught my attention is cribbing from Michelle Obama in the speech delivered on Monday by Melania Trump. The speech writer, Meredith McIver, has accepted blame for it and characterized it as an innocent mistake.

Maybe someone else has already said or written this, but I suspect innocent plagiarism is probably true precisely because that’s the standard in quite a lot of academe these days. Students get away with it all the time, just not on a national stage. Reworking another’s ideas is far easier than coming up with one’s own original ideas, and Melania Trump has no reputation (near as I can tell) as an original thinker. The article linked to above indicates she admires Michelle Obama, so the plagiarism is from a twisted perspective an encomium.

The failure of Trump campaign officials to review the speech (or if they did review it, then do so effectively) is another LOL gaffe. It doesn’t rise to the level of the McCain campaign’s failure to vet Sarah Palin properly and won’t have any enduring effects, but it does reflect upon the Trump campaign’s ineptitude. My secret fear is that ineptitude is precisely why a lot of folks intend to vote for Trump: so that he can accelerate America’s self-destruction. It’s a negative view, and somewhat devil-may-care, to say “let’s make it a quick crash and get things over with already.” Or maybe it’s darkly funny only until suffering ramps up.

The first Gilded Age in the U.S. and the Roaring Twenties were periods that produced an overabundance of wealth for a handful of people. Some of them became synonymous with the term robber baron precisely for their ability to extract and accumulate wealth, often using tactics that to say the least lacked scruples when not downright criminal. The names include Rockefeller, Carnegie, Astor, Mellon, Stanford, Vanderbilt, Duke, Morgan, and Schwab. All have their names associated in posterity with famous institutions. Some are colleges and universities, others are banking and investment behemoths, yet others are place names and commercial establishments. Perhaps the philanthropy they practiced was not entirely generous, as captains of industry (then and today) seem to enjoy burnishing their legacies with a high level of name permanence. Still, one can observe that most of the institutions bearing their names are infrastructure useful to the general public, making them public goods. This may be partly because the early 20th century was still a time of nation building, whereas today is arguably a time of decaying empire.

The second Gilded Age in the U.S. commenced in the 1980s and is still going strong as measured by wealth inequality. However, the fortunes of today’s tycoons appear to be directed less toward public enrichment than toward self-aggrandizement. The very nature of philanthropy has shifted. Two modern philanthropists appear to be transitional: Bill Gates and Ted Turner. The Gates Foundation has a range of missions, including healthcare, education, and access to information technology. Ted Turner’s well-publicized $1 billion gift to the United Nations Foundation in 1997 was an open dare to encourage similar philanthropy among the ultrarich. The Turner Foundation website’s byline is “protecting & restoring the natural world.” Not to be ungrateful or uncharitable, but both figureheads are renowned for highhandedness in the fashion in which they gathered up their considerable fortunes and are redirecting some portion of their wealth toward pet projects that can be interpreted as a little self-serving. Philanthropic efforts by Warren Buffet appear to be less about giving away his own fortune to charities or establishing institutions bearing his name as they are about using his notoriety to raise charitable funds from others sources and thus stimulating charitable giving. The old saying applies especially to Buffet: “no one gets rich by giving it away.” More galling, perhaps, is another group of philanthropists, who seem to be more interested in building shrines to themselves. Two entries stand out: The Lucas Museum (currently seeking a controversial site in Chicago) and The Walmart Museum. Neither resembles a public good, though their press packages may try to convince otherwise.

Charity has also shifted toward celebrity giving, with this website providing philanthropic news and profiles of celebrities complete with their causes and beneficiaries. With such a wide range of people under consideration, it’s impossible to make any sweeping statements about the use or misuse of celebrity, the way entertainers are overcompensated for their talents, or even how individuals such as Richard Branson and Elon Musk have been elevated to celebrity status primarily for being rich. (They undoubtedly have other legitimate claims to fame, but they’re overshadowed in a culture that celebrates wealth before any other attribute.) And then there are the wealthy contributors to political campaigns, such as the Koch brothers, George Soros, and Sheldon Adelson, just to name a few. It’s fair to say that every contributor wants some bang for their buck, but I daresay that political contributors (not strictly charity givers) expect a higher quotient of influence, or in terms more consistent with their thinking, a greater return on investment.

None of this takes into account the charitable work and political contributions stemming from corporations and unions, or indeed the umbrella corporations that exist solely to raise funds from the general public, taking a sizeable share in administrative fees before passing some portion onto the eventual beneficiary. Topical charities and scams also spring up in response to whatever is the latest natural disaster or atrocity. What’s the average citizen to do when the pittance they can donate pales in comparison to that offered by the 1% (which would be over 3 million people in the U.S. alone)? Or indeed how does one guard against being manipulated by scammers (including the burgeoning number of street panhandlers) and political candidates into throwing money at fundamentally insoluble problems? Are monetary gifts really the best way of demonstrating charity toward the needy? Answers to these questions are not forthcoming.

Update: Closure has been achieved on the Lucas Museum coming to Chicago. After 2 years of litigation blocking any building on his proposed site on the lakefront, George Lucas has decided to seek a site in California instead. Both sides had to put their idiotic PR spin on the result, but most people I know are relieved not to have George Lucas making inroads into Chicago architecture. Now if only we could turn back time and stop Donald Trump.

rant on/

This is the time of year when media pundits pause to look back and consider the previous year, typically compiling unasked-for “best of” lists to recap what everyone may have experienced — at least if one is absorbed by entertainment media. My interest in such nonsense is passive at best, dismissive at worst. Further, more and more lists are weighed and compiled by self-appointed and guileless fanboys and -girls, some of whom are surprisingly knowledgeable (sign of a misspent youth?) and insightful yet almost uniformly lack a sufficiently longitudinal view necessary to form circumspect and expert opinions. The analogy would be to seek wisdom from a 20- or 30-something in advance of its acquisition. Sure, people can be “wise beyond their years,” which usually means free of the normal illusions of youth without yet having become a jaded, cynical curmudgeon — post-ironic hipster is still available — but a real, valuable, historical perspective takes more than just 2-3 decades to form.

For instance, whenever I bring up media theory to a youngster (from my point of reckoning), usually someone who has scarcely known the world without 24/7/365 access to all things electronic, he or she simply cannot conceive what it means to be without that tether/pacifier/security blanket smothering them. It doesn’t feel like smothering because no other information environment has ever been experienced (excepting perhaps in early childhood, but even that’s not guaranteed). Even a brief hiatus from the information blitzkrieg, a two-week vacation, say, doesn’t suffice. Rather, only someone olde enough to remember when it simply wasn’t there — at least in the personal, isolating, handheld sense — can know what it was like. I certainly remember when thought was free to wander, invent, and synthesize without pressure to incorporate a continuous stream of incoming electronic stimuli, most of which amounts to ephemera and marketing. I also remember when people weren’t constantly walled in by their screens and feeds, when life experience was more social, shared, and real rather than private, personal, and virtual. And so that’s why when I’m away from the radio, TV, computer, etc. (because I purposely and pointedly carry none of it with me), I’m less a mark than the typical media-saturated fool face-planted in a phone or tablet for the lures, lies, cons, and swindles that have become commonplace in late-stage capitalism.

Looking back in another sense, I can’t help but to feel a little exasperated by the splendid reviews of the life in music led by Pierre Boulez, who died this past week. Never heard of him? Well, that just goes to show how far classical music has fallen from favor that even a titan such as he makes utterly no impression on the general public, only specialists in a field that garners almost no attention anymore. Yet I defy anyone not to know who Kim Kardashian is. Here’s the bigger problem: despite being about as favorably disposed toward classical music as it is possible to be, I have to admit that no one I know (including quite a few musicians) would be able to hum or whistle or sing a recognizable tune by Boulez. He simply doesn’t pass the whistle test. But John Williams (of Star Wars fame) certainly does. Nor indeed would anyone put on a recording of one of Boulez’s works to listen to. Not even his work as a conductor is all that compelling, either live or on disc (I’ve experienced plenty of both). As one looks back on the life of Pierre Boulez, as one is wont to do upon his passing, how can it be that such prodigious talent as he possessed could be of so little relevance?

Consider these two examples flip sides of the same coin. One enjoys widespread subscription but is base (opinions differ); the other is obscure but (arguably) refined. Put differently, one is pedestrian, the other admirable. Or over a lifetime, one is placebo (or worse), the other fulfilling. Looking back upon my own efforts and experiences in life, I would much rather be overlooked or forgotten than be petty and (in)famous. Yet mass media conspires to make us all into nodes on a network with goals decidedly other than human respectability or fulfillment. So let me repeat the challenge question of this blog: are you climbing or descending?

rant off/

The video below came to my attention recently, which shows a respectable celebrity, violinist/conductor Itzhak Perlman, being dicked around in an interview he probably undertook in good faith. My commentary follows.

Publicized pranks and gotchas are by no means rare. Some are good-natured and quite funny, but one convention of the prank is to unmask it pretty quickly. In the aftermath, the target typically either laughs if off, leaves without comment, or less often, storms out in disgust. Andy Kaufman as “Tony Clifton” was probably among the first to sustain a prank well past the point of discomfort, never unmasking himself. Others have since gotten in on the antics, though results are probably not any worse dickishness (dickery?) than Kaufman’s.

Fake interviews by comedians posing as news people are familiar to viewers of The Daily Show and its spinoff The Colbert Report (its run now completed). Zack Galifianakis does the same schtick in Between Two Ferns. It always surprises me when targets fall into the trap, exposing themselves as clueless ideologues willing to be hoisted with their own petards. However, Colbert in particular balanced his arch Republican stage persona with an unmistakable respect for his interview subject, which was at times inspired. Correspondents from The Daily Show are frequently pretty funny, but they almost never convey any respect for the subjects of the interview. Nick Canellakis (shown above) apparently has a whole series of interviews with classical musicians where he feigns idiocy and insult. Whereas some interview subjects are media savvy enough to get the joke and play along, I find this attempt at humor tasteless and unbearable.

Further afield, New Media Rockstars features a burgeoning list of media hosts who typically operate cheaply over the Web via YouTube, supported by an array of social media. At least one, Screen Junkies (the only one I watch), has recently blown into an entire suite of shows. I won’t accuse them all of being talentless hacks or dicking people around for pointless yuks, but I often pause to wonder what makes the shows worth producing beyond the hosts’ embarrassingly encyclopedic knowledge of comics, cartoons, TV shows, movies, etc. They’re fanboys (and girls) who have leveraged their misspent youth and eternal adolescence to gush and gripe about their passions. Admittedly, this may not be so different from sports fanatics (especially human statisticians), opera geeks, and nerds of others stripes.

Throwaway media may have unintentionally smuggled in tasteless shenanigans such as those by Nick Canellakis. Various comedians (unnamed) have similarly offered humorless discomfort as entertainment. Reality TV shows explored this area a while back, which I called trainwreck television. Cheaply produced video served over the Web has unleashed a barrage of dreck in all these categories. Some shows may eventually find their footing and become worthwhile. In the meantime, I anticipate seeing plenty more self-anointed media hosts dicking around celebrities and audiences alike.

Since the eruption of bigotry against Islam on the Bill Maher’s show Real Time last October, I have been bugged by the ongoing tide of vitriol and fear-mongering as radical Islam becomes this century’s equivalent of 20th-century Nazis. There is no doubt that the Middle East is a troubled region of the world and that many of its issues are wrapped about Islamic dogma (e.g., jihad) that have been hijacked by extremists. Oppression, misogyny, violence, and terrorism will get no apologetics from me. However, the fact that deplorable behaviors often have an Islamic flavor does not, to my mind, excuse bigotry aimed at Islam as a whole. Yet that is precisely the argument offered by many pundits and trolls.

Bill Maher did not get the ball rolling, exactly, but he gave it a good shove, increasing its momentum and seeming righteousness rightness among weak thinkers who take their cues and opinions from television personalities. Maher wasn’t alone, however, as Sam Harris was among his guests and argued that Islam is “the mother lode of bad ideas.” The notable exception on the panel that episode was Ben Affleck (Nicholas Kristof also made good points, though far more diplomatically), who called bullshit on Islam-baiting but failed to convince Maher or Harris, whose minds were already made up. Maher’s appeals to authoritative “facts” and “reality” (a sad bit of failed rhetoric he trots out repeatedly) failed to convince in the other direction.


Kyung Wha Chung has been in the back of my mind for decades. Her recording of the Berg (and Bartók) Violin Concerto(s) with the Chicago Symphony Orchestra under Sir Georg Solti has long been on my list of favorite recordings, all the more so for making a difficult work intelligible to the listener. Her other recordings have mostly escaped my attention, and I’ve never heard her perform live. Three interesting developments have brought her again to my attention: Decca’s new release of a box set of her recordings, her return to the London stage that first brought her fame, and her regrettable response to an audience coughing fit from that stage. Coverage of the last two news items has been provided by Norman Lebrecht at his website Slipped Disc. I’ve linked to Lebrecht twice in the past, but he’s not on my blogroll because he writes deplorable clickbait headlines. I appreciate his work aggregating classical music news, which is mostly about personnel (hiring and firing), but his obvious pandering irks me. The incident of the coughing spasm filtering through the audience, however, attracted my attention independent of the individuals involved. Commentary at Slipped Disc runs the gamut from “she was right to respond” to “an artist should never acknowledge the public in such a manner.” The conflict is irresolvable, of course, but let me opine anyway.

Only a few venues/activities exist where cultured people go to enjoy themselves in the exercise of good manners and taste. The concert hall (classical music, including chamber music and solo recitals but not popular musics) is one such oasis. Charges of snobbery and elitism are commonplace when criticisms of the fine arts come into play, but the mere fact that absolutely anyone can buy a ticket and attend puts the lie to that. Better to focus such coarse thinking on places like golf, country, and suppers clubs that openly exclude nonmembers, typically on the basis of nonpayment of onerous membership fees. Other bases for exclusion I will leave alone. (The supposition that sophistication accompanies wealth is absurd, as anyone having acquaintance with such places can attest.) I note, too, that democratization of everything has brought more access to fine arts to everyone — but at a cost, namely, the manners and self-control needed for the audience space to function effectively has eroded in the last few decades.

Is has been said that all arts aspire to the condition of music, with its unity of subject matter and form that foster direct connection to the emotions. As such, the concert artist (and ensembles) in the best case scenario casts an emotional spell over audiences. In response, audiences cannot sit in stony silence but should be emotionally open and engaged. Distractions, whether visual or aural, unavoidably dispel the tone established in performance, no matter if they happen to occur during the brief interval between movements rather than during performance. A noisy, extended interval where the audience coughed, fidgeted, and otherwise rearranged itself reportedly occurred after the first movement of a Mozart sonata performed by Kyung Wha Chung, and she was irritated enough to respond indelicately by upbraiding the parent of a child, the child unfortunately being among the last to be heard coughing. As a result, there was a palpable tension in the room that didn’t wear off, not unlike when an audience turns on a performer.

Audience disruption at concerts is not at all unusual; in some estimations, lack of decorum has only increased over the years. My first memory of a concert being temporarily derailed by the audience was in the middle 1980s. So now the arguments are flying back and forth, such as that the audience pays to see/hear what’s offered onstage and the artist has no business complaining. Another goes that the artist should be operating on a lofty aesthetic plane that would disallow notice-taking of audience behavior. (Miles Davis is renowned and sometimes reviled for having often turned his back to the audience in performance.) Both quite miss the point that it is precisely an emotional circuit among composer (or by proxy, the composer’s work), performer, and audience that makes the endeavor worthwhile. Excellence in composition and performance are requirements, and so too is the thoughtful contribution of the audience to close the circuit. Suggestions that boorish behavior by audiences is irrelevant fail to account for the sensitivity needed among all parties to make the endeavor effective.

It happens that I gave a solo recital a few months ago, my first in more than a decade. I am by no means an artist anywhere near the accomplishment of Kyung Wha Chung (few are, frankly), but I rely on audience response the same as any performer. My first surprise was the number of no-shows among my friends and peers who had confirmed their attendance. Then, after the completion of the first four-movement sonata, the audience sat silently, not making a peep. It fell to me to respond, to invite applause, to overcome the anxiety in the room regarding the proper way to act. (Clapping between movements is not customary, and clumsy audiences who clap in the wrong places have sometimes been shushed, so I surmised there was fear about when applause was supposed to happen.) Further, due to the awkwardness of the performance space (only one place the piano would fit), three latecomers (35+ min. into the performance) paraded right past me, between movements, to get seated. I was affected by these surprises but tried to take them in stride. Still, it’s fair to say my concentration was more than a little rattled. So I have some sympathy for any performer whose audience behaves unpredictably.

At the extremes, there are artists whose performance style is deep concentration or a nearly hypnotic state where even small disruptions take them out of the moment, whereas others can continue unimpeded through an air raid. No one-size-fits-all solution exists, of course, and in hindsight, it’s always possible to imagine better ways to respond to setbacks. However, I cannot join in the side of the debate that condemns Kyung Wha Chung, however regrettable her response was.