Archive for the ‘Classical Music’ Category

Among the many complaints that cross my path in the ongoing shitshow that American culture has become is an article titled “The Tragic Decline of Music Literacy (and Quality),” authored by Jon Henschen. His authorship is a rather unexpected circumstance since he is described as a financial advisor rather than an authority on music, technology, or culture. Henschen’s article reports on (without linking to it as I do) an analysis by Joan Serrà et al., a postdoctoral scholar at the Artificial Intelligence Research Institute. Curiously, the analysis is reported on and repackaged by quite a few news sites and blogs since its publication in 2012. For example, the YouTube video embedded below makes many of the same arguments and cites the so-called Millennial Whoop, a hook or gesture now ubiquitous in pop music that’s kinda sorta effective until one recognizes it too manifestly and it begins to sound trite, then irritating.

I won’t recount or summarize arguments except to say that neither the Henschen article nor the video discusses the underlying musical issues quite the way a trained musician would. Both are primarily quantitative rather than qualitative, equating an observed decrease in variety of timbre, loudness, and pitch/harmony as worse music (less is more worse). Lyrical (or poetical) complexity has also retreated. It’s worth noting, too, that the musical subject is narrowed to recorded pop music from 1955 to 2010. There’s obviously a lot to know about pop music, but it’s not generally the subject of serious study among academic musicians. AFAIK, no accredited music school offers degrees in pop music. Berklee College of Music probably comes the closest. (How exactly does songwriting as a major differ from composition?) That standard may be relaxing.

Do quantitative arguments demonstrate degradation of pop music? Do reduced variety, range, and experimentation make pop music the equivalent of a paint-by-the-numbers image with the self-imposed limitation that allows only unmixed primary colors? Hard to say, especially if one (like me) has a traditional education in art music and already regards pop music as a rather severe degradation of better music traditions. Reduction of the artistic palette from the richness and variety of, say, 19th-century art music proceeded through the 20th century (i.e., musical composition is now understood by the lay public to mean songs, which is just one musical genre among many) to a highly refined hit-making formula that has been proven to work remarkably well. Musical refinements also make use of new technological tools (e.g., rhythm machines, autotune, digital soundfield processing), which is another whole discussion.

Musical quality isn’t mere quantity (where more is clearly better), however, and some manage pretty well with limited resources. Still, a sameness or blandness is evident and growing within a genre that is already rather narrowly restricted to using drums, guitars, keyboards, vocals. The antidote Henschen suggests (incentivizing musical literacy and participation, especially in schools) might prove salutary, but such recommendations are ubiquitous throughout modern history. The magical combination of factors that actually catalyzes creativity, as opposed to degradation, is rather quixotic. Despite impassioned pleas not to allow quality to disappear, nothing could be more obvious than that culture drifts according to its own whims (to anthropomorphize) rather than being steered by well-meaning designs.

More to say in part 2 to follow.

Advertisements

Language acquisition in early childhood is aided by heavy doses of repetition and the memorable structure of nursery rhymes, songs, and stories that are repeated ad nauseum to eager children. Please, again! Again, again … Early in life, everything is novel, so repetition and fixity are positive attributes rather than causes for boredom. The music of one’s adolescence is also the subject of endless repetition, typically through recordings (radio and Internet play, mp3s played over headphones or earbuds, dances and dance clubs, etc.). Indeed, most of us have mental archives of songs heard over and over to the point that the standard version becomes canonical: that’s just the way the song goes. When someone covers a Beatles song, it’s recognizably the same song, yet it’s not the same and may even sound wrong somehow. (Is there any acceptable version of Love Shack besides that of the B52’s?) Variations of familiar folk tales and folk songs, or different phrasing in The Lord’s Prayer, imprinted in memory through sheer repetition, also possess discomfiting differences, sometimes being offensive enough to cause real conflict. (Not your Abrahamic deity, mine!)

Performing musicians traverse warhorses many times in rehearsal and public performance so that, after an undetermined point, how one performs a piece just becomes how it goes, admitting few alternatives. Casual joke-tellers may improvise over an outline, but as I understand it, the pros hone and craft material over time until very little is left to chance. Anyone who has listened to old comedy recordings of Bill Cosby, Steve Martin, Richard Pryor, and others has probably learned the jokes (and timing and intonation) by heart — again through repetition. It’s strangely comforting to be able to go back to the very same performance again and again. Personally, I have a rather large catalogue of classical music recordings in my head. I continue to seek out new renditions, but often the first version I learned becomes the default version, the way something goes. Dislodging that version from its definitive status is nearly impossible, especially when it’s the very first recording of a work (like a Beatles song). This is also why live performance often fails in comparison with the studio recording.

So it goes with a wide variety of phenomenon: what is first established as how something goes easily becomes canonical, dogmatic, and unquestioned. For instance, the origin of the universe in the big bang is one story of creation to which many still hold, while various religious creation myths hold sway with others. News that the big bang has been dislodged from its privileged position goes over just about as well as dismissing someone’s religion. Talking someone out of a fixed belief is hardly worth the effort because some portion of one’s identity is anchored to such beliefs. Thus, to question a cherished belief is to impeach a person’s very self.

Political correctness is the doctrine that certain ideas and positions have been worked out effectively and need (or allow) no further consideration. Just subscribe and get with the program. Don’t bother doing the mental work or examining the issue oneself; things have already been decided. In science, steady evidenciary work to break down a fixed understanding is often thankless, or thanks arrives posthumously. This is the main takeaway of Thomas Kuhn’s The Structure of Scientific Revolutions: paradigms are changed as much through attrition as through rational inquiry and accumulation of evidence.

One of the unanticipated effects of the Information and Communications Age is the tsunami of information to which people have ready access. Shaping that information into a cultural narrative (not unlike a creation myth) is either passive (one accepts the frequently shifting dominant paradigm without compunction) or active (one investigates for oneself as an attribute of the examined life, which with wizened folks never really arrives at a destination, since it’s the journey that’s the point). What’s a principled rationalist to do in the face of a surfeit of alternatives available for or even demanding consideration? Indeed, with so many self-appointed authorities vying for control over cultural narratives like the editing wars on Wikipedia, how can one avoid the dizzying disorientation of gaslighting and mendacity so characteristic of the modern information environment?

Still more to come in part 4.

Two shocking and vaguely humorous (dark, sardonic humor) events occurred recently in the gun debate: (1) in a speech, Marco Rubio sarcastically offered the very reform a healthy majority of the public wants — banning assault weapons — and revealed himself to be completely tin-earred with respect to the public he addresses, and (2) 45 supported some gun controls and even raised the stakes, saying that guns should be taken from people flagged as unstable and dangerous before they commit their mayhem. Rubio had already demonstrated his inability to think on his feet, being locked into scripts handed to him by … whom exactly? Certainly not the public he purportedly serves. So much for his presidential aspirations. OTOH, 45 channels populism and can switch positions quickly. Though ugly and base in many cases, populism at least expresses the will of the people, such as it can be known. His departure from reflexive Republican defense of the hallowed 2nd Amendment shouldn’t be too great a surprise; he’s made similar remarks in the past. His willingness to discard due process and confiscate guns before a crime has been committed sounds more than a little like Spielbergian precrime (via Orwell and Philip K. Dick). To even entertain this prospect in the gun debate demonstrates just how intolerable weekly mass shootings — especially school shootings by troubled youth — have become in the land of the free and home of the brave. On balance, 45 also recommended arming classroom teachers (a risible solution to the problem), so go figger.

Lodged deep in my brain is a potent archetype I don’t often see cited: the Amfortas wound. The term comes from Richard Wagner’s music drama Parsifal (synopsis found here). Let me describe the principal elements (very) briefly. Amfortas is the king of the Knights of the Holy Grail and has a seeping wound than cannot be healed except, according to prophecy, by an innocent youth, also described as a fool wizened by compassion. Such a youth, Parsifal, appears and after familiar operatic conflict does indeed fulfill the prophecy. Parsifal is essentially a retelling of the Arthurian legend. The music is some of the most transcendentally beautiful orchestral composition ever committed to paper and is very much recommended. Admittedly, it’s rather slow for today’s audiences more inclined to throwaway pop music.

Anyway, to tie together the gun debate and Parsifal, I muse that the Amfortas wound is gun violence and 45 is the titular fool who in the end heals the wound and becomes king of the Knights of the Holy Grail. The characterization is not entirely apt, of course, because it’s impossible to say that 45 is young, or compassionate, or wizened, but he has oddly enough moved the needle on gun debate. Not single-handedly, mind you, but from a seat of considerable power unlike, say, the Parkland survivors. Resolution and healing have yet to occur and will no doubt be opposed by the NRA and Marco Rubio. Maybe we’re only in Act I of the traditional 3-act structure. Other characters and plots devices from Parsifal I leave uncast. The main archetype is the Amfortas wound.

Back in undergraduate college, when just starting on my music education degree, I received an assignment where students were asked to formulate a philosophy of education. My thinking then was influenced by a curious textbook I picked up: A Philosophy of Music Education by Bennett Reimer. Of course, it was the wrong time for an undergraduate to perform this exercise, as we had neither maturity nor understanding equal to the task. However, in my naïvté, my answer was all about learning/teaching an aesthetic education — one that focused on appreciating beauty in music and the fine arts. This requires the cultivation of taste, which used to be commonplace among the educated but is now anathema. Money is the preeminent value now. Moreover, anything that smacks of cultural programming and thought control is now repudiated reflexively, though such projects are nonetheless undertaken continuously and surreptitiously through a variety of mechanisms. As a result, the typical American’s sense of what is beautiful and admirable is stunted. Further, knowledge of the historical context in which the fine arts exist is largely absent. (Children are ahistorical in this same way.) Accordingly, many Americans are coarse philistines whose tastes rarely extend beyond those acquired naturally during adolescence (including both biophilia and biophobia), thus the immense popularity of comic book movies, rock and roll music, and all manner of electronica.

When operating with a limited imagination and undeveloped ability to perceive and discern (and disapprove), one is a sitting duck for what ought to be totally unconvincing displays of empty technical prowess. Mere mechanism (spectacle) then possesses the power to transfix and amaze credulous audiences. Thus, the ear-splitting volume of amplified instruments substitutes for true emotional energy produced in exceptional live performance, ubiquitous CGI imagery (vistas and character movements, e.g., fight skills, that simply don’t exist in reality) in cinema produces wonderment, and especially, blinking lights and animated GIFs deliver the equivalent of a sugar hit (cookies, ice cream, soda) when they’re really placebos or toxins. Like hypnosis, the placebo effect is real and pronounced for those unusually susceptible to induction. Sitting ducks.

Having given the fine arts (including their historical contexts) a great deal of my academic attention and acquired an aesthetic education, my response to the video below fell well short of the blasé relativism most exhibit; I actively dislike it. (more…)

See this exchange where Neil deGrasse Tyson chides Sam Harris for failing to speak to his audience in terms it understands:

The upshot is that lay audiences simply don’t subscribe to or possess the logical, rational, abstract style of discourse favored by Harris. Thus, Harris stands accused of talking past his audience — at least somewhat — especially if his audience is understood to be the general public rather than other well-educated professionals. Subject matter is less important than style but revolves around politics, and worse, identity politics. Everyone has abundant opinions about those, whether informed by rational analysis or merely fed by emotion and personal resonance.

The lesson deGrasse Tyson delivers is both instructive and accurate yet also demands that the level of discourse be lowered to a common denominator (like the reputed 9th-grade speech adopted by the evening news) that regrettably forestalls useful discussion. For his part (briefly, at the end), Harris takes the lesson and does not resort to academic elitism, which would be obvious and easy. Kudos to both, I guess, though I struggle (being somewhat an elitist); the style-over-substance argument really goes against the grain for me. Enhancements to style obviously work, and great communicators use them and are convincing as a result. (I distinctly recall Al Gore looking too much like a rock star in An Inconvenient Truth. Maybe it backfired. I tend to think that style could not overcome other blocks to substance on that particular issue.) Slick style also allows those with nefarious agendas to hoodwink the public into believing nonsense.

(more…)

Today is the 10-year anniversary of the opening of this blog. As a result, there is a pretty sizeable backblog should anyone decide to wade in. As mentioned in my first post, I only opened this blog to get posting privileges at a group blog I admired because it functioned more like a discussion than a broadcast. The group blog died of attrition years ago, yet here I am 10 years later still writing my personal blog (which isn’t really about me).

Social media lives and dies by the numbers, and mine are deplorable. Annual traffic has ranged from about 6,800 to about 12,500 hits, much of which I’m convinced is mere background noise and bot traffic. Cumulative hits number about 90,140, and unique visitors are about 19,350, neither of which is anything to crow about for a blog of this duration. My subscriber count continues to climb pointlessly, now resting at 745. However, I judge I might have only a half dozen regular readers and perhaps half again as many commentators. I’ve never earned a cent for my effort, nor am I likely to ever put up a Patreon link or similar goad for donations. All of which only demonstrate that almost no one cares what I have to write about. C’est la vie. I don’t write for that purpose and frankly wouldn’t know what to write about if I were trying to drive numbers.

So if you have read my blog, what are some of the thing you might have gathered from me? Here’s an incomplete synopsis:

  • Torture is unspeakably bad. History is full of devices, methodologies, and torturers, but we learned sometime in the course of two 20th-century world wars that nothing justifies it. Nevertheless, it continues to occur with surprising relish, and those who still torture (or want to) are criminally insane.
  • Skyscrapers are awesomely tall examples of technical brilliance, exuberance, audacity, and hubris. Better expressions of techno-utopian, look-mom-no-hands, self-defeating narcissism can scarcely be found. Yet they continue to be built at a feverish pace. The 2008 financial collapse stalled and/or doomed a few projects, but we’re back to game on.
  • Classical music, despite record budgets for performing ensembles, has lost its bid for anything resembling cultural and artistic relevance by turning itself into a museum (performing primarily works of long-dead composers) and abandoning emotional expression in favor of technical perfection, which is probably an accurate embodiment of the spirit of the times. There is arguably not a single living composer who has become a household name since Aaron Copland, who died in 1990 but was really well-known in the 1940s and 50s.
  • We’re doomed — not in any routine sense of the word having to do with individual mortality but in the sense of Near-Term (Human) Extinction (NTE). The idea is not widely accepted in the least, and the arguments are too lengthy to repeat (and unlikely to convince). However, for those few able to decipher it, the writing is on the wall.
  • American culture is a constantly moving target, difficult to define and describe, but its principal features are only getting uglier as time wears on. Resurgent racism, nationalism, and misogyny make clear that while some strides have been made, these attitudes were only driven underground for a while. Similarly, colonialism never really died but morphed into a new version (globalization) that escapes criticism from the masses, because, well, goodies.
  • Human consciousness — another moving target — is cratering (again) after 3,000–5,000 years. We have become hollow men, play actors, projecting false consciousness without core identity or meaning. This cannot be sensed or assessed easily from the first-person perspective.
  • Electronic media makes us tools. The gleaming attractions of sterile perfection and pseudo-sociability have hoodwinked most of the public into relinquishing privacy and intellectual autonomy in exchange for the equivalent of Huxley’s soma. This also cannot be sensed or assessed easily from the first-person perspective.
  • Electoral politics is a game played by the oligarchy for chumps. Although the end results are not always foreseeable (Jeb!), the narrow range of options voters are given (lesser of evils, the devil you know …) guarantees that fundamental change in our dysfunctional style of government will not occur without first burning the house down. After a long period of abstention, I voted in the last few elections, but my heart isn’t really in it.
  • Cinema’s infatuation with superheros and bankable franchises (large overlap there) signals that, like other institutions mentioned above, it has grown aged and sclerotic. Despite large budgets and impressive receipts (the former often over $100 million and the latter now in the billions for blockbusters) and considerable technical prowess, cinema has lost its ability to be anything more than popcorn entertainment for adolescent fanboys (of all ages).

This is admittedly a pretty sour list. Positive, worthwhile manifestations of the human experience are still out there, but they tend to be private, modest, and infrequent. I still enjoy a successful meal cooked in my own kitchen. I still train for and race in triathlons. I still perform music. I still make new friends. But each of these examples is also marred by corruptions that penetrate everything we do. Perhaps it’s always been so, and as I, too, age, I become increasingly aware of inescapable distortions that can no longer be overcome with innocence, ambition, energy, and doublethink. My plan is to continue writing the blog until it feels like a burden, at which point I’ll stop. But for now, there’s too much to think and write about, albeit at my own leisurely pace.

My work commute typically includes bus, train, and walking legs to arrive at my destination. If wakefulness and an available seat allow, I often read on the bus and train. (This is getting to be exceptional compared to other commuters, who are more typically lost in their phones listening to music, watching video, checking FB, or playing games. Some are undoubtedly reading, like me, but electronic media, which I find distasteful, alter the experience fundamentally from ink on paper.) Today, I was so absorbed in my reading that by the time I looked up, I missed my bus stop, and half an hour later, I nearly missed my train stop, too. The experience of tunnel vision in deep concentration is not at all unfamiliar to me, but it is fleeting and unpredictable. More typical is a relaxed yet alert concentration that for me takes almost no effort anymore.

So what sent me ’round the bend? The book I’m currently reading, Nick Carr’s The Glass Cage, takes a diversion into the work of poet Robert Frost. Carr uses Frost to illustrate his point about immersion in bodily work with manageable difficulty lending the world a more robust character than the detached, frictionless world experienced with too much technological mediation and ease. Carr does a terrific job contextualizing Frost’s lyric observations in a way quite unlike the contextual analysis one might undertake in a high school or college classroom, which too often makes the objects of study lifeless and irrelevant. Carr’s discussion put me unexpectedly into an aesthetic mode of contemplation, as distinguished from analytic or kinesthetic modes. There are probably others.

I don’t often go into aesthetic mode. It requires the right sort of stimulation. The Carr/Frost combination put me there, and so I tunneled into the book and forgot my commute. That momentary disorientation is often pleasurable, but for me, it can also be distressing. My infrequent visits to art museums are often accompanied by a vague unease at the sometimes nauseating emotionalism of the works on display. It’s an honest response, though I expect most folks can’t quite understand why something beautiful would provoke something resembling a negative response. In contrast, my experience in the concert hall is usually frustration, as musicians have become ever more corporate and professional in their performance over time to the detriment and exclusion of latent emotional content. I suppose that as Super Bowl Sunday is almost upon us (about which I care not at all), the typical viewer gets an emotional/aesthetic charge out of that overhyped event, especially if the game is hotly contested rather than a blowout. I seek and find my moments in less crass expressions of the human spirit.

rant on/

This is the time of year when media pundits pause to look back and consider the previous year, typically compiling unasked-for “best of” lists to recap what everyone may have experienced — at least if one is absorbed by entertainment media. My interest in such nonsense is passive at best, dismissive at worst. Further, more and more lists are weighed and compiled by self-appointed and guileless fanboys and -girls, some of whom are surprisingly knowledgeable (sign of a misspent youth?) and insightful yet almost uniformly lack a sufficiently longitudinal view necessary to form circumspect and expert opinions. The analogy would be to seek wisdom from a 20- or 30-something in advance of its acquisition. Sure, people can be “wise beyond their years,” which usually means free of the normal illusions of youth without yet having become a jaded, cynical curmudgeon — post-ironic hipster is still available — but a real, valuable, historical perspective takes more than just 2-3 decades to form.

For instance, whenever I bring up media theory to a youngster (from my point of reckoning), usually someone who has scarcely known the world without 24/7/365 access to all things electronic, he or she simply cannot conceive what it means to be without that tether/pacifier/security blanket smothering them. It doesn’t feel like smothering because no other information environment has ever been experienced (excepting perhaps in early childhood, but even that’s not guaranteed). Even a brief hiatus from the information blitzkrieg, a two-week vacation, say, doesn’t suffice. Rather, only someone olde enough to remember when it simply wasn’t there — at least in the personal, isolating, handheld sense — can know what it was like. I certainly remember when thought was free to wander, invent, and synthesize without pressure to incorporate a continuous stream of incoming electronic stimuli, most of which amounts to ephemera and marketing. I also remember when people weren’t constantly walled in by their screens and feeds, when life experience was more social, shared, and real rather than private, personal, and virtual. And so that’s why when I’m away from the radio, TV, computer, etc. (because I purposely and pointedly carry none of it with me), I’m less a mark than the typical media-saturated fool face-planted in a phone or tablet for the lures, lies, cons, and swindles that have become commonplace in late-stage capitalism.

Looking back in another sense, I can’t help but to feel a little exasperated by the splendid reviews of the life in music led by Pierre Boulez, who died this past week. Never heard of him? Well, that just goes to show how far classical music has fallen from favor that even a titan such as he makes utterly no impression on the general public, only specialists in a field that garners almost no attention anymore. Yet I defy anyone not to know who Kim Kardashian is. Here’s the bigger problem: despite being about as favorably disposed toward classical music as it is possible to be, I have to admit that no one I know (including quite a few musicians) would be able to hum or whistle or sing a recognizable tune by Boulez. He simply doesn’t pass the whistle test. But John Williams (of Star Wars fame) certainly does. Nor indeed would anyone put on a recording of one of Boulez’s works to listen to. Not even his work as a conductor is all that compelling, either live or on disc (I’ve experienced plenty of both). As one looks back on the life of Pierre Boulez, as one is wont to do upon his passing, how can it be that such prodigious talent as he possessed could be of so little relevance?

Consider these two examples flip sides of the same coin. One enjoys widespread subscription but is base (opinions differ); the other is obscure but (arguably) refined. Put differently, one is pedestrian, the other admirable. Or over a lifetime, one is placebo (or worse), the other fulfilling. Looking back upon my own efforts and experiences in life, I would much rather be overlooked or forgotten than be petty and (in)famous. Yet mass media conspires to make us all into nodes on a network with goals decidedly other than human respectability or fulfillment. So let me repeat the challenge question of this blog: are you climbing or descending?

rant off/

The video below came to my attention recently, which shows a respectable celebrity, violinist/conductor Itzhak Perlman, being dicked around in an interview he probably undertook in good faith. My commentary follows.

Publicized pranks and gotchas are by no means rare. Some are good-natured and quite funny, but one convention of the prank is to unmask it pretty quickly. In the aftermath, the target typically either laughs if off, leaves without comment, or less often, storms out in disgust. Andy Kaufman as “Tony Clifton” was probably among the first to sustain a prank well past the point of discomfort, never unmasking himself. Others have since gotten in on the antics, though results are probably not any worse dickishness (dickery?) than Kaufman’s.

Fake interviews by comedians posing as news people are familiar to viewers of The Daily Show and its spinoff The Colbert Report (its run now completed). Zack Galifianakis does the same schtick in Between Two Ferns. It always surprises me when targets fall into the trap, exposing themselves as clueless ideologues willing to be hoisted with their own petards. However, Colbert in particular balanced his arch Republican stage persona with an unmistakable respect for his interview subject, which was at times inspired. Correspondents from The Daily Show are frequently pretty funny, but they almost never convey any respect for the subjects of the interview. Nick Canellakis (shown above) apparently has a whole series of interviews with classical musicians where he feigns idiocy and insult. Whereas some interview subjects are media savvy enough to get the joke and play along, I find this attempt at humor tasteless and unbearable.

Further afield, New Media Rockstars features a burgeoning list of media hosts who typically operate cheaply over the Web via YouTube, supported by an array of social media. At least one, Screen Junkies (the only one I watch), has recently blown into an entire suite of shows. I won’t accuse them all of being talentless hacks or dicking people around for pointless yuks, but I often pause to wonder what makes the shows worth producing beyond the hosts’ embarrassingly encyclopedic knowledge of comics, cartoons, TV shows, movies, etc. They’re fanboys (and girls) who have leveraged their misspent youth and eternal adolescence to gush and gripe about their passions. Admittedly, this may not be so different from sports fanatics (especially human statisticians), opera geeks, and nerds of others stripes.

Throwaway media may have unintentionally smuggled in tasteless shenanigans such as those by Nick Canellakis. Various comedians (unnamed) have similarly offered humorless discomfort as entertainment. Reality TV shows explored this area a while back, which I called trainwreck television. Cheaply produced video served over the Web has unleashed a barrage of dreck in all these categories. Some shows may eventually find their footing and become worthwhile. In the meantime, I anticipate seeing plenty more self-anointed media hosts dicking around celebrities and audiences alike.

Updates to my blogroll are infrequent. I only add blogs that present interesting ideas (with which I don’t always agree) and/or admirable writing. Deletions are typically the result of a change of focus at the linked blog, or regrettably, the result of a blogger becoming abusive or self-absorbed. This time, it’s latter. So alas, another one bites the dust. Dropping off my blogroll — no loss since almost no one reads my blog — is On an Overgrown Path (no link), which is about classical music.

My indignation isn’t about disagreements (we’ve had a few); it’s about inviting discussion in bad faith. I’m very interested in contributing to discussion and don’t mind moderated comments to contend with trolls. However, my comments drive at ideas, not authors, and I’m scarcely a troll. Here’s the disingenuously titled blog post, “Let’s Start a Conversation about Concert Hall Sound,” where the blogger declined to publish my comment, handily blocking conversation. So for maybe the second time in the nearly 10-year history of this blog, I am reproducing the entirety of another’s blog post (minus the profusion of links, since that blogger tends to create link mazes, defying readers to actually explore) followed by my unpublished comment, and then I’ll expound and perhaps rant a bit. Apologies for the uncharacteristic length. (more…)