Archive for the ‘Tacky’ Category

Already widely reported but only just having come to my awareness is an initiative by Rolling Stone to establish a Culture Council: “an Invitation-Only Community of Influencers, Innovatives, and Creatives.” The flattering terms tastemakers and thought leaders are also used. One must presume that submissions will be promotional and propaganda pieces masquerading as news articles. Selling advertising disguised as news is an old practice, but the ad usually has the notation “advertisement” somewhere on the page. Who knows whether submissions will be subject to editorial review?

To be considered for membership, candidates must sit in a senior-level position at a company generating at least $500K in annual revenue or have obtained at least $1M in total institutional funding.

Rolling Stone‘s website doesn’t say it anywhere I can locate, but third-party reports indicate that members pay either a $1,500 annual fee and $500 submission fee (one-time? repeat?) or a flat $2,000 submission fee. Not certain which. Just to be abundantly clear, fees would be paid by the submitter to the magazine, reversing how published content is normally acquired (i.e., by paying staff writers and free lancers). I’d say this move by Rolling Stone is unprecedented, but of course, it’s not. However, it is a more brazen pay-to-play scheme than most and may be a harbinger of even worse developments to come.

Without describing fully how creative content (arts and news) was supported in the past, I will at least observe that prior to the rise of full-time creative professions in the 18th and 19th centuries (those able to scratch out earn a living on commissions and royalties), creative work was either a labor of love/dedication, typically remunerated very poorly if at all, or was undertaken through the patronage of wealthy European monarchs, aristocrats, and religious institutions (at least in the developing West). Unless I’m mistaken, self-sustaining news organizations and magazines came later. More recent developments include video news release and crowd sourcing, the latter of which sometimes accomplished under the pretense of running contests. The creative commons is how many now operative (including me — I’ve refused to monetize my blog), which is exploited ruthlessly by HuffPost (a infotainment source I ignore entirely), which (correct me if wrong) doesn’t pay for content but offers exposure as an inducement to journalists trying to develop a byline and/or audience. Podcasts, YouTube channels, and news sites also offer a variety of subscription, membership, and voluntary patronage (tipping) schemes to pay the bills (or hit it big if an outlier). Thus, business models have changed considerably over time and are in the midst of another major transformation, especially for news-gathering organizations and the music recording industry in marked retreat from their former positions.

Rolling Stone had always been a niche publication specializing in content that falls outside my usual scope of interest. I read Matt Taibbi’s reporting that appeared in Rolling Stone, but the magazine’s imprint (read: reputation) was not the draw. Now that the Rolling Stone is openly soliciting content through paid membership in the Culture Council, well, the magazine sinks past irrelevance to active avoidance.

It’s always been difficult to separate advertising and propaganda from reliable news, and some don’t find it important to keep these categories discrete, but this new initiative is begging to be gamed by motivated PR hacks and self-promoters with sufficient cash to burn. It’s essentially Rolling Stone whoring itself out. Perhaps more worrying is that others will inevitably follow Rolling Stone‘s example and sell their journalistic integrity with similar programs, effectively putting the final nails in their own coffins (via brand self-destruction). The models in this respect are cozy, incestuous relationships between PACs, lobbying groups, think tanks, and political campaigns. One might assume that legacy publications such as Rolling Stone would have the good sense to retain as much of their valuable brand identity as possible, but the relentless force of corporate/capitalist dynamics are corrupting even the incorruptible.

I admit it: I’m a bit triggered. Storming of the U.S. Capitol Building last week, even though it was over in one day, sent a lot of us back to the drawing board, wondering how things could come to that. Not that civil unrest, attempted coups and secession, and even revolution haven’t been predicted for months. Still, the weirdness of this particular manifestation of citizen frustrations is hard to fathom. See, for instance, this blog post, which offers a reckoning not easy to face. Simply put, crowds that form into protests and physical occupations fully recognize their abandonment at the hand of oligarchs and political leaders and as a result act out their desperation and nihilism. Their question becomes “why not take over and occupy a building?” Doesn’t matter, nothing to lose anymore. It’s already all gone. Whether it’s a college administrative building, governor’s mansion, federal or state office building, or the U.S. Capitol Building, the sentiment appears to be the same: why the hell not? Doesn’t matter there was no plan what to do once the building was breached; doesn’t matter that it wasn’t occupied for long; doesn’t matter that property was damaged; doesn’t matter that lives were ruined and lost; doesn’t matter that no replacement government or executive was installed like a real coup or revolution would demand. Still works as an expression of outrage over the dysfunctions of society.

On the bright side, actual death and injury were quite limited compared to what might have obtained. Mayhem was largely limited to property destruction. Plus, it was a potent reminder to legislators (filmed scrambling for safety) that maybe they ought to fear backing the citizenry into corners with nowhere to turn. Conjecture that, had the racial make-up of the protesters been different, a massacre would have ensued remains just that: conjecture.

(more…)

Caveat: this post is uncharacteristically long and perhaps a bit disjointed. Or perhaps an emerging blogging style is being forged. Be forewarned.

Sam Harris has been the subject of or mentioned in numerous previous blog posts. His podcast Making Sense (formerly, Waking Up), partially behind a paywall but generously offered for free (no questions asked) to those claiming financial hardship, used to be among those I would tune in regularly. Like the Joe Rogan Experience (soon moving to Spotify — does that mean its disappearance from YouTube?), the diversity of guests and reliable intellectual stimulation have been attractive. Calling his podcast Making Sense aligns with my earnest concern over actually making sense of things as the world spins out of control and our epistemological crisis deepens. Yet Harris has been a controversial figure since coming to prominence as a militant atheist. I really want to like what Harris offers, but regrettably, he has lost (most of) my attention. Others reaching the same conclusion have written or vlogged their reasons, e.g., “Why I’m no longer a fan of ….” Do a search.

Having already ranted over specific issues Harris has raised, let me instead register three general complaints. First, once a subject is open for discussion, it’s flogged to death, often without reaching any sort of conclusion, or frankly, helping to make sense. For instance, Harris’ solo discussion (no link) regarding facets of the killing of George Floyd in May 2020, which event sparked still unabated civil unrest, did more to confuse than clarify. It was as though Harris were trying the court case by himself, without a judge, jury, or opposing counsel. My second complaint is that Harris’ verbosity, while impressive in many respects, leads to interviews marred by long-winded, one-sided speeches where the thread is hopelessly lost, blocking an interlocutor from tracking and responding effectively. Whether Harris intends to bury others under an avalanche of argument or does so uncontrollably doesn’t matter. It’s still a Gish gallop. Third is his over-emphasis on hypotheticals and thought experiments. Extrapolation is a useful but limited rhetorical technique, as is distillation. However, treating prospective events as certainties is tantamount to building arguments on poor foundations, namely, abstractions. Much as I admire Harris’ ambition to carve out a space within the public sphere to get paid for thinking and discussing topics of significant political and philosophical currency, he frustrates me enough that I rarely tune in anymore.

(more…)

One of the victims of cancel culture, coming to my attention only days ago, is Kate Smith (1907–1986), a singer of American popular song. Though Smith had a singing career spanning five decades, she is best remembered for her version(s) of Irving Berlin’s God Bless America, which justifiably became a bit of Americana. The decades of Smith’s peak activity were the 1930s and 40s.

/rant on

I dunno what goes through people’s heads, performing purity rituals or character excavation on folks long dead. The controversy stems from Smith having a couple other songs in her discography: That’s Why Darkies Were Born (1931) and Pickaninny Heaven from the movie Hello, Everybody! (1933). Hate to break it anyone still living under a rock, but these dates are not far removed from minstrelsy, blackface, and The Birth of a Nation (1915) — a time when typical Americans referred to blacks with a variety of terms we now consider slurs. Such references were still used during the American civil rights movement (1960s) and are in use among some virulent white supremacists even today. I don’t know the full context of Kate Smith having sung those songs, but I suspect I don’t need to. In that era, popular entertainment had few of the sensibilities regarding race we now have (culture may have moved on, but it’s hard to say with a straight face it’s evolved or progressed humanely), and uttering commonly used terms back then was not automatic evidence of any sort of snarling racism.

I remember having heard my grandparents, nearly exact contemporaries of Kate Smith, referring to blacks (the term I grew up with, still acceptable I think) with other terms we no longer consider acceptable. It shocked me, but to them, that’s simply what blacks were called (the term(s) they grew up with). Absolutely nothing in my grandparents’ character or behavior indicated a nasty, racist intent. I suspect the same was true of Kate Smith in the 1930s.

Back when I was a librarian, I also saw plenty of sheet music published before 1920 or so with the term darkie (or darkey) in the title. See for example this. The Library of Congress still uses the subject headings “negro spirituals” (is there another kind?) and “negro songs” to refer to various subgenres of American folk song that includes slave songs, work songs, spirituals, minstrel music, protest songs, etc. Maybe we should cancel the Library of Congress. Some published music titles from back then even call them coon songs. That last one is totally unacceptable today, but it’s frankly part of our history, and like changing character names in Mark Twain’s Huckleberry Finn, sanitizing the past does not make it go away or any less discomfiting. But if you wanna bury your head in the sand, go ahead, ostrich.

Also, if some person or entity ever does some questionably racist, sexist, or malign thing (even something short of abominable) situated contextually in the past, does that mean he, she, or it must be cancelled irrevocably? If that be the case, then I guess we gotta cancel composer Richard Wagner, one of the most notorious anti-Semites of the 19th century. Also, stop watching Pixar, Marvel, and Star Wars films (among others), because remember that time when Walt Disney Studios (now Walt Disney Company) made a racist musical film, Song of the South (1946)? Disney’s tainted legacy (extending well beyond that one movie) is at least as awful as, say, Kevin Spacey, and we’re certainly not about to rehabilitate him.

/rant off

Robots are coming; we all know it. Frankly, for some implementations, they’re already here. For example, I recently took interest in robotic vacuums. I already have an upright vacuum with the usual attachments I push around on weekends, plus brooms and dustpans for hard, uncarpeted floors. But I saw a robotic vacuum in action and found myself considering purchasing something I knew existed but never gave thought to needing. All it took was watching one scuttling along the floor aimlessly, bumping harmlessly into furniture, to think perhaps my living experience would be modestly enhanced by passive clean-up while I’m out of the house — at least I thought so until I saw the price range extends from roughly $150 to $500. Surprised me, too, to see how crowded the marketplace is with competing devices from different manufacturers. Can’t rationalize the expense as a simple labor-saving device. The effort it replaces just isn’t that arduous.

Another robotic device caught my eye: the Gita cargo robot by Piaggio Fast Forward. I will admit that a stuff carrier for those with mobility issues might be a worthwhile device, much like Segway seemed like a relatively good idea to increase range for those with limited mobility — at least before such devices branched into self-balancing hoverboards and motorized scooters that now clog the sidewalks, create unnecessary hazards, and send thousands each year to emergency rooms with broken wrists (or worse). One of those little Gita buggers following able-bodied folks around seems to me the height of foolishness, not to mention laziness. The video review I saw (sorry, no link, probably outta date and based on a prototype) indicated that the Gita is not ready for prime time and requires the user to wear a camera/belt assembly for the Gita to track and follow its owner. Its limited capacity and operating duration between charges (yeah, another thing to plug in — sigh), plus its inability to negotiate doors effectively, makes it seem like more trouble that it’s worth for the hefty price of around $3,250.

Billed as a robot butler, the Gita falls well short of a Jetsons or Star Wars upright robot that’s able, for example, to execute commands and interact verbally. Maybe the Gita represents the first baby steps toward that envisioned future (or long time ago in a galaxy far, far away), but I rather doubt it. Moreover, we’re already irritatingly besieged by people face-planted in their phones. Who wants a future were others (let’s say half of the people we come into contact with in hallways, corridors, and parking lots) are attended by a robot cargo carrier or fully functioning robot butler? In the meantime, just like the Google Glass that was never adopted widely, anyone seen with a Gita trailing behind is a tool.

For ambulatory creatures, vision is arguably the primary sense of the five (main) senses. Humans are among those species that stand upright, facilitating a portrait orientation when interacting among ourselves. The terrestrial environment on which we live, however, is in landscape (as distinguished from the more nearly 3D environments of birds and insects in flight or marine life in rivers, lakes, seas, and oceans). My suspicion is that modest visual conflict between portrait and landscape is among the dynamics that give rise to the orienting response, a step down from the startle reflex, that demands full attention when visual environments change.

I recall reading somewhere that wholesale changes in surroundings, such as when crossing a threshold, passing through a doorway, entering or exiting a tunnel, and notably, entering and exiting an elevator, trigger the orienting response. Indeed, the flush of disorientation before one gets his or her bearings is tantamount to a mind wipe, at least momentarily. This response may also help to explain why small, bounded spaces such as interiors of vehicles (large and small) in motion feel like safe, contained, hermetically sealed personal spaces. We orient visually and kinesthetically at the level of the interior, often seated and immobile, rather than at the level of the outer landscape being traversed by the vehicle. This is true, too, of elevators, a modern contraption that confounds the nervous system almost as much as revolving doors — particularly noticeable with small children and pets until they become habituated to managing such doorways with foreknowledge of what lies beyond.

The built environment has historically included transitional spaces between inner and outer environments. Churches and cathedrals include a vestibule or narthex between the exterior door and inner door leading to the church interior or nave. Additional boundaries in church architecture mark increasing levels of hierarchy and intimacy, just as entryways of domiciles give way to increasingly personal spaces: parlor or sitting room, living room, dining room, kitchen, and bedroom. (The sheer utility of the “necessary” room defies these conventions.) Commercial and entertainment spaces use lobbies, atria, and prosceniums in similar fashion.

What most interests me, however, is the transitional space outside of buildings. This came up in a recent conversation, where I observed that local school buildings from the early to middle part of the 20th century have a distinguished architecture set well back from the street where lawns, plazas, sidewalks, and porches leading to entrances function as transitional spaces and encourage social interaction. Ample window space, columnar entryways, and roof embellishments such as dormers, finials, cupolas, and cornices add style and character befitting dignified public buildings. In contrast, 21st-century school buildings in particular and public buildings in general, at least in the city where I live, tend toward porchless big-box warehouses built right up to the sidewalk, essentially robbing denizens of their social space. Blank, institutional walls forbid rather than invite. Consider, for example, how students gathered in a transitional space are unproblematic, whereas those congregated outside a school entrance abutting a narrow sidewalk suggest either a gauntlet to be run or an eruption of violence in the offing. (Or maybe they’re just smoking.) Anyone forced to climb past loiterers outside a commercial establishment experiences similar suspicions and discomforts.

Beautifully designed and constructed public spaces of yore — demonstrations of a sophisticated appreciation of both function and intent — have fallen out of fashion. Maybe they understood then how transitional spaces ease the orientation response, or maybe they only intuited it. Hard to say. Architectural designs of the past acknowledged and accommodated social functions and sophisticated aesthetics that are today actively discouraged except for pointless stunt architecture that usually turns into boondoggles for taxpayers. This has been the experience of many municipalities when replacing or upgrading schools, transit centers, sports arenas, and public parks. Efficient land use today drives toward omission of transitional space. One of my regular reads is James Howard Kunstler’s Eyesore of the Month, which profiles one architectural misfire after the next. He often mocks the lack of transitional space, or when present, observes its open hostility to pedestrian use, including unnecessary obstacles and proximity to vehicular traffic (noise, noxious exhaust, and questionable safety) discouraging use. Chalk this up as another collapsed art (e.g., painting, music, literature, and poetry) so desperate to deny the past and establish new aesthetics that it has ruined itself.

Among the myriad ways we have of mistreating each other, epithets may well be the most ubiquitous. Whether using race, sex, age, nationality, or nominal physical characteristic (especially genital names), we have so many different words with which to insult and slur it boggles the mind. Although I can’t account for foreign cultures, I doubt there is a person alive or dead who hasn’t suffered being made fun of for some stupid thing. I won’t bother to compile a list there are so many (by way of example, Wikipedia has a list of ethnic slurs), but I do remember consulting a dictionary of historical slang, mostly disused, and being surprised at how many terms were devoted specifically to insults.

I’m now old and contented enough for the “sticks and stones …” dismissal to nullify any epithets hurled my way. When one comes up, it’s usually an obvious visual characteristic, such as my baldness or ruddiness. Those characteristics are of course true, so why allow them to draw ire when used with malicious intent? However, that doesn’t stop simple words from giving grave offense for those with either thin skins or being so-called fighting words for those habituated to answering provocation with physical force. And in an era when political correctness has equated verbal offense with violence, the self-appointed thought police call for blood whenever someone steps out of line in public. Alternatively, when such a person is one’s champion, then the blood sport becomes spectacle, such as when 45 gifts another public figure with a sobriquet.

The granddaddy of all epithets — the elephant in the room, at least in the U.S. — will not be uttered by me, sorta like the he-who-shall-not-be-named villain of the Harry Potter universe or the forbidden language of Mordor from the Tolkien universe. I lack standing to use the term in any context and won’t even venture a euphemism or placeholder using asterisks or capitalisms. Reclaiming the term in question by adopting it as a self-description — a purported power move — has decidedly failed to neutralize the term. Instead, the term has become even more egregiously insulting than ever, a modern taboo. Clarity over who gets to use the term with impunity and when is elusive, but for my own part, there is no confusion: I can never, ever speak or write it in any context. I also can’t judge whether this development is a mark of cultural progress or regression.

Back in the 1980s when inexpensive news programs proliferated, all wanting to emulate 60 Minutes or 20/20, I recall plenty having no problem working the public into a lather over some crime or injustice. A typical framing trick was to juxtapose two unrelated facts with the intent that the viewer leap to an unwarranted conclusion. Here’s an example I just made up: “On Tuesday, Jane went to her plastic surgeon for a standard liposuction procedure. By Friday, Jane was dead.” Well, what killed Jane? The obvious inference, by virtue of juxtaposition, is the procedure. Turns out it was an entirely unrelated traffic accident. The crap news program could legitimately claim that it never said the procedure killed Jane, yet it led the credulous public to believe so. Author Thomas Sowell resorts to that same sort of nonsense in his books: a habit of misdirection when arguing his point. I initially sought out his writing for balance, as everyone needs others capable of articulating competing ideas to avoid the echo chamber of one’s own mind (or indeed the chorus of the converted). Sowell failed to keep me as a reader.

It’s not always so easy to recognize cheap rhetorical tricks. They appear in movies all the time, but then, one is presumably there to be emotionally manipulated affected by the story, so a healthy suspension of disbelief goes a long way to enhance one’s enjoyment. Numerous fanboy sites (typically videos posted to YouTube) offer reviews and analysis that point out failures of logic, plotting, and continuity, as well as character inconsistency and embedded political propaganda messaging, but I’ve always thought that taking movies too seriously misses the point of cheap entertainment. Considering the powerful influence cinematic storytelling has over attitudes and beliefs, perhaps I’m being too cavalier about it.

When it comes to serious debate, however, I’m not nearly so charitable. The favored 5-minute news debate where 3 or 4 floating heads spew their rehearsed talking point, often talking over each other in a mad grab for air time, accomplishes nothing. Formal, long-form debates in a theater in front of an audience offer better engagement if participants can stay within proper debate rules and etiquette. Political debates during campaign season fail on that account regularly, with more spewing of rehearsed talking points mixed with gratuitous swipes at opponents. Typically, both sides claim victory in the aftermath and nothing is resolved, since that’s not really the objective. (Some opine that government, being essentially nonstop campaigning, suffers a similar fate: nothing is resolved because that’s not the true objective anymore.)

I was intrigued to learn recently of the semi-annual Munk Debates, named after their benefactors, that purport to be formal debates with time limits, moderation, and integrity. I had never heard of them before they booked Jordan Peterson alongside Michael Eric Dyson, Michelle Goldberg, and Stephen Fry. Like Donald Trump did for TV and print news, Peterson has turned into a 1-man ratings bonanza for YouTube and attracts viewers to anything in which he participates, which is quite a lot. The proposition the four debaters were provided was this: Be it resolved, what you call political correctness, I call progress … Problem is, that’s not really what was debated most of the time. Instead, Dyson diverted the debate to identity politics, specifically, racism and so-called white privilege. Goldberg mostly attacked Peterson regarding his opinions outside of the debate, Peterson defended himself against repeated personal attacks by Goldberg and Dyson, and Fry stayed relatively true to the intended topic. Lots of analysis and opinion appeared on YouTube almost immediately after the debate, so wade in if that’s what interests you. I viewed some of it. A couple videos called Dyson a grievance merchant, which seems to me accurate.

What concerns me more here are the cheap rhetorical tricks employed by Dyson — the only debater booed by the audience — that fundamentally derailed the proceedings. Dyson speaks with the fervor of a revivalist preacher, a familiar style that has been refined and coopted many times over to great effect. Whether deserved or not, it carries associations of great moral authority and momentous occasion. Unfortunately, if presented as a written transcript rather than a verbal rant, Dyson’s remarks are incoherent, unhinged, and ineffective except for their disruptive capacity. He reminded everyone of his blackness and his eloquence, the first of which needs no reminder, the second of which immediately backfired and called into question his own claim. Smart, eloquent people never tell you they’re smart and eloquent; the proof is in their behavior. Such boastful announcements tend to work against a person. Similarly, any remark that beings with “As a black/white/red/brown/blue man/woman/hybrid of _______ ethnicity/sexuality/identity …” calls in a host of associations that immediately invalidates the statement that follows as skewed and biased.

The two point-scoring bits of rhetoric Dyson levies with frequency, which probably form a comfort zone to which he instinctively retreats in all challenges, are his blackness (and by proxy his default victimhood) and historical oppression of blacks (e.g., slavery, Jim Crow laws, etc.). There are no other issues that concern him, as these two suffice to push everyone back on their heels. That’s why the debate failed to address political correctness effectively but instead revolved around identity politics. These issues are largely distinct, unless one debates the wisdom of switching out terminology cyclically, such as occurs even now with various racial epithets (directed to every race, not just blacks). That obvious tie-in, the use of euphemism and neologism to mask negative intent, was never raised. Nor were the twisted relations between free speech, hate speech, and approved speech codes (politically correct speech). Nope, the debate featured various personalities grandstanding on stage and using the opportunity to push and promote their personal brands, much like Trump has over the years. Worse, it was mostly about Michael Eric Dyson misbehaving. He never had my attention in the past; now I intend to avoid him at all costs.

The movie Gladiator depicts the protagonist Maximus addressing spectators directly at gladiatorial games in the Roman Colosseum with this meme-worthy challenge: “Are you not entertained?” Setting the action in an ancient civilization renowned for its decadent final phase prior to collapse, referred to as Bread and Circuses, allows us to share vicariously in the protagonist’s righteous disgust with the public’s blood lust while shielding us from any implication of our own shame because, after all, who could possibly entertain blood sports in the modern era? Don’t answer that.

are-you-not-entertained-gladiator

But this post isn’t about our capacity for cruelty and barbarism. Rather, it’s about the public’s insatiable appetite for spectacle — both fictional and absolutely for real — served up as entertainment. Professional wrestling is fiction; boxing and mixed martial arts are reality. Audiences consuming base entertainment and, in the process, depleting performers who provide that entertainment extend well beyond combat sports, however. For instance, it’s not uncommon for pop musicians to slowly destroy themselves once pulled into the attendant celebrity lifestyle. Three examples spring to mind: Elvis Presley, Michael Jackson, and Whitney Houston. Others call hiatus or retire altogether from the pressure of public performance, such as Britney Spears, Miles Davis, and Barbra Streisand.

To say that the public devours performers and discards what remains of them is no stretch, I’m afraid. Who remembers countdown clocks tracking when female actors turn 18 so that perving on them is at last okay? A further example is the young starlet who is presumably legitimized as a “serious” actor once she does nudity and/or portrays a hooker but is then forgotten in favor of the next. If one were to seek the full depth of such devouring impulses, I suggest porn is the industry to have all one’s illusions shattered. For rather modest sums, there is absolutely nothing some performers won’t do on film (these days on video at RedTube), and naturally, there’s an audience for it. Such appetites are as bottomless as they come. Are you not entertained?

Speaking of Miles Davis, I take note of his hiatus from public performance in the late 1970s before his limited return to the stage in 1986 and early death in 1991 at age 65. He had cemented a legendary career as a jazz trumpeter but in interviews (as memory serves) dismissed the notion that he was somehow a spokesperson for others, saying dryly “I’m just a trumpet player, man ….” What galled me, though, were Don Cheadle’s remarks in the liner notes of the soundtrack to the biopic Miles Ahead (admittedly a deep pull):

Robert Glasper and I are preparing to record music for the final scene of Miles Ahead — a possible guide track for a live concert that sees the return of Miles Davis after having been flushed from his sanctuary of silence and back onto the stage and into his rightful light. My producers and I are buzzing in disbelief about what our audacity and sheer will may be close to pulling off ….

What they did was record a what-might-have-been track had Miles incorporated rap or hip hop (categories blur) into his music. It’s unclear to me whether the “sanctuary of silence” was inactivity or death, but Miles was essentially forced onstage by proxy. “Flushed” is a strange word to use in this context, as one “flushes” an enemy or prey unwillingly from hiding. The decision to recast him in such “rightful light” strikes me as rather poor taste — a case of cultural appropriation worse than merely donning a Halloween costume.

This is the wave of the future, of course, now that images of dead celebrities can be invoked, say, to sell watches (e.g., Steve McQueen) and holograms of dead musicians are made into singing zombies, euphemized as “virtual performance”(e.g., Tupak Shakur). Newly developed software can now create digitized versions of people saying and doing whatever we desire of them, such as when celebrity faces are superimposed onto porn actors (called “deepfakes”). It might be difficult to argue that in doing so content creators are stealing the souls of others, as used to be believed in the early days of photography. I’m less concerned with those meeting demand than with the demand itself. Are we becoming demons, the equivalents of the succubus/incubus, devouring or destroying frivolously the objects of our enjoyment? Are you not entertained?

Back in undergraduate college, when just starting on my music education degree, I received an assignment where students were asked to formulate a philosophy of education. My thinking then was influenced by a curious textbook I picked up: A Philosophy of Music Education by Bennett Reimer. Of course, it was the wrong time for an undergraduate to perform this exercise, as we had neither maturity nor understanding equal to the task. However, in my naïvté, my answer was all about learning/teaching an aesthetic education — one that focused on appreciating beauty in music and the fine arts. This requires the cultivation of taste, which used to be commonplace among the educated but is now anathema. Money is the preeminent value now. Moreover, anything that smacks of cultural programming and thought control is now repudiated reflexively, though such projects are nonetheless undertaken continuously and surreptitiously through a variety of mechanisms. As a result, the typical American’s sense of what is beautiful and admirable is stunted. Further, knowledge of the historical context in which the fine arts exist is largely absent. (Children are ahistorical in this same way.) Accordingly, many Americans are coarse philistines whose tastes rarely extend beyond those acquired naturally during adolescence (including both biophilia and biophobia), thus the immense popularity of comic book movies, rock and roll music, and all manner of electronica.

When operating with a limited imagination and undeveloped ability to perceive and discern (and disapprove), one is a sitting duck for what ought to be totally unconvincing displays of empty technical prowess. Mere mechanism (spectacle) then possesses the power to transfix and amaze credulous audiences. Thus, the ear-splitting volume of amplified instruments substitutes for true emotional energy produced in exceptional live performance, ubiquitous CGI imagery (vistas and character movements, e.g., fight skills, that simply don’t exist in reality) in cinema produces wonderment, and especially, blinking lights and animated GIFs deliver the equivalent of a sugar hit (cookies, ice cream, soda) when they’re really placebos or toxins. Like hypnosis, the placebo effect is real and pronounced for those unusually susceptible to induction. Sitting ducks.

Having given the fine arts (including their historical contexts) a great deal of my academic attention and acquired an aesthetic education, my response to the video below fell well short of the blasé relativism most exhibit; I actively dislike it. (more…)