Posts Tagged ‘Celebrity’

The comic below struck a chord and reminded me of Gary Larson’s clumsily drawn but often trenchant The Far Side comics on scientific subjects.

This one masquerades as science but is merely wordplay, i.e., puns, double entendres, and unexpectedly funny malapropisms (made famous by Yogi Berra, among others). Wordplay is also found in various cultural realms, including comic strips and stand-up comedy, advertising and branding, politics, and now Wokedom (a subset of grassroots politics, some might argue). Playing with words has gone from being a clever, sometimes enjoyable diversion (e.g., crossword puzzles) to fully deranging, weaponized language. Some might be inclined to waive away the seriousness of that contention using the childhood retort “sticks and stones ….” Indeed, I’m far less convinced of the psychological power of verbal nastiness than those who insist words are violence. But it’s equally wrong to say that words don’t matter (much) or have no effect whatsoever. Otherwise, why would those acting in bad faith work so tirelessly to control the narrative, often by restricting free speech (as though writing or out-loud speech were necessary for thoughts to form)?

It’s with some exasperation that I observe words no longer retain their meanings. Yeah, yeah … language is dynamic. But semantic shifts usually occur slowly as language evolves. Moreover, for communication to occur effectively, senders and receivers must be aligned in their understandings of words. If you and I have divergent understandings of, say, yellow, we won’t get very far in discussions of egg yolks and sunsets. The same is true of words such as liberal, fascist, freedom, and violence. A lack of shared understanding of terms, perhaps borne out of ignorance, bias, or agenda, leads to communications breakdown. But it’s gotten far worse than that. The meanings of words have been thrown wide open to PoMo reinterpretation that often invert their meanings in precisely the way George Orwell observed in his novel 1984 (published 1949): “War is peace. Freedom is slavery. Ignorance is strength.” Thus, earnest discussion of limitations on free speech and actual restriction on social media platforms, often via algorithmic identification of keywords that fail to account for irony, sarcasm, or context, fail to register that implementation of restrictive kludges already means free speech is essentially gone. The usual exceptions (obscenity, defamation, incitement, gag orders, secrecy, deceptive advertising, student speech, etc.) are not nearly as problematic because they have been adjudicated for several generations and accepted as established practice. Indeed, many exceptions have been relaxed considerably (e.g., obscenity that has become standard patois now fails to shock or offend), and slimy workarounds are now commonplace (e.g., using “people are saying …” to say something horrible yet shielding oneself while saying it). Another gray area includes fighting words and offensive words, which are being expanded (out of a misguided campaign to sanitize?) to include many words with origins as clinical and scientific terms, and faux offense used to stifle speech.

Restrictions on free speech are working in many respects, as many choose to self-censor to avoid running afoul of various self-appointed watchdogs or roving Internet thought police (another Orwell prophecy come true) ready to pounce on some unapproved utterance. One can argue whether self-censorship is cowardly or judicious, I suppose. However, silence and the pretense of agreement only conceal thoughts harbored privately and left unexpressed, which is why restrictions on public speech are fool’s errands and strategic blunders. Maybe the genie can be bottled for a time, but that only produces resentment (not agreement), which boils over into seething rage (and worse) at some point.

At this particular moment in U.S. culture, however, restrictions are not my greatest concern. Rather, it’s the wholesale control of information gathering and reporting that misrepresent or remove from the public sphere ingredients needed to form coherent thoughts and opinions. It’s not happening only to the hoi polloi; those in positions of power and control are manipulated, too. (How many lobbyists per member of Congress, industry after industry, whispering in their ears like so many Wormtongues?) And in extreme cases of fame and cult of personality, a leader or despot unwittingly surrounds him- or herself by a coterie of yes-men frankly afraid to tell the truth out of careerist self-interest or because of shoot-the-messenger syndrome. It’s lonely at the top, right?

Addendum: Mere minutes after publishing this post, I wandered over to Bracing Views (on my blogroll) and found this post saying some of the same things, namely, that choking information off at the source results in a degraded information landscape. Worth a read.

The famous lyric goes “haters gonna hate.” That reflexive structure is equivalent to the meaningless phrase “It is what it is.” Subtexts attach to these phrases, and they take on lives of their own, after a fashion, with everyone pretending to know precisely what is intended and meant. That was the lesson, by the way, of the phrase “Stupid is as stupid does,” made up precisely to confound bullies who were making fun of someone of apparently limited capacity. In light of these commonplace rhetorical injunctions to actual thought, it is unsurprising that practitioners of various endeavors would be revealed as cheerleaders and self-promoters (sometimes rabidly so) for their own passion projects. With most activities, however, one can’t XX about XX, as in sport about sports, music about music, or cook about cooking. If one plays sports, makes music, or cooks, exemplary results are identifiable easily enough, but promotion on behalf of those results, typically after the fact but sometimes in the midst of the activity (i.e., sports commentary), takes place within the context of language. The two major exceptions I can identify are (1) politicking about politics and (2) writing about writing, both heavily laden with speech. (A third example, which I won’t explore, might be celebrating celebrities. Ugh.)

Of the first example I have little to say except that it’s so miserably, ugly, and venal that only politicians, policy wonks, political junkies, and campaign strategists (now full-time political strategists considering campaigns never end) derive much joy or energy from the reflexive trap. The rest of us prefer to think as little as possible about the entirely corrupt nature of political institutions and the associated players. The second example, however, is arguably an inborn feature of writing that still commands attention. Writers writing about writing might be typically understood as fiction writers revealing their processes. A recent example is J.K. Rowling, who leapt from obscurity to international fame in one bound and now offers writing tips (mainly plotting) to aspirants. An older example is Mark Twain, whose recommendation to ward off verbosity is something I practice (sometimes with limited success). Writers writing about writing now extends to journalists, whose self-reflection never seem to wear thin as the famous ones become brands unto themselves (perhaps even newsworthy in their own right). Training attention on themselves (“Look mom, no hands!”) is rather jejune, but again, commonplace. It’s also worth observing that journalists journaling about journalism, especially those who reveal how the proverbial sausage is made (e.g., Matt Taibbi and his book Hate Inc.: Why Today’s Media Makes Us Despise One Another (2019)), are essentially self-cannibalizing (much like celebrities).

What strikes me lately is how many writers, journalists, and commentators (probably includes bloggers like me — bloggers blogging about blogging) have become cheerleaders for the media in which they work, which is especially true of those who have abandoned legacy media in favor of newer platforms to connect with readerships and/or audiences. Extolling the benefits of the blog is already passé, but the shift over to podcasting and YouTube/TikToc channels, accompanied by testimonial about how great are attributes of the new medium, has passed beyond tiresome now that so many are doing it. Print journalists are also jumping ship from legacy publications, mostly newspapers and magazines, to digital publishing platforms such as Medium, Revue, and Substack. Some create independent newsletters. Broadcast journalists are especially keen on YouTube. A fair bit of incestuous crossover occurs as well, as media figures interview each other endlessly. Despite having restricted my media diet due to basic distrust of the legacy media in particular, I still award a lot of attention to a few outlets I determined deserve my attention and are sometimes even trustworthy. Or sometimes, they’re just entertaining. I still tune in the stray episode of someone I find infuriating just to check in and reinforce my decision not to return more frequently.

Stopping here and breaking this post into parts because the remainder of the draft was already growing overlong. More to come in part 2.

I have a memory of John Oliver dressing down a room full of entertainment journalists (ugh …) asking him questions following his Emmy win a few years ago. The first few had failed to offer even perfunctory congratulations for his award but instead leapt straight into questions. After his demand that everyone observe basic courtesy by at least acknowledging the reason for their attention being focused on him, each dutifully offered their compliments, which Oliver accepted graciously, and a question and answer ensued. It was a worthy reminder (something I mistakenly believed superfluous when I was much younger) that we have a sophisticated set of manners developed over time to which we should all subscribe. Behaving otherwise (i.e., skipping straight to matters at hand) is boorish, clownish, rude, and unsophisticated. Thus, routine exchanges at the beginnings of most interviews intended for broadcast go something to the effect, “Thanks for appearing on the show” or “Nice to meet you” followed by “Pleased to be here” or “My pleasure.” It’s part of a formal frame, the introduction or prologue, bearing no significant content but needful for hosts and guests to acknowledge each other.

In the course viewing many podcasts, often conducted by relative unknowns who nonetheless manage to attract someone of distinction to interview, I notice a tendency to geek out and succumb to effusive fandom. Even a little bit of that has the unfortunate effect of establishing an uneasy tension because the fan often becomes unhinged in the presence of the celebrity. Even when there is no latent threat of something going really wrong, the fanboi sometimes goes to such an extreme heaping praise and adulation on the interview subject that nothing else worthwhile occurs. Instead, one witnesses only the fanboi’s self-debasement. It makes me squirm watching someone figuratively fellating a celebrity (apology for my coarseness, but that’s really what springs to mind), and those on the receiving end often look just as uncomfortable. There’s simply no good response to the gushing, screaming, fainting, delirious equivalent of a 15-year-old Beatles freak (from back in the day) failing to hold it together and being caught embarrassingly in flagrante delicto.

Like others, I admire some people for their extraordinary accomplishments, but I never describe myself as a fan. Rather, objects of my admiration fall uniformly in the category of heroes people one shouldn’t scrutinize too closely lest their flaws be uncovered. Further, those times I’ve been in the presence of celebrities are usually the occasion of some discomfort precisely because celebrities’ fame invokes a false sense of intimacy (one might say oversharing) because details of their lives are in full public view. A balanced interaction is impossible because I know quite a bit about them whereas they know nothing about me, and topics gravitate toward the reasons for their celebrity. Most of us average folks feel compelled to acknowledge the films, trophies, recordings, awards, etc. that form their accomplishments, no matter how out of date. I’ve never been in the circumstance where a famous person, recognizing that I don’t recognize him or her (or don’t kowtow as expected), plays the celebrity card: “Don’t you know who I am?”

An interrelated effect is when someone has way too much money, that fortune clouding all interactions because it transforms the person into a target for those currying favor or otherwise on the make. Scammers, conmen, golddiggers, sycophants, etc. appear to extract wealth, and the dynamic breeds mutual distrust and wariness even in routine transactions. Chalk it up as another corrupting aspect of inequality run amok, this time affecting wannabes as well. In light of this, I suppose it’s understandable that rich, famous people are most comfortable among those similarly rich and famous, thus, immune to envy and fandom (but not always). Everyone else is alienated. Weird sort of gilded case to live in — not one that I admire.

Ours is an era when individuals are encouraged to explore, amplify, and parade various attributes of their identities out in public, typically via social media. For those just coming of age and/or recently having entered adulthood, because identity is not yet fully formed, defining oneself is more nearly a demand. When identity is further complicated by unusual levels of celebrity, wealth, beauty, and athleticism (lots of overlap there), defining oneself is often an act of rebellion against the perceived demands of an insatiable public. Accordingly, it was unsurprising to me at least to learn of several well-known people unhappy with their lives and the burdens upon them.

Regular folks can’t truly relate the glitterati, who are often held up aspirational models. For example, many of us look upon the discomforts of Prince Harry and Meghan Markle with a combination of perverse fascination and crocodile tears. They were undoubtedly trapped in a strange, gilded prison before repudiating the duties expected of them as “senior royals,” attempting an impossible retreat to normalcy outside of England. Should be obvious that they will continue to be hounded while public interest in them persists. Similarly, Presley Gerber made news, fell out of the news, and then got back into the news as a result of his growing collection of tattoos. Were he simply some anonymous fellow, few would care. However, he has famous parents and already launched a modeling career before his face tattoo announced his sense of being “misunderstood.” Pretty bold move. With all the presumed resources and opportunities at his disposal, many have wondered in comments and elsewhere whether another, better declaration of self might have been preferred.

Let me give these three the benefit of doubt. Although they all have numerous enviable attributes, the accident of birth (or in Markle’s case, decision to marry) landed them in exceptional circumstances. The percentage of celebrities who crack under the pressure of unrelenting attention and proceed to run off the rails is significant. Remaining grounded is no doubt easier if one attains celebrity (or absurd immense wealth) after, say, the age of 25 or even later. (On some level, we’ve all lost essential groundedness with reality, but that’s another blog post.) Those who are children of celebrities or who become child prodigies may not all be consigned to character distortion or a life irrevocably out of balance, but it’s at least so commonplace that the dangerous potential should be recognized and embraced only with wariness. I’ve heard of programs designed to help professional athletes who become sudden multimillionaires (and thus targets of golddiggers and scammers) make the transition. Good for them that structured support is available. Yet another way average folks can’t relate: we have to work things out for ourselves.

Here’s the example I don’t get: Taylor Swift. She was the subject of a Netflix biography called Miss Americana (2020) that paints her as, well, misunderstood. Thing is, Swift is a runaway success story, raking in money, fans, awards, attention, and on balance, detractors. That success is something she earnestly desired and struggled to achieve only to learn that the glossy, popstar image sold especially but nonexclusively to 14-year-old girls comes with a lot of heavy baggage. How can the tragic lives of so many musicians launched into superstardom from the late 1950s onward have escaped Swift’s awareness in our media-saturated world? Naming names is sorta tacky, so I demur, but there are lots of them. Swift obtained her heart’s desire, found her songwriting and political voice, maintains a high public profile, and shows no lack of productivity. Sure, it’s a life out of balance, not remotely normal the way most noncelebrities would understand. However, she signed up for it willingly (if naïvely) and by all accounts perpetuates it. She created her own distinctive gilded prison. I don’t envy her, nor do I particularly feel sorry for her, as the Netflix show appears to instruct.

/rant on

Yet another journalist has unburdened herself (unbidden story of personal discovery masquerading as news) of her addiction to digital media and her steps to free herself from the compulsion to be always logged onto the onslaught of useless information hurled at everyone nonstop. Other breaking news offered by our intrepid late-to-the-story reporter: water is wet, sunburn stings, and the Earth is dying (actually, we humans are actively killing it for profit). Freeing oneself from the screen is variously called digital detoxification (detox for short), digital minimalism, digital disengagement, digital decoupling, and digital decluttering (really ought to be called digital denunciation) and means limiting the duration of exposure to digital media and/or deleting one’s social media accounts entirely. Naturally, there are apps (counters, timers, locks) for that. Although the article offers advice for how to disentangle from screen addictions of the duh! variety (um, just hit the power switch), the hidden-in-plain-sight objective is really how to reengage after breaking one’s compulsions but this time asserting control over the infernal devices that have taken over life. It’s a love-hate style of technophilia and chock full of illusions embarrassing even to children. Because the article is nominally journalism, the author surveys books, articles, software, media platforms, refusniks, gurus, and opinions galore. So she’s partially informed but still hasn’t demonstrated a basic grasp of media theory, the attention economy, or surveillance capitalism, all of which relate directly. Perhaps she should bring those investigative journalism skills to bear on Jaron Lanier, one of the more trenchant critics of living online.

I rant because the embedded assumption is that anything, everything occurring online is what truly matters — even though online media didn’t yet exist as recently as thirty years ago — and that one must (must I say! c’mon, keep up!) always be paying attention to matter in turn or suffer from FOMO. Arguments in favor of needing to be online for information and news gathering are weak and ahistorical. No doubt the twisted and manipulated results of Google searches, sometimes contentious Wikipedia entries, and various dehumanizing, self-as-brand social media platforms are crutches we all now use — some waaaay, way more than others — but they’re nowhere close to the only or best way to absorb knowledge or stay in touch with family and friends. Career networking in the gig economy might require some basic level of connection but shouldn’t need to be the all-encompassing, soul-destroying work maintaining an active public persona has become.

Thus, everyone is chasing likes and follows and retweets and reblogs and other analytics as evidence of somehow being relevant on the sea of ephemera floating around us like so much disused, discarded plastic in those infamous garbage gyres. (I don’t bother to chase and wouldn’t know how to drive traffic anyway. Screw all those solicitations for search-engine optimization. Paying for clicks is for chumps, though lots apparently do it to lie enhance their analytics.) One’s online profile is accordingly a mirror of or even a substitute for the self — a facsimile self. Lost somewhere in my backblog (searched, couldn’t find it) is a post referencing several technophiles positively celebrating the bogus extension of the self accomplished by developing and burnishing an online profile. It’s the domain of celebrities, fame whores, narcissists, and sociopaths, not to mention a few criminals. Oh, and speaking of criminals, recent news is that OJ Simpson just opened a Twitter account to reform his disastrous public image? but is fundamentally outta touch with how deeply icky, distasteful, and disgusting it feels to others for him to be participating once again in the public sphere. Disgraced criminals celebrities negatively associated with the Me-Too Movement (is there really such a movement or was it merely a passing hashtag?) have mostly crawled under their respective multimillion-dollar rocks and not been heard from again. Those few who have tried to reemerge are typically met with revulsion and hostility (plus some inevitable star-fuckers with short memories). Hard to say when, if at all, forgiveness and rejoining society become appropriate.

/rant off

Some while back, Scott Adams (my general disdain for him noted but unexpanded, since I’m not in the habit of shitting on people), using his knowledge of hypnosis, began pushing the string selling the narrative that our Commander-in-Chief is cannily adept at the art of persuasion. I, for one, am persuaded by neither Adams nor 45 but must admit that many others are. Constant shilling for control of narratives by agents of all sorts could not be more transparent (for me at least), rendering the whole enterprise null. Similarly, when I see an advertisement (infrequently, I might add, since I use ad blockers and don’t watch broadcast TV or news programs), I’m rarely inclined to seek more information or make a purchase. Once in a long while, an ad creeps through my defenses and hits one of my interests, and even then, I rarely respond because, duh, it’s an ad.

In the embedded video below, Stuart Ewen describes how some learned to exploit a feature (not a bug) in human cognition, namely, appeals to emotion that overwhelm rational response. The most obvious, well-worn example is striking fear into people’s hearts and minds to convince them of an illusion of safety necessitating relinquishing civil liberties and/or fighting foreign wars.

The way Ewen uses the term consciousness differs from the way I use it. He refers specifically to opinion- and decision-making (the very things vulnerable to manipulation) rather than the more generalized and puzzling property of having an individual identity or mind and with it self-awareness. In fact, Ewen uses the terms consciousness industry and persuasion industry instead of public relations and marketing to name those who spin information and thus public discourse. At some level, absolutely everyone is guilty of seeking to persuade others, which again is a basic feature of communication. (Anyone negotiating the purchase of, say, a new or used car faces the persuasion of the sales agent with some skepticism.) What turns it into something maniacal is using lies and fabrication to advance agendas against the public interest, especially where public opinion is already clear.

Ewen also points to early 20th-century American history, where political leaders and marketers were successful in manipulating mass psychology in at least three ways: (1) drawing the pacifist U.S. public into two world wars of European origin, (2) transforming citizens into consumers, thereby saving capitalism from its inherently self-destructive endgame (creeping up on us yet again), and (3) suppressing emergent collectivism, namely, socialism. Of course, unionism as a collectivist institution still gained considerable strength but only within the larger context of capitalism, e.g., achieving the American Dream in purely financial terms.

So getting back to Scott Adams’ argument, the notion that the American public is under some form of mass hypnosis (persuasion) and that 45 is the master puppeteer is perhaps half true. Societies do sometimes go mad and fall under the spell of a mania or cult leader. But 45 is not the driver of the current episode, merely the embodiment. I wouldn’t say that 45 figured out anything because that awards too much credit to presumed understanding and planning. Rather, he worked out (accidentally and intuitively — really by default considering his job in 2016) that his peculiar self-as-brand could be applied to politics by treating it all as reality TV, which by now everyone knows is its own weird unreality the same way professional wrestling is fundamentally unreal. (The term political theater applies here.) He demonstrated a knack (at best) for keeping the focus firmly on himself and driving ratings (abetted by the mainstream media that had long regarded him as a clown or joke), but those objectives were never really in service of a larger political vision. In effect, the circus brought to town offers its own bizarre constructed narrative, but its principle characteristic is gawking, slack-jawed, made-you-look narcissism, not any sort of proper guidance or governance.

Several politicians on the U.S. national stage have emerged in the past few years as firebrands of new politics and ideas about leadership — some salutary, others less so. Perhaps the quintessential example is Bernie Sanders, who identified himself as Socialist within the Democratic Party, a tacit acknowledgement that there are no electable third-party candidates for high office thus far. Even 45’s emergence as a de facto independent candidate within the Republican Party points to the same effect (and at roughly the same time). Ross Perot and Ralph Nader came closest in recent U.S. politics to establishing viable candidacies outside the two-party system, but their ultimate failures only reinforce the rigidity of modern party politics; it’s a closed system.

Those infusing energy and new (OK, in truth, they’re old) ideas into this closed system are intriguing. By virtue of his immediate name/brand recognition, Bernie Sanders can now go by his single given name (same is true of Hillary, Donald, and others). Supporters of Bernie’s version of Democratic Socialism are thus known as Bernie Bros, though the term is meant pejoratively. Considering his age, however, Bernie is not widely considered a viable presidential candidate in the next election cycle. Among other firebrands, I was surprised to find Alexandria Ocasio-Cortez (often referred to simply as AOC) described in the video embedded below as a Democratic Socialist but without any reference to Bernie (“single-handedly galvanized the American people”):

Despite the generation(s) gap, young adults had no trouble supporting Bernie three years ago but appear to have shifted their ardent support to AOC. Yet Bernie is still relevant and makes frequent statements demonstrating how well he understands the failings of the modern state, its support of the status quo, and the cult of personality behind certain high-profile politicians.

As I reflect on history, it occurs to me that many of the major advances in society (e.g., abolition, suffrage, the labor movement, civil rights, equal rights and abortion, and the end of U.S. involvement in the Vietnam War) occurred not because our government led us to them but because the American people forced the issues. The most recent examples of the government yielding to the will of the people are gay marriage and cannabis/hemp legalization (still underway). I would venture that Johnson and Nixon were the last U.S. presidents who experienced palpable fear of the public. (Claims that Democrats are afraid of AOC ring hollow — so far.) As time has worn on, later presidents have been confident in their ability to buffalo the public or at least to use the power of the state to quell unrest (e.g., the Occupy movement). (Modern means of crowd control raise serious questions about the legitimacy of any government that would use them against its own citizens. I would include enemy combatants, but that is a separate issue.) In contrast with salutary examples of the public using its disruptive power over a recalcitrant government are arguably more examples where things went haywire rather badly. Looking beyond the U.S., the French Reign of Terror and the Bolsheviks are the two examples that leap immediately to mind, but there are plenty of others. The pattern appears to be a populist ideology that takes root and turns virulent and violent followed by consolidation of power by those who mange to survive the inevitable purge of dissidents.

I bring this up because we’re in a period of U.S. history characterized by populist ideological possession on both sides (left/right) of the political continuum, though politics ought to be better understood as a spectrum. Extremism has again found a home (or several), and although the early stages appear to be mild or harmless, I fear that a charismatic leader might unwittingly succeed in raising a mob. As the saying goes (from the Indiana Jones movie franchise), “You are meddling with forces you cannot possibly comprehend,” to which I would add cannot control. Positioning oneself at the head of a movement or rallying behind such an opportunist may feel like the right thing to do but could easily and quickly veer into wildly unintended consequences. How many times in history has that already occurred?

YouTube ratings magnet Jordan Peterson had a sit-down with Susan Blackmore to discuss/debate the question, “Do We Need God to Make Sense of Life?” The conversation is lightly moderated by Justin Brierley and is part of a weekly radio broadcast called Unbelievable? (a/k/a The Big Conversation, “the flagship apologetics and theology discussion show on Premier Christian Radio in the UK”). One might wonder why evangelicals are so eager to pit believers and atheists against each other. I suppose earnest questioning of one’s faith is preferable to proselytizing, though both undoubtedly occur. The full episode (47 min.) is embedded below: (more…)

Back in the 1980s when inexpensive news programs proliferated, all wanting to emulate 60 Minutes or 20/20, I recall plenty having no problem working the public into a lather over some crime or injustice. A typical framing trick was to juxtapose two unrelated facts with the intent that the viewer leap to an unwarranted conclusion. Here’s an example I just made up: “On Tuesday, Jane went to her plastic surgeon for a standard liposuction procedure. By Friday, Jane was dead.” Well, what killed Jane? The obvious inference, by virtue of juxtaposition, is the procedure. Turns out it was an entirely unrelated traffic accident. The crap news program could legitimately claim that it never said the procedure killed Jane, yet it led the credulous public to believe so. Author Thomas Sowell resorts to that same sort of nonsense in his books: a habit of misdirection when arguing his point. I initially sought out his writing for balance, as everyone needs others capable of articulating competing ideas to avoid the echo chamber of one’s own mind (or indeed the chorus of the converted). Sowell failed to keep me as a reader.

It’s not always so easy to recognize cheap rhetorical tricks. They appear in movies all the time, but then, one is presumably there to be emotionally manipulated affected by the story, so a healthy suspension of disbelief goes a long way to enhance one’s enjoyment. Numerous fanboy sites (typically videos posted to YouTube) offer reviews and analysis that point out failures of logic, plotting, and continuity, as well as character inconsistency and embedded political propaganda messaging, but I’ve always thought that taking movies too seriously misses the point of cheap entertainment. Considering the powerful influence cinematic storytelling has over attitudes and beliefs, perhaps I’m being too cavalier about it.

When it comes to serious debate, however, I’m not nearly so charitable. The favored 5-minute news debate where 3 or 4 floating heads spew their rehearsed talking point, often talking over each other in a mad grab for air time, accomplishes nothing. Formal, long-form debates in a theater in front of an audience offer better engagement if participants can stay within proper debate rules and etiquette. Political debates during campaign season fail on that account regularly, with more spewing of rehearsed talking points mixed with gratuitous swipes at opponents. Typically, both sides claim victory in the aftermath and nothing is resolved, since that’s not really the objective. (Some opine that government, being essentially nonstop campaigning, suffers a similar fate: nothing is resolved because that’s not the true objective anymore.)

I was intrigued to learn recently of the semi-annual Munk Debates, named after their benefactors, that purport to be formal debates with time limits, moderation, and integrity. I had never heard of them before they booked Jordan Peterson alongside Michael Eric Dyson, Michelle Goldberg, and Stephen Fry. Like Donald Trump did for TV and print news, Peterson has turned into a 1-man ratings bonanza for YouTube and attracts viewers to anything in which he participates, which is quite a lot. The proposition the four debaters were provided was this: Be it resolved, what you call political correctness, I call progress … Problem is, that’s not really what was debated most of the time. Instead, Dyson diverted the debate to identity politics, specifically, racism and so-called white privilege. Goldberg mostly attacked Peterson regarding his opinions outside of the debate, Peterson defended himself against repeated personal attacks by Goldberg and Dyson, and Fry stayed relatively true to the intended topic. Lots of analysis and opinion appeared on YouTube almost immediately after the debate, so wade in if that’s what interests you. I viewed some of it. A couple videos called Dyson a grievance merchant, which seems to me accurate.

What concerns me more here are the cheap rhetorical tricks employed by Dyson — the only debater booed by the audience — that fundamentally derailed the proceedings. Dyson speaks with the fervor of a revivalist preacher, a familiar style that has been refined and coopted many times over to great effect. Whether deserved or not, it carries associations of great moral authority and momentous occasion. Unfortunately, if presented as a written transcript rather than a verbal rant, Dyson’s remarks are incoherent, unhinged, and ineffective except for their disruptive capacity. He reminded everyone of his blackness and his eloquence, the first of which needs no reminder, the second of which immediately backfired and called into question his own claim. Smart, eloquent people never tell you they’re smart and eloquent; the proof is in their behavior. Such boastful announcements tend to work against a person. Similarly, any remark that beings with “As a black/white/red/brown/blue man/woman/hybrid of _______ ethnicity/sexuality/identity …” calls in a host of associations that immediately invalidates the statement that follows as skewed and biased.

The two point-scoring bits of rhetoric Dyson levies with frequency, which probably form a comfort zone to which he instinctively retreats in all challenges, are his blackness (and by proxy his default victimhood) and historical oppression of blacks (e.g., slavery, Jim Crow laws, etc.). There are no other issues that concern him, as these two suffice to push everyone back on their heels. That’s why the debate failed to address political correctness effectively but instead revolved around identity politics. These issues are largely distinct, unless one debates the wisdom of switching out terminology cyclically, such as occurs even now with various racial epithets (directed to every race, not just blacks). That obvious tie-in, the use of euphemism and neologism to mask negative intent, was never raised. Nor were the twisted relations between free speech, hate speech, and approved speech codes (politically correct speech). Nope, the debate featured various personalities grandstanding on stage and using the opportunity to push and promote their personal brands, much like Trump has over the years. Worse, it was mostly about Michael Eric Dyson misbehaving. He never had my attention in the past; now I intend to avoid him at all costs.

The movie Gladiator depicts the protagonist Maximus addressing spectators directly at gladiatorial games in the Roman Colosseum with this meme-worthy challenge: “Are you not entertained?” Setting the action in an ancient civilization renowned for its decadent final phase prior to collapse, referred to as Bread and Circuses, allows us to share vicariously in the protagonist’s righteous disgust with the public’s blood lust while shielding us from any implication of our own shame because, after all, who could possibly entertain blood sports in the modern era? Don’t answer that.

are-you-not-entertained-gladiator

But this post isn’t about our capacity for cruelty and barbarism. Rather, it’s about the public’s insatiable appetite for spectacle — both fictional and absolutely for real — served up as entertainment. Professional wrestling is fiction; boxing and mixed martial arts are reality. Audiences consuming base entertainment and, in the process, depleting performers who provide that entertainment extend well beyond combat sports, however. For instance, it’s not uncommon for pop musicians to slowly destroy themselves once pulled into the attendant celebrity lifestyle. Three examples spring to mind: Elvis Presley, Michael Jackson, and Whitney Houston. Others call hiatus or retire altogether from the pressure of public performance, such as Britney Spears, Miles Davis, and Barbra Streisand.

To say that the public devours performers and discards what remains of them is no stretch, I’m afraid. Who remembers countdown clocks tracking when female actors turn 18 so that perving on them is at last okay? A further example is the young starlet who is presumably legitimized as a “serious” actor once she does nudity and/or portrays a hooker but is then forgotten in favor of the next. If one were to seek the full depth of such devouring impulses, I suggest porn is the industry to have all one’s illusions shattered. For rather modest sums, there is absolutely nothing some performers won’t do on film (these days on video at RedTube), and naturally, there’s an audience for it. Such appetites are as bottomless as they come. Are you not entertained?

Speaking of Miles Davis, I take note of his hiatus from public performance in the late 1970s before his limited return to the stage in 1986 and early death in 1991 at age 65. He had cemented a legendary career as a jazz trumpeter but in interviews (as memory serves) dismissed the notion that he was somehow a spokesperson for others, saying dryly “I’m just a trumpet player, man ….” What galled me, though, were Don Cheadle’s remarks in the liner notes of the soundtrack to the biopic Miles Ahead (admittedly a deep pull):

Robert Glasper and I are preparing to record music for the final scene of Miles Ahead — a possible guide track for a live concert that sees the return of Miles Davis after having been flushed from his sanctuary of silence and back onto the stage and into his rightful light. My producers and I are buzzing in disbelief about what our audacity and sheer will may be close to pulling off ….

What they did was record a what-might-have-been track had Miles incorporated rap or hip hop (categories blur) into his music. It’s unclear to me whether the “sanctuary of silence” was inactivity or death, but Miles was essentially forced onstage by proxy. Flushed is a strange word to use in this context, as one flushes an enemy or prey unwillingly from hiding. The decision to recast him in such “rightful light” strikes me as rather poor taste — a case of cultural appropriation worse than merely donning a Halloween costume.

This is the wave of the future, of course, now that images of dead celebrities can be invoked, say, to sell watches (e.g., Steve McQueen) and holograms of dead musicians are made into singing zombies, euphemized as “virtual performance”(e.g., Tupak Shakur). Newly developed software can now create digitized versions of people saying and doing whatever we desire of them, such as when celebrity faces are superimposed onto porn actors (called “deepfakes”). It might be difficult to argue that in doing so content creators are stealing the souls of others, as used to be believed in the early days of photography. I’m less concerned with those meeting demand than with the demand itself. Are we becoming demons, the equivalents of the succubus/incubus, devouring or destroying frivolously the objects of our enjoyment? Are you not entertained?