Archive for the ‘Tacky’ Category

Among the myriad ways we have of mistreating each other, epithets may well be the most ubiquitous. Whether using race, sex, age, nationality, or nominal physical characteristic (especially genital names), we have so many different words with which to insult and slur it boggles the mind. Although I can’t account for foreign cultures, I doubt there is a person alive or dead who hasn’t suffered being made fun of for some stupid thing. I won’t bother to compile a list there are so many (by way of example, Wikipedia has a list of ethnic slurs), but I do remember consulting a dictionary of historical slang, mostly disused, and being surprised at how many terms were devoted specifically to insults.

I’m now old and contented enough for the “sticks and stones …” dismissal to nullify any epithets hurled my way. When one comes up, it’s usually an obvious visual characteristic, such as my baldness or ruddiness. Those characteristics are of course true, so why allow them to draw ire when used with malicious intent? However, that doesn’t stop simple words from giving grave offense for those with either thin skins or being so-called fighting words for those habituated to answering provocation with physical force. And in an era when political correctness has equated verbal offense with violence, the self-appointed thought police call for blood whenever someone steps out of line in public. Alternatively, when such a person is one’s champion, then the blood sport becomes spectacle, such as when 45 gifts another public figure with a sobriquet.

The granddaddy of all epithets — the elephant in the room, at least in the U.S. — will not be uttered by me, sorta like the he-who-shall-not-be-named villain of the Harry Potter universe or the forbidden language of Mordor from the Tolkien universe. I lack standing to use the term in any context and won’t even venture a euphemism or placeholder using asterisks or capitalisms. Reclaiming the term in question by adopting it as a self-description — a purported power move — has decidedly failed to neutralize the term. Instead, the term has become even more egregiously insulting than ever, a modern taboo. Clarity over who gets to use the term with impunity and when is elusive, but for my own part, there is no confusion: I can never, ever speak or write it in any context. I also can’t judge whether this development is a mark of cultural progress or regression.

Advertisements

Back in the 1980s when inexpensive news programs proliferated, all wanting to emulate 60 Minutes or 20/20, I recall plenty having no problem working the public into a lather over some crime or injustice. A typical framing trick was to juxtapose two unrelated facts with the intent that the viewer leap to an unwarranted conclusion. Here’s an example I just made up: “On Tuesday, Jane went to her plastic surgeon for a standard liposuction procedure. By Friday, Jane was dead.” Well, what killed Jane? The obvious inference, by virtue of juxtaposition, is the procedure. Turns out it was an entirely unrelated traffic accident. The crap news program could legitimately claim that it never said the procedure killed Jane, yet it led the credulous public to believe so. Author Thomas Sowell resorts to that same sort of nonsense in his books: a habit of misdirection when arguing his point. I initially sought out his writing for balance, as everyone needs others capable of articulating competing ideas to avoid the echo chamber of one’s own mind (or indeed the chorus of the converted). Sowell failed to keep me as a reader.

It’s not always so easy to recognize cheap rhetorical tricks. They appear in movies all the time, but then, one is presumably there to be emotionally manipulated affected by the story, so a healthy suspension of disbelief goes a long way to enhance one’s enjoyment. Numerous fanboy sites (typically videos posted to YouTube) offer reviews and analysis that point out failures of logic, plotting, and continuity, as well as character inconsistency and embedded political propaganda messaging, but I’ve always thought that taking movies too seriously misses the point of cheap entertainment. Considering the powerful influence cinematic storytelling has over attitudes and beliefs, perhaps I’m being too cavalier about it.

When it comes to serious debate, however, I’m not nearly so charitable. The favored 5-minute news debate where 3 or 4 floating heads spew their rehearsed talking point, often talking over each other in a mad grab for air time, accomplishes nothing. Formal, long-form debates in a theater in front of an audience offer better engagement if participants can stay within proper debate rules and etiquette. Political debates during campaign season fail on that account regularly, with more spewing of rehearsed talking points mixed with gratuitous swipes at opponents. Typically, both sides claim victory in the aftermath and nothing is resolved, since that’s not really the objective. (Some opine that government, being essentially nonstop campaigning, suffers a similar fate: nothing is resolved because that’s not the true objective anymore.)

I was intrigued to learn recently of the semi-annual Munk Debates, named after their benefactors, that purport to be formal debates with time limits, moderation, and integrity. I had never heard of them before they booked Jordan Peterson alongside Michael Eric Dyson, Michelle Goldberg, and Stephen Fry. Like Donald Trump did for TV and print news, Peterson has turned into a 1-man ratings bonanza for YouTube and attracts viewers to anything in which he participates, which is quite a lot. The proposition the four debaters were provided was this: Be it resolved, what you call political correctness, I call progress … Problem is, that’s not really what was debated most of the time. Instead, Dyson diverted the debate to identity politics, specifically, racism and so-called white privilege. Goldberg mostly attacked Peterson regarding his opinions outside of the debate, Peterson defended himself against repeated personal attacks by Goldberg and Dyson, and Fry stayed relatively true to the intended topic. Lots of analysis and opinion appeared on YouTube almost immediately after the debate, so wade in if that’s what interests you. I viewed some of it. A couple videos called Dyson a grievance merchant, which seems to me accurate.

What concerns me more here are the cheap rhetorical tricks employed by Dyson — the only debater booed by the audience — that fundamentally derailed the proceedings. Dyson speaks with the fervor of a revivalist preacher, a familiar style that has been refined and coopted many times over to great effect. Whether deserved or not, it carries associations of great moral authority and momentous occasion. Unfortunately, if presented as a written transcript rather than a verbal rant, Dyson’s remarks are incoherent, unhinged, and ineffective except for their disruptive capacity. He reminded everyone of his blackness and his eloquence, the first of which needs no reminder, the second of which immediately backfired and called into question his own claim. Smart, eloquent people never tell you they’re smart and eloquent; the proof is in their behavior. Such boastful announcements tend to work against a person. Similarly, any remark that beings with “As a black/white/red/brown/blue man/woman/hybrid of _______ ethnicity/sexuality/identity …” calls in a host of associations that immediately invalidates the statement that follows as skewed and biased.

The two point-scoring bits of rhetoric Dyson levies with frequency, which probably form a comfort zone to which he instinctively retreats in all challenges, are his blackness (and by proxy his default victimhood) and historical oppression of blacks (e.g., slavery, Jim Crow laws, etc.). There are no other issues that concern him, as these two suffice to push everyone back on their heels. That’s why the debate failed to address political correctness effectively but instead revolved around identity politics. These issues are largely distinct, unless one debates the wisdom of switching out terminology cyclically, such as occurs even now with various racial epithets (directed to every race, not just blacks). That obvious tie-in, the use of euphemism and neologism to mask negative intent, was never raised. Nor were the twisted relations between free speech, hate speech, and approved speech codes (politically correct speech). Nope, the debate featured various personalities grandstanding on stage and using the opportunity to push and promote their personal brands, much like Trump has over the years. Worse, it was mostly about Michael Eric Dyson misbehaving. He never had my attention in the past; now I intend to avoid him at all costs.

The movie Gladiator depicts the protagonist Maximus addressing spectators directly at gladiatorial games in the Roman Colosseum with this meme-worthy challenge: “Are you not entertained?” Setting the action in an ancient civilization renowned for its decadent final phase prior to collapse, referred to as Bread and Circuses, allows us to share vicariously in the protagonist’s righteous disgust with the public’s blood lust while shielded us from any implication of our own shame because, after all, who could possibly entertain blood sports in the modern era? Don’t answer that.

are-you-not-entertained-gladiator

But this post isn’t about our capacity for cruelty and barbarism. Rather, it’s about the public’s insatiable appetite for spectacle — both fictional and absolutely for real — served up as entertainment. Professional wrestling is fiction; boxing and mixed martial arts are reality. Audiences consuming base entertainment and, in the process, depleting performers who provide that entertainment extend well beyond combat sports, however. For instance, it’s not uncommon for pop musicians to slowly destroy themselves once pulled into the attendant celebrity lifestyle. Three examples spring to mind: Elvis Presley, Michael Jackson, and Whitney Houston. Others call hiatus or retire altogether from the pressure of public performance, such as Britney Spears, Miles Davis, and Barbra Streisand.

To say that the public devours performers and discards what remains of them is no stretch, I’m afraid. Who remembers countdown clocks tracking when female actors turn 18 so that perving on them is at last okay? A further example is the young starlet who is presumably legitimized as a “serious” actor once she does nudity and/or portrays a hooker but is then forgotten in favor of the next. If one were to seek the full depth of such devouring impulses, I suggest porn is the industry to have all one’s illusions shattered. For rather modest sums, there is absolutely nothing some performers won’t do on film (these days on video at RedTube), and naturally, there’s an audience for it. Such appetites are as bottomless as they come. Are you not entertained?

Speaking of Miles Davis, I take note of his hiatus from public performance in the late 1970s before his limited return to the stage in 1986 and early death in 1991 at age 65. He had cemented a legendary career as a jazz trumpeter but in interviews (as memory serves) dismissed the notion that he was somehow a spokesperson for others, saying dryly “I’m just a trumpet player, man ….” What galled me, though, were Don Cheadle’s remarks in the liner notes of the soundtrack to the biopic Miles Ahead (admittedly a deep pull):

Robert Glasper and I are preparing to record music for the final scene of Miles Ahead — a possible guide track for a live concert that sees the return of Miles Davis after having been flushed from his sanctuary of silence and back onto the stage and into his rightful light. My producers and I are buzzing in disbelief about what our audacity and sheer will may be close to pulling off ….

What they did was record a what-might-have-been track had Miles incorporated rap or hip hop (categories blur) into his music. It’s unclear to me whether the “sanctuary of silence” was inactivity or death, but Miles was essentially forced onstage by proxy. “Flushed” is a strange word to use in this context, as one “flushes” an enemy or prey unwillingly from hiding. The decision to recast him in such “rightful light” strikes me as rather poor taste — a case of cultural appropriation worse than merely donning a Halloween costume.

This is the wave of the future, of course, now that images of dead celebrities can be invoked, say, to sell watches (e.g., Steve McQueen) and holograms of dead musicians are made into singing zombies, euphemized as “virtual performance”(e.g., Tupak Shakur). Newly developed software can now create digitized versions of people saying and doing whatever we desire of them, such as when celebrity faces are superimposed onto porn actors (called “deepfakes”). It might be difficult to argue that in doing so content creators are stealing the souls of others, as used to be believed in the early days of photography. I’m less concerned with those meeting demand than with the demand itself. Are we becoming demons, the equivalents of the succubus/incubus, devouring or destroying frivolously the objects of our enjoyment? Are you not entertained?

Back in undergraduate college, when just starting on my music education degree, I received an assignment where students were asked to formulate a philosophy of education. My thinking then was influenced by a curious textbook I picked up: A Philosophy of Music Education by Bennett Reimer. Of course, it was the wrong time for an undergraduate to perform this exercise, as we had neither maturity nor understanding equal to the task. However, in my naïvté, my answer was all about learning/teaching an aesthetic education — one that focused on appreciating beauty in music and the fine arts. This requires the cultivation of taste, which used to be commonplace among the educated but is now anathema. Money is the preeminent value now. Moreover, anything that smacks of cultural programming and thought control is now repudiated reflexively, though such projects are nonetheless undertaken continuously and surreptitiously through a variety of mechanisms. As a result, the typical American’s sense of what is beautiful and admirable is stunted. Further, knowledge of the historical context in which the fine arts exist is largely absent. (Children are ahistorical in this same way.) Accordingly, many Americans are coarse philistines whose tastes rarely extend beyond those acquired naturally during adolescence (including both biophilia and biophobia), thus the immense popularity of comic book movies, rock and roll music, and all manner of electronica.

When operating with a limited imagination and undeveloped ability to perceive and discern (and disapprove), one is a sitting duck for what ought to be totally unconvincing displays of empty technical prowess. Mere mechanism (spectacle) then possesses the power to transfix and amaze credulous audiences. Thus, the ear-splitting volume of amplified instruments substitutes for true emotional energy produced in exceptional live performance, ubiquitous CGI imagery (vistas and character movements, e.g., fight skills, that simply don’t exist in reality) in cinema produces wonderment, and especially, blinking lights and animated GIFs deliver the equivalent of a sugar hit (cookies, ice cream, soda) when they’re really placebos or toxins. Like hypnosis, the placebo effect is real and pronounced for those unusually susceptible to induction. Sitting ducks.

Having given the fine arts (including their historical contexts) a great deal of my academic attention and acquired an aesthetic education, my response to the video below fell well short of the blasé relativism most exhibit; I actively dislike it. (more…)

For a variety of reasons, I go to see movies in the theater only a handful of times any given year. The reasons are unimportant (and obvious) and I recognize that, by eschewing the theater, I’m giving up the crowd experience. Still, I relented recently and went to see a movie at a new AMC Dolby Cinema, which I didn’t even know exists. The first thing to appreciate was that is was a pretty big room, which used to be standard when cinema was first getting established in the 1920s but gave way sometime in the 1970s to multiplex theaters able to show more than one title at a time in little shoebox compartments with limited seating. Spaciousness was a welcome throwback. The theater also had oversized, powered, leather recliners rather than cloth, fold-down seats with shared armrests. The recliners were quite comfortable but also quite unnecessary (except for now typical Americans unable to fit their fat asses in what used to be a standard seat). These characteristics are shared with AMC Prime theaters that dress up the movie-going experience and charge accordingly. Indeed, AMC now offers several types of premium cinema, including RealD 3D, Imax, Dine-In, and BigD.

Aside I: A friend only just reported on her recent trip to the drive-in theater, a dated cinema experience that is somewhat degraded unenhanced yet retains its nostalgic charm for those of us old enough to remember as kids the shabby chic of bringing one’s own pillows, blankets, popcorn, and drinks to a double feature and sprawling out on the hood and/or roof of the car (e.g., the family station wagon). My friend actually brought her dog to the drive-in and said she remembered and sorta missed the last call on dollar hot dogs at 11 PM that used to find all the kids madly, gleefully rushing the concession stand before food ran out.

What really surprised me, however, was how the Dolby Cinema experience turned into a visual, auditory, and kinesthetic assault. True, I was watching Wonder Woman (sorry, no review), which is set in WWI and features lots of gunfire and munitions explosions in addition to the usual invincible superhero punchfest, so I suppose the point is partly to be immersed in the environment, a cinematic stab at verisimilitude. But the immediacy of all the wham-bam, rock ’em-sock ’em action made me feel more like a participant in a theater of war than a viewer. The term shell shock (a/k/a battle fatigue a/k/a combat neurosis) refers to the traumatized disorientation one experiences in moments of high stress and overwhelming sensory input; it applies here. Even the promo before the trailers and feature, offered to demonstrate the theater’s capabilities themselves, was off-putting because of unnecessary and overweening volume and impact. Unless I’m mistaken, the seats even have built-in subwoofers to rattle theatergoers from below when loud, concussive events occur, which is often because, well, filmmakers love their spectacle as much as audiences do.

Aside II: One real-life lesson to be gleaned from WWI, or the Great War as it was called before WWII, went well beyond the simplistic truism that war is hell. It was that civility (read: civilization) had failed and human progress was a chimera. Technical progress, however, had made WWI uglier in many respects than previous warfare. It was an entirely new sort of horror. Fun fact: there are numerous districts in France, known collectively as Le Zone Rouge, where no one is allowed to live because of all the unexploded ordnance (100 years later!). Wonder Woman ends up having it both ways: acknowledging the horrific nature of war on the one hand yet valorizing and romanticizing personal sacrifice and eventual victory on the other. Worse, perhaps, it establishes that there’s always another enemy in the wings (otherwise, how could there be sequels?), so keep fighting. And for the average viewer, uniformed German antagonists are easily mistakable for Nazis of the subsequent world war, a historical gloss I’m guessing no one minds … because … Nazis.

So here’s my problem with AMC’s Dolby Cinema: why settle for routine or standard theater experience when it can be amped up to the point of offense? Similarly, why be content with the tame and fleeting though reliable beauty of a sunset when one can enjoy a widescreen, hyperreal view of cinematic worlds that don’t actually exist? Why settle for the subtle, old-timey charm of the carousel (painted horses, dizzying twirling, and calliope music) when instead one can strap in and get knocked sideways by roller coasters so extreme that riders leave wobbly and crying at the end? (Never mind the risk of being stranded on the tracks for hours, injured, or even killed by a malfunction.) Or why bother attending a quaint symphonic band concert in the park or an orchestral performance in the concert hall when instead one can go to Lollapalooza and see/hear/experience six bands in the same cacophonous space grinding it out at ear-splitting volume, along with laser light shows and flash-pot explosions for the sheer sake of goosing one’s senses? Coming soon are VR goggles that trick the wearer’s nervous system into accepting they are actually in the virtual game space, often first-person shooters depicting killing bugs or aliens or criminals without compunction. Our arts and entertainments have truly gotten out of hand.

If those criticisms don’t register, consider my post more than a decade ago on the Paradox of the Sybarite and Catatonic, which argues that our senses are so overwhelmed by modern life that we’re essentially numb from overstimulation. Similarly, let me reuse this Nietzsche quote (used before here) to suggest that on an aesthetic level, we’re not being served well in display and execution of refined taste so much as being whomped over the head and dragged willingly? through ordeals:

… our ears have become increasingly intellectual. Thus we can now endure much greater volume, much greater ‘noise’, because we are much better trained than our forefathers were to listen for the reason in it. All our senses have in fact become somewhat dulled because we always inquire after the reason, what ‘it means’, and no longer for what ‘it is’ … our ear has become coarsened. Furthermore, the ugly side of the world, originally inimical to the senses, has been won over for music … Similarly, some painters have made the eye more intellectual, and have gone far beyond what was previously called a joy in form and colour. Here, too, that side of the world originally considered ugly has been conquered by artistic understanding. What is the consequence of this? The more the eye and ear are capable of thought, the more they reach that boundary line where they become asensual. Joy is transferred to the brain; the sense organs themselves become dull and weak. More and more, the symbolic replaces that which exists … the vast majority, which each year is becoming ever more incapable of understanding meaning, even in the sensual form of ugliness … is therefore learning to reach out with increasing pleasure for that which is intrinsically ugly and repulsive, that is, the basely sensual. [italics not in original]

The video below came to my attention recently, which shows a respectable celebrity, violinist/conductor Itzhak Perlman, being dicked around in an interview he probably undertook in good faith. My commentary follows.

Publicized pranks and gotchas are by no means rare. Some are good-natured and quite funny, but one convention of the prank is to unmask it pretty quickly. In the aftermath, the target typically either laughs if off, leaves without comment, or less often, storms out in disgust. Andy Kaufman as “Tony Clifton” was probably among the first to sustain a prank well past the point of discomfort, never unmasking himself. Others have since gotten in on the antics, though results are probably not any worse dickishness (dickery?) than Kaufman’s.

Fake interviews by comedians posing as news people are familiar to viewers of The Daily Show and its spinoff The Colbert Report (its run now completed). Zack Galifianakis does the same schtick in Between Two Ferns. It always surprises me when targets fall into the trap, exposing themselves as clueless ideologues willing to be hoisted with their own petards. However, Colbert in particular balanced his arch Republican stage persona with an unmistakable respect for his interview subject, which was at times inspired. Correspondents from The Daily Show are frequently pretty funny, but they almost never convey any respect for the subjects of the interview. Nick Canellakis (shown above) apparently has a whole series of interviews with classical musicians where he feigns idiocy and insult. Whereas some interview subjects are media savvy enough to get the joke and play along, I find this attempt at humor tasteless and unbearable.

Further afield, New Media Rockstars features a burgeoning list of media hosts who typically operate cheaply over the Web via YouTube, supported by an array of social media. At least one, Screen Junkies (the only one I watch), has recently blown into an entire suite of shows. I won’t accuse them all of being talentless hacks or dicking people around for pointless yuks, but I often pause to wonder what makes the shows worth producing beyond the hosts’ embarrassingly encyclopedic knowledge of comics, cartoons, TV shows, movies, etc. They’re fanboys (and girls) who have leveraged their misspent youth and eternal adolescence to gush and gripe about their passions. Admittedly, this may not be so different from sports fanatics (especially human statisticians), opera geeks, and nerds of others stripes.

Throwaway media may have unintentionally smuggled in tasteless shenanigans such as those by Nick Canellakis. Various comedians (unnamed) have similarly offered humorless discomfort as entertainment. Reality TV shows explored this area a while back, which I called trainwreck television. Cheaply produced video served over the Web has unleashed a barrage of dreck in all these categories. Some shows may eventually find their footing and become worthwhile. In the meantime, I anticipate seeing plenty more self-anointed media hosts dicking around celebrities and audiences alike.

The English language has words for everything, and whenever something new comes along, we coin a new word. The latest neologism I heard is bolthole, which refers to the the location one bolts to when collapse and civil unrest reach intolerable proportions. At present, New Zealand is reputed to be the location of boltholes purchased and kept by the ultrarich, which has the advantage of being located in the Southern Hemisphere, meaning remote from the hoi polloi yet reachable by private plane or oceangoing yacht. Actually, bolthole is an older term now being repurposed, but it seems hip and current enough to be new coin.

Banned words are the inverse of neologisms, not in the normal sense that they simply fall out of use but in their use being actively discouraged. Every kid learns this early on when a parent or older sibling slips and lets an “adult” word pass his or her lips that the kid isn’t (yet) allowed to use. (“Mom, you said fuck!”) George Carlin made a whole routine out of dirty words (formerly) banned from TV. Standards have been liberalized since the 1970s, and now people routinely swear or refer to genitalia on TV and in public. Sit in a restaurant or ride public transportation (as I do), eavesdrop a little speech within easy earshot (especially private cellphone conversations), and just count the casual F-bombs.

The worst field of banned-words nonsense is political correctness, which is intertwined with identity politics. All the slurs and epithets directed at, say, racial groups ought to be disused, no doubt, but we overcompensate by renaming everyone (“____-American”) to avoid terms that have little or no derogation. Even more ridiculous, at least one egregiously insulting term has been reclaimed as an badge of honor unbanned banned word by the very group it oppresses. It takes Orwellian doublethink to hear that term — you all know what it is — used legitimately exclusively by those allowed to use it. (I find it wholly bizarre yet fear to wade in with my own prescriptions.) Self-disparaging language, typically in a comedic context, gets an unwholesome pass, but only if one is within the identity group. (Women disparage women, gays trade on gay stereotypes, Jews indulge in jokey anti-Semitism, etc.) We all laugh and accept it as safe, harmless, and normal. President Obama is continuously mixed up appearances (“optics”), or what to call things — or not call them, as the case may be. For instance, his apparent refusal to call terrorism originating in the Middle East “Muslim terrorism” has been met with controversy.

I’m all for calling a thing what it is, but the term terrorism is too loosely applied to any violent act committed against (gasp!) innocent Americans. Recent events in Charleston, SC, garnered the terrorism label, though other terms would be more apt. Further, there is nothing intrinsically Muslim about violence and terrorism. Yeah, sure, Muslims have a word or doctrine — jihad — but it doesn’t mean what most think or are led to believe it means. Every religion across human history has some convenient justification for the use of force, mayhem, and nastiness to promulgate its agenda. Sometimes it’s softer and inviting, others time harder and more militant. Unlike Bill Maher, however, circumspect thinkers recognize that violence used to advance an agenda, like words used to shape narratives, are not the province of any particular hateful or hate-filled group. Literally everyone does it to some extent. Indeed, the passion with which anyone pursues an agenda is paradoxically celebrated and reviled depending on content and context, and it’s a long, slow, ugly process of sorting to arrive as some sort of Rightthink®, which then becomes conventional wisdom before crossing over into political correctness.

If I were to get twisted and strained over every example of idiocy on parade, I’d be permanently distorted. Still, a few issues have crossed my path that might be worth bringing forward.

Fealty to the Flag

An Illinois teacher disrespected the American flag during a classroom lesson on free speech. Context provided in this article is pretty slim, but it would seem to me that a lesson on free speech might be precisely the opportunity to demonstrate that tolerance of discomfiting counter-opinion is preferable to the alternative: squelching it. Yet in response to complaints, the local school board voted unanimously to fire the teacher of the offending lesson. The ACLU ought to have a field day with this one, though I must admit there can be no convincing others that desecrating the flag is protected free speech. Some remember a few years ago going round and round on this issue with a proposed Constitutional amendment. Patriots stupidly insist on carving out an exception to free speech protections when it comes to the American flag, which shows quite clearly that they are immune to the concept behind the 1st Amendment, which says this:

Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the government for a redress of grievances. [emphasis added]

Naturally, interpretations of the Bill of Rights vary widely, but it doesn’t take a Constitutional scholar to parse the absolute character of these rights. Rights are trampled all the time, of course, as the fired Illinois teacher just found out.

Fealty to the Wrong Flag

The Confederate battle flag has come back into the national spotlight following racially inspired events in Charleston, SC. (Was it ever merely a quaint, anachronistic, cultural artifact of the American South?) CNN has a useful article separating fact from fiction, yet some Southerners steadfastly defend the flag. As a private issue of astonishingly poor taste, idiocy, and free speech, individuals should be allowed to say what they want and fly their flags at will, but as a public issue for states and/or institutions that still fly the flag or emblazon it on websites, letterhead, etc., it’s undoubtedly better to give up this symbol and move on. (more…)

I’m not a serious cineaste, but I have offered a few reviews on The Spiral Staircase. There are many, many cineastes out there, though, and although cinema is now an old medium (roughly 100 years old), cineastes tend to be on the younger side of 35 years. Sure, lots of established film critics are decidedly older, typically acting under the aegis of major media outlets, but I’m thinking specifically of the cohort who use new, democratized media (e.g., cheap-to-produce and -distribute YouTube channels) to indulge in their predilections. For example, New Media Rockstars has a list of their top 100 YouTube channels (NMR No. 1 contains links to the rest). I have heard of almost none of them, since I don’t live online like so many born after the advent of the Information/Communications Age. The one I pay particular attention to is Screen Junkies (which includes Honest Trailers, the Screen Junkies Show, and Movie Fights), and I find their tastes run toward childhood enthusiasms that mire their criticism in a state of permanent adolescence and self-mocking geekdom. The preoccupation with cartoons, comic books, action figures, superheros, and popcorn films couldn’t be more clear. Movies Fights presumes to award points on the passion, wit, and rhetoric of the fighters rather than quality of the films they choose to defend. However, adjudication is rarely neutral, since trump cards tend to get played when a superior film or actor is cited against an inferior one.

So I happened to catch three recent flicks that are central to Screen Junkies canon: Captain America: Winter Soldier, The Avengers: Age of Ultron, and Transformers: Age of Extinction (links unnecessary). They all qualify as CGI festivals — films centered on hyperkinetic action rather than story or character (opinions differ, naturally). The first two originate from the MCU (acronym alert: MCU = Marvel Cinematic Universe, which is lousy with comic book superheros) and the last is based on a Saturday-morning children’s cartoon. Watching grown men and a few women on Screen Junkies getting overexcited about content originally aimed at children gives me pause, yet I watch them to see what fighters say, knowing full well that thoughtful remarks are infrequent.

Were I among the fighters (no chance, since I don’t have my own media fiefdom), I would likely be stumped when a question needs immediate recall (by number, as in M:I:3 for the third Mission Impossible film) of a specific entry from any of numerous franchises pumping out films regularly like those named above. Similarly, my choices would not be so limited to films released after 1990 as theirs, that year being the childhood of most of the fighters who appear. Nor would my analysis be so embarrassingly visual in orientation, since I understand good cinema to be more about story and character than whiz-bang effects.

Despite the visual feast fanboys adore (what mindless fun!), lazy CGI festivals suffer worst from overkill, far outstripping the eye’s ability to absorb onscreen action fully or effectively. Why bother with repeat viewing of films with little payoff in the first place? CGI characters were interesting in and of themselves the first few times they appeared in movies without causing suspension of belief, but now they’re so commonplace that they feel like cheating. Worse, moviegoers are now faced with so many CGI crowds, clone and robot armies, zombie swarms, human-animal hybrids, et cetera ad nauseum, little holds the interest of jaded viewers. Thus, because so few scenes resonate emotionally, sheer novelty substitutes (ineffectively) for meaning, not that most chases or slugfests in the movies offer much truly original. The complaint is heard all the time: we’ve seen it before.

Here’s my basic problem with the three CGI-laden franchise installments I saw recently: their overt hypermilitarism. When better storytellers such as Kubrick or Coppola make films depicting the horrors of war (or other existential threats, such as the ever-popular alien invasion), their perspective is indeed that war is horrible, and obvious moral and ethical dilemmas flow from there. When hack filmmakers pile up frenzied depictions of death and destruction, typically with secondary or tertiary characters whose dispatch means and feels like nothing, and with cities destroyed eliciting no emotional response because it’s pure visual titillation, they have no useful, responsible, or respectable commentary. Even the Screen Junkies recognize that, unlike, say, Game of Thrones, none of their putative superheroes really face much more than momentary distress before saving the day in the third act and certainly no lasting injury (a little make-up blood doesn’t convince me). Dramatic tension simply drains away, since happy resolutions are never in doubt. Now, characters taking fake beatdowns are laughter inducing, sorta like professional wrestling after the sheepish admission that they’ve been acting all along. Frankly, pretend drama with nothing at stake is a waste of effort and the audience’s time and trust. That so many fanboys enjoy being goosed or that some films make lots of money is no justification. The latter is one reason why cinema so often fails to rise to the aspiration of art: it’s too bound up in grubbing for money.

A Surfeit of Awards

Posted: January 29, 2015 in Culture, Education, Idle Nonsense, Tacky, Taste
Tags: ,

/rant on

I get alumni magazines from two colleges/universities I attended. These institutional organs are unapologetic boosters of the accomplishments of alumni, faculty, and students. They also trumpet never-ending capital campaigns, improvements to facilities, and new and refurbished buildings. The latest round of news from my two schools feature significant new and rebuilt structures, accompanied by the naming of these structures after the foundations, contributors, and faculty/administrators associated with their execution. Well and good, you might surmise, but I always have mixed feelings. No doubt there are certain thresholds that must be met for programs to function and excel: stadia and gyms, locker rooms, concert halls and theaters, practice and rehearsal spaces, equipment, computer labs, libraries and their holdings, etc. Visiting smaller schools having inadequate facilities always brought that point home. Indeed, that’s one of the reasons why anyone chooses a school: for the facilities.

Since the late sixties or so, I have witnessed one school after another (not just in higher education) becoming what I think of as lifestyle schools. Facilities are not merely sufficient or superior; they range into the lap of luxury and excess. It’s frankly embarrassing that the quality and furnishings of dormitories now exceed what most students will enjoy for decades post-graduation. In my college years, no one found it the slightest bit embarrassing to have meager accommodations. That’s not why one was there. Now the expectation is to luxuriate. Schools clearly compete to attract students using a variety of enticements, but delivering the best lifestyle while in attendance was formerly not one of them. But the façades and accoutrements are much easier to evaluate than the academic programs, which have moved in the opposite direction. Both are now fraudulent at many schools; it’s a game of dress-up.

That rant, however, may only the tip of the proverbial iceberg. I cannot escape the sense that we celebrate ourselves and our spurious accomplishments with amazing disregard for their irrelevance. Unlike many, who dream of achieving immortality through proxy, the desire to see one’s name on the side of a building, in a hall of fame, on an endowed chair, etched in a record book, or otherwise gouged into posterity confounds me. Yet I can’t go anywhere without finding another new feature named after someone, usually posthumously but not always, whose memory must purportedly be preserved. (E.g., Chicago recently renamed the Circle Interchange after its first and only female mayor, Jane Byrne, causing some confusion due to inadequate signage.) The alumni magazines were all about newly named buildings, chairs, scholarships, halls, bricks, and waste cans. It got to be sickening. The reflex is now established: someone gives a pile of money or teaches (or administers) for a time, name something after him or her. And as we enter championship and awards season in sports and cinema, the surfeit of awards doled out, often just for showing up and doing one’s job, is breathtaking.

Truly memorable work and achievement need no effusive praise. They are perpetuated through subscription. Yet even they, as Shelley reminds us, pass from memory eventually. Such is the way of the world in the long stretches of time (human history) we have inhabited it. Readers of this blog will know that, in fairly awful terms, that time is rapidly drawing to a close due to a variety of factors, but primarily because of our own prominence. So one might wonder, why all this striving and achieving and luxuriating and self-celebrating when its end is our own destruction?

/rant off