Archive for the ‘Artistry’ Category

I see plenty of movies over the course of a year but had not been to a theater since The Force Awakens came out slightly over a year ago. The reason is simple: it costs too much. With ticket prices nearing $15 and what for me had been obligatory popcorn and soda (too much of both the way they’re bundled and sold — ask anyone desperately holding back their pee until the credits roll!), the endeavor climbed to nearly $30 just for one person. Never mind that movie budgets now top $100 million routinely; the movie-going experience simply isn’t worth $30 a pop. Opening weekend crowds (and costumes)? Fuggedaboudit! Instead, I view films at home on DVD (phooey on Blueray) or via a streaming service. Although I admit I’m missing out on being part of an audience, which offers the possibility of being carried away on a wave of crowd emotion, I’m perfectly happy watching at home, especially considering most films are forgettable fluff (or worse) and filmmakers seem to have forgotten how to shape and tell good stories. So a friend dragged me out to see Rogue One, somewhat late after its opening by most standards. Seeing Star Wars and other franchise installments now feels like an obligation just to stay culturally relevant. Seriously, soon enough it will be Fast & Furious Infinitum. We went to a newly built theater with individual recliners and waiters (no concession stands). Are film-goers no longer satisfied by popcorn and Milk Duds? No way would I order an $80 bottle of wine to go with Rogue One. It’s meant to be a premium experience, with everything served to you in the recliner, and accordingly, charges premium prices. Too bad most films don’t warrant such treatment. All this is preliminary to the actual review, of course.

I had learned quite a bit about Rogue One prior to seeing it, not really caring about spoilers, and was pleasantly surprised it wasn’t as bad as some complain. Rogue One brings in all the usual Star Wars hallmarks: storm troopers, the Force, X-Wings and TIE Fighters, ray guns and light sabers, the Death Star, and familiar characters such as Grand Moff Tarkin, Darth Vader, Princess Leia, etc. Setting a story within the Star Wars universe makes most of that unavoidable, though some specific instances did feel like gratuitous fan service, such as the 3-second (if that) appearance of C3PO and R2D2. The appearance of things and characters I already knew about didn’t feel to me like an extra thrill, but how much I needed to already know about Star Wars just to make sense of Rogue One was a notable weakness. Thus, one could call Rogue One a side story, but it was by no means a stand-alone story. Indeed, characters old and new were given such slipshod introductions (or none at all!) that they functioned basically as chess pieces moved around to drive the game forward. Good luck divining their characteristic movements and motivations. Was there another unseen character manipulating everyone? The Emperor? Who knows? Who cares! It was all a gigantic, faceless, pawn sacrifice. When at last the main rebels died, there was no grief or righteousness over having at least accomplished their putative mission. Turns out the story was all about effects, not emotional involvement. And that’s how I felt: uninvolved. It was a fireworks display ending with a pointless though clichéd grand finale. Except I guess that watching a bunch of fake stuff fake blow up was the fake point.

About what passed for a story: the Rebellion learns (somehow?!) that they face total annihilation from a new superweapon called the Death Star. (Can’t remember whether that term was actually used in the film.) While the decision of leadership is to scatter and flee, a plucky band of rebels within the rebellion insist on flinging themselves against the enemy without a plan except to improvise once on site, whereupon leadership decides irrationally to do the same. The lack of strategy is straight out of The Return of the King, distracting the enemy from the true mission objective, but the visual style is more like the opening of Saving Private Ryan, which is to say, full, straight-on bombardment and invasion. Visual callbacks to WWII infantry uniforms and formations couldn’t be more out of place. To call these elements charmless is to give them too much credit. Rather, they’re hackneyed. However, they probably fit well enough within the Saturday-morning cartoon, newsreel, swashbuckler sensibility that informed the original Star Wars films from the 1970s. Problem is, those 1970s kids are grown and want something with greater gravitas than live-action space opera. Newer Star Wars audiences are stuck in permanent adolescence because of what cinema has become, with its superhero franchises and cynical money grabs.

As a teenager when the first trilogy came out, I wanted more of the mystical element — the Force — than I wanted aerial battles, sword fights, or chase scenes. The goofy robots, reluctant heroes, and bizarre aliens were fun, but they were balanced by serious, steady leadership (the Jedi) and a couple really bad-ass villains. While it’s known George Lucas had the entire character arc of Anakin Skywalker/Darth Vader in mind from the start, it’s also fair to say that no one quite knew in Episode 4 just how iconic Vader the villain would become, which is why his story became the centerpiece of the first two trilogies (how many more to come?). However, Anakin/Vader struggled with the light/dark sides of the Force, which resonated with anyone familiar with the angel/demon nomenclature of Christianity. When the Force was misguidedly explained away as Midi-clorians (science, not mysticism), well, the bottom dropped out of the Star Wars universe. At that point, it became a grand WWII analogue populated by American GIs and Nazis — with some weird Medievalism and sci-fi elements thrown in — except that the wrong side develops the superweapon. Rogue One makes that criticism even more manifest, though it’s fairly plain to see throughout the Star Wars films.

Let me single out one actor for praise: Ben Mendelsohn as Orson Krennic. It’s hard for me to decide whether he chews the scenery, upstaging Darth Vader as a villain in the one scene they share, or he’s among a growing gallery of underactors whose flat line delivery and blandness invites viewers to project upon them characterization telegraphed through other mechanisms (costuming, music, plot). Either way, I find him oddly compelling and memorable, unlike the foolish, throwaway, sacrificial band of rebellious rebels against the rebellion and empire alike. Having seen Ben Mendelsohn in other roles, he possesses an unusual screen magnetism that reminds me of Sean Connery. He tends to play losers and villains and be a little one-note (not a bag of tricks but just one trick), but he is riveting on-screen for the right reasons compared to, say, the ookiness of the two gratuitous CGI characters in Rogue One.

So Rogue One is a modestly enjoyable and ephemeral romp through the Star Wars universe. It delivers and yet fails to deliver, which about as charitable as I can be.

Stray links build up over time without my being able to handle them adequately, so I have for some time wanted a way of purging them. I am aware of other bloggers who curate and aggregate links with short commentaries quite well, but I have difficulty making my remarks pithy and punchy. That said, here are a few that I’m ready to purge in this first attempt to dispose of a few links from by backlog.

Skyfarm Fantasies

Futurists have offered myriad visions of technologies that have no hope of being implemented, from flying cars to 5-hour workweeks to space elevators. The newest pipe dream is the Urban Skyfarm, a roughly 30-story tree-like structure with 24 acres of space using solar panels and hydroponics to grow food close to the point of consumption. Utopian engineering such as this crops up frequently (pun intended) and may be fun to contemplate, but in the U.S. at least, we can’t even build high-speed rail, and that technology is already well established elsewhere. I suppose that’s why cities such as Seoul and Singapore, straining to make everything vertical for lack of horizontal space, are the logical test sites.

Leaving Nashville

The City of Nashville is using public funds to buy homeless people bus tickets to leave town and go be poor somewhere else. Media spin is that the city is “helping people in need,” but it’s obviously a NIMBY response to a social problem city officials and residents (not everyone, but enough) would rather not have to address more humanely. How long before cities begin completing with each other in numbers of people they can ship off to other cities? Call it the circle of life when the homeless start gaming the programs, revisiting multiple cities in an endless circuit.


Over at Rough Type, Nick Carr points to an article in The Nation entitled “Instagram and the Fantasy of of Mastery,” which argues that a variety of technologies now give “artists” the illusion of skill, merit, and vision by enabling work to be easily executed using prefab templates and stylistic filters. For instance, in pop music, the industry standard is to auto-tune everyone’s singing to hide imperfections. Carr’s summary probably is better than the article itself and shows us the logical endpoint of production art in various media undertaken without the difficult work necessary to develop true mastery.

Too Poor to Shop

The NY Post reported over the summer that many Americans are too poor to shop except for necessities. Here are the first two paragraphs:

Retailers have blamed the weather, slow job growth and millennials for their poor results this past year, but a new study claims that more than 20 percent of Americans are simply too poor to shop.

These 26 million Americans are juggling two to three jobs, earning just around $27,000 a year and supporting two to four children — and exist largely under the radar, according to America’s Research Group, which has been tracking consumer shopping trends since 1979.

Current population in the U.S. is around 325 million. Twenty percent of that number is 65 million; twenty-six million is 8 percent. Pretty basic math, but I guess NY Post is not to be trusted to report even simple things accurately. Maybe it’s 20% of U.S. households. I dunno and can’t be bothered to check. Either way, that’s a pretty damning statistic considering the U.S. stock market continues to set new all-time highs — an economic recovery not shared with average Americans. Indeed, here are a few additional newsbits and links stolen ruthlessly from

  • The number of Americans that are living in concentrated areas of high poverty has doubled since the year 2000.
  • In 2007, about one out of every eight children in America was on food stamps. Today, that number is one out of every five.
  • 46 million Americans use food banks each year, and lines start forming at some U.S. food banks as early as 6:30 in the morning because people want to get something before the food supplies run out.
  • The number of homeless children in the U.S. has increased by 60 percent over the past six years.
  • According to Poverty USA, 1.6 million American children slept in a homeless shelter or some other form of emergency housing last year.

For further context, theeconomiccollapseblog also points to “The Secret Shame of Middle Class Americans” in The Atlantic, which reports, among other things, that fully 47% of Americans would struggle to scrape together a mere $400 in an emergency.

How do such folks respond to the national shopping frenzy kicking off in a few days with Black Friday, Small Business Saturday, Charitable Sunday, and Cyber Monday? I suggest everyone stay home.

See this exchange where Neil deGrasse Tyson chides Sam Harris for failing to speak to his audience in terms it understands:

The upshot is that lay audiences simply don’t subscribe to or possess the logical, rational, abstract style of discourse favored by Harris. Thus, Harris stands accused of talking past his audience — at least somewhat — especially if his audience is understood to be the general public rather than other well-educated professionals. Subject matter is less important than style but revolves around politics, and worse, identity politics. Everyone has abundant opinions about those, whether informed by rational analysis or merely fed by emotion and personal resonance.

The lesson deGrasse Tyson delivers is both instructive and accurate yet also demands that the level of discourse be lowered to a common denominator (like the reputed 9th-grade speech adopted by the evening news) that regrettably forestalls useful discussion. For his part (briefly, at the end), Harris takes the lesson and does not resort to academic elitism, which would be obvious and easy. Kudos to both, I guess, though I struggle (being somewhat an elitist); the style-over-substance argument really goes against the grain for me. Enhancements to style obviously work, and great communicators use them and are convincing as a result. (I distinctly recall Al Gore looking too much like a rock star in An Inconvenient Truth. Maybe it backfired. I tend to think that style could not overcome other blocks to substance on that particular issue.) Slick style also allows those with nefarious agendas to hoodwink the public into believing nonsense.


A couple of posts ago, I used the phrase “pay to play” in reference to our bought-and-paid-for system of political patronage. This is one of those open secrets we all recognize but gloss over because, frankly, in a capitalist economy, anything that can be monetized and corrupted will be. Those who are thus paid to play enjoy fairly handsome rewards for doing not very much, really. Yet the paradigm is self-reinforcing, much like the voting system, with promises of increased efficiency and effectiveness with greater levels of participation. Nothing of the sort has proven to be true; it’s simply a goad we continue to hear, some believing in the carrot quite earnestly, others holding their noses and ponying up their dollars and votes, and still others so demoralized and disgusted with the entire pointless constellation of lies and obfuscations that refusing to participate feels like the only honest response. (Periodic arguments levied my way that voting is quite important have failed to convince me that my vote matters a whit. Rather, it takes a bizarre sort of doublethink to conclude that casting my ballot is meaningful. Of late, I’ve succumbed to sustained harangues and shown up to vote, but my heart’s not in it.) I can’t distinguish so well anymore between true believers and mere manipulators except to observe that the former are more likely to be what few civic-minded voters remain and the latter are obviously candidates and their PR hacks. Journalists? Don’t get me started.

The phrase put me in mind of two other endeavors (beyond politics) where a few professionals enjoy being paid to play: sports and performing arts. Both enjoy heavy subscription among the masses early in life, as student sports and performing groups offer training and experience. The way most of us start out, in fact, we actually pay to play through classes, lessons, training, dues, and memberships that provide access to experts and put us in position to reap rewards later in life. Maybe you attended tennis camp or music camp as a kid, or you paid for a college education (defrayed perhaps by activity scholarships) majoring in athletics or theater. Lots of variations exist, and they’re not limited to youth. As an endurance athlete, I continue to pay entrance fees to race organizers for the opportunity to race on courses with support that would otherwise be unavailable without the budget provided by participants, sponsorship notwithstanding. Chicago’s popular 16-inch softball leagues are pay-to-play sports.

A second phase might be giving it away for free. As with paying to play, pure enjoyment of the endeavor works as a strong motivation and justification. This is probably more common in the community-level performing arts, where participation is just plain fun. And who knows? Exposure might lead to a big break or discovery. It’s also what motivates quite a lot of amateur athletes, especially for sports that have not gone mainstream. Olympic athletes (tertiary events) might fall roughly into this category, especially when their primary incomes are derived elsewhere. A third phase is being paid to play. If the audience or fan base is big enough, the financial rewards and fame can be considerable. However, those who enter the professional ranks don’t always demonstrate such great prowess, especially early on. More than a few blow up and flame out quickly, unable to sustain the spark that launched their careers. There’s also being paid to play but earning well short of a livable wage, which borders on giving it away for free or at least for too little. A final phase is being paid not to play. A mean interpretation of that would be that one is screwing up or blocking others’ opportunities to the point where it becomes worthwhile to pay someone to not show up or to go away. A more charitable interpretation would be that one’s employment contract includes time-off benefits that require continuous payments even when not playing.

As with my post about the differences between the Participation, Achievement, and Championship Models, I’m now content with numerous endeavors to be either pay to play, play for free, or play for too little. Participation makes it worthwhile under any payment regime, the alternative typically being sitting at home on my couch wasting my time in front of the TV. I never made it to the enviable position of being paid to play or paid not to play. Still, as an individual of some attainment and multiple areas of expertise, I admit finding it irksome to observe some truly awful people out there pulling in attention and wealth despite rather feeble efforts or abilities. The meritocracy may not be dead, but it often looks comatose.

Today is the 10-year anniversary of the opening of this blog. As a result, there is a pretty sizeable backblog should anyone decide to wade in. As mentioned in my first post, I only opened this blog to get posting privileges at a group blog I admired because it functioned more like a discussion than a broadcast. The group blog died of attrition years ago, yet here I am 10 years later still writing my personal blog (which isn’t really about me).

Social media lives and dies by the numbers, and mine are deplorable. Annual traffic has ranged from about 6,800 to about 12,500 hits, much of which I’m convinced is mere background noise and bot traffic. Cumulative hits number about 90,140, and unique visitors are about 19,350, neither of which is anything to crow about for a blog of this duration. My subscriber count continues to climb pointlessly, now resting at 745. However, I judge I might have only a half dozen regular readers and perhaps half again as many commentators. I’ve never earned a cent for my effort, nor am I likely to ever put up a Patreon link or similar goad for donations. All of which only demonstrate that almost no one cares what I have to write about. C’est la vie. I don’t write for that purpose and frankly wouldn’t know what to write about if I were trying to drive numbers.

So if you have read my blog, what are some of the thing you might have gathered from me? Here’s an incomplete synopsis:

  • Torture is unspeakably bad. History is full of devices, methodologies, and torturers, but we learned sometime in the course of two 20th-century world wars that nothing justifies it. Nevertheless, it continues to occur with surprising relish, and those who still torture (or want to) are criminally insane.
  • Skyscrapers are awesomely tall examples of technical brilliance, exuberance, audacity, and hubris. Better expressions of techno-utopian, look-mom-no-hands, self-defeating narcissism can scarcely be found. Yet they continue to be built at a feverish pace. The 2008 financial collapse stalled and/or doomed a few projects, but we’re back to game on.
  • Classical music, despite record budgets for performing ensembles, has lost its bid for anything resembling cultural and artistic relevance by turning itself into a museum (performing primarily works of long-dead composers) and abandoning emotional expression in favor of technical perfection, which is probably an accurate embodiment of the spirit of the times. There is arguably not a single living composer who has become a household name since Aaron Copland, who died in 1990 but was really well-known in the 1940s and 50s.
  • We’re doomed — not in any routine sense of the word having to do with individual mortality but in the sense of Near-Term (Human) Extinction (NTE). The idea is not widely accepted in the least, and the arguments are too lengthy to repeat (and unlikely to convince). However, for those few able to decipher it, the writing is on the wall.
  • American culture is a constantly moving target, difficult to define and describe, but its principal features are only getting uglier as time wears on. Resurgent racism, nationalism, and misogyny make clear that while some strides have been made, these attitudes were only driven underground for a while. Similarly, colonialism never really died but morphed into a new version (globalization) that escapes criticism from the masses, because, well, goodies.
  • Human consciousness — another moving target — is cratering (again) after 3,000–5,000 years. We have become hollow men, play actors, projecting false consciousness without core identity or meaning. This cannot be sensed or assessed easily from the first-person perspective.
  • Electronic media makes us tools. The gleaming attractions of sterile perfection and pseudo-sociability have hoodwinked most of the public into relinquishing privacy and intellectual autonomy in exchange for the equivalent of Huxley’s soma. This also cannot be sensed or assessed easily from the first-person perspective.
  • Electoral politics is a game played by the oligarchy for chumps. Although the end results are not always foreseeable (Jeb!), the narrow range of options voters are given (lesser of evils, the devil you know …) guarantees that fundamental change in our dysfunctional style of government will not occur without first burning the house down. After a long period of abstention, I voted in the last few elections, but my heart isn’t really in it.
  • Cinema’s infatuation with superheros and bankable franchises (large overlap there) signals that, like other institutions mentioned above, it has grown aged and sclerotic. Despite large budgets and impressive receipts (the former often over $100 million and the latter now in the billions for blockbusters) and considerable technical prowess, cinema has lost its ability to be anything more than popcorn entertainment for adolescent fanboys (of all ages).

This is admittedly a pretty sour list. Positive, worthwhile manifestations of the human experience are still out there, but they tend to be private, modest, and infrequent. I still enjoy a successful meal cooked in my own kitchen. I still train for and race in triathlons. I still perform music. I still make new friends. But each of these examples is also marred by corruptions that penetrate everything we do. Perhaps it’s always been so, and as I, too, age, I become increasingly aware of inescapable distortions that can no longer be overcome with innocence, ambition, energy, and doublethink. My plan is to continue writing the blog until it feels like a burden, at which point I’ll stop. But for now, there’s too much to think and write about, albeit at my own leisurely pace.

A long while back (8 years ago), I drew attention to a curious bit of rhyming taking place in the world of architecture: the construction of skyscrapers that twist from base to top (see also here). I even suggested that one per city was needed, which seems to be slowly manifesting. Back then, the newest installment was the Infinity Tower, now fully built and known as the Cayan Tower. The doomed planned Chicago Spire has yet to get off the ground. Another incarnation of the basic twisting design is the Evolution Tower in Moscow, completed in 2014 (though I only just learned about it):


There are plenty more pics at the Skyscraper page devoted to this building.

News of this development comes to me by way of James Howard Kunstler’s Eyesore of the Month feature at his website. I draw attention to Kunstler because he is far better qualified to evaluate and judge architecture than am I, even though most of his remarks are disparagement. Kunstler and I share both aesthetic and doomer perspectives on stunt architecture, and the twisting design seems to be one faddish way to avoid the boxy, straight-line approach to supertall buildings that dominated for some fifty years. Indeed, many buildings of smaller stature now seek that same avoidance, which used to be accomplished via ornamentation but is now structural. Such designs and construction are enabled by computers, thought it remains to be seen how long maintenance and repair can be sustained in an era of diminishing financial resources. (Material resources are a different but related matter, but these days, almost no one bothers with anything without financial incentive or reward.)

When the last financial collapse occurred in 2008 (extending into 2009 with recovery since then mostly faked), lots of projects were mothballed. I note, however, that Chicago has many new projects underway, and I can only surmise that other skylines are similarly full of cranes signalling the return of multibillion-dollar construction projects aimed at the well-off. Mention of crumbling infrastructure has been ongoing for decades now. Here’s one recent example. Yet attention and funding seems to flow in the direction of projects that really do not need doing. While it might be true that the discrepancy here lies with public vs. private funding, it appears to me another case of mismanaging our own affairs by focusing too much on marquee projects while allowing dated and perhaps less attractive existing structures to decay and crumble.

My work commute typically includes bus, train, and walking legs to arrive at my destination. If wakefulness and an available seat allow, I often read on the bus and train. (This is getting to be exceptional compared to other commuters, who are more typically lost in their phones listening to music, watching video, checking FB, or playing games. Some are undoubtedly reading, like me, but electronic media, which I find distasteful, alter the experience fundamentally from ink on paper.) Today, I was so absorbed in my reading that by the time I looked up, I missed my bus stop, and half an hour later, I nearly missed my train stop, too. The experience of tunnel vision in deep concentration is not at all unfamiliar to me, but it is fleeting and unpredictable. More typical is a relaxed yet alert concentration that for me takes almost no effort anymore.

So what sent me ’round the bend? The book I’m currently reading, Nick Carr’s The Glass Cage, takes a diversion into the work of poet Robert Frost. Carr uses Frost to illustrate his point about immersion in bodily work with manageable difficulty lending the world a more robust character than the detached, frictionless world experienced with too much technological mediation and ease. Carr does a terrific job contextualizing Frost’s lyric observations in a way quite unlike the contextual analysis one might undertake in a high school or college classroom, which too often makes the objects of study lifeless and irrelevant. Carr’s discussion put me unexpectedly into an aesthetic mode of contemplation, as distinguished from analytic or kinesthetic modes. There are probably others.

I don’t often go into aesthetic mode. It requires the right sort of stimulation. The Carr/Frost combination put me there, and so I tunneled into the book and forgot my commute. That momentary disorientation is often pleasurable, but for me, it can also be distressing. My infrequent visits to art museums are often accompanied by a vague unease at the sometimes nauseating emotionalism of the works on display. It’s an honest response, though I expect most folks can’t quite understand why something beautiful would provoke something resembling a negative response. In contrast, my experience in the concert hall is usually frustration, as musicians have become ever more corporate and professional in their performance over time to the detriment and exclusion of latent emotional content. I suppose that as Super Bowl Sunday is almost upon us (about which I care not at all), the typical viewer gets an emotional/aesthetic charge out of that overhyped event, especially if the game is hotly contested rather than a blowout. I seek and find my moments in less crass expressions of the human spirit.

rant on/

This is the time of year when media pundits pause to look back and consider the previous year, typically compiling unasked-for “best of” lists to recap what everyone may have experienced — at least if one is absorbed by entertainment media. My interest in such nonsense is passive at best, dismissive at worst. Further, more and more lists are weighed and compiled by self-appointed and guileless fanboys and -girls, some of whom are surprisingly knowledgeable (sign of a misspent youth?) and insightful yet almost uniformly lack a sufficiently longitudinal view necessary to form circumspect and expert opinions. The analogy would be to seek wisdom from a 20- or 30-something in advance of its acquisition. Sure, people can be “wise beyond their years,” which usually means free of the normal illusions of youth without yet having become a jaded, cynical curmudgeon — post-ironic hipster is still available — but a real, valuable, historical perspective takes more than just 2-3 decades to form.

For instance, whenever I bring up media theory to a youngster (from my point of reckoning), usually someone who has scarcely known the world without 24/7/365 access to all things electronic, he or she simply cannot conceive what it means to be without that tether/pacifier/security blanket smothering them. It doesn’t feel like smothering because no other information environment has ever been experienced (excepting perhaps in early childhood, but even that’s not guaranteed). Even a brief hiatus from the information blitzkrieg, a two-week vacation, say, doesn’t suffice. Rather, only someone olde enough to remember when it simply wasn’t there — at least in the personal, isolating, handheld sense — can know what it was like. I certainly remember when thought was free to wander, invent, and synthesize without pressure to incorporate a continuous stream of incoming electronic stimuli, most of which amounts to ephemera and marketing. I also remember when people weren’t constantly walled in by their screens and feeds, when life experience was more social, shared, and real rather than private, personal, and virtual. And so that’s why when I’m away from the radio, TV, computer, etc. (because I purposely and pointedly carry none of it with me), I’m less a mark than the typical media-saturated fool face-planted in a phone or tablet for the lures, lies, cons, and swindles that have become commonplace in late-stage capitalism.

Looking back in another sense, I can’t help but to feel a little exasperated by the splendid reviews of the life in music led by Pierre Boulez, who died this past week. Never heard of him? Well, that just goes to show how far classical music has fallen from favor that even a titan such as he makes utterly no impression on the general public, only specialists in a field that garners almost no attention anymore. Yet I defy anyone not to know who Kim Kardashian is. Here’s the bigger problem: despite being about as favorably disposed toward classical music as it is possible to be, I have to admit that no one I know (including quite a few musicians) would be able to hum or whistle or sing a recognizable tune by Boulez. He simply doesn’t pass the whistle test. But John Williams (of Star Wars fame) certainly does. Nor indeed would anyone put on a recording of one of Boulez’s works to listen to. Not even his work as a conductor is all that compelling, either live or on disc (I’ve experienced plenty of both). As one looks back on the life of Pierre Boulez, as one is wont to do upon his passing, how can it be that such prodigious talent as he possessed could be of so little relevance?

Consider these two examples flip sides of the same coin. One enjoys widespread subscription but is base (opinions differ); the other is obscure but (arguably) refined. Put differently, one is pedestrian, the other admirable. Or over a lifetime, one is placebo (or worse), the other fulfilling. Looking back upon my own efforts and experiences in life, I would much rather be overlooked or forgotten than be petty and (in)famous. Yet mass media conspires to make us all into nodes on a network with goals decidedly other than human respectability or fulfillment. So let me repeat the challenge question of this blog: are you climbing or descending?

rant off/

The video below came to my attention recently, which shows a respectable celebrity, violinist/conductor Itzhak Perlman, being dicked around in an interview he probably undertook in good faith. My commentary follows.

Publicized pranks and gotchas are by no means rare. Some are good-natured and quite funny, but one convention of the prank is to unmask it pretty quickly. In the aftermath, the target typically either laughs if off, leaves without comment, or less often, storms out in disgust. Andy Kaufman as “Tony Clifton” was probably among the first to sustain a prank well past the point of discomfort, never unmasking himself. Others have since gotten in on the antics, though results are probably not any worse dickishness (dickery?) than Kaufman’s.

Fake interviews by comedians posing as news people are familiar to viewers of The Daily Show and its spinoff The Colbert Report (its run now completed). Zack Galifianakis does the same schtick in Between Two Ferns. It always surprises me when targets fall into the trap, exposing themselves as clueless ideologues willing to be hoisted with their own petards. However, Colbert in particular balanced his arch Republican stage persona with an unmistakable respect for his interview subject, which was at times inspired. Correspondents from The Daily Show are frequently pretty funny, but they almost never convey any respect for the subjects of the interview. Nick Canellakis (shown above) apparently has a whole series of interviews with classical musicians where he feigns idiocy and insult. Whereas some interview subjects are media savvy enough to get the joke and play along, I find this attempt at humor tasteless and unbearable.

Further afield, New Media Rockstars features a burgeoning list of media hosts who typically operate cheaply over the Web via YouTube, supported by an array of social media. At least one, Screen Junkies (the only one I watch), has recently blown into an entire suite of shows. I won’t accuse them all of being talentless hacks or dicking people around for pointless yuks, but I often pause to wonder what makes the shows worth producing beyond the hosts’ embarrassingly encyclopedic knowledge of comics, cartoons, TV shows, movies, etc. They’re fanboys (and girls) who have leveraged their misspent youth and eternal adolescence to gush and gripe about their passions. Admittedly, this may not be so different from sports fanatics (especially human statisticians), opera geeks, and nerds of others stripes.

Throwaway media may have unintentionally smuggled in tasteless shenanigans such as those by Nick Canellakis. Various comedians (unnamed) have similarly offered humorless discomfort as entertainment. Reality TV shows explored this area a while back, which I called trainwreck television. Cheaply produced video served over the Web has unleashed a barrage of dreck in all these categories. Some shows may eventually find their footing and become worthwhile. In the meantime, I anticipate seeing plenty more self-anointed media hosts dicking around celebrities and audiences alike.

Continuing from part 1, the Ironic is characterized by (among other things) reversal of meaning, sometimes understood as the unexpected manifested but more commonly as sarcasm. The old joke goes that in pompous, authoritarian fashion, the language/semiotics professor says to his class of neophytes, “In many languages, a double negative equals a positive, but in no language does a double positive make a negative.” In response, a student mutters under his breath, “yeah, right ….” Up to a certain age and level of cognitive development, children don’t process sarcasm; they are literal-minded and don’t understand subtext. Transcripts and text (e.g., blog posts and comments) also typically fail to transmit nonverbal cues that one may be less than earnest making certain statements. Significantly, no one is allowed to make offhand jokes in line at security checkpoints because, in that context, remarks such as “yeah, like my shoes are full of C4” are treated quite literally.

I have a vague memory of the period in my adolescence when I discovered sarcasm, at which time it was deployed almost continuously, saying the opposite of what I meant with the expectation that others (older than me) would understand the implied or latent meaning. I also adopted the same mock abuse being used elsewhere, which regrettably lasted into my late 20s. Maybe it’s a phase everyone must go through, part of growing up, and as a society, our cultural development must also pass through that phase, though I contend we remain mired in irony or ironic posturing.

The model for me was insult comedy, still in style now but more familiar from my childhood. Like most during this developmental phase, I accepted the TV as social tutor for how people communicate and what’s acceptable to say. So who can blame me or other children, fed a diet of snark and attitude (adult writers of TV shows being a lot more clever than the adolescent actors who voice the lines) from speaking the same way? But to appreciate irony more directly, consider the comedian (then and now) who levies criticism using clichés drawn from his or her own gender, race, religion, social class, etc. In comedy, sexism, racism, and class conflict are not just joke fodder but stereotyped bigotry that reinforces the very scourges they ostensibly criticize. Oh, sure, the jokes are often funny. We all know to laugh at the black comedian who trades nonstop in nigger jokes or the female who complains of being nothing more than an object for male titillation. Comedians (and special interest groups — minority or not — that lay claim to victimhood) may coopt the language of their oppressors (some actual, some imagined — see for instance those complaining about the War on Christmas), but the language and attitudes are broken down and reinforced at the same time.

This isn’t solely the domain of comedy, either. Whereas TV sitcoms are ruled by hip, ironic posturing — the show about nothing that plumbs the surprising depths inside everything trivial, banal, and inane, the show full of nerd archetypes who rise above their inherent nerdiness to be real people worthy of respect (or not surprisingly, not so worthy after all), or the endless parade of sitcom families with unrealistically precocious, smart aleck kids who take aim at everyone with a continuous stream of baleful insults, take-downs, and mockery but are, despite truly cretinous behavior, always forgiven (or passed over because another joke is imminent) and still lovable — in the virtual world (the Internet, where you are reading this), sarcasm, snark, irony, abuse, and corrosive jokiness are legion. Take, for instance, this video at and tell me there isn’t something deeply wrong with it:

One might wonder whether the intent is interdiction or recruitment (or both at once), especially if one acknowledges that most of the awful things depicted in the video are precisely what the U.S. military has been doing in the Middle East for well over a decade. The Fox News blurb linked below the video says, “The State Department is launching a tough and graphic propaganda counteroffensive against the Islamic State, using some of the group’s own images of barbaric acts against fellow Muslims to undercut its message.” Maybe the word propaganda is a mistake and publicity was intended, but I suspect that propaganda is the right word precisely because it’s understood as both pejorative and superlative. As with everything else, meaning has become polysemous.

Iain McGilchrist illustrates this with special emphasis on the arts and how substitution of symbolic tokens normalizes distortion. For instance, art theory of the Aesthetes contains a fundamental paradox:

The Aesthetes’ creed of ‘art for art’s sake’, while it sounds like an elevation of the value of art, in that it denies that it should have an ulterior purpose beyond itself — so far so good — is also a devaluation of art, in that it marginalizes its relationship with life. In other words it sacrifices the betweenness of art with life, instead allowing art to become self-reflexively fulfilled. There is a difference between the forlorn business of creating ‘art for art’s sake’, and art nonetheless being judged solely ‘as art’, not as for another purpose. [p. 409]

Isolating artistic creation in a mental or virtual transactional space ought to be quite familiar (or perhaps more accurately, assumed and thus invisible) to 21st-century people, but it was a new concept at the outset of the 20th century. The paradox is that the doctrine is both a reversal of meaning and retention of opposites together. Over the course of the 20th century, we became habituated to such thinking, namely, that a thing automatically engenders its opposite and is both things at once. For instance, what used to be called the War on Poverty, meant to help those suffering deprivation, is now also its reverse: literally a war on the poverty-stricken. Similarly, the War on Drugs, meant to eradicate drug use as a social ill, is also quite literally a war against drug users, who are a large and improper part of the bloated U.S. prison population. Reduction of government services to the poor and rampant victim-blaming demonstrate that programs once meant to assist those in need now often instead leave them to fend for themselves, or worse, pile on with criminal charges. Disinformation campaign about welfare cheats and the minimum wage are further examples of information being distorted and used to serve an unwholesome agenda.

My conclusion is not yet ready to be drawn; it’s far too subtle to fit in a Tweet or even a series of blog posts. However, consider what it means when the language we use is laden with ironic twists that force recipients of any message to hold simultaneously forward/backward, up/down, left/right, and true/false meanings. Little can be established beyond reasonable doubt not just because so many of us have been poorly served by educational institutions (or is it the students themselves — sort of a chicken-and-egg question) more interested in business and credentialing than teaching and learning that few possess the ability to assess and evaluate information (ironically, from a variety of perspectives) being spun and spoon fed to us by omnimedia but because the essential underlying structure of language and communications has been corrupted by disembedding, decontextualization, and deconstruction that relegate reality to something to be dreamt up and then used to convince others. In the end, of course, we’re only fooling ourselves.