Something caught my eye this week as I was surfing around, this time from a mostly abandoned classical music criticism blog I used to read (with some frustration). I reproduce in full a post called “Top Ten Music School Rankings” because it’s content-lite (perhaps not original to the blog):

10. The school where you did your undergrad.
9. The school where you got your Master’s, and to which you are indebted for the gigs it helped you get to pay off the student loans for the school where you did your undergrad.
8. The place where you wrote your DMA dissertation on your teacher’s teacher’s teacher’s pedagogical methods (or lack thereof).
7. Juellerd. Julleard? Julliard. Jewelyard? Whatever.
6. Harvard.
5. The place you wanted to go for undergrad, but you fracked one single note in one single excerpt and then you panicked and broke down and called the trumpet professor “Dad” and then Dave got in even though he couldn’t play Petrushka in time and he’s always been kind of a dick about it and now he’s subbing like every weekend in the fucking BSO.
4. Royal Something of Great British Academy I think? I hear they never let Americans in. Or maybe that’s the other one?
3. The school that everybody knows isn’t as good as the school where you did your undergrad, but is “up-and coming.” Featuring a lauded entrepreneurship initiative that trains barista skills at one of the three coffee shops housed in its new state-of-the-art building, named for an alumnus of the university’s business school currently facing indictment for fraud.
2. University of Phoenix.
1. The school that has paid to have this list promoted on Facebook.

It’s funny (I guess) in ways that register mostly on music school grads, whose experiences and concerns over musical minutiae diverge from the mass of college graduates who majored in business, English, or any number of professional studies (premed, prelaw, journalism) that lead more consistently to employment in those professions. (Music school, BTW, is an unacknowledged type of trade school.) But the jokes are also somewhat ghoulish in ways that are becoming increasingly familiar to everyone seeking employment after completion of the formal phase of education. Mentions of Univ. of Phoenix and Facebook ought to be struck from this particular list except that they’re too emblematic of the systemic fraud that now passes for higher education. So it was curious to read, after the hooting and self-recognition in the comments section, a veritable cry in the wilderness:

I graduated from Oberlin, Michigan and Wisconsin and am currently a custodian in an apartment complex. I even won the concerto competition at 2 of the 3 schools and am in debt up to my eyeballs. I wish music schools would emphasize alternatives in the field of music, offer apprenticeships and internships and even require students to double major or double on a secondary “gig” instrument, so they could do well in the field.

Despite robust demand for education in performance fields (e.g., music, dance, acting) and other fine arts, there have never been waiting jobs anywhere close to the number of (presumably skilled) graduates churned out by schools. Obviously, one can invert the supply-demand nomenclature to oversupply of skilled performance labor vs. minimal market demand for those skills. Offering such degrees by every second- and third-tier school is undoubtedly a money-making enterprise but is nonetheless tantamount to intellectual dishonesty of a type distinct from what I blogged about here. Faculty and administrators are certainly hip to the fact that they’re often selling a bill of goods. After all, they’re paid for that expertise. This is why some parents (and some professors, too) do everything in their power to discourage students from pursuing performance studies, but to little avail as enrollments and selectivity continue to rise even if skill levels and accomplishment don’t.

As the “debt up to my eyeballs” comment above exemplifies, the cost of higher education has mounted far faster than inflation, and crushing student debt (unlikely to ever be repaid) now accompanies attendance at most top-tier schools except perhaps to trust-fund students. And even those top-tier schools find it difficult to deliver graduates into waiting jobs. It’s not that no one gets employed, mind you; it’s just that majoring in performance studies of one sort or another is akin to (essentially) majoring in football or basketball with dreams of joining the NFL or NBA after school. The numbers don’t bode well for graduates without extraordinary talent and/or connections, and unlike sports franchises, the arts don’t operate as pure meritocracies. Scoring ability if far more measurable than artistic expression, making it worthwhile to tolerate the misbehavior of thugs and cretins with speed, power, and attitude. I’m still waiting for the meme to establish itself that perhaps the considerable risk of tens of thousands of dollar in debt to attend music school is not worth the reward. This would clearly be a case of “do as I say, not as I do,” as careful readers of this blog must surmise by now that I, too, went to music school, though some while back before tuition hikes put it out of reach for me.

I don’t normally concern myself overly much with B movies. I may watch one while munching my popcorn, but they hardly warrant consideration beyond the time lost spent plopped in front of the screen. My first thought about World War Z is that there hasn’t been another case of a special effect in search of a story since, well, any of the films from the Transformers franchise (new one due out in a couple weeks). WWZ is a zombie film — the kind with fast zombies (running, jumping, and busting their heads through glass instead of just lumbering around) who transform from the living into the undead in under 20 seconds. None of this works without the genre being well established for viewers. Yet World War Z doesn’t hew to the implicit understanding that it should essentially be a snuff film, concocting all manner of never-before-seen gore from dispatching them-no-longer-us. Instead, its main visual play is distant CGI crowd scenes (from helicopters — how exciting!) of self-building Jenga piles of zombies.

Two intertwined stories run behind the ostensible zombie dreck: (1) an investigation into the origin of the viral outbreak that made the zombies, leading to a pseudo-resolution (not quite a happy ending) Hollywood writers apparently find obligatory, and (2) reuniting the investigator with his family, who has been separated because he’s the kind of reluctant hero with such special, unique skills that he’s extorted into service by his former employer. Why an A-list actor such as Brad Pitt agrees to associate himself with such moronic fare is beyond me. The character could have been played by any number of action stars aging past their ass-kicking usefulness as we watch: Bruce Willis, John Travolta, Nicolas Cage, Pierce Brosnan, Mel Gibson, Liam Neeson, Wesley Snipes, Keanu Reeves (who can at least project problem-solving acumen), and Sylvester Stallone, just to name a few. This list could actually go on quite a bit further.

This is the kind of film for which the term suspension of disbelief was coined. The implausibly fortunate survival of the hero through a variety of threats is assured, tying the story together from front to back, which is a cliché that drains dramatic tension out of the story despite everyone around him perishing. I was curious to read P.Z. Myers’ rant discussing the awful science of World War Z, which also observes plot holes and strategic WTFs. The bad science doesn’t stick in my craw quite like it does for Myers, but then, my science background is pretty modest. Like so many fight scenes in action movies where the hero is never really injured, I just sorta go with it.

What really interests me about WWZ, however, is that it presents yet another scenario (rather uninspired, actually) of what might happen when society breaks apart. Since the film features a fast crash where everything goes utterly haywire within hours — yet the electrical grid stays up — the first harrowing scene is the family fleeing, first in a car and then a commandeered mobile home, before seeking temporary refuge in a tenement. The main character states early on that people on the move survive and people who hunker down are lost. That may be true in a theater of war, but I can’t judge whether it’s also true with a virulent contagion scenario. In any case, the investigator alternates between movement and refuge as his situation changes.

Because the zombie horde is a functionally external threat, survivors (however temporary) automatically unite and cooperate. This behavior is borne out in various real-world responses to fast-developing events. However, slow-mo threats without the convenient external enemy, such as we’re now experiencing in the real world with protracted industrial collapse, provides a different picture: dog eating dog and fighting to survive another day. Such alternatives cause many who foresee extraordinary difficulties in the decades ahead to wish for events to break over civilization like a tsunami, taking many all at once and uniting those unlucky enough to survive. But until that happens, we’re faced with slow death by a thousand cuts.

Fools Rush In

Posted: July 1, 2014 in Culture, Economics, Education, Literacy
Tags: ,

Several highly publicized inventories of OECD Skills Outlook 2013 hit the media last fall and then promptly fell off the radar. They stayed on my radar, waiting for the propitious time to sort my thinking and develop a blog post. (I’m always late to the party.) The full report is 466 pp., including blank pages, extensive front- and back-matter, and a writing style that positively discourages reading except to pluck quotes or statistics, as I do below. Such reports (e.g., the Intergovernmental Panel on Climate Change (IPCC) also released in Fall 2013, which I also blogged about here) take considerable effort to compile, but they always leave me wondering whether any of them are actionable or worth going to such lengths to assess, compile, and report. Even the executive summaries expend more effort saying what the reports are rather than offering a cogent conclusion and/or recommendation. This style may well be a requirement of advanced bureaucracy.

Skills assessed by the OECD Skills Outlook are described here:

The Survey of Adult Skills, a product of the OECD Programme for the International Assessment of Adult Competencies (PIAAC), was designed to provide insights into the availability of some of these key skills in society and how they are used at work and at home. The first survey of its kind, it directly measures proficiency in several information-processing skills — namely literacy, numeracy and problem solving in technology-rich environments.

Read the rest of this entry »

Any given species has its unique behaviors and preferred habitat, inevitably overlapping with others that are predator or prey. The human species has spread geographically to make nearly the entire world its habitat and every species its prey (sometimes unintentionally). But it’s a Pyrrhic success, because for the ecosystem to work as our habitat as well as theirs, diversity and abundance is needed. As our numbers have expanded to over 7 billion, nonhuman populations have often declined precipitously (when we don’t farm them for food). When we humans are not otherwise busy hunting, harvesting, and exterminating, we harass them and claim their habitats as uniquely our own. Our unwillingness to share space and/or tolerate their presence except on our own terms is audacious, to say the least.

Read the rest of this entry »

I am, as usual, late getting to the latest controversy in academe, which has been argued to death before I even became aware of it. Inside Higher Ed appears to have gotten there first, followed by editorials at The New York Times, The Los Angeles Times, and The Washington Post. At issue are trigger warnings, a neologism for what might otherwise be called parental advisories (thinking in loco parentis here), to be placed in syllabi and on classroom materials, at first fiction reading but potentially history lessons (and frankly, just about anything else), that might trigger a panic attack or some other dolorous response from a student with a phobia or memory of a traumatic experience. The opinion articles linked above (Inside Higher Ed is more purely reporting) are all in agreement that triggers warnings are a bad idea.

Although articles in news organs are more nearly broadcasting and thus lack discussion (unless one ventures into the swamp of the comments section, which I rarely do), I indulged in a long discussion of the subject with fellow alumni of one of the institutions named in the reports. As with other issues, it developed so many facets that a snapshot determination became impossible if one attempted to accommodate or address all perspectives. Therein lies the problem: accommodation. Left-leaning liberals are especially prone to hypersensitivity to identity politics, which gained prominence in the late 1970s or early 80s. I quickly run afoul of anyone who takes such a perspective because I am notoriously white, male, well-educated, and middle class, so I must constantly “check my privilege.” When someone like me refuses others accommodation, it looks to others like raising the ladder behind me after I’ve safely ascended. I can appreciate, I think, how frustrating it must be to have one’s earnestness thwarted, but yet, I admit I just don’t get it. At the risk of offending (trigger warning here), let me blunder ahead anyway.

The world (or as I’m beginning to call it more simply, reality) is a messy place, and each of us inevitably carries some phobia, trauma, or history that is unsavory. From one celebrated perspective, what doesn’t kill us makes us stronger; from another, we are trained to request accommodation. Accommodation used to be primarily for physical disabilities; now it’s for feelings, which some argue are just as debilitating. This is the province of every self-selected minority and special interest group, which has spawned predictable backlashes among various majority groups (e.g., the men’s movement, resurgent white supremacy). Naturally, any lobby, whether part of a minority or majority, will seek to promote its agenda, but I regard the brew-ha-ha over trigger warnings as an example of growing incidence of what’s been called the Strawberry Generation. It’s remarkable that students now regard themselves as dainty flowers in need of special protection lest they be trampled by, well, reality. So trigger warnings are being requested by students, not on their behalves. With so many examples throughout even recent history of flagrant social injustice and oppression, it’s clear that everyone wants to proclaim their special combination of disadvantages and receive accommodation, all the better if multiplied by inclusion in several protected classes. It’s a claim of victimhood before the fact or perhaps permanent victimhood if one is a survivor of some nastiness. (Disclaimer: real oppression and victimhood do exist, which I don’t intend to minimize, but they’re not because of reading fiction or learning history, scandalous as they may be).

In addition, what exactly is accomplished by virtue of warnings that one is about to encounter — what should it be called — messy material? Does one steel oneself against impact and thus limit its educational value, or does one expect to be excused from facing reality and receive an alternative assignment minus the offending material? Both are the antithesis of higher education. Arguments in the abstract are easy to ignore, so here are two specific examples: substitution or elimination of the words nigger and injun in modernized editions of Mark Twain’s Adventures of Huckleberry Finn and biology textbooks that give consideration to (literally) unscientific accounts of creation and evolution. If one’s racial or religious background gives rise to excess discomfort over the use of one or another egregious trigger word (nigger in particular now having been reclaimed and repurposed for all sorts of uses but with caveats) borne out of ripe historical context or what science (as opposed to religion) teaches, well, that’s the world (reality) we live in. Sanitizing education to avoid discomfort (or worse) does no one any favors. Discomfort and earnest questioning are inevitable if one is to learn anything worthwhile in the course of getting an education.

In his defense of the canon of English literature published in Harper’s (March 2014), Arthur Krystal wrote that traditionalists argue “its contents were an expression of the human condition: the joy of love, the pain of duty, the horror of war, and the recognition of the self and soul.” I would add to these the exhilaration of understanding, the transcendence of beauty, the bitterness of injustice, and the foreknowledge of death. Ranking or ordering elements of the human condition is a fool’s errand, but I contend that none possesses the power to transfix and motivate as much as knowing that one day, each of us must die.

Thus, we develop narratives of a supposed afterlife, in effect to achieve immortality. The most typical are religious dogma regarding eternity spent in heaven, hell, purgatory, or limbo. Another way to cheat death, or more simply, to be remembered, is passing one’s genes to another generation through procreation and achieving a small measure of proxy immortality. Other examples include acquiring fame and wealth to make a mark on history, such as having one’s name on buildings (like the $100 million presidential library and museum being discussed for siting in Chicago’s Hyde Park neighborhood honoring Barack Obama), or having one name inscribed in one of many sports record books, or being preserved on celluloid (which is now increasingly archaic, since everything is going digital). For creative arts, the earliest works of literature to have achieved immortality, meaning that they are still widely known, read, and performed today, are the plays of William Shakespeare. For musicians, it would probably be J.S. Bach. I discount the works of the ancient Greeks or those of the Middle Ages throughout the rest of Europe because, despite passing familiarity with their names, their legacies lie buried deep below the surface and are penetrated only by scholars.

And therein lies the rub: for posterity to supply meaning to an earthly afterlife by proxy, culture must retain historical continuity or at least some living memory. Yet wide swathes of history have been rendered both mute and moot, as Shelley makes clear in his sonnet Ozymandias, with its memorable interdiction, “Look upon my works, ye mighty, and despair!” Who among us can claim to know much if anything about ancient Egyptian or Chinese dynasties, or indeed any of the other major civilizations now collapsed? Our own civilization, grown like a behemoth to the size of the globe, now faces its own collapse for a host of reasons. Even worse, civilizational collapse, ecological collapse, and depopulation present the very real possibility of near-term human extinction (NTE). All the assiduous work to assure one’s place in history won’t amount to much if history leaves us behind.

Read the rest of this entry »

Peter Van Buren has a new book out and is flogging it at TomDispatch. He’s a good enough writer, so I have no objection to the promotional aspect of disseminating his own work. But as I read his article describing an America gone to seed, I realized that for all his writerly skill, he misses the point. As a former State Dept. administrator (charged with assisting Iraqi reconstruction) turned whistle-blower, Van Buren is clearly outside the mainstream media and somewhat outside mainstream opinion, yet he appears to be well within the dominant paradigm. His new spin on regime change takes as implicit all the teachings of economics and politics as systems ideally suited to engineering an equitable social contract where everyone benefits. But as cycles of history have shown, those systems are even more prone to manipulation by a power elite who care little about people they pretend to serve. Whether that carelessness is learned or ingrained in the kleptocracy plutocracy is open to debate.

Van Buren’s article offers a few interesting tidbits, including a couple neologisms (I’m always on the lookout for new coin):

dirt shadow = the faint but legible image left behind an uninstalled sign on the exterior of a closed storefront or building

street gravy = the dirt and grunge that collects over time on a homeless person

Neither is too picaresque. The second is obviously a (sad because it’s too hip) euphemism, since gravy suggests richness whereas the actuality is downright unpleasant. As Van Buren surveys, similar unpleasantness is currently experienced all across America in towns and niche economies that have imploded. Interestingly, his counterexample is a U.S. Marine Corps base, Camp Lejeune located in North Carolina, that functions as a gated community with the added irony that it is supported by public funds. Van Buren also notes that, according to the Congressional Budget Office, an average active-duty service member receives a benefits and pay compensation package estimated to be worth $99,000, some 60 percent of it in noncash compensation.

If there is a cause why our regime is in disarray, however, Van Buren busies himself with standard economic and political (one might even say military-industrial) explanations, demonstrating an inability to frame the decline of empire as the beginning of an epochal shift away from plentiful energy resources, famously termed The Long Emergency by James Howard Kunstler. (We ought to resurrect that phrase.) Other frames of reference are certainly not without their impacts, but the inability to connect all the dots to see the underlying cause is commonplace in the mainstream.

In contrast, consider this passage from Harvesting the Biosphere by Vaclav Smil:

There are two simple explanations why food production in traditional agricultural societies — despite its relatively high need for claiming new arable land — had a limited impact on natural ecosystems: very low population growth rates and very slow improvements in prevailing diets. Population growth rates averaged no more than 0.05% during the antiquity and they reached maxima of just 0.07% in medieval Eurasia — resulting in very slow expansion of premodern societies: it took Europe nearly 1,500 years to double the population it had when Rome became an empire, and Asian doubling was only a bit faster, from the time of China’s Han dynasty to the late Ming period. [pp. 118–119]

Smil goes on to provide exhaustive detail, much of it measurement (with acknowledged ranges of error), showing how modern mechanisms and energy exploitation have enabled rapid population growth. Although population has (apparently) not yet peaked, we are already sliding back down the energy gradient we climbed over the past 250 years and will soon enough face widespread food shortages (among other things) as productivity plummets due to diminishing energy inputs and accumulated environmental destruction (including climate change). Economics and politics do not possess solutions to that prospect. That’s the real dirt in the historical narrative, which remains largely uncovered (unreported) by paradigmatic thinking.

Readers have recently been registering hits on my blog backlog (evidenced by stats kept by WordPress in the backstage), which has prompted me to revisit a few of my older posts. (I’d say they’re “gathering dust” except that this blog is entirely virtual. Do electrons have dust?) One post particularly worth revisiting is “Doomsday Creeping Closer” about the Doomsday Clock found at the Bulletin of the Atomic Scientists. My original provocation to blog about this was a 2007 news report of an adjustment to the clock from 7 before midnight to 5 before midnight. The metaphorical hands were adjusted back in 2010 and forward in 2012, now sitting again at 5 before midnight.

My original post was written before I had become fully collapse aware. Although already a pessimist, fatalist, misanthrope, and sometimes harsh critic, I shared many of the same illusions as the public, foremost among them the idea of a future based on historical continuity still stretching out a long way in front of us. Since then, considering how bad news (scientific findings, political gridlock and infighting, and geopolitical stresses, but most of all, accelerating losses in animal and plant populations as climate change ramps up in all its awful manifestations) keeps piling up, my time horizon has shortened considerably. Thus, I find it curious that the esteemed atomic scientists provide the following reasons for moving back the doomsday clock in 2010:

6 minutes to midnight
2010: “We are poised to bend the arc of history toward a world free of nuclear weapons” is the Bulletin’s assessment. Talks between Washington and Moscow for a follow-on agreement to the Strategic Arms Reduction Treaty are nearly complete, and more negotiations for further reductions in the U.S. and Russian nuclear arsenal are already planned. The dangers posed by climate change are growing, but there are pockets of progress. Most notably, at Copenhagen, the developing and industrialized countries agree to take responsibility for carbon emissions and to limit global temperature rise to 2 degrees Celsius.
.
Armageddon resulting from global nuclear war has always been regarded as a serious threat to the Bulletin going all the way back to 1947. However, I wonder what “pockets of progress” have been made on climate change? My appreciation is that, in high distinction from the usual political theater, none of the various climate talks and publications throughout the years have yielded anything other than some quasi-hysterical shrieking (handily invalidating the message) and delegates leaving without forming agreements or signing treaties. No one wants to give up the bounties of industrialism. The default is then business as usual (BAU), which ironically has historical continuity — at least until is doesn’t anymore for reasons quite beyond anyone’s control. Reasons for moving the doomsday clock forward in 2012 fare better:
5 minutes to midnight
2012: “The challenges to rid the world of nuclear weapons, harness nuclear power, and meet the nearly inexorable climate disruptions from global warming are complex and interconnected. In the face of such complex problems, it is difficult to see where the capacity lies to address these challenges.” Political processes seem wholly inadequate; the potential for nuclear weapons use in regional conflicts in the Middle East, Northeast Asia, and South Asia are alarming; safer nuclear reactor designs need to be developed and built, and more stringent oversight, training, and attention are needed to prevent future disasters; the pace of technological solutions to address climate change may not be adequate to meet the hardships that large-scale disruption of the climate portends.
.
Notice the tiny change in position of the hands on the clock? Neato! (I need a SarkMark () or an interrobang () there ….) The 2012 assessment is far more sober and honest, asking quite plainly “whatcha gonna do?” Inadequacy weighs heavy on our institutions and leadership, though incompetence and corruption might be just as applicable. Also, the (ongoing) Fukushima disaster has (re-)raised the specter of a nuclear Armageddon arising from something other than war. It’s notable here, too, that it’s scientists who hedgingly admit that technology may not rescue us. Further, the prospect of near-term human extinction (NTHE or NTE) as part of the Earth’s sixth great (or mass) extinction event (a process rather than a date) might be cause for the Bulletin to reevaluate the more likely cause of doomsday. Indeed, one wonders if the clock will ever register true time if no one survives to update it.

An article in Wired pushes the meme that coal, whilst claiming the lion’s share of responsibility for releasing CO2 into the atmosphere, can be cleaned up to continue to provide (mostly electrical) energy for everything we use. Pshaw, I say. Comments at the magazine’s website also call bullshit on the article, going as far as to baldly accuse Wired of shilling for big energy, and note that hundreds of similar comments following publication of the story have been purged. Pro-and-con debate on the subject lies beyond my technical prowess, though I have my suspicions. Most interesting to me, however, is what’s not said.

The implicit assumption is that energy demand must be met somehow. Totally and utterly outside of consideration is demand destruction, whether through pricing, metering, or simple unavailability. Sure, there’s 100+ years of coal still available to be mined (or harvested, or exploited, or <choose your euphemism>). Guess we have no choice but to go after it, right? The author does shed some hazy light on environmental and health costs from burning coal, especially in China where it’s worst, but nowhere is there a suggestion that we might stop burning so much of the stuff, which I find a serious omission. Instead, in true technophiliac fashion, an unproven innovation will rescue us from the consequences of our own behavior and deliver salvation (BAU, I suppose, including gadgety distraction if that’s your idea of fun) in the form of “clean coal,” namely, underground resequestration of CO2 released in the process of burning coal. Basically, it’s the equivalent of continuing to dig the hole we’re in by attempting to refill it with its own pollutants. And never mind the delayed effects of what’s already done.

The “clean coal” meme was risible on its face when it appeared a few years ago. Innovation notwithstanding, it continues to be primarily the work of fiction authors marketers and, I guess, stringers for Wired. Several coal ash spills and tonnes upon tonnes of CO2 added to the atmosphere (increasing year over year without stalling) since the meme was hatched are far more convincing to me than hopes of a technofix. Facts and figures make better arguments most of the time. I have none to offer. Instead, let me simply point to everyday sights (and smells inferred from the visuals) we confront. Here is an image from twenty years ago of the city where I live:

Here’s a more recent one:

These days are become a lot less exceptional. How far down this path do we intend to go? All the way, I surmise.

I heard the title phrase — improper use of celebrity — uttered recently in relation to celebrity feuds that fuel the paparazzi and related parasite press. It was one high-profile celebrity (is there any other kind?) admonishing another to behave himself because it is a mistake to air petty grievances publicly and thus fan media flames. That seemed to me a worthy corrective, considering how little self-restraint most people practice, especially overtly dramatic public personae who run increased risks of believing their own hype, and accordingly, entitlement to publicity, whether good or bad. We all know too much already about the childish antics of media whores who, among other things, throw tantrums with impunity compared to the general public.

rant on/

The Yale Forum on Climate Change & the Media just issued a public relations piece about a new Showtime multipart climate change documentary called Years of Living Dangerously. I call it public relations because, like all good PR, it appeals to prurient interests (look at the beautiful people doing beautiful people stuff) and instructs credulous readers far too much about what to think, lest anyone form opinions without the guidance of the infernal marketing machine. Rampant name-dropping with bullshit glamor-shots showing a few famous people (all filmmakers and actors, laughably relabeled “correspondent”) getting their green on precedes the risible assertion that celebrities function as proxies for the average person despite the average person having absolutely nothing in common with the wealth, overexposure, travel, command of attention, heaping of accolades, and enjoyment of fawning deference that characterize celebrities. Drawing focus to climate change and (one might hope) swinging discussion away from deniers (who champion controversy over truth) are cited as precisely the reasons why celebrities are perfect for this documentary. The PR piece further examines (albeit briefly) celebrity activism and provides links to studies on the social science of celebrity (gawd …) before admitting that some backlash might ensue. I guess I’m fomenting backlash.

As PR, the piece is certainly well written, despite its unabashed star-fucking celebrity worship. Further, celebrities have legitimate interests in politics, culture, climate change, and collapse, just like anyone else, even though exorbitant wealth enables them to behave as supranational entities like so many stateless multinational corporations. So why not use their fame to influence people, right? There You GoWell, we’ve already been down the primrose path of celebrity spokespersons occupying positions of influence, speaking from well-crafted scripts, and selling out issues and policy like commodities. Some celebs even understand those issues, though that’s no guarantee of wizened leadership. Consider Arnold Schwarzenegger’s undistinguished tenure as California governor. I have never lived in California to know first hand, but my dominant impression of Schwarzenegger’s leadership style was unapologetic political theater, with incessant catchphrases from his movies functioning as entertaining drivel misdirection matched against his inability (or anyone else’s, for that matter) to solve intractable problems. Curiously, his name is connected as a backer to Years of Living Dangerously, with a whole section of Yale’s PR piece devoted to charges of hypocrisy over his being a loot-and-pollute industrialist once removed through partial ownership in an investment company. Indeed, such conflicts of interest and hypocrisy are flagrant among celebrities who jet around the globe to movie sets (jet-setting?) then jet off again to have themselves filmed bearing witness (in flyovers, it seems, taking a spurious god’s-eye view from above the fray) to ecological devastation. I hesitate to raise this objection because ideological purity doesn’t exist, and as demonstrated in this lengthy blog post, charges of hypocrisy are hard to make stick after even modest analysis. But still, those who most enjoy the fruits of our passing Age of Abundance might pause to consider how it looks when they throw support behind undoing the same disastrous institutions that rewarded them so handsomely. It may not be quite the same as saying we must all now accept austerity (typically, you first! — as in Harrison Ford confronting Indonesian officials?), but near-universal austerity is inevitably where we’re headed anyway.

These are not my principal reasons for whining and ranting, however. My main reason is that by putting rich, celebrity “correspondents” at the center of the story (perhaps they put themselves there, I can’t really know), they adopt an approach similar to too-big-to-fail and too-rich-to-prosecute, except now it’s too-famous-to-ignore. The MSM, revealed as ugly-sister handmaidens to corporate and political power, has failed completely to engage the public sufficiently on climate change, but by putting pretty, loquacious celebrities on display and in charge of rude issue awakening, the documentary falls to the level of clickbait despite whatever intentions it may possess. So although nominally about climate change, it’s really about celebrities waking up to climate change. How lovely! But this is a life-and-death (mostly death at this stage) issue for all of humanity, not just entertainers. Further, what do celebrities qua celebrities bring to the discussion? Nothing, really, except the empty glamor of their fame, expert line delivery, and ability to improvise dialogue. Maybe I shouldn’t sniff at that, considering how  journalists (now climbing into celebrity ranks for all the wrong reasons and too often themselves at the center of the story, both of which undermine journalistic credibility) and politicians have failed so utterly to address social issues effectively. No matter that it’s the job of journalists and government policymakers to bring to light the harrowing news that we done done ourselves in. I warn, however, that if James Cameron or any other instigator behind Years of Living Dangerously believes their project to be a game changer, he or she has seriously misunderstood dynamics that shape public opinion. For centuries, we’ve been assiduously ignoring Cassandra-like warnings from far more authoritative scientists and blue-ribbon panels such as the IPCC. Why would that change now by mixing in celebrities?

And why on earth would earnest celebrity response to recognition of imminent disaster brought on by climate change be to put on a show (the Little Rascals response) with self-serving celebrity spin? Or for that matter, why succumb to notorious solutionism, hopefulness, and the ironically dispiriting happy chapter? The answer is that they have not yet processed the true gravity of our multiple dilemmas and reached the fully foreseeable conclusion after delayed effects are taken into account: we’ve totally and irredeemably fucked. But I guess that wouldn’t sell DVDs, now would it?

/rant off