Archive for August, 2010

CGI Doomsday Festivals

Posted: August 24, 2010 in Cinema, Culture, Idle Nonsense, Tacky

I saw the movie 2012 recently, the one where the Earth’s crust comes loose and practically everyone and everything perishes (mostly from a suitably polite camera distance) in nonstop earthquakes, volcanic eruptions, and tsunamis. I’m not sure what’s so entertaining about movies that are so clearly excuses for CGI effects rather than where the effects serve the story. Fine, blow shit up, but at least tell a good story! The presence of myriad plot devices and character gimmicks, which audiences see right through anyway, does not make a story any more than all the CGI.

The public seems to have an insatiable appetite for cinematic disaster porn, where one iconic building or geographical feature after another is digitally wrecked. There is a definite gawk factor, but I tire of this dreck pretty quickly. Besides, false, manufactured drama hasn’t worked on me since all those disaster films of the 70s (dating myself here), such as the Airport franchise, The Towering Inferno and its knock-offs, and The Poseidon Adventure. (It’s no accident that the Airport movies spawned the Airplane! spoofs, which have since become a cottage industry spoofing one self-serious movie genre after an another.) Raising the stakes with world-destroying movies like Deep Impact, Armageddon, The Day After Tomorrow, and The Day the Earth Stood Still has an oddly depersonalizing effect. Cinematic doomsday porn is not an improvement over cinematic disaster porn.

For my taste, the franchise that probably got it right — at least the first two of the four — is The Terminator and its sequels. The first Terminator established a dark future without showing much of it at all and the second developed that not really very possible future very effectively. However, the effects were not the story. Also, the drama was human-scale, not planet-scale. Although like other doomsday movies the Terminator flicks generally devolve into mere chase movies (or races against time), the characters remained the center of the story. The title character is also arguably Schwarzenegger’s best role, considering the minimal dialogue and physicality fit him so well despite his woodenness.

It was curious that in 2012, Danny Glover played the President of the United States. It wasn’t a good role for him, as every line was delivered with maudlin seriousness. In this era, however, the role is apparently open to African-American actors without a second thought. The only other such casting (of which I’m aware) was Morgan Freeman, strangely enough, in Deep Impact. (There’s an obvious joke in there somewhere about black presidents being in office just as cataclysm strikes.)

This month, film crews have been on the streets of Chicago filming the third installment of the Transformers franchise, another heedless CGI festival. The second film was so badly reviewed one wonders why a third outing got the green light. Must have made money. The funny thing to me is that announcements of street closures were circulated to a lot of downtown office workers, which included the warning that no actual Transformers were being filmed. Those characters are added later using CGI, don’cha know. I can’t help but snigger that some foolish fudgies were trudging around downtown Chicago film sets, pestering productions assistants, and disheartened to learn that their Saturday morning childhood buddies — five-story fully articulating robots mind you — weren’t actually visible amongst the burnt-out cars and crumbled concrete (styrofoam) scattered around the sets.

Truth Abdicated

Harriett Baber has a rather surprising opinion column at The Guardian in response to the question “can we choose what we believe?” I don’t normally care about professions of faith because they can’t be held to a shared standard of logic, evidence, or reason. But when an academic — a professor of philosophy in this case — says that truth doesn’t matter, I take notice:

Truth is overrated. And it’s remarkable that the very individuals who are most vocal in their opposition to religiously motivated puritanism are the most fiercely puritanical when it comes to truth. They condemn Christians for imposing constraints on sensual pleasure but are outraged that we should take pleasure in the consolations of religion instead of squarely facing what they believe to be the hard truths about the human condition.

People in any case overestimate the value of truth and underestimate the difficulty of arriving at it. There are a great many truths in which I have absolutely no interest — truths about the lifecycle of Ctenocephalides felis, (the common cat flea) or the extensive body of truths about the condition of my teeth that my dentist imposes on me. I see no reason why I should bother with these truths or make a point of believing them.

She uses the form “I see no reason why …” repeatedly and goes on to say that truth should be no impediment to believing whatever a person wants, especially if believing gives comfort. The idea of customizable belief reminded me of Sheilaism as described by Dick Meyer in his book Why We Hate Us. (The idea of Sheilaism may originate in Robert Bellah’s Habits of the Heart.) Sheilaism is a sort of raw individualism that rejects authority (and truth) and creates a hodgepodge belief system not unlike assembling a meal haphazardly from a buffet. Sheilaism also demonstrates how ideas are fragmented, torn from their contexts, converted into free-floating consumables, and reconstituted as expressions of one’s personal tastes and predilections. While I have considerable patience for regular folks in this regard, an academic who exhibits such muddy, relativist thinking is a serious abdication of academic integrity. One can only wonder what goes on in Prof. Baber’s classroom.

Plagiarism Denied

The New York Times has an article discussing plagiarism among students in the digital age. According to the article,

Digital technology makes copying and pasting easy, of course. But that is the least of it. The Internet may also be redefining how students — who came of age with music file-sharing, Wikipedia and Web-linking — understand the concept of authorship and the singularity of any text or image.

What this means is that students increasingly don’t even get that using sources without attribution is wrong; plagiarism doesn’t even register as academic misconduct. Who teaches students academic values and upholds their worth? Teachers and professors, who are apparently failing miserably under the pressure of the copy/paste functions of word processors. The article also provides numerous examples of brazen plagiarism committed by students (and parents!) who do in fact know better but do it anyway. Similarly, outside of academe, books, news articles, blog posts, etc. use and recycle large tracts of text without proper attribution and without being called to task. Some are even award winners.

Aside: the notion that creative works embodied in a digital format suitable for easy reproduction are available for use and reuse has swept away the entire concept of copyright. File sharing via Napster or YouTube raised the legal issues, but for all intents and purposes, the horse has already left the barn since so few respect copyright anymore. Although not true in the legal sense, in practice, the public sphere has become the public domain.

Evidence Irrelevant

Finally, one academic blogger expands the NYT article linked above to the principled use of evidence in academic work and beyond:

… I’m enough of a devotee of our recent view of authorship and creativity (and property) to think that the norms established around plagiarism during the 20th Century need some kind of continuing defense, just with sufficient awareness of the changes in textual production and circulation.

What really worries me is what’s happening to the larger purpose of the analytical writing which tempts some to plagiarism. The thing I’m honestly afraid of is that we’ve come to a point where the professional value of learning to build strong arguments based on and determined by a command over solid evidence is in rapid decline.

These are good statements, but the blogger goes on to ask whether teaching sound academic standards is now a disservice to students in the professional world beyond academe where misshaped evidence, outright errors, omissions, and lies go unchecked and unpunished.

So maybe that’s the kind of writing and speaking we need to train our students to do: rhetorically effective and infinitely mutable on substance, entirely about rather than just sensibly attentive to affect and audience. At what point is it perverse to continue making buggy whips while the Ford plant churns away right next door?

As I said in my comment at the blog, if find it astonishing that an academic could even voice the question. Although I’m certain to be in the minority on this point, the answer is to me as duh! obvious as the answer to the question “should we torture?” All sorts of justifications and rationalizations exist for wriggling out from under the obvious answers, but no person of integrity entertains such a debate for longer than it takes to dismiss the question.

I’ve known for some time about Neil Postman’s analysis in his book Amusing Ourselves to Death contrasting the dystopian views of Aldous Huxley and George Orwell. Here it is presented graphically, which is to say, as a comic strip. (I would embed it but it’s pretty long.) If you are a typical reader nowadays, you accept a graphical presentation — even with captions and text —┬ámuch more readily than a purely textual one. Either way, the reality we currently have emerges pretty quickly from the comparison as that of Huxley, though trends point toward an eventual shift to a harsher, Orwellian reality as resource scarcity (especially energy) grows.

Against the tiny voice of conscience telling me not to, I started to use the self-checkout lanes at the grocery store. These cost-saving and efficiency services join an already well-established and growing trend toward self-service that probably began when gas stations (used to be called service stations) stopped employing anyone to pump gas and banks replaced tellers with ATMs. The number of machine interfaces used every day in the service industry continues to grow, as do disincentives fees to talk to an actual person. Some self-service options appear to be neutral or even beneficial, such as the automated check-in lanes at the airport that enable low-maintenance travelers to avoid unnecessary lines. But that tiny voice keeps buzzing in the back of my brain.

A cost-benefit analysis ought to accompany consideration of every new technology and workflow that appears in the marketplace. However, in a culture easily duped led by the latest gee-whiz gadgetry, what actually exists is more like an if-you-build-it-they-will-come assurance that subscribers will be lined up outside the door, around the block, and halfway to the next state line for days on end in anticipation of the release of each new version of the Apple iPhone or some similar device.

The automatic embrace of technology is a deep question, as it turns out. Techno-Utopians seem to fall in love with every new offering while more sober questioning by cultural conservatives goes mostly unheeded. That questioning typically involves a sense of creeping loss of humanity as we orient more of our daily lives to interactions not with each other but with machines, essentially bypassing the social realm, failing to adopt everyday manners or patience, and worst of all, settling for a stunted sense of empathy, which drives (or fails to drive) compassion and forgiveness.

Machines never require forgiveness; they merely execute our commands. So if the grocery store self-service lane fails to recognize our inputs (or is engineered to be difficult to trick, leading to situations where it locks up to prevent fraudulent transactions), there is no reason for restraint on the part of the user, whose self-service checkout lane rage rises quickly. Such fast-trigger anger is commonplace among self-service devices, as with social media, perhaps because without another human directly involved in the transaction (or the other human so thoroughly mediated by technology as to be a mere avatar), users lose their normal social inhibitions and vent too easily.