Archive for March, 2011

This is the first of two posts (the second is here) where I delayed blogging about an idea long enough that someone else wrote it first — and probably better than I could. I have no pretensions about being either a journalist or academic, nor do I spend my days writing for a living. Accordingly, what criticism I offer is of the armchair variety.

As I said in a recent comment, I sorta disclaim the nostalgic frame because I recognize it as a trap. The nostalgic frame is described by Adam Gopnik of The New Yorker in an article called “The Information: How the Internet Gets Inside Us.” His primary focus is how we interpret information delivered via electronic media:

The Never-Betters believe that we’re on the brink of a new utopia, where information will be free and democratic, news will be made from the bottom up, love will reign, and cookies will bake themselves. The Better-Nevers think that we would have been better off if the whole thing had never happened, that the world that is coming to an end is superior to the one that is taking its place, and that, at a minimum, books and magazines create private space for minds in ways that twenty-second bursts of information don’t. The Ever-Wasers insist that at any moment in modernity something like this is going on, and that a new way of organizing data and connecting users is always thrilling to some and chilling to others — that something like this is going on is exactly what makes it a modern moment. One’s hopes rest with the Never-Betters; one’s head with the Ever-Wasers; and one’s heart? Well, twenty or so books in, one’s heart tends to move toward the Better-Nevers, and then bounce back toward someplace that looks more like home.


A never-ending debate rages between two camps: language mavens who are dedicated to preserving standards and language modifiers who embrace change of any sort. The characterization of one group by the other is less charitable. Mavens are called school marms, Nazis, and keepers of pointless detail. Modifiers are called philistines and ignoramuses. Can you guess which camp I’m in?

Undoubtedly, language is fluid and undergoes change as particular usages dominate or decay with time. Further, some details matter more (or less) than others. A raging debate was sparked by a recent change to the Associate Press style sheet. Many publishing organs compile style sheets to bring consistency out of chaos with respect to how certain terms are used. For example, which is correct, Web-site or website (or some further variation)? The current debate is over e-mail vs. email. Although this is not a detail worth arguing at length, it is probably worth deciding for the sake of consistency. Whereas I prefer e-mail, the Associated Press decided to change to email. Neither determination is borne out of ignorance as to how compound words are formed, but loud, vehement cries of “Who cares?” no doubt are. Those are the ignoramuses who can’t believe such detail warrants the slightest bit of attention. The keepers charge that if we don’t care about this, then why care about spelling or punctuation at all?

Even more unsettling, perhaps, is the decision of the Oxford English Dictionary (OED), the self-proclaimed definitive record of the English language, to include initialisms, namely, LOL, FYI, and OMG. (I was a little surprised by the word initialism, but it apparently has a long period of use.) If the OED is trying to be all inclusive, then sure, throw everything in. But if it’s trying to be definitive, admitting new words to the lexicon needs to be quite conservative. No doubt its editorial board has conditions for inclusion that lie beyond my scrutiny.

Changes in usage do not necessitate decline, but the preponderance of changes now occurring stems not from any need to express ideas better but from liberalization of expression. Shortcuts taken to accommodate texting efficiency or to meet Twitter character limits do not enhance language, though their ubiquity cannot be challenged. And as we transition from a reading public to a viewing public (as I’ve argued here, among other places), the loss of ability to decode subtle usage and indeed think sophisticated thoughts is guaranteed when our lexicon is littered with thoughtless though hip and efficient nonwords.

Some Sad News

Posted: March 28, 2011 in Blogroll, Corporatism, Culture

I just learned that Joe Bageant died. Although I never met him (I read his blog and first book of the same name: Deer Hunting with Jesus), I felt a deep admiration of him for his excellent writing and his even more excellent cultural analysis.  His new book, Rainbow Pie, which I have yet to read since it is just out, is reviewed here.

His blog will probably go abandoned after a short while, so in time I will remove it from my blogroll. In the meantime, if anyone reading this even remotely believes anything I say or recommend, get to Bageant’s blog and read his essays while they’re still available.

This is pretty funny, too (my second post today): an article in the New York Times called “Teaching to the Text Message.” The author, a professor at John Jay College, has embraced the Twitter phenomenon and now insists that (since students can’t be challenged to write in long form) writing assignments should require only one or two lines, like a Tweet or a text message. If one is actually writing photo captions, news headlines, ad copy, product blurbs for eBay, or some other form marked by extreme brevity, then a dense, succinct style is warranted. But the author says this is what all writing should be:

I’ve been teaching college freshmen to write the five-paragraph essay and its bully of a cousin, the research paper, for years. But these forms invite font-size manipulation, plagiarism and clichés. We need to set our sights not lower, but shorter.

This is an educator who has given up. Assigning a research paper is the equivalent of bullying? Pshaw! Next thing we’ll just liquefy all our food to avoid the need for chewing.

In other related news, the Washington Post reports on controversy surrounding a writing prompt on the SAT that uses a popular culture phenomenon — the reality TV show — as the topic of the prompt. Critics have complained for years that one topic or another skews the test results toward one demographic or another, so you can’t have, for instance, a writing prompt about a sailing regatta, which would favor a very few students (of means) while disadvantaging those with no exposure to sailing, which is almost everyone. Ironically, TV is among the most democratic of media in its ubiquity, and those complaining loudest are the parents of the best students: the ones without exposure to sailboats reality TV because they’re too busy studying.

What these two news articles share is a shocking blindness about developing ideas in writing. The author of the first caves because it’s just too much work — for himself even more than his students. The second misses the point that the writing prompt already provides plenty of information and context to develop into a brief argument, even without ever having seen a reality TV show. Here is the prompt:

Reality television programs, which feature real people engaged in real activities rather than professional actors performing scripted scenes, are increasingly popular. These shows depict ordinary people competing in everything from singing and dancing to losing weight, or just living their everyday lives. Most people believe that the reality these shows portray is authentic, but they are being misled. How authentic can these shows be when producers design challenges for the participants and then editors alter filmed scenes?

Do people benefit from forms of entertainment that show so-called reality, or are such forms of entertainment harmful?

Student writing assignments are evaluated on the quality of the writing, which is to say, the development of ideas, which makes content and conclusions far less important. Stringing together sentences in a sustained, coherent argument from thesis statement through discussion and ending with a conclusion is a basic writing skill.


Posted: March 25, 2011 in Consciousness, Idle Nonsense, Manners

This is pretty funny: an article on “How to Be a Better Listener” in the Chicago Tribune. In next week’s column, learn how to walk on two legs! But in the meantime, listen up! Here’s the set-up:

Did you know that March is International Listening Awareness Month? According to the International Listening Association (ILA), we only retain about 50 percent of what we hear immediately after we hear it, and only another 20 percent beyond that. So how can we get those percentages to rise?

I suspect the author knows nothing about cognition and makes the usual assumption that increasing those percentages means improved cognition. Well, sorry, that’s not the way perception/memory works. We discard the bulk of immediate perception to make room for new stimuli constantly flowing in. If we didn’t, the tank would overflow and nothing new would get in.

If the article were instead about focusing one’s attention, then maybe there would be something useful in it. She gives five suggestions that mostly amount to the same thing:

  1. Don’t take notes at meetings.
  2. Clear your mind.
  3. Absorb the feedback.
  4. Don’t argue, understand.
  5. Body language is key.

All but the last are about eliminating or reducing distractions by getting out of one’s own head and paying attention to someone else. This is good advice all the time. The last is unnecessary: body language is perceived subliminally. Conscious awareness of it is not generally necessary.

Two news pieces of legislation (here and abroad) are aimed at further victimizing those already the victims of financial distress. The first, from merry old England, would remove substantial funding from social services for the homeless and actually make it a crime to give out food for free. The second, from wacko Minnesota, would make it a crime for anyone on public assistance to have more than $20 in his or her pockets. Yet another U.S. federal bill would require IRS audits to determine how abortions were paid for, ostensibly to ensure no government funds were used.

The pettiness of such legislation is so flabbergasting I hardly know what to say. Legislators appear to be under the delusion that people sleeping in cardboard boxes and women seeking abortions are merely making lifestyle choices, who, if given proper motivation, would choose differently as encouraged by lawmakers. It should be obvious to anyone with a tiny sliver of sense that these new laws, if enacted, guarantee that people will be driven to commit crimes, whether stealing food or risking back-alley abortions.

Who votes for these office holders? Who could possibly support criminalizing charity, Christian or otherwise? When will the situation become so egregious that the people rebel?

Artistry is Dead

Posted: March 15, 2011 in Artistry, Culture

A joke among technophiles obsessed with so many flabbergasting, innovative products is the question, “Where the hell is my flying car?” I don’t care about flying cars. Instead, I want to know where are the modern-day equivalents of Mozart and Brahms, Van Gogh and Munch, Goethe and Shaw, Shelley and Proust (just to give a few examples of towering creative geniuses whose works have withstood the test of time)? I’ll dare to throw my lot in with many nineteenth-century critics, who moaned even then that the arts were exhausted. However, I believe that with the benefit of hindsight the date of their collapse can be moved to the 1950s or so, with the start of the television age and the appearance of rock and roll. This isn’t mere coincidence, nor should it startling to anyone who has been paying attention. At the risk of being too reductionist, I would say that the public has simply turned its attentions elsewhere, that we no longer value the aesthetic life. We are cut off from depth and understanding that flow from serious consideration. We’ve basically stalled our artistic development in adolescence; we’re stuck on hot dogs with mac and cheese. In contrast, Susan Sontag’s description of George Steiner, the famous literary critic for The New Yorker, provides one model of artistic maturity.

He thinks that there are great works of art that are clearly superior to anything else in their various forms, that there is such a thing as profound seriousness. And works created out of profound seriousness … have a claim on our attention and our loyalty that surpasses qualitatively and quantitatively any claim made by any other form of art or entertainment.

Steiner’s perspective comes from the audience side, which is an important part of the artistic cycle. My perspective comes from the creative side or the perspective of the practitioner. My sense is that the conditions necessary in the mind of the artist for the creation of great work, namely, depth of feeling, global awareness, seriousness, vision, and inspiration, are nearly impossible to achieve in the modern world. The chance appearance of a very few preternaturally talented savants does not disprove this. Compared to the nineteenth century and those immediately before, which were host to a bewildering concentration of first-class artists, the twentieth century has seen a long, slow erosion of artistry and replacement of high art forms with commercial and populist forms. Even the idea of authorship is eroding, as creative work is increasingly collaborative rather than the product of a single mind. Many of us would struggle to name a living composer, painter, poet, or novelist except the most obvious, commercially successful ones, which may not be a very good measure of greatness.

The prospect of a world where too few possess the wherewithal to reflect upon humanity and use their creative skills to inspire an audience is probably not a grave concern to many folks. This is not to say that great art is completely missing from the modern scene, but what little is still being created is buried beneath an avalanche of populism. In response to my initial question (Where are today’s greats?), the submissions that I typically hear are Michael Jackson, Elvis, The Beatles, and for a few slightly more ambitious, Louis Armstrong and Duke Ellington. These people are certainly at the peak of their genres, but those genres are all variations of popular music of their day, which for a strict conservative like me still seem like entertainments. No doubt they will be remembered as are Bach and Schubert and Rachmaninoff, but entertainers don’t quite rank the same.


Bob Woodward is saying much the same thing about journalism in this interview.

Not sure what the creators of this video were thinking:

The creators say at the source website that is was “originally created for an industry conference and as a promotional video to stimulate discussion within the marketing industry. It aimed to made [sic] projections on what the media landscape could be like in ten years based on what we are seeing now.” They also admit they got it wrong, but I suspect that the strong negative reaction may actually be because they got it unwittingly and disquietingly right. The kids in the video are clearly on script and articulate a viewpoint far more cogently than a typical teenager could, but the desires expressed are consistent with what most of us seem to want: tight integration of our social, consumer, and technological worlds. The mistake was casting those desires as an overt threat rather than a blind demographic wave.

The last such wave was arguably the counter-culture of the 1960s, fueled by baby boomers entering adulthood and demanding their appetites be sated. If those appetites ran early on toward social justice, sexual fulfillment, and creative expression, boomers eventually aged, married, started families, and shifted to more conventional targets, mostly careers, home, and hearth. The counter-culture that had rebelled against The Man became The Establishment. This is the way of things, of course, as people progress through stages of life, but the lingering sense of self-betrayal felt by boomers still stings.

The new demographic wave, what I think of as entitlement kids (many of them now already adults, at least in age), results less from a population bulge than from conversion of an entire generation to a lifestyle characterized by omnipresent connectivity. They don’t yet recognize that their minds have been colonized by makers of shiny electronics, much like counter-culture dissent was commodified, repackaged, and sold back to dissenters. Since current youth has imbibed from birth the empty promises of technophilia, they scarcely recognize alternatives and are in fact leading the blind, furious charge into a poorly charted and dimly understood future. In short, they are the gaping maw of humanity, demanding to be fed not gruel but ambrosia.

But the demand isn’t just to have appetites sated anymore; it’s to completely reengineer society in terms of our electronic devices, and the overt threat is that failure to do so means replacement destruction of the parent by the child. The threat is real, of course, and the mistake made by the creators of the video was to recognize it and tell the truth, which is that youth do not yet possess the resources, know-how, and power to reshape the world according to their infantile desires but will soon enough take up those reins and drive their parents into the grave.

An article at The Chicago Tribune about libraries moving away from the Dewey Decimal System to bookstore-style shelving and organization covers the issue with typical journalistic banality: telling both sides of a potentially contentious subject (everyone being doctrinaire about nearly everything these days) without rendering judgment. Differences between research and public libraries are mentioned briefly, as well as differences between commonly used classification systems. Library patrons are also quoted for man-in-the-street authenticity. It’s virtually assembly-line news reporting — writing to a template that can be applied universally, not unlike paint-by-the-numbers movies.