Archive for the ‘Television’ Category

Nick Carr has an interesting blog post (late getting to it as usual) highlighting a problem with our current information environment. In short, the constant information feed to which many of us subscribe and read on smartphones, which I’ve frequently called a fire hose pointed indiscriminately at everyone, has become the new normal. And when it’s absent, people feel anxiety:

The near-universal compulsion of the present day is, as we all know and as behavioral studies prove, the incessant checking of the smartphone. As Begley notes, with a little poetic hyperbole, we all “feel compelled to check our phones before we get out of bed in the morning and constantly throughout the day, because FOMO — the fear of missing out — fills us with so much anxiety that it feels like fire ants swarming every neuron in our brain.” With its perpetually updating, tightly personalized messaging, networking, searching, and shopping apps, the smartphone creates the anxiety that it salves. It’s a machine almost perfectly designed to turn its owner into a compulsive … from a commercial standpoint, the smartphone is to compulsion what the cigarette pack was to addiction

I’ve written about this phenomenon plenty of times (see here for instance) and recommended that wizened folks might adopt a practiced media ecology by regularly turning one’s attention away from the feed (e.g., no mobile media). Obviously, that’s easier for some of us than others. Although my innate curiosity (shared by almost everyone, I might add) prompts me to gather quite a lot of information in the course of the day/week, I’ve learned to be restrictive and highly judgmental about what sources I read, printed text being far superior in most respects to audio or video. No social media at all, very little mainstream media, and very limited “fast media” of the type that rushes to publication before enough is known. Rather, periodicals (monthly or quarterly) and books, which have longer paths to publication, tend to be more thoughtful and reliable. If I could never again be exposed to noise newsbits with, say, the word “Kardashian,” that would be an improvement.

Also, being aware that the basic economic structure underlying media from the advent of radio and television is to provide content for free (interesting, entertaining, and hyperpalatable perhaps, but simultaneously pointless ephemera) in order to capture the attention of a large audience and then load up the channel with advertisements at regular intervals, I now use ad blockers and streaming media to avoid being swayed by the manufactured desire that flows from advertising. If a site won’t display its content without disabling the ad blocker, which is becoming more commonplace, then I don’t give it my attention. I can’t avoid all advertising, much like I can’t avoid my consumer behaviors being tracked and aggregated by retailers (and others), but I do better than most. For instance, I never saw any Super Bowl commercials this year, which have become a major part of the spectacle. Sure, I’m missing out, but I have no anxiety about it. I prefer to avoid colonization of my mind by advertisers in exchange for cheap titillation.

In the political news media, Rachel Maddow has caught on that it’s advantageous to ignore a good portion of the messages flung at the masses like so much monkey shit. A further suggestion is that because of the pathological narcissism of the new U.S. president, denial of the rapt attention he craves by reinforcing only the most reasonable conduct of the office might be worth a try. Such an experiment would be like the apocryphal story of students conditioning their professor to lecture with his/her back to the class by using positive/negative reinforcement, paying attention and being quiet only when his/her back was to them. Considering how much attention is trained on the Oval Office and its utterances, I doubt such an approach would be feasible even if it were only journalists attempting to channel behavior, but it’s a curious thought experiment.

All of this is to say that there are alternatives to being harried and harassed by insatiable desire for more information at all times. There is no actual peril to boredom, though we behave as though an idle mind is either wasteful or fearsome. Perhaps we aren’t well adapted — cognitively or culturally — to the deluge of information pressing on us in modern life, which could explain (partially) this age of anxiety when our safety, security, and material comforts are as good as they’ve ever been. I have other thoughts about what’s really missing in modern life, which I’ll save for another post.

Advertisements

I don’t have the patience or expertise to prepare and offer a detailed political analysis such as those I sometimes (not very often) read on other blogs. Besides, once the comments start filling up at those sites, every possible permutation is trotted out, muddying the initial or preferred interpretation with alternatives that make at least as much sense. They’re interesting brainstorming sessions, but I have to wonder what is accomplished.

My own back-of-the-envelope analysis is much simpler and probably no closer to (or farther from) being correct, what with everything being open to dispute. So the new POTUS was born in 1946, which puts the bulk of his boyhood in the 1950s, overlapping with the Eisenhower Administration. That period has lots of attributes, but the most significant (IMO), which would impact an adolescent, was the U.S. economy launching into the stratosphere, largely on the back of the manufacturing sector (e.g., automobiles, airplanes, TVs, etc.), and creating the American middle class. The interstate highway system also dates from that decade. Secondarily, there was a strong but misplaced sense of American moral leadership (one might also say authority or superiority), since we took (too much) credit for winning WWII.

However, it wasn’t great for everyone. Racism, misogyny, and other forms of bigotry were open and virulent. Still, if one was lucky to be a white, middle class male, things were arguably about as good as they would get, which many remember rather fondly, either through rose-colored glasses or otherwise. POTUS as a boy wasn’t middle class, but the culture around him supported a worldview that he embodies even now. He’s also never been an industrialist, but he is a real estate developer (some would say slumlord) and media figure, and his models are taken from the 1950s.

The decade of my boyhood was the 1970s, which were the Nixon, Ford, and Carter Administrations. Everyone could sense the wheels were already coming off the bus, and white male entitlement was far diminished from previous decades. The Rust Belt was already a thing. Like children from the 1950s forward, however, I spent a lot of time in front of the TV. Much of it was goofy fun such as Gilligan’s Island, The Brady Bunch, and interestingly enough, Happy Days. It was innocent stuff. What are the chances that, as a boy plopped in front of the TV, POTUS would have seen the show below (excerpted) and taken special notice considering that the character shares his surname?

Snopes confirms that this a real episode from the TV show Trackdown. Not nearly as innocent as the shows I watched. The coincidences that the character is a con man, promises to build a wall, and claims to be the only person who can save the town are eerie, to say the least. Could that TV show be lodged in the back of POTUS’ brain, along with so many other boyhood memories, misremembered and revised the way memory tends to do?

Some have said that the great economic expansion of the 1950s and 60s was an anomaly. A constellation of conditions configured to produce an historical effect, a Golden Era by some reckonings, that cannot be repeated. We simply cannot return to an industrial or manufacturing economy that had once (arguably) made America great. And besides, the attempt would accelerate the collapse of the ecosystem, which is already in free fall. Yet that appears to be the intention of POTUS, whose early regression to childhood is a threat to us all.

The English language has words for everything, and whenever something new comes along, we coin a new word. The latest neologism I heard is bolthole, which refers to the the location one bolts to when collapse and civil unrest reach intolerable proportions. At present, New Zealand is reputed to be the location of boltholes purchased and kept by the ultrarich, which has the advantage of being located in the Southern Hemisphere, meaning remote from the hoi polloi yet reachable by private plane or oceangoing yacht. Actually, bolthole is an older term now being repurposed, but it seems hip and current enough to be new coin.

Banned words are the inverse of neologisms, not in the normal sense that they simply fall out of use but in their use being actively discouraged. Every kid learns this early on when a parent or older sibling slips and lets an “adult” word pass his or her lips that the kid isn’t (yet) allowed to use. (“Mom, you said fuck!”) George Carlin made a whole routine out of dirty words (formerly) banned from TV. Standards have been liberalized since the 1970s, and now people routinely swear or refer to genitalia on TV and in public. Sit in a restaurant or ride public transportation (as I do), eavesdrop a little speech within easy earshot (especially private cellphone conversations), and just count the casual F-bombs.

The worst field of banned-words nonsense is political correctness, which is intertwined with identity politics. All the slurs and epithets directed at, say, racial groups ought to be disused, no doubt, but we overcompensate by renaming everyone (“____-American”) to avoid terms that have little or no derogation. Even more ridiculous, at least one egregiously insulting term has been reclaimed as an badge of honor unbanned banned word by the very group it oppresses. It takes Orwellian doublethink to hear that term — you all know what it is — used legitimately exclusively by those allowed to use it. (I find it wholly bizarre yet fear to wade in with my own prescriptions.) Self-disparaging language, typically in a comedic context, gets an unwholesome pass, but only if one is within the identity group. (Women disparage women, gays trade on gay stereotypes, Jews indulge in jokey anti-Semitism, etc.) We all laugh and accept it as safe, harmless, and normal. President Obama is continuously mixed up appearances (“optics”), or what to call things — or not call them, as the case may be. For instance, his apparent refusal to call terrorism originating in the Middle East “Muslim terrorism” has been met with controversy.

I’m all for calling a thing what it is, but the term terrorism is too loosely applied to any violent act committed against (gasp!) innocent Americans. Recent events in Charleston, SC, garnered the terrorism label, though other terms would be more apt. Further, there is nothing intrinsically Muslim about violence and terrorism. Yeah, sure, Muslims have a word or doctrine — jihad — but it doesn’t mean what most think or are led to believe it means. Every religion across human history has some convenient justification for the use of force, mayhem, and nastiness to promulgate its agenda. Sometimes it’s softer and inviting, others time harder and more militant. Unlike Bill Maher, however, circumspect thinkers recognize that violence used to advance an agenda, like words used to shape narratives, are not the province of any particular hateful or hate-filled group. Literally everyone does it to some extent. Indeed, the passion with which anyone pursues an agenda is paradoxically celebrated and reviled depending on content and context, and it’s a long, slow, ugly process of sorting to arrive as some sort of Rightthink®, which then becomes conventional wisdom before crossing over into political correctness.

Intellectual history is sometimes studied through themes and symbols found in novels with the writers of those novels being manifest about their intent. This is the first of two blog posts exploring truth-telling in fictional narrative. This is also cross-posted at The Collapse of Industrial Civilization.

One of the many recurring themes and ideas that appear on this blog is that the essential form taken by consciousness is story or narrative. Story enables us to orient ourselves in the world and make it somewhat intelligible. It should not be overlooked that it is we who tell ourselves stories, narrating life as we go via the inner voice no less than attending to the great stories that inform culture. The Bible is one such story (or collection of stories), though its message is interpreted with a scandalously high degree of controversy. (I’m especially intrigued by Paula Hay’s thesis over at Mythodrome that the story of The Fall is really about the loss of animism, not a literal expulsion from the Garden of Eden. The Tao te Ching and the Qur’an are similar, one might even say, competing stories from other world cultures.) Story has taken on many forms throughout history, beginning with oral tradition. Setting epics in song and/or verse made them memorable, since fixed written forms came rather late in history (conceived in terms of tens of thousands of years). The appearance of books eroded oral tradition gradually, and the transition of the book into an everyday object after the invention of the printing press eventually helped undermine the authority of the Medieval Church, which housed libraries and trained clerics in the philosophical, ecclesiastical, and scientific (as it was then understood) interpretation of texts. Story continued its development in the Romantic novel and serial fiction, which attracted a mass audience. Today, however, with literacy in decline, cinema and television are the dominant forms of story.

Many categories, types, and genres of story have evolved in fiction. Considering that story arcs typically progress from calm to conflict to resolution, the nature of conflict and the roles we are asked to assume through identification with characters (often archetypal) are a subtly effective vehicle for learning and mind control. Those whose minds have been most deeply and successfully infiltrated are often the same who argue vociferously in defense of a given story, no matter the evidence, with arguments playing out in political spheres and mass media alike. In addition to lighter fare such as RomComs and coming-of-age stories, both of which define not-yet-fully-formed characters through their solidifying relationships, we get hero/antihero/superhero, war, and dystopian tales, where characters tend to be chiseled in place, mostly unchanging as action and events around them take center stage. It is significant that in such tales of conflict, antagonists typically appear from outside: political opponents, foreigners and terrorists, aliens (from space), and faceless, nameless threats such as infectious disease that one might poetically regard as destiny or fate. They threaten to invade, transform, and destroy existing society, which must be defended at all cost even though, ironically, no one believes on a moment’s contemplation it’s really worth saving. Exceptionally, the antagonist is one of us, but an aberrant, outlying example of us, such as a domestic terrorist or serial killer. And while plenty of jokes and memes float around in public that we are often our own worst enemies, becoming the monsters we aim to defeat, stories that identify our full, true threat to ourselves and the rest of creation precisely because of who we are and how we now live are relatively few.

In light of the story of industrial collapse, probably the biggest, baddest story of all time but which is only told and understood in fleeting glimpses, it occurred to me that at least two shows found in cinema and TV have gotten their basic stories mostly correct: The Matrix (predominantly the first film) and The Terminator (the TV show to a greater degree than the movie franchise). In both, a very few possess the truth: knowledge of our enslavement (actual or prospective) to machines of our own invention. Characters in the matrix may feel a sense of unease, of the projected reality being somehow off, but only a few take the notorious red pill and face reality in all its abject despair while most prefer the blue pill (or more accurately, no pill) and the blissful ignorance of illusion. Traveling back and forth between realities (one known to be quite false), the ultrachic glamor and superhero antics of the false reality are far, far more appealing than the dull, cold, grey reality without makeup, costumes, and enhanced fighting skills. Everyone behaves in the false reality with cool, almost emotionless confidence, whereas in the other reality everyone is strained to the breaking point by continuous stress at the threat of annihilation. In Terminator world, time travel enables a few to come back from the future, in the process spilling the beans about what happens after the Singularity, namely, that machines go on a rampage to kill humanity. The dominant emotion of the few initiates is again stress, which manifests as bunker mentality and constant battle readiness. Casualties are not limited to frayed nerves and strained civility, though; plenty of innocent bystanders die alongside those fighting to survive or forestall the future.

Those are only stories, reflections of our preoccupations and diversions from the truth available to witness without needing a red pill. But reality is nonetheless a bitter pill to swallow, so few who become aware of the option to square up to it vs. ignore it really want the truth. I judge that most are still blissfully unaware an option exists, though evidence and supporting stories are everywhere to be found. For those of us unable to pretend or unknow what we now know, the appearance of stress, paranoia, self-abnegation, infighting, gallows humor, and nihilism run parallel to character traits in the Matrix and Terminator worlds. Through story, reconfigured as entertainment, we may indeed be working through some of our psychological issues. And we experience some of the same coming together and tearing apart that inevitably accompany the great events of history. But unlike the childish teaser in this CBS News story that the apocalypse has a date, the machinations of history, like death and extinction, are not strictly events but processes. The process we initiated unwittingly but then ignored is beginning its final crescendo. Stories we tell ourselves conventionally end with triumphal resolution, flatly ignoring the destruction left in their wake. I warn: do not look for triumph in the story of industrial collapse except in those tiny, anonymous moments of grace where suffering ends.

I don’t watch TV, but I sometimes see DVDs of something made for and initially broadcast on TV. A PBS series called This Emotional Life caught my eye at the library, so I decided to check it out. Based on the title, I expected (and hoped) to learn something about the cognitive basis of emotions. What I saw instead was a heuristic about happiness.

The three-part series is hosted by Daniel Gilbert, who authored the best-selling Stumbling on Happiness and is a Professor of Psychology at Harvard University. (The PBS series is perhaps just a longer version of his TED Talk.) His taxonomy of things that do and don’t make us happy, despite what we may believe and the shallowness of our aspirations, comes as no surprise. It’s been clear to me for some time that our media environment distracts people almost totally from what makes a person truly happy, and very few Westerners know anymore how to live meaningfully. The two main distractions are wealth and fame/esteem, neither of which are especially good indicators of happiness. A third, having children, is surprisingly neutral. Again and again throughout the series, Dr. Gilbert turns to scientists who insist that truth is ascertained through measurement and that various indices of happiness are better guides to understanding than actual experience and/or conventional wisdom. In a certain respect, this is true: we have lost touch with and are estranged from both our bodies and our emotions (those being inextricably linked) and so easily deceive ourselves with pursuit of the wrong goals. But measurement and the scientific method are also subtle deceptions, primarily the domain of experts and beyond the ken of the lay public.

I’ve complained before about psychometrics. No doubt such an approach, using mounds of data, does permit some interesting observations not otherwise available. For instance, Dr. Gilbert reports on findings that came out a gigantic data set from the Framingham Heart Study, namely, that happiness (and presumably other emotions) operates like an infectious disease and makes emotional life into a shared cognitive network extending three to four degrees of separation. Only big data allows us to measure this kind of effect, but my seat-of-the-pants intuition is that no one with any experience in a meatworld social network, such as a nuclear or extended family, needs science to tell us that others’ emotions impact our own. The nursery and classroom function this way, as do cliques, gangs, and other membership organizations. Indeed, poets, artists, and philosophers have long known that no man is an island and that the betweenness of things, their intersubjectivity (to use Husserl’s term), is where the action is. Science may now be demonstrating this intuition numerically, but I’m irritated that eggheads insist their measurements finally allow us to know what’s really truly true as though no poet ever expressed it before or better.

Curiously, this article in The Atlantic argues that the pursuit of happiness is not itself sufficient to provide meaning, the search for meaning in life being one of the great philosophical conundrums long before happiness as an objective emerged to replace it. (We can probably lay some blame on Thomas Jefferson’s famous phrase.) Yet a larger, unacknowledged postmodern project to demythologize the present, and for that matter, the past, renders its human subjects as kinetic and inert, mere objects of study. Similar misguided approaches, using measurement in the pursuit of truth that is already obvious and thus needs no rigorous explanation, include an article in Discover called “The Math Behind Beauty” and the entire field of aesthetics (sometimes called hermeneutics when applied specifically to textual interpretation). Further corollaries include monetization of everything (as when a famous painting is described to schoolchildren not in terms of its beauty or context within an artistic genre but by how much money it’s reputed to be worth) and the reduction to financial and polling terms the entirety of our social and political realms.

(more…)

I caught an episode of Real Time where Bill Maher interviews Charles Murray (see embedded video at bottom), the latter of whom became notorious with his book The Bell Curve and is now hawking a new book called Coming Apart. Murray is a psychometrician, a type of scientist Wikipedia describes as follows:

Psychometrics is the field of study concerned with the theory and technique of psychological measurement, which includes the measurement of knowledge, abilities, attitudes, personality traits, and educational measurement. The field is primarily concerned with the construction and validation of measurement instruments such as questionnaires, tests, and personality assessments.

Using psychometrics, a researcher can take the pulse of society and presumably make observations and spot trends virtually impossible to obtain through other means, but as Murray discovered, can also lead to killing the messenger if those observations run counter to conventional wisdom or cherished fables we tells ourselves, such as gender and racial equality, which we have not yet achieved or even really approached very closely in spite of earnest protestations pointing to great strides we have made. Psychometrics give the lie to those self-congratulations.

(more…)

Like others deeply enmeshed in the Information Age and its myriad technologies, I spend a lot of time at the computer. I use it for work and for information gathering. I don’t use it much for games, entertainment, or viewing video. So I found myself wondering recently what numskull conceived of the computer screen in landscape orientation when almost everything I read is better in portrait orientation. All the documents I create are in portrait, and the inability to see more than a portion of the page is irritating. Almost immediately, the answer occurred to me that, despite the computer display’s meager origins in command-line user interfaces and vertically scrolling text, the screen itself had more in common with the landscape orientation of the TV than with the portrait orientation of the printed page.

Although we live in three-dimensional space, the height dimension is so thin or flat compared to length and width that height is poorly perceived by the human visual sense. (Side note: height used to be spelled heighth, with the same final th as length and width, but that concordance was dropped at some point.) Prior to the last century, we had no flight, no buildings taller than a few stories, and relatively few needs to process visual stimuli in terms of height. Accordingly, we believed for millennia that the world is flat, and our mental maps were organized primarily in two dimensions: along and across the horizon. We took a landscape view of most visual stimuli: having depth and width but little meaningful height beyond the scale of the human figure.

In fact, I suspect the height of the portrait orientation derives from the obvious need for vertical space in portraiture. Why the printed page also settled into portrait orientation isn’t so obvious to me. On occasion, one finds a book created in landscape orientation, but that’s usually when publishing a picture book, typically a children’s book. Printed music varies more widely than text, but it also uses the portrait orientation as the standard. A perfectly square area might be the obvious compromise, but it appears only rarely — mostly in charts, graphs, and maps that are more constrained by the information they present than are text or imagery. Photography may be the sole medium where changing orientation has clear utility and is accomplished so simply by rotating the camera 90 degrees.

Back to the computer screen. The origins the screen and display technology were in pictorial display, which is to say, visual processing of moving images on TV rather than reading text. If the CRT (cathode ray tube) constrained early computer screens to landscape orientation, that limitation was overcome with the rotatable screen, which though still available has not been widely adopted. As computer usage has matured, it’s become clear that the medium is better suited, like the TV, to video than to text. Although reading from the screen isn’t foreclosed, the nature of the medium inevitably transforms reading into something else, something akin to reading but not quite the same, really. Christine Rosen develops this idea in a fascinating article in The New Atlantis titled “People of the Screen.”

The introduction of the widescreen display for computers clearly moves the computer away from being a work machine towards being an entertainment device. Any argument that it can be both simultaneously strikes me as hollow, along the lines of the TV being an educational device. If the computer does eventually become the complete home media center and replaces the TV and stand-alone stereo system as hoped by many technophiles, perhaps it will be fulfilling its destiny, with obvious implications for further debasing the literacy and erudition of the general public.

The New Yorker has a rather long but interesting article called “Twilight of the Books” about the decline of reading and literacy in the modern world. The article is far reaching in its attempt to summarize information from a number of sources, notably a book by Maryanne Wolf, a professor at Tufts University and director of its Center for Reading and Language Research, titled Proust and the Squid. The article begins with a litany of statistics demonstrating that reading is in decline.

I have to pause here to chide The New Yorker about its own writing, which is the flip side of reading on the literacy coin. Don’t all articles pass over at least two desks: the writer’s and the editor’s?

In January 1994, forty-nine per cent of respondents told the Pew Research Center for the People and the Press that they had read a newspaper the day before. In 2006, only forty-three per cent said so, including those who read online. Book sales, meanwhile, have stagnated. The Book Industry Study Group estimates that sales fell from 8.27 books per person in 2001 to 7.93 in 2006. According to the Department of Labor, American households spent an average of a hundred and sixty-three dollars on reading in 1995 and a hundred and twenty-six dollars in 2005. [emphasis added]

Isn’t per cent better as one word: percent? Similarly, shouldn’t a hundred and sixty-three be one hundred sixty-three? Any experienced copy editor should know that we don’t write numbers (or numerals) the way we speak them. We may say one-oh-six, but we don’t write 1o6 (as opposed to 106 — the typographical difference is difficult to see with some fonts, but it’s there). There are lots of other style errors, contractions, and generalized clumsiness, but I’ll move on.

As I read the article, I was struck by the number of times I said to myself, Duh, that’s so obvious it doesn’t bear stating! But I realized that most of the Duh! moments aren’t in fact so obvious to anyone ignorant of even entry-level media theory, which is really what I have. So I’ll reproduce a few noteworthy items with comments. (more…)

The Grinning Face of Propaganda

Posted: September 12, 2006 in Politics, Television

Judging from the activity of the mainstream media, the 5-year anniversary of the 9/11 attacks is so much more significant than the 4-year anniversary). I’ve seen a few blog entries that rehash the details, go on record against terrorism, or ask the “where were you?” question. That’s well-trodden ground, and frankly, no worthwhile result obtains (beyond the trite “never forget” idea — as if we could).

The one piece that makes a great deal of sense to me and has a worthwhile reminder of where we ought to be after five years is this one by Keith Olbermann. Among his insightful remarks is that now five years later, we still have a 16-acre hole in lower Manhattan. No memorial, no building project. So if we all came together in support of victims, civil servants, and Pres. Bush, well, we haven’t yet been able to similarly put aside our differences and get something done on the site of ground zero.

Olbermann also makes a rather spooky reference to a Twilight Zone episode where conquering aliens remark that once they set us against each other, the aliens themselves don’t really need to do much to wipe us out as our own paranoia and mob response are pretty potent weapons. Leading that charge, by Obermann’s assessment, is the Bush administration, for which he has nothing but contempt, apparently.

Probably the worst aspect Olbermann notes is that the 9/11 attacks have been used by political opportunists (again, the Bush administration figures strongly) in shameless propaganda campaigns to advance partisan agendas, notably, the war with Iraq and rolling back civil liberties. And it’s not over. ABC’s docudrama The Path to 9/11 has been roundly denounced as “[f]actually shaky, politically inflammatory and photographically a mess” by the Washington Post. I’m the wrong person to comment on this, as I didn’t watch any of the docudrama, nor did I watch either of the Hollywood movies treating the same subject. Like the JFK assassination, with so much disinformation, outright fiction, and conspiracy theory floating around, I rather doubt the truth behind either event is truly knowable with any confidence after they’ve been spun and massaged and coopted as party propaganda.

So although politics is not really my focus, I’ll offer a brief five-years-later assessment. It makes me infinitely sad that whatever lessons might have been learned in the aftermath of the attacks, including some serious self-assessment about the things we did to get us to the point of becoming a target, that opportunity has been mostly squandered in unthinking American jingoism and flailing retribution taken on the wrong parties.

TV on DVD

Posted: May 15, 2006 in Television

I haven't watched TV in about five years. That includes everything on TV: shows, ads, news, everything. Because it's so ubiquitous, I have actually seen bits and pieces of a few things. And I purposely tuned into the pilot of Commander in Chief because I wanted to see how the writers got a woman into office. But I've never seen a single episode of The Sopranos, Sex in the City, House, American Idol, or any of the various shows discussed around the water cooler. The amount of wasted time I recovered has pretty impressive, which filled up with other things pretty quickly.

From time to time, folks discover that I don't watch TV. It's usually in the context of "did you see this commercial" or "can you believe what happened on such-and-such show?" I usually respond that I don't watch TV and the next question is "at all?" I say "yes" and the jaws drop. It's as though I just said I don't breathe anymore. The idea of not going home and giving over several hours of veg time is frankly beyond some people's comprehension. Parents with children regard the TV as a lifesaver at the same time they acknowledge it's probably unwise to park kids there for extended periods of time (but still do it).

So a friend of mine recently forced on me loaned me a copy of the first season of Lost on DVD. I've tried to be open-minded but can't escape the sense that I'm still watching TV. The eight-minute segmentation to accommodate commercials is grating even without the commercials (but thank goodness for no commercials), and the 45-minute story arcs are a formal frame that really confines sensible story-telling. The way each episode plays like a parable or morality tale is also shockingly facile. I knew all this before I stopped watching TV, but it is especially glaring to me now. The other monstrously irritating thing about this particular show is the endless parade of dramatic pauses and knowing looks in response to simple questions any normal person would answer unhesitatingly.

Q: "Do you have matches?"
A: pause — look — "Why do you want to know?"

ANSWER THE STUPID QUESTION!!

Watching is mostly a take-it-or-leave-it proposition for me at this point. Of seven DVDs, I watched four. I'll have a nagging sense of incompletion if I don't finish, I suspect, the same as with books. The fact that I'll never see season two or three doesn't bother me a bit.