Archive for the ‘Television’ Category

Purpose behind consumption of different genres of fiction varies. For most of us, it’s about responding to stimuli and experiencing emotions vicariously, which is to say, safely. For instance, tragedy and horror can be enjoyed, if that’s the right word, in a fictional context to tweak one’s sensibilities without significant effect outside the story frame. Similarly, fighting crime, prosecuting war, or repelling an alien invasion in a video game can be fun but is far removed from actually doing those things in real life (not fun). For less explicitly narrative forms, such as music, feelings evoked are aesthetic and artistic in nature, which makes a sad song or tragic symphony enjoyable on its own merits without bleeding far into real sadness or tragedy. Cinema (now blurred with broadcast TV and streaming services) is the preeminent storytelling medium that provoke all manner of emotional response. After reaching a certain age (middle to late teens), emotional detachment from depiction of sexuality and violent mayhem makes possible digestion of such stimulation for the purpose of entertainment — except in cases where prior personal trauma is triggered. Before that age, nightmare-prone children are prohibited.

Dramatic conflict is central to driving plot and story forward, and naturally, folks are drawn to some stories while avoiding others. Although I’m detached enough not to be upset by, say, zombie films where people and zombies alike are dispatched horrifically, I wouldn’t say I enjoy gore or splatter. Similarly, realistic portrayals of war (e.g., Saving Private Ryan) are not especially enjoyable for me despite the larger story, whether based on true events or entirely made up. The primary reason I leave behind a movie or TV show partway through is because I simply don’t enjoy watching suffering.

Another category bugs me even more: when fiction intrudes on reality to remind me too clearly of actual horrors (or is it the reverse: reality intruding on fiction?). It doesn’t happen often. One of the first instances I recall was in Star Trek: The Next Generation when the story observed that (fictional) warp travel produced some sort of residue akin to pollution. The reminder that we humans are destroying the actual environment registered heavily on me and ruined my enjoyment of the fictional story. (I also much prefer the exploration and discovery aspects of Star Trek that hew closer to Gene Roddenberry’s original vision than the militaristic approach now central to Star Trek.) A much more recent intrusion occurs in the rather adolescent TV show The 100, where a global nuclear exchange launched by an artificial intelligence has the follow-on effect a century later of remaining nuclear sites going critical, melting down, and irradiating the Earth, making it uninhabitable. This bothers me because that’s my expectation what happens in reality, probably not too long (decades) after industrial civilization collapses and most or all of us are dead. This prospect served up as fiction is simply too close to reality for me to enjoy vicariously.

Another example of fiction intruding too heavily on my doomer appreciation of reality occurred retroactively. As high-concept science fiction, I especially enjoyed the first Matrix movie. Like Star Trek, the sequels degraded into run-of-the-mill war stories. But what was provocative about the original was the matrix itself: a computer-generated fiction situated within a larger reality. Inside the matrix was pleasant enough (though not without conflict), but reality outside the matrix was truly awful. It was a supremely interesting narrative and thought experiment when it came out in 1999. Now twenty-one years later, it’s increasingly clear that we are living in a matrix-like, narrative-driven hyperreality intent on deluding ourselves of a pleasant equilibrium that simply isn’t in evidence. In fact, as societies and as a civilization, we’re careening out of control, no brakes, no steering. Caitlin Johnstone explores this startling after-the-fact realization in an article at, which I found only a couple days ago. Reality is in fact far worse than the constructed hyperreality. No wonder no one wants to look at it.

Fully a decade ago, I analyzed with more length than I usually allow myself an article from The New Yorker that examined how media trends were pushing away from literacy (the typographic mind) toward listening and viewing (orality) as primary modes of information gathering and entertainment. The trend was already underway with the advent of radio, cinema, and television, which moved the relatively private experience of silent reading to a public or communal realm as people shared experiences around emerging media. The article took particular aim at TV. In the intervening decade, media continue to contrive new paths of distribution, moving activity back to private information environments via the smart phone and earbuds. The rise of the webcast (still called podcast by some, though that’s an anachronism), which may include a video feed or display a static image over discussion and/or lecture, and streaming services are good examples. Neither has fully displaced traditional media just yet, but the ongoing shift in financial models is a definite harbinger of relentless change.

This comes up again because, interestingly, The New Yorker included with an article I popped open on the Web an audio file of the very same article read by someone not the author. The audio was 40 minutes, whereas the article may have taken me 15 to 20 minutes had I read it. For undisclosed reasons, I listened to the audio. Not at all surprisingly, I found it odd and troublesome. Firstly, though the content was nominally investigative journalism (buttressed by commentary), hearing it read to me made it feel like, well, storytime, meaning it was fiction. Secondly, since my eyes weren’t occupied with reading, they sought other things to do and thus fragmented my attention.

No doubt The New Yorker is pandering to folks who would probably not be readers but might well become listeners. In doing so, it’s essentially conceding the fight, admitting that the effort to read is easily eclipsed by the effortlessness of listening. As alternative and unequal modes of transmitting the content of the article, however, it strikes me as an initiative hatched not by writers and editors capable of critical thought and addressing a similarly enabled readership but by a combination of sales and marketing personnel attempting to capture a widening demographic of listeners (read: nonreaders). Navigating to the article might be a modest extra complication, but if a link to the audio file can be tweeted out (I don’t actually know if that’s possible), then I guess the text isn’t truly necessary.

Here part of what I wrote a decade ago:

If the waning of the typographic mind proceeds, I anticipate that the abstract reasoning and critical thinking skills that are the legacy of Enlightenment Man will be lost except to a few initiates who protect the flame. And with so many other threats cropping up before us, the prospect of a roiling mass of all-but-in-name barbarians ruled by a narrow class of oligarchs does indeed spell the total loss of democracy.

Are we getting perilously close that this dystopia? Maybe not, since it appears that many of those in high office and leadership positions labor under their own failures/inabilities to read at all critically and so execute their responsibilities with about the same credibility as hearsay. Even The New Yorker is no longer protecting the flame.

I recall Nathaniel Hawthorne’s short story The Celestial Railroad railing against the steam engine, an infernal machine, that disrupts society (agrarian at that time). It’s a metaphor for industrialization. The newest infernal machine (many candidates have appeared since Hawthorne’s time only to be supplanted by the next) is undoubtedly the smart phone. Its disruption of healthy formation of identity among teenagers has already been well researched and documented. Is it ironic that as an object of our own creation, it’s coming after our minds?

I remarked in an earlier blog that artists, being hypersensitive to emergent patterns and cultural vibes, often get to ideas sooner than the masses and express their sensibilities through creative endeavor. Those expressions in turn give watchers, viewers, listeners, readers, etc. a way of understanding the world through the artist’s interpretive lens. Interpretations may be completely fictitious, based on real-life events, or merely figurative as the medium allows. They are nonetheless an inevitable reflection of ourselves. Philistines who fail to appreciate that the arts function by absorbing and processing human experience at a deep, intuitive level may insist that the arts are optional or unworthy of attention or financial support. That’s an opinion not at all borne out in the culture, however, and though support may be vulnerable to shifts in valuation (e.g., withdrawal of federal funding for the NEA and PBS), the creative class will always seek avenues of expression, even at personal cost. The democratization of production has made modes of production and distribution for some media quite cheap compared to a couple decades ago. Others remain undeniably labor intensive.

What sparked my thinking are several TV series that have caught my attention despite my generally low level of attention to such media. I haven’t watched broadcast television in over a decade, but the ability to stream TV programming has made shows I have ignored for years far more easy to tune in on my own terms and schedule. “Tune in” is of course the wrong metaphor, but suffice it to say I’ve awarded some of my attention to shows that have up until now fell out of scope for me, cinema being more to my liking. The three shows I’ve been watching (only partway through each) are The Americans, Homeland, and Shameless. The first two are political thrillers (spy stuff) whereas the last is a slice-of-life family drama, which often veers toward comedy but keeps delivering instead tragedy. Not quite the same thing as dark comedy. Conflict is necessary for dramatic purposes, but the ongoing conflict in each of these shows flirts with the worst sorts of disaster, e.g., the spies being discovered and unmasked and the family being thrown out of its home and broken up. Episodic scenarios the writers concoct to threaten catastrophe at every step or at any moment gets tiresome after a while. Multiple seasons ensure that dramatic tension is largely dispelled, since the main characters are present year over year. (The trend toward killing off major characters in others popular TV dramas is not yet widespread.) But still, it’s no way to live, constantly in disaster mode. No doubt I’ve cherry picked three shows from a huge array of entertainments on offer.

Where art reflects reality is that we all now live in the early 21st century under multiple, constantly disquieting threats, large and small, including sudden climate change and ecological disaster, nuclear annihilation, meteor impacts, eruption of the shield volcano under Yellowstone, the Ring of Fire becoming active again (leading to more volcanic and earthquake activity), geopolitical dysfunction on a grand scale, and of course, global financial collapse. This, too, is no way to live. Admittedly, no one was ever promised a care-free life. Yet our inability to manage our own social institutions or shepherd the earth (as though that were our mandate) promise catastrophes in the fullness of time that have no parallels in human history. We’re not flirting with disaster so much as courting it.

Sociologists and historians prepare scholarly works that attempt to provide a grand narrative of the times. Cinema seems to be preoccupied with planetary threats requiring superhero interventions. Television, on the other hand, with its serial form, plumbs the daily angst of its characters to drive suspense, keeping viewers on pins and needles while avoiding final resolution. That final resolution is inevitably disaster, but it won’t appear for a few seasons at least — after the dramatic potential is wrung out of the scenario. I can’t quite understand why these shows are consumed for entertainment (by me no less than anyone else) except perhaps to distract from the clear and present dangers we all face every day.

Nick Carr has an interesting blog post (late getting to it as usual) highlighting a problem with our current information environment. In short, the constant information feed to which many of us subscribe and read on smartphones, which I’ve frequently called a fire hose pointed indiscriminately at everyone, has become the new normal. And when it’s absent, people feel anxiety:

The near-universal compulsion of the present day is, as we all know and as behavioral studies prove, the incessant checking of the smartphone. As Begley notes, with a little poetic hyperbole, we all “feel compelled to check our phones before we get out of bed in the morning and constantly throughout the day, because FOMO — the fear of missing out — fills us with so much anxiety that it feels like fire ants swarming every neuron in our brain.” With its perpetually updating, tightly personalized messaging, networking, searching, and shopping apps, the smartphone creates the anxiety that it salves. It’s a machine almost perfectly designed to turn its owner into a compulsive … from a commercial standpoint, the smartphone is to compulsion what the cigarette pack was to addiction

I’ve written about this phenomenon plenty of times (see here for instance) and recommended that wizened folks might adopt a practiced media ecology by regularly turning one’s attention away from the feed (e.g., no mobile media). Obviously, that’s easier for some of us than others. Although my innate curiosity (shared by almost everyone, I might add) prompts me to gather quite a lot of information in the course of the day/week, I’ve learned to be restrictive and highly judgmental about what sources I read, printed text being far superior in most respects to audio or video. No social media at all, very little mainstream media, and very limited “fast media” of the type that rushes to publication before enough is known. Rather, periodicals (monthly or quarterly) and books, which have longer paths to publication, tend to be more thoughtful and reliable. If I could never again be exposed to noise newsbits with, say, the word “Kardashian,” that would be an improvement.

Also, being aware that the basic economic structure underlying media from the advent of radio and television is to provide content for free (interesting, entertaining, and hyperpalatable perhaps, but simultaneously pointless ephemera) in order to capture the attention of a large audience and then load up the channel with advertisements at regular intervals, I now use ad blockers and streaming media to avoid being swayed by the manufactured desire that flows from advertising. If a site won’t display its content without disabling the ad blocker, which is becoming more commonplace, then I don’t give it my attention. I can’t avoid all advertising, much like I can’t avoid my consumer behaviors being tracked and aggregated by retailers (and others), but I do better than most. For instance, I never saw any Super Bowl commercials this year, which have become a major part of the spectacle. Sure, I’m missing out, but I have no anxiety about it. I prefer to avoid colonization of my mind by advertisers in exchange for cheap titillation.

In the political news media, Rachel Maddow has caught on that it’s advantageous to ignore a good portion of the messages flung at the masses like so much monkey shit. A further suggestion is that because of the pathological narcissism of the new U.S. president, denial of the rapt attention he craves by reinforcing only the most reasonable conduct of the office might be worth a try. Such an experiment would be like the apocryphal story of students conditioning their professor to lecture with his/her back to the class by using positive/negative reinforcement, paying attention and being quiet only when his/her back was to them. Considering how much attention is trained on the Oval Office and its utterances, I doubt such an approach would be feasible even if it were only journalists attempting to channel behavior, but it’s a curious thought experiment.

All of this is to say that there are alternatives to being harried and harassed by insatiable desire for more information at all times. There is no actual peril to boredom, though we behave as though an idle mind is either wasteful or fearsome. Perhaps we aren’t well adapted — cognitively or culturally — to the deluge of information pressing on us in modern life, which could explain (partially) this age of anxiety when our safety, security, and material comforts are as good as they’ve ever been. I have other thoughts about what’s really missing in modern life, which I’ll save for another post.

I don’t have the patience or expertise to prepare and offer a detailed political analysis such as those I sometimes (not very often) read on other blogs. Besides, once the comments start filling up at those sites, every possible permutation is trotted out, muddying the initial or preferred interpretation with alternatives that make at least as much sense. They’re interesting brainstorming sessions, but I have to wonder what is accomplished.

My own back-of-the-envelope analysis is much simpler and probably no closer to (or farther from) being correct, what with everything being open to dispute. So the new POTUS was born in 1946, which puts the bulk of his boyhood in the 1950s, overlapping with the Eisenhower Administration. That period has lots of attributes, but the most significant (IMO), which would impact an adolescent, was the U.S. economy launching into the stratosphere, largely on the back of the manufacturing sector (e.g., automobiles, airplanes, TVs, etc.), and creating the American middle class. The interstate highway system also dates from that decade. Secondarily, there was a strong but misplaced sense of American moral leadership (one might also say authority or superiority), since we took (too much) credit for winning WWII.

However, it wasn’t great for everyone. Racism, misogyny, and other forms of bigotry were open and virulent. Still, if one was lucky to be a white, middle class male, things were arguably about as good as they would get, which many remember rather fondly, either through rose-colored glasses or otherwise. POTUS as a boy wasn’t middle class, but the culture around him supported a worldview that he embodies even now. He’s also never been an industrialist, but he is a real estate developer (some would say slumlord) and media figure, and his models are taken from the 1950s.

The decade of my boyhood was the 1970s, which were the Nixon, Ford, and Carter Administrations. Everyone could sense the wheels were already coming off the bus, and white male entitlement was far diminished from previous decades. The Rust Belt was already a thing. Like children from the 1950s forward, however, I spent a lot of time in front of the TV. Much of it was goofy fun such as Gilligan’s Island, The Brady Bunch, and interestingly enough, Happy Days. It was innocent stuff. What are the chances that, as a boy plopped in front of the TV, POTUS would have seen the show below (excerpted) and taken special notice considering that the character shares his surname?

Snopes confirms that this a real episode from the TV show Trackdown. Not nearly as innocent as the shows I watched. The coincidences that the character is a con man, promises to build a wall, and claims to be the only person who can save the town are eerie, to say the least. Could that TV show be lodged in the back of POTUS’ brain, along with so many other boyhood memories, misremembered and revised the way memory tends to do?

Some have said that the great economic expansion of the 1950s and 60s was an anomaly. A constellation of conditions configured to produce an historical effect, a Golden Era by some reckonings, that cannot be repeated. We simply cannot return to an industrial or manufacturing economy that had once (arguably) made America great. And besides, the attempt would accelerate the collapse of the ecosystem, which is already in free fall. Yet that appears to be the intention of POTUS, whose early regression to childhood is a threat to us all.

The English language has words for everything, and whenever something new comes along, we coin a new word. The latest neologism I heard is bolthole, which refers to the the location one bolts to when collapse and civil unrest reach intolerable proportions. At present, New Zealand is reputed to be the location of boltholes purchased and kept by the ultrarich, which has the advantage of being located in the Southern Hemisphere, meaning remote from the hoi polloi yet reachable by private plane or oceangoing yacht. Actually, bolthole is an older term now being repurposed, but it seems hip and current enough to be new coin.

Banned words are the inverse of neologisms, not in the normal sense that they simply fall out of use but in their use being actively discouraged. Every kid learns this early on when a parent or older sibling slips and lets an “adult” word pass his or her lips that the kid isn’t (yet) allowed to use. (“Mom, you said fuck!”) George Carlin made a whole routine out of dirty words (formerly) banned from TV. Standards have been liberalized since the 1970s, and now people routinely swear or refer to genitalia on TV and in public. Sit in a restaurant or ride public transportation (as I do), eavesdrop a little speech within easy earshot (especially private cellphone conversations), and just count the casual F-bombs.

The worst field of banned-words nonsense is political correctness, which is intertwined with identity politics. All the slurs and epithets directed at, say, racial groups ought to be disused, no doubt, but we overcompensate by renaming everyone (“____-American”) to avoid terms that have little or no derogation. Even more ridiculous, at least one egregiously insulting term has been reclaimed as an badge of honor unbanned banned word by the very group it oppresses. It takes Orwellian doublethink to hear that term — you all know what it is — used legitimately exclusively by those allowed to use it. (I find it wholly bizarre yet fear to wade in with my own prescriptions.) Self-disparaging language, typically in a comedic context, gets an unwholesome pass, but only if one is within the identity group. (Women disparage women, gays trade on gay stereotypes, Jews indulge in jokey anti-Semitism, etc.) We all laugh and accept it as safe, harmless, and normal. President Obama is continuously mixed up appearances (“optics”), or what to call things — or not call them, as the case may be. For instance, his apparent refusal to call terrorism originating in the Middle East “Muslim terrorism” has been met with controversy.

I’m all for calling a thing what it is, but the term terrorism is too loosely applied to any violent act committed against (gasp!) innocent Americans. Recent events in Charleston, SC, garnered the terrorism label, though other terms would be more apt. Further, there is nothing intrinsically Muslim about violence and terrorism. Yeah, sure, Muslims have a word or doctrine — jihad — but it doesn’t mean what most think or are led to believe it means. Every religion across human history has some convenient justification for the use of force, mayhem, and nastiness to promulgate its agenda. Sometimes it’s softer and inviting, others time harder and more militant. Unlike Bill Maher, however, circumspect thinkers recognize that violence used to advance an agenda, like words used to shape narratives, are not the province of any particular hateful or hate-filled group. Literally everyone does it to some extent. Indeed, the passion with which anyone pursues an agenda is paradoxically celebrated and reviled depending on content and context, and it’s a long, slow, ugly process of sorting to arrive as some sort of Rightthink®, which then becomes conventional wisdom before crossing over into political correctness.

Intellectual history is sometimes studied through themes and symbols found in novels with the writers of those novels being manifest about their intent. This is the first of two blog posts exploring truth-telling in fictional narrative. This is also cross-posted at The Collapse of Industrial Civilization.

One of the many recurring themes and ideas that appear on this blog is that the essential form taken by consciousness is story or narrative. Story enables us to orient ourselves in the world and make it somewhat intelligible. It should not be overlooked that it is we who tell ourselves stories, narrating life as we go via the inner voice no less than attending to the great stories that inform culture. The Bible is one such story (or collection of stories), though its message is interpreted with a scandalously high degree of controversy. (I’m especially intrigued by Paula Hay’s thesis over at Mythodrome that the story of The Fall is really about the loss of animism, not a literal expulsion from the Garden of Eden. The Tao te Ching and the Qur’an are similar, one might even say, competing stories from other world cultures.) Story has taken on many forms throughout history, beginning with oral tradition. Setting epics in song and/or verse made them memorable, since fixed written forms came rather late in history (conceived in terms of tens of thousands of years). The appearance of books eroded oral tradition gradually, and the transition of the book into an everyday object after the invention of the printing press eventually helped undermine the authority of the Medieval Church, which housed libraries and trained clerics in the philosophical, ecclesiastical, and scientific (as it was then understood) interpretation of texts. Story continued its development in the Romantic novel and serial fiction, which attracted a mass audience. Today, however, with literacy in decline, cinema and television are the dominant forms of story.

Many categories, types, and genres of story have evolved in fiction. Considering that story arcs typically progress from calm to conflict to resolution, the nature of conflict and the roles we are asked to assume through identification with characters (often archetypal) are a subtly effective vehicle for learning and mind control. Those whose minds have been most deeply and successfully infiltrated are often the same who argue vociferously in defense of a given story, no matter the evidence, with arguments playing out in political spheres and mass media alike. In addition to lighter fare such as RomComs and coming-of-age stories, both of which define not-yet-fully-formed characters through their solidifying relationships, we get hero/antihero/superhero, war, and dystopian tales, where characters tend to be chiseled in place, mostly unchanging as action and events around them take center stage. It is significant that in such tales of conflict, antagonists typically appear from outside: political opponents, foreigners and terrorists, aliens (from space), and faceless, nameless threats such as infectious disease that one might poetically regard as destiny or fate. They threaten to invade, transform, and destroy existing society, which must be defended at all cost even though, ironically, no one believes on a moment’s contemplation it’s really worth saving. Exceptionally, the antagonist is one of us, but an aberrant, outlying example of us, such as a domestic terrorist or serial killer. And while plenty of jokes and memes float around in public that we are often our own worst enemies, becoming the monsters we aim to defeat, stories that identify our full, true threat to ourselves and the rest of creation precisely because of who we are and how we now live are relatively few.

In light of the story of industrial collapse, probably the biggest, baddest story of all time but which is only told and understood in fleeting glimpses, it occurred to me that at least two shows found in cinema and TV have gotten their basic stories mostly correct: The Matrix (predominantly the first film) and The Terminator (the TV show to a greater degree than the movie franchise). In both, a very few possess the truth: knowledge of our enslavement (actual or prospective) to machines of our own invention. Characters in the matrix may feel a sense of unease, of the projected reality being somehow off, but only a few take the notorious red pill and face reality in all its abject despair while most prefer the blue pill (or more accurately, no pill) and the blissful ignorance of illusion. Traveling back and forth between realities (one known to be quite false), the ultrachic glamor and superhero antics of the false reality are far, far more appealing than the dull, cold, grey reality without makeup, costumes, and enhanced fighting skills. Everyone behaves in the false reality with cool, almost emotionless confidence, whereas in the other reality everyone is strained to the breaking point by continuous stress at the threat of annihilation. In Terminator world, time travel enables a few to come back from the future, in the process spilling the beans about what happens after the Singularity, namely, that machines go on a rampage to kill humanity. The dominant emotion of the few initiates is again stress, which manifests as bunker mentality and constant battle readiness. Casualties are not limited to frayed nerves and strained civility, though; plenty of innocent bystanders die alongside those fighting to survive or forestall the future.

Those are only stories, reflections of our preoccupations and diversions from the truth available to witness without needing a red pill. But reality is nonetheless a bitter pill to swallow, so few who become aware of the option to square up to it vs. ignore it really want the truth. I judge that most are still blissfully unaware an option exists, though evidence and supporting stories are everywhere to be found. For those of us unable to pretend or unknow what we now know, the appearance of stress, paranoia, self-abnegation, infighting, gallows humor, and nihilism run parallel to character traits in the Matrix and Terminator worlds. Through story, reconfigured as entertainment, we may indeed be working through some of our psychological issues. And we experience some of the same coming together and tearing apart that inevitably accompany the great events of history. But unlike the childish teaser in this CBS News story that the apocalypse has a date, the machinations of history, like death and extinction, are not strictly events but processes. The process we initiated unwittingly but then ignored is beginning its final crescendo. Stories we tell ourselves conventionally end with triumphal resolution, flatly ignoring the destruction left in their wake. I warn: do not look for triumph in the story of industrial collapse except in those tiny, anonymous moments of grace where suffering ends.

I don’t watch TV, but I sometimes see DVDs of something made for and initially broadcast on TV. A PBS series called This Emotional Life caught my eye at the library, so I decided to check it out. Based on the title, I expected (and hoped) to learn something about the cognitive basis of emotions. What I saw instead was a heuristic about happiness.

The three-part series is hosted by Daniel Gilbert, who authored the best-selling Stumbling on Happiness and is a Professor of Psychology at Harvard University. (The PBS series is perhaps just a longer version of his TED Talk.) His taxonomy of things that do and don’t make us happy, despite what we may believe and the shallowness of our aspirations, comes as no surprise. It’s been clear to me for some time that our media environment distracts people almost totally from what makes a person truly happy, and very few Westerners know anymore how to live meaningfully. The two main distractions are wealth and fame/esteem, neither of which are especially good indicators of happiness. A third, having children, is surprisingly neutral. Again and again throughout the series, Dr. Gilbert turns to scientists who insist that truth is ascertained through measurement and that various indices of happiness are better guides to understanding than actual experience and/or conventional wisdom. In a certain respect, this is true: we have lost touch with and are estranged from both our bodies and our emotions (those being inextricably linked) and so easily deceive ourselves with pursuit of the wrong goals. But measurement and the scientific method are also subtle deceptions, primarily the domain of experts and beyond the ken of the lay public.

I’ve complained before about psychometrics. No doubt such an approach, using mounds of data, does permit some interesting observations not otherwise available. For instance, Dr. Gilbert reports on findings that came out a gigantic data set from the Framingham Heart Study, namely, that happiness (and presumably other emotions) operates like an infectious disease and makes emotional life into a shared cognitive network extending three to four degrees of separation. Only big data allows us to measure this kind of effect, but my seat-of-the-pants intuition is that no one with any experience in a meatworld social network, such as a nuclear or extended family, needs science to tell us that others’ emotions impact our own. The nursery and classroom function this way, as do cliques, gangs, and other membership organizations. Indeed, poets, artists, and philosophers have long known that no man is an island and that the betweenness of things, their intersubjectivity (to use Husserl’s term), is where the action is. Science may now be demonstrating this intuition numerically, but I’m irritated that eggheads insist their measurements finally allow us to know what’s really truly true as though no poet ever expressed it before or better.

Curiously, this article in The Atlantic argues that the pursuit of happiness is not itself sufficient to provide meaning, the search for meaning in life being one of the great philosophical conundrums long before happiness as an objective emerged to replace it. (We can probably lay some blame on Thomas Jefferson’s famous phrase.) Yet a larger, unacknowledged postmodern project to demythologize the present, and for that matter, the past, renders its human subjects as kinetic and inert, mere objects of study. Similar misguided approaches, using measurement in the pursuit of truth that is already obvious and thus needs no rigorous explanation, include an article in Discover called “The Math Behind Beauty” and the entire field of aesthetics (sometimes called hermeneutics when applied specifically to textual interpretation). Further corollaries include monetization of everything (as when a famous painting is described to schoolchildren not in terms of its beauty or context within an artistic genre but by how much money it’s reputed to be worth) and the reduction to financial and polling terms the entirety of our social and political realms.


I caught an episode of Real Time where Bill Maher interviews Charles Murray (see embedded video at bottom), the latter of whom became notorious with his book The Bell Curve and is now hawking a new book called Coming Apart. Murray is a psychometrician, a type of scientist Wikipedia describes as follows:

Psychometrics is the field of study concerned with the theory and technique of psychological measurement, which includes the measurement of knowledge, abilities, attitudes, personality traits, and educational measurement. The field is primarily concerned with the construction and validation of measurement instruments such as questionnaires, tests, and personality assessments.

Using psychometrics, a researcher can take the pulse of society and presumably make observations and spot trends virtually impossible to obtain through other means, but as Murray discovered, can also lead to killing the messenger if those observations run counter to conventional wisdom or cherished fables we tells ourselves, such as gender and racial equality, which we have not yet achieved or even really approached very closely in spite of earnest protestations pointing to great strides we have made. Psychometrics give the lie to those self-congratulations.


Like others deeply enmeshed in the Information Age and its myriad technologies, I spend a lot of time at the computer. I use it for work and for information gathering. I don’t use it much for games, entertainment, or viewing video. So I found myself wondering recently what numskull conceived of the computer screen in landscape orientation when almost everything I read is better in portrait orientation. All the documents I create are in portrait, and the inability to see more than a portion of the page is irritating. Almost immediately, the answer occurred to me that, despite the computer display’s meager origins in command-line user interfaces and vertically scrolling text, the screen itself had more in common with the landscape orientation of the TV than with the portrait orientation of the printed page.

Although we live in three-dimensional space, the height dimension is so thin or flat compared to length and width that height is poorly perceived by the human visual sense. (Side note: height used to be spelled heighth, with the same final th as length and width, but that concordance was dropped at some point.) Prior to the last century, we had no flight, no buildings taller than a few stories, and relatively few needs to process visual stimuli in terms of height. Accordingly, we believed for millennia that the world is flat, and our mental maps were organized primarily in two dimensions: along and across the horizon. We took a landscape view of most visual stimuli: having depth and width but little meaningful height beyond the scale of the human figure.

In fact, I suspect the height of the portrait orientation derives from the obvious need for vertical space in portraiture. Why the printed page also settled into portrait orientation isn’t so obvious to me. On occasion, one finds a book created in landscape orientation, but that’s usually when publishing a picture book, typically a children’s book. Printed music varies more widely than text, but it also uses the portrait orientation as the standard. A perfectly square area might be the obvious compromise, but it appears only rarely — mostly in charts, graphs, and maps that are more constrained by the information they present than are text or imagery. Photography may be the sole medium where changing orientation has clear utility and is accomplished so simply by rotating the camera 90 degrees.

Back to the computer screen. The origins the screen and display technology were in pictorial display, which is to say, visual processing of moving images on TV rather than reading text. If the CRT (cathode ray tube) constrained early computer screens to landscape orientation, that limitation was overcome with the rotatable screen, which though still available has not been widely adopted. As computer usage has matured, it’s become clear that the medium is better suited, like the TV, to video than to text. Although reading from the screen isn’t foreclosed, the nature of the medium inevitably transforms reading into something else, something akin to reading but not quite the same, really. Christine Rosen develops this idea in a fascinating article in The New Atlantis titled “People of the Screen.”

The introduction of the widescreen display for computers clearly moves the computer away from being a work machine towards being an entertainment device. Any argument that it can be both simultaneously strikes me as hollow, along the lines of the TV being an educational device. If the computer does eventually become the complete home media center and replaces the TV and stand-alone stereo system as hoped by many technophiles, perhaps it will be fulfilling its destiny, with obvious implications for further debasing the literacy and erudition of the general public.