Archive for the ‘Media’ Category

Nicholas Carr has a pair of thoughtful new posts at his blog Rough Type (see blogroll) under the tag “infinite media.” The second of the two is about context collapse, restoration, and content collapse. I won’t review that particular post; I’m merely pointing to it for you to read. Carr is a journalist and media theorist whose work is especially interesting to me as a partial antidote to what I’ve been calling our epistemological crisis. In short, he offers primers on how to think about stuff, that stuff being the primary medium through which most people now gather information: via screens.

Relatedly, the other media theorist to whom I pay attention is Alan Jacobs, who has a recent book (which I read but didn’t review or blog about) called more simply How to Think. It’s about recognizing and avoiding cognitive biases on the way to more disciplined, clear thinking. I mention these two fellows together because I’ve been reading their blogs and books for over a decade now and have been curious to observe how their public interactions have changed over time. They have each embraced and abandoned various new media (particularly social media) and adopted more stringent media ecology. Carr posts ocassionally now and has closed comments at his blog (a shame, since his commentariat was valuable, quite unlike the troll mob at most sites). Jacobs is even more aggressive, starting and abandoning one blog after another (was active at multiple URLs, one formerly on my blogroll) and deleting his Twitter account entirely. Whatever goings-on occur at Facebook I can’t say; I never go there. These aren’t criticisms. We all evolve our associations and activities. But these two are unusual, perhaps, in that they evaluate and recommend with varying vehemence how to interact with electronic media tools.

The wide-open Web available to Americans (but restricted in some countries) used to be valorized as a wholly democratic, organic, grass-roots, decentralized force for good where information yearned to breathe free. Though pioneered by academic institutions, it wasn’t long before the porn industry became the first to monetize it effectively (cuz duh! that’s there the money was — at least initially) and then the whole thing was eventually overwhelmed by others with unique agendas and mechanisms, including commerce, surveillance, and propaganda. The surfeit of information demanded curation, and social media with algorithmic feeds became the default for folks either too lazy or just untrained (or uninterested) in how to think for themselves. Along the way, since a surprisingly large portion of human activity diverted to online media, that activity turned into a resource mined, harvested, and in turn monetized, much like the voting public has become a resource tracked, polled, channeled, activated, disenfranchized, corrupted, and analyzed to death.

An earlier media theorist I read with enthusiasm, Neil Postman, recommended that curricula include the study of semantics as applied to media. (Use of a word like semantics sends nonacademics running for the hills, but the recommendation is basically about thinking critically, even skeptically, regarding information, its sources, and its means of distribution.) The rise of handheld omnimedia postdates Postman, so I can only surmise that the bewildering array of information we confront absorb every day, which I liken to drinking from a fire hose, only compounds Postman’s concern that students are severely overmatched by media (especially advertising) intent on colonizing and controlling their minds. Thus, today’s information environment is a far cry from the stately slowness of earlier eras when teaching and learning (to say nothing of entertainment) were conducted primarily through reading, lecture, and discussion.

A comment came in on this blog chiding me for still blogging after 14 years. I admit hardly anyone reads anymore; they watch (or listen, as with audio-only podcasts). Preferred forms of media consumption have moved on from printed text, something USA Today recognized decades ago when it designed its print publication and sidewalk distribution boxes to look more like TVs. Nonetheless, the modest reproach reminded me of a cry in the wilderness by Timothy Burke: why he still blogs, though quite infrequently. (There’s a brokeback can’t-quit-you joke in there somewhere I’ll leave unformulated.) So this blog may indeed be past its proper expiration date, yet it remains for me one of the best means for organizing how I think about stuff. Without it, I’m afraid thoughts would be rattling loose inside my head, disorganized, only to be displaced by the next slurp from the fire hose.

Cenk Uygur is running for U.S. Congress in California. Good for him … I guess. Racehorse politics don’t actually interest me, at least as a topic for a blog post, but his decision to enter the electoral fray poses some curious issues. What follows is some context and unsolicited advice, the latter exceptional for me since I’m not a political advocate and don’t reside in Cenk’s district (or even state).

Unlike many who heap unwarranted praise of our interrelated systems of government and economics, or who subscribe to some version of Churchill’s quip that democracy is the worst form of government yet preferred over all the others, I regard representative democracy and capitalism both as dumpster fires in the process of burning out. Good ideas while they lasted, perhaps, but they consumed nearly all their available fuel and are now sputtering, leaving behind useless ash and detritus. As a journalist and political junkie commentator, Cenk Uygur may be sensing his Hindenburg moment has arrived to jump onto the sinking RMS Titanic (mixing metaphors of doomed ships), meaning that a serendipitous right-time-right-place opportunity presented itself. Omigawd, the humanity! Others who had their unique Hindenburg moments and made good include Rudy Giuliani in the immediate aftermath of 9/11 (only to spiral down ignominiously) and Alexandria Ocasio-Cortez (AOC, elected to the U.S. Congress in 2018). Dunno about Cenk Uygur. His campaign website linked above rather conspicuously omits his surname (couldn’t find it anywhere). Maybe like AOC and Pete Buttigieg, it’s just too challenging for folks. Curious choice.

I have mostly disregarded Cenk Uygur and The Young Turks (TYT) for some time now. They are part of the new media formed and distributed (primarily?) on the Web, though I’m doubtful they (and others) have yet established a useful revival to supplant traditional broadcast journalism (TV and radio) that have become sclerotic. How their business models (and the inevitable distortions those models introduce) differ is unclear. The main reason I ignore him/them is that TYT adopted a breezy, chatty, unscripted style that is less about reporting than interpreting mostly political news on the fly. They are essentially programming their viewers/subscribers with progressive talking points and orthodoxy, a form of narrowcasting. Onscreen “reporters” have come and gone, but none are as boorish as Cenk Uygur, who labors under the impression that he can outwit others with logic traps but really comes across as incoherent, unfocused, and ideological. TYT has also aired their dirty laundry in the form of beefs with former “correspondents.” None of this serves my political and/or intellectual interests.

The tone of TYT puzzles me, too, considering the utter seriousness of political dysfunction. Commentators appear to enjoy being in front of the camera for verbal jousting matches with each other and guests or simply to riff on the news. Another journalist clearly in love with being on-camera is Rachel Maddow, who has been pilloried for promulgating the Russiagate story relentlessly. Maybe anchors who relish (a little too much) being in the public eye is a collateral effect of news bureaus having been folded into the entertainment divisions of media conglomerates and being forced told to put forward a smiling face no matter what horrors are reported. If I want to see politics served up as jokes, I watch Jimmy Dore (who provides an alarming level of insight). If I want to watch people having entertaining fun, I watch movies or stream TV. I do not watch ideological news shows or political debates (if I watch at all) to be entertained but rather to be informed. While TYT commentators endeavor to be scrupulously factually correct in their opinions, they offer too little signal alongside the noise.

So here are a few recommendations for Cenk’s campaign, worth a couple cents at most:

  • Recognize that politic decisions voters now face are no longer merely left/right, progressive/conservative, who-gets-to-hold-office binaries. Rather, it’s now whether we should track further down the path of authoritarian rule (e.g., a fascist national security state) masking itself as populism (but instead serving the plutocracy) under any political banner or instead serve the interests of the American people (best as able) as empire and industrial civilization sputter out.
  • Recognize that logic and reason are poor substitutes for good character and clarity of vision when the public (i.e., the great unwashed masses) responds more readily to jingoism, emotionalism, and empty rhetoric.
  • Zingers, gotchas, and takedowns are gladiatorial exploits that require more than mere accuracy to hit their marks and inflict damage. Take care not to indulge without considerable preparation and nuance. Some are obviously better at this than others.
  • When answering questions and/or giving interviews, do not mistake the exchange as a speech opportunity and dominate from one side (boorishness). Riffing, having fun, and sucking all the air out of the room are the attributes of TYT but wear thin in campaigning. Listening is just as important, maybe more.
  • Align your tone with the gravity of other’s suffering rather than your enjoyment of the applause and limelight. Your personal circumstances are not the proper locus of emotion.
  • Politics is deeply intertwined with wealth, power, and corruption and accordingly creates distortion fields that threaten to undo even the purest of hearts when compromise and/or betrayal are offered as lures. It’s an inevitability, a basic feature rather than a bug. Know that it’s coming. No one is incorruptible.

Admittedly, I’m not a campaign strategist and have no access to polling data. Accordingly, this post will likely be neither read nor its recommendations heeded; I’m not a political playah. Think of this as the undesired Christmas gift so valueless it can’t even be returned for store credit.

Delving slightly deeper after the previous post into someone-is-wrong-on-the-Internet territory (worry not: I won’t track far down this path), I was dispirited after reading some economist dude with the overconfidence hubris to characterize climate change as fraud. At issue is the misframing of proper time periods in graphical data for the purpose of overthrowing government and altering the American way of life. (Um, that’s the motivation? Makes no sense.) Perhaps this fellow’s intrepid foray into the most significant issue of our time (only to dismiss it) is an aftereffect of Freakonomics emboldening economists to offer explanations and opinions on matters well outside their field of expertise. After all, truly accurate, relevant information is only ever all about numbers (read: the Benjamins), shaped and delivered by economists, physical sciences be damned.

The author of the article has nothing original to say. Rather, he repackages information from the first of two embedded videos (or elsewhere?), which examines time frames of several trends purportedly demonstrating global warming (a term most scientists and activists have disused in favor of climate change, partly to distinguish climate from weather). Those trends are heat waves, extent of Arctic ice, incidence of wildfires, atmospheric carbon, sea level, and global average temperature. Presenters of weather/climate information (such as the IPCC) are accused of cherry-picking dates (statistical data arranged graphically) to present a false picture, but then similar data with other dates are used to depict another picture supposedly invalidating the first set of graphs. It’s a case of lying with numbers and then lying some more with other numbers.

Despite the claim that “reports are easily debunked as fraud,” I can’t agree that this example of climate change denial overcomes overwhelming scientific consensus on the subject. It’s not so much that the data are wrong (I acknowledge they can be misleading) but that the interpretation of effects of industrial activity since 1750 (a more reasonable comparative baseline) isn’t so obvious as simply following shortened or lengthened trend lines and demographics up or down. That’s typically zooming in or out to render the picture most amenable to a preferred narrative, precisely what the embedded video does and in turn accuses climate scientists and activists of doing. The comments under the article indicate a chorus of agreement with the premise that climate change is a hoax or fraud. Guess those commentators haven’t caught up yet with rising public sentiment, especially among the young.

Having studied news and evidence of climate change as a layperson for roughly a dozen years now, the conclusions drawn by experts (ignoring economists) convince me that we’re pretty irredeemably screwed. The collapse of industrial civilization and accompanying death pulse are the predicted outcomes but a precise date is impossible to provide because it’s a protracted process. An even worse possibility is near-term human extinction (NTHE), part of the larger sixth mass extinction. Absorbing this information has been a arduous, ongoing, soul-destroying undertaking for me, and evidence keeps being supplemented and revised, usually with ever-worsening prognoses. However, I’m not the right person to argue the evidence. Instead, see this lengthy article (with profuse links) by Dr. Guy McPherson, which is among the best resources outside of the IPCC.

In fairness, except for the dozen years I’ve spent studying the subject, I’m in no better position to offer inexpert opinion than some economist acting the fool. But regular folks are implored to inform and educate themselves on a variety of topics if nothing else than so that they can vote responsibly. My apprehension of reality and human dynamics may be no better than the next, but as history proceeds, attempting to make sense of the deluge of information confronting everyone is something I take seriously. Accordingly, I’m irked when contentious issues are warped and distorted, whether earnestly or malignantly. Maybe economists, like journalists, suffer from a professional deformation that confers supposed explanatory superpowers. However, in the context of our current epistemological crisis, I approach their utterances and certainty with great skepticism.

There is something ironic and vaguely tragic about how various Internet platforms — mostly search engines and social media networks — have unwittingly been thrust into roles their creators never envisioned for themselves. Unless I’m mistaken, they launched under the same business model as broadcast media: create content, or better yet, crowd-source content, to draw in viewers and subscribers whose attention is then delivered to advertisers. Revenue is derived from advertisers while the basic services — i.e., search, job networking, encyclopedias and dictionaries, or social connection — are given away gratis. The modest inconveniences and irritations of having the screen littered and interrupted with ads is a trade-off most end users are happy to accept for free content.

Along the way, some platform operators discovered that user data itself could be both aggregated and individualized and subsequently monetized. This second step unwittingly created so-called surveillance capitalism that Shoshana Zuboff writes about in her recently published book (previously blogged about it here). Essentially, an Orwellian Big Brother (several of them, in fact) tracks one’s activity through smart phone apps and Web browsers, including GPS data revealing movement through real space, not just virtual spaces. This is also the domain of the national security state from local law enforcement to the various security branches of the Federal government: dragnet surveillance where everyone is watched continuously. Again, end users shrug off surveillance as either no big deal or too late to resist.

The most recent step is that, like the Internet itself, various platforms have been functioning for some time already as public utilities and accordingly fallen under demand for regulation with regard to authenticity, truth, and community standards of allowable speech. Thus, private corporations have been thrust unexpectedly into the role of regulating content. Problem is, unlike broadcast networks that create their own content and can easily enforce restrictive standards, crowd-sourced platforms enable the general population to upload its own content, often mere commentary in text form but increasingly as video content, without any editorial review. These platforms have parried by deploying and/or modifying their preexisting surveillance algorithms in search of objectionable content normally protected as free speech and taken steps to remove content, demonetize channels, and ban offending users indefinitely, typically without warning and without appeal.

If Internet entrepreneurs initially got into the biz to make a few (or a lot of) quick billions, which some few of them have, they have by virtue of the global reach of their platforms been transformed into censors. It’s also curious that by enabling end uses to publish to their platforms, they’ve given voice to the masses in all their unwashed glory. Now, everyone’s crazy, radicalized uncle (or sibling or parent or BFF) formerly banished to obscurity railing against one thing or another at the local tavern, where he was tolerated as harmless so long as he kept his bar tab current, is proud to fly his freak flag anywhere and everywhere. Further, the anonymous coward who might issue death or bomb threats to denounce others has been given means to distribute hate across platforms and into the public sphere, where it gets picked up and maybe censored. Worst of all, the folks who monitor and decide what is allowed, functioning as modern-day thought police, are private citizens and corporations with no oversight or legal basis to act except for the fact that everything occurs on their respective platforms. This is a new aspect to the corporatocracy but not one anyone planned.

This is about to get weird.

I caught a good portion of a recent Joe Rogan podcast (sorry, no link or embedded video) with Alex Jones and Eddie Bravo (nearly 5 hours long instead of the usual 2 to 3) where the trio indulged themselves in a purported grand conspiracy to destroy civilization and establish a new post-human one. The more Jones rants speaks (which is quite a lot), the more he sounds like a madman. But he insists he does so to serve the public. He sincerely wants people to know things he’s figured out about an evil cabal of New World Order types. So let me say at least this: “Alex Jones, I hear you.” But I’m unconvinced. Apologies to Alex Jones et al. if I got any details wrong. For instance, it’s not clear to me whether Jones believes this stuff himself or he’s merely reporting what others may believe.

The grand conspiracy is supposedly interdimensional beings operating at a subliminal range below or beyond normal human perception. Perhaps they revealed themselves to a few individuals (to the cognoscenti, ya know, or is that shared revelation how one is inducted into the cognoscenti?). Rogan believes that ecstatic states induced by drugs provide access to revelation, like tuning a radio to the correct (but secret) frequency. Whatever exists in that altered cognitive state appears like a dream and is difficult to understand or remember. The overwhelming impression Rogan reports as lasting is of a distinct nonhuman presence.

Maybe I’m not quite as barking mad as Jones or as credulous as Rogan and Bravo, but I have to point out that humans are interdimensional beings. We move through three dimensions of space and one unidirectional dimension of time. If that doesn’t quite make sense, then I refer readers to Edwin Abbott’s well-known book Flatland. Abbott describes what it might be like for conscious beings in only two dimensions of space (or one). Similarly, for most of nature outside of vertebrates, it’s understood that consciousness, if it exists at all (e.g., not in plants), is so rudimentary that there is no durable sense of time. Beings exist in an eternal now (could be several seconds long/wide/tall — enough to function) without memory or anticipation. With that in mind, the possibility of multidimensional beings in 5+ dimensions completely imperceptible to us doesn’t bother me in the least. The same is true of the multiverse or many-worlds interpretation. What bothers me is that such beings would bother with us, especially with a conspiracy to crash civilization.

The other possibility at which I roll my eyes is a post-human future: specifically, a future when one’s consciousness escapes its biological boundaries. The common trope is that one’s mind is uploaded to a computer to exist in the ether. Another is that one transcends death somehow with intention and purpose instead of simply ceasing to be (as atheists believe) or some variation of the far more common religious heaven/hell/purgatory myth. This relates as well to the supposition of strong AI about to spark (the Singularity): self-awareness and intelligent thought that can exist on some substrate other than human biology (the nervous system, really, including the brain). Sure, cognition can be simulated for some specific tasks like playing chess or go, and we humans can be fooled easily into believing we are communicating with a thought machine à la the Turing Test. But the rather shocking sophistication, range, utility, and adaptability of even routine human consciousness is so far beyond any current simulation that the usual solution to get engineers from where they are now to real, true, strong AI is always “and then a miracle happened.” The easy, obvious route/accident is typically a power surge (e.g., a lightning strike).

Why bother with mere humans is a good question if one is post-human or an interdimensional being. It could well be that existence in such a realm would make watching human interactions either impenetrable (news flash, they are already) or akin to watching through a dim screen. That familiar trope is the lost soul imprisoned in the spirit world, a parallel dimension that permits viewing from one side only but prohibits contact except perhaps through psychic mediums (if you believe in such folks — Rogan for one doesn’t).

The one idea worth repeating from the podcast is the warning not to discount all conspiracy theories out of hand as bunk. At least a few have been demonstrated to be true. Whether any of the sites behind that link are to be believed I leave you readers to judge.

Addendum: Although a couple comments came in, no one puzzled over the primary piece I had to add, namely, that we humans are interdimentional beings. The YouTube video below depicts a portion of the math/science behind my statement, showing how at least two topographical surfaces behave paradoxically when limited to 2 or 3 dimensions but theoretically cohere in 4+ dimensions imperceptible to us.

Everyone is familiar with the convention in entertainment media where characters speak without the use of recognizable language. (Not related really to the convention of talking animals.) The first instance I can recall (someone correct me if earlier examples are to be found) is the happy-go-lucky bird Woodstock from the old Peanuts cartoons (do kids still recognize that cast of characters?), whose dialog was shown graphically as a series of vertical lines:

When the cartoon made its way onto TV for holiday specials, its creator Charles Schultz used the same convention to depict adults, never shown onscreen but with dialogue voiced by a Harmon-muted trombone. Roughly a decade later, two characters from the Star Wars franchise “spoke” in languages only other Star Wars characters could understand, namely, Chebacca (Chewie) and R2D2. More recently, the character Groot from Guardians of the Galaxy (known to me only through the Marvel movie franchise, not through comic books) speaks only one line of dialogue, “I am Groot,” which is understood as full speech by others Guardians characters. When behemoths larger than a school bus (King Kong, Godzilla, Jurassic dinosaurs, Cloverfield, Kaiju, etc.) appear, the characters are typically denied the power of speech beyond the equivalent of a lion’s roar. (True villains talk little or not at all as they go about their machinations — no monologuing! unless it’s a James Bond film. An exception notable for its failure to charm audiences is Ultron, who wouldn’t STFU. You can decide for yourself which is the worse kind of villainy.)

This convention works well enough for storytelling and has the advantage of allowing the reader/viewer to project onto otherwise blank speech. However, when imported into the real world, especially in politics, the convention founders. There is no Babelfish universal translator inserted in the ear to transform nonsense into coherence. The obvious example of babblespeech is 45, whose speech when off the teleprompter is a series of rambling non sequiturs, free associations, slogans, and sales pitches. Transcripts of anyone’s extemporaneous speech reveal lots of restarts and blind alleys; we all interrupt ourselves to redirect. However, word salad that substitutes for meaningful content in 45’s case is tragicomic: alternately entirely frustrating or comically entertaining depending on one’s objective. Satirical news shows fall into the second category.

45 is certainly not the first. Sarah Palin in her time as a media darling (driver of ratings and butt of jokes — sound familiar?) had a knack for crazy speech combinations that were utter horseshit yet oddly effective for some credulous voters. She was even a hero to some (nearly a heartbeat away from being the very first PILF). We’ve also now been treated to a series of public interrogations where a candidate for a cabinet post or an accused criminal offers testimony before a congressional panel. Secretary of Education Betsy DeVos famously evaded simple yes/no questions during her confirmation hearing, and Supreme Court Justice Brett Kavanaugh similarly refused to provide direct answers to direct questions. Unexpectedly, sacrificial lamb Michael Cohen does give direct answers to many questions, but his interlocutors then don’t quite know how to respond considering their experience and expectation that no one answers appropriately.

What all this demonstrates is that there is often a wide gulf between what is said and what is heard. In the absence of what might be understood as effective communication (honest, truthful, and forthright), audiences and voters fill in the blanks. Ironically, we also can’t handle hear too much truth when confronted by its awfulness. None of this is a problem in storytelling, but when found in politic narratives, it’s emblematic of how dysfunctional our communications have become, and with them, the clear thought and principled activity of governance.

In the lost decades of my youth (actually, early adulthood, but to an aging fellow like me, that era now seems like youth), I began to acquire audio equipment and recordings (LPs, actually) to explore classical music as an alternative to frequent concert attendance. My budget allowed only consumer-grade equipment, but I did my best to choose wisely rather than guess and end up with flashy front-plates that distract from inferior sound (still a thing, as a visit to Best Buy demonstrates). In the decades since, I’ve indulged a modest fetish for high-end electronics that fits neither my budget nor lifestyle but nonetheless results in my simple two-channel stereo (not the surround sound set-ups many favor) of individual components providing fairly astounding sonics. When a piece exhibits problems or a connection gets interrupted, I often resort to older, inferior, back-up equipment before troubleshooting and identifying the problem. Once the correction is made, return to premium sound is an unmistakable improvement. When forced to resort to less-than-stellar components, I’m sometimes reminded of a remark a friend once made, namely, that when listening, he tries to hear the quality in the performance despite degraded reproduced sound (e.g., surface noise on the LP).

Though others may argue, I insist that popular music does not requires high fidelity to enjoy. The truth in that statement is evidenced by how multifunction devices such as phones and computers are used by most people to listen to music. Many influencers laugh and scoff at the idea that anyone would buy physical media or quality equipment anymore; everything now is streamed to their devices using services such as Spotify, Apple Music, or Amazon Prime. From my perspective, they’re fundamentally insensitive to subtle gradations of sound. Thumping volume (a good beat) is all that’s needed or understood.

However, multifunction devices do not aim at high fidelity. Moreover, clubs and outdoor festivals typically use equipment designed for sheer volume rather than quality. Loud jazz clubs might be the worst offenders, especially because intimate, acoustic performance (now mostly abandoned) set an admirable artistic standard only a few decades ago. High volume creates the illusion of high energy, but diminishing returns set in quickly as the human auditory system reacts to extreme volume by blocking as much sound as possible to protect itself from damage, or more simply, by going deaf slowly or quickly. Reports of performers whose hearing is wrecked from short- or long-term overexposure to high volume are legion. Profound hearing loss is already appearing throughout the general public the same way enthusiastic sunbathers are developing melanoma.

As a result of technological change, notions of how music is meant to sound is shifting. Furthermore, the expectation that musical experiences are to be shared by audiences of more than, say, a few people at a time is giving way to the singular, private listening environment enabled by headphones and earbuds. (Same thing happened with reading.) Differences between music heard communally in a purposed performance space (whether live or reproduced) and music reproduced in the ear (earbuds) or over the ear (headphones) canal — now portable and ubiquitous — lead to audio engineers shifting musical perspective yet again (just as they did at the onset of the radio and television eras) to accommodate listeners with distorted expectations how music should sound.

No doubt, legitimate musical experiences can be had through reproduced sound, though degraded means produce lesser approximations of natural sound and authenticity as equipment descends in price and quality or the main purpose is simply volume. Additionally, most mainstream popular musics require amplification, as opposed to traditional acoustic forms of musicmaking. Can audiences/listeners actually get beyond degradation and experience artistry and beauty? Or must we be content with facsimiles that no longer possess the intent of the performers or a robust aesthetic experience? These may well be questions for the ages for which no solid answers obtain.

Caveat: Rather uncharacteristically long for me. Kudos if you have the patience for all of this.

Caught the first season of HBO’s series Westworld on DVD. I have a boyhood memory of the original film (1973) with Yul Brynner and a dim memory of its sequel Futureworld (1976). The sheer charisma of Yul Brynner in the role of the gunslinger casts a long shadow over the new production, not that most of today’s audiences have seen the original. No doubt, 45 years of technological development in film production lends the new version some distinct advantages. Visual effects are quite stunning and Utah landscapes have never been used more appealingly in terms of cinematography. Moreover, storytelling styles have changed, though it’s difficult to argue convincingly that they’re necessarily better now than then. Competing styles only appear dated. For instance, the new series has immensely more time to develop its themes; but the ancient parables of hubris and loss of control over our own creations run amok (e.g., Shelley’s Frankenstein, or more contemporaneously, the surprisingly good new movie Upgrade) have compact, appealing narrative arcs quite different from constant teasing and foreshadowing of plot developments while actual plotting proceeds glacially. Viewers wait an awful lot longer in the HBO series for resolution of tensions and emotional payoffs, by which time investment in the story lines has been dispelled. There is also no terrifying crescendo of violence and chaos demanding rescue or resolution. HBO’s Westworld often simply plods on. To wit, a not insignificant portion of the story (um, side story) is devoted to boardroom politics (yawn) regarding who actually controls the Westworld theme park. Plot twists and reveals, while mildly interesting (typically guessed by today’s cynical audiences), do not tie the narrative together successfully.

Still, Westworld provokes considerable interest from me due to my fascination with human consciousness. The initial episode builds out the fictional future world with characters speaking exposition clearly owing its inspiration to Julian Jayne’s book The Origins of Consciousness in the Breakdown of the Bicameral Mind (another reference audiences are quite unlikely to know or recognize). I’ve had the Julian Jaynes Society’s website bookmarked for years and read the book some while back; never imagined it would be captured in modern fiction. Jaynes’ thesis (if I may be so bold as to summarize radically) is that modern consciousness coalesced around the collapse of multiple voices in the head — ideas, impulses, choices, decisions — into a single stream of consciousness perhaps better understood (probably not) as the narrative self. (Aside: the multiple voices of antiquity correspond to polytheism, whereas the modern singular voice corresponds to monotheism.) Thus, modern human consciousness arose over several millennia as the bicameral mind (the divided brain having two camera, chambers, or halves) functionally collapsed. The underlying story of the new Westworld is the emergence of machine consciousness, a/k/a strong AI, a/k/a The Singularity, while the old Westworld was about a mere software glitch. Exploration of machine consciousness modeling (e.g., improvisation builds on memory to create awareness) as a proxy for better understanding human consciousness might not be the purpose of the show, but it’s clearly implied. And although conjectural, the speed of emergence of human consciousness contrasts sharply with the abrupt ON switch regarding theorized machine consciousness. Westworld treats them as roughly equivalent, though in fairness, 35 years or so in Westworld is in fact abrupt compared to several millennia. (Indeed, the story asserts that machine consciousness sparked alive repeatedly (which I suggested here) over those 35 years but was dialed back repeatedly. Never mind all the unexplored implications.) Additionally, the fashion in which Westworld uses the term bicameral ranges from sloppy to meaningless, like the infamous technobabble of Star Trek.


Political discussion usually falls out of scope on this blog, though I use the politics category and tag often enough. Instead, I write about collapse, consciousness, and culture (and to a lesser extent, music). However, politics is up front and center with most media, everyone taking whacks at everyone else. Indeed, the various political identifiers are characterized these days by their most extreme adherents. The radicalized elements of any political persuasion are the noisiest and thus the most emblematic of a worldview if one judges solely by the most attention-grabbing factions, which is regrettably the case for a lot of us. (Squeaky wheel syndrome.) Similarly, in the U.S. at least, the spectrum is typically expressed as a continuum from left to right (or right to left) with camps divided nearly in half based on voting. Opinion polls reveal a more lopsided division (toward Leftism/Progressivism as I understand it) but still reinforce the false binary.

More nuanced political thinkers allow for at least two axes of political thought and opinion, usually plotted on an x-y coordinate plane (again, left to right and down to up). Some look more like the one below (a quick image search will reveal dozens of variations), with outlooks divided into regions of a Venn diagram suspiciously devoid of overlap. The x-y coordinate plane still underlies the divisions.


If you don’t know where your political compass points, you can take this test, though I’m not especially convinced that the result is useful. Does it merely apply more labels? If I had to plot myself according to the traditional divisions above, I’d probably be a centrist, which is to say, nothing. My positions on political issues are not driven by party affiliation, motivated by fear or grievance, subject to a cult of personality, or informed by ideological possession. Perhaps I’m unusual in that I can hold competing ideas in my head (e.g., individualism vs. collectivism) and make pragmatic decisions. Maybe not.

If worthwhile discussion is sought among principled opponents (a big assumption, that), it is necessary to diminish or ignore the more radical voices screaming insults at others. However, multiple perverse incentives reward the most heinous adherents the greatest attention and control of the narrative(s). in light of the news out just this week, call it Body Slam Politics. It’s a theatrical style borne out of fake drama from the professional wrestling ring (not an original observation on my part), and we know who the king of that style is. Watching it unfold too closely is a guaranteed way to destroy one’s political sensibility, to say nothing of wrecked brain cells. The spectacle depicted in Idiocracy has arrived early.

I’m on the sidelines with the issue of free speech, an observer with some skin in the game but not really much at risk. I’m not the sort of beat my breast and seek attention over what seems to me a fairly straightforward value, though with lots of competing interpretations. It helps that I have no particularly radical or extreme views to express (e.g., won’t find me burning the flag), though I am an iconoclast in many respects. The basic value is that folks get to say (and by extension think) whatever they want short of inciting violence. The gambit of the radicalized left has been to equate speech with violence. With hate speech, that may actually be the case. What is recognized as hate speech may be changing, but liberal inclusion strays too far into mere hurt feelings or discomfort, thus the risible demand for safe spaces and trigger warnings suitable for children. If that standard were applied rigorously, free speech as we know it in the U.S. would come to an abrupt end. Whatever SJWs may say they want, I doubt they really want that and suggest they haven’t thought it through well enough yet.

An obvious functional limitation is that one doesn’t get to say whatever one wishes whenever and wherever one wants. I can’t simply breach security and go onto The Tonight Show, a political rally, or a corporate boardroom to tell my jokes, voice my dissent, or vent my dissatisfaction. In that sense, deplatforming may not be an infringement of free speech but a pragmatic decision regarding whom it may be worthwhile to host and promote. Protest speech is a complicated area, as free speech areas designated blocks away from an event are clearly set up to nullify dissent. No attempt is made here to sort out all the dynamics and establish rules of conduct for dissent or the handling of dissent by civil authorities. Someone else can attempt that.

My point with this blog post is to observe that for almost all of us in the U.S., free speech is widely available and practiced openly. That speech has conceptual and functional limitations, such as the ability to attract attention (“move the needle”) or convince (“win hearts and minds”), but short of gag orders, we get to say/think what we want and then deal with the consequences (often irrelevance), if any. Adding terms to the taboo list is a waste of time and does no more to guide people away from thinking or expressing awful things than does the adoption of euphemism or generics. (The terms moron, idiot, and imbecile used to be acceptable psychological classifications, but usage shifted. So many euphemisms and alternatives to calling someone stupid exist that avoiding the now-taboo word retard accomplishes nothing. Relates to my earlier post about epithets.)

Those who complain their free speech has been infringed and those who support free speech vociferously as the primary means of resolving conflict seem not to realize that their objections are less to free speech being imperiled but more to its unpredictable results. For instance, the Black Lives Matter movement successfully drew attention to a real problem with police using unnecessary lethal force against black people with alarming regularity. Good so far. The response was Blue Lives Matter, then All Lives Matter, then accusations of separatism and hate speech. That’s the discussion happening — free speech in action. Similarly, when Colin Kaepernick famously took a knee rather than stand and sing the national anthem (hand over heart, uncovered head), a rather modest protest as protests go, he drew attention to racial injustice that then morphed into further, ongoing discussion of who, when, how, why anyone gets to protest — a metaprotest. Nike’s commercial featuring Kaepernick and the decline of attendance at NFL games are part of that discussion, with the public participating or refusing to participate as the case may be. Discomforts and sacrifices are experienced all around. This is not Pollyannaish assurance that all is well and good in free speech land. Whistleblowers and Me Too accusers know only too well that reprisals ruin lives. Rather, it’s an ongoing battle for control of the narrative(s). Fighting that battle inevitably means casualties. Some engage from positions of considerable power and influence, others as underdogs. The discussion is ongoing.