Archive for the ‘Media’ Category

Ours is an era when individuals are encouraged to explore, amplify, and parade various attributes of their identities out in public, typically via social media. For those just coming of age and/or recently having entered adulthood, because identity is not yet fully formed, defining oneself is more nearly a demand. When identity is further complicated by unusual levels of celebrity, wealth, beauty, and athleticism (lots of overlap there), defining oneself is often an act of rebellion against the perceived demands of an insatiable public. Accordingly, it was unsurprising to me at least to learn of several well-known people unhappy with their lives and the burdens upon them.

Regular folks can’t truly relate the glitterati, who are often held up aspirational models. For example, many of us look upon the discomforts of Prince Harry and Meghan Markle with a combination of perverse fascination and crocodile tears. They were undoubtedly trapped in a strange, gilded prison before repudiating the duties expected of them as “senior royals,” attempting an impossible retreat to normalcy outside of England. Should be obvious that they will continue to be hounded while public interest in them persists. Similarly, Presley Gerber made news, fell out of the news, and then got back into the news as a result of his growing collection of tattoos. Were he simply some anonymous fellow, few would care. However, he has famous parents and already launched a modeling career before his face tattoo announced his sense of being “misunderstood.” Pretty bold move. With all the presumed resources and opportunities at his disposal, many have wondered in comments and elsewhere whether another, better declaration of self might have been preferred.

Let me give these three the benefit of doubt. Although they all have numerous enviable attributes, the accident of birth (or in Markle’s case, decision to marry) landed them in exceptional circumstances. The percentage of celebrities who crack under the pressure of unrelenting attention and proceed to run off the rails is significant. Remaining grounded is no doubt easier if one attains celebrity (or absurd immense wealth) after, say, the age of 25 or even later. (On some level, we’ve all lost essential groundedness with reality, but that’s another blog post.) Those who are children of celebrities or who become child prodigies may not all be consigned to character distortion or a life irrevocably out of balance, but it’s at least so commonplace that the dangerous potential should be recognized and embraced only with wariness. I’ve heard of programs designed to help professional athletes who become sudden multimillionaires (and thus targets of golddiggers and scammers) make the transition. Good for them that structured support is available. Yet another way average folks can’t relate: we have to work things out for ourselves.

Here’s the example I don’t get: Taylor Swift. She was the subject of a Netflix biography called Miss Americana (2020) that paints her as, well, misunderstood. Thing is, Swift is a runaway success story, raking in money, fans, awards, attention, and on balance, detractors. That success is something she earnestly desired and struggled to achieve only to learn that the glossy, popstar image sold especially but nonexclusively to 14-year-old girls comes with a lot of heavy baggage. How can the tragic lives of so many musicians launched into superstardom from the late 1950s onward have escaped Swift’s awareness in our media-saturated world? Naming names is sorta tacky, so I demur, but there are lots of them. Swift obtained her heart’s desire, found her songwriting and political voice, maintains a high public profile, and shows no lack of productivity. Sure, it’s a life out of balance, not remotely normal the way most noncelebrities would understand. However, she signed up for it willingly (if naïvely) and by all accounts perpetuates it. She created her own distinctive gilded prison. I don’t envy her, nor do I particularly feel sorry for her, as the Netflix show appears to instruct.

Caveat: rather overlong for me, but I got rolling …

One of the better articles I’ve read about the pandemic is this one by Robert Skidelsky at Project Syndicate (a publication I’ve never heard of before). It reads as only slightly conspiratorial, purporting to reveal the true motivation for lockdowns and social distancing, namely, so-called herd immunity. If that’s the case, it’s basically a silent admission that no cure, vaccine, or inoculation is forthcoming and the spread of the virus can only be managed modestly until it has essentially raced through the population. Of course, the virus cannot be allowed to simply run its course unimpeded, but available impediments are limited. “Flattening the curve,” or distributing the infection and death rates over time, is the only attainable strategy and objective.

Wedding mathematical and biological insights, as well as the law of mass action in chemistry, into an epidemic model may seem obvious now, but it was novel roughly a century ago. We’re also now inclined, if scientifically oriented and informed, to understand the problem and its potential solutions management in terms of engineering rather than medicine (or maybe in terms of triage and palliation). Global response has also made the pandemic into a political issue as governments obfuscate and conceal true motivations behind their handling (bumbling in the U.S.) of the pandemic. Curiously, the article also mentions financial contagion, which is shaping up to be worse in both severity and duration than the viral pandemic itself.

(more…)

I had at least two further ideas for this third part of a series, but frankly, given the precipitous turn of events over the past month or so, nothing feels appropriate to write about just yet other than the global pandemic that has staggered society, reeling from being forced apart from each other and the way of life to which we are adapted being suddenly ripped out from beneath us. As the voiceover at the beginning of one of the Lord of the Rings movies intones rather soberly, “The world … has changed ….” That was my assessment here, though I was really thinking of the post-truth public sphere.

Many are already admitting that we will never be able to go back to what once was, that what broke will stay forever broken. And while the eventual response may be interpreted in sweet-lemon style as a reform opportunity or beckon call to greatness, I daresay a far more likely result is that mass death, sickness, and ruin will create a critical mass of desperate people not so willing to stay hunkered down waiting for the extended crisis to pass. Indeed, the bunker mentality already imprinted on our minds as we cringe before the next in a series of injurious blows can’t be expected to endure. Folks will take to the streets with all their stockpiled guns and ammo, seeking something, anything to do, rather than dying quietly, meekly, alone, at home. The metaphor of being pummeled into submission or to death is probably incorrect. Right now, we’re still only partway up one of those parabolic curves that ultimately points skyward. Alternatively, it’s a crescendo of pain that overwhelms until nothing functions anymore.

If surviving historians are able to piece together the story some time hence, one possibility will be to observe that the abundance we sorta enjoyed during two centuries of cheap energy did not develop into anything resembling an enlightened style of social organization that could be sustained or indeed even prepare us adequately for inevitable black swan events. Such discontinuities are entirely predictable by virtue of their inevitability, though precise timing is a fool’s errand. Natural disasters are the obvious example, and despite organizations and agencies scattered throughout all levels of government, we’re found flat-footed nearly every time disaster strikes. This global pandemic is no different, nor is the collapse of industrial civilization or runaway climate change. The current crisis is the first major kick in the teeth that may well cascade domino-style into full-on collapse.

As the crisis deepens, top leaders are often found to be worthless. Where is Pence, appointed more than a month ago to coordinate a coronavirus task force? It’s quite unlike a major political figure to do his or her work quietly and competently without media in tow. Even incompetence gets coverage, but Pence is nowhere to be seen. Must be self-quarantining. Some leaders are even worse than worthless; they actively add to the misery. Mainstream media may also have finally gotten hip to the idea that hanging on every insipid word uttered by that gaping chasm of stupidity that is our president is no longer a ratings bonanza to be tolerated in exchange for fruitless fact-checking missions. I fantasize about press events where correspondents heckle and laugh the fatuous gasbag (or his apologists) off the podium. Regrettably, there seems to be no bottom to the humiliation he can withstand so long as attention stays riveted on him. Perhaps the better response to his noisome nonsense would be stone silence — crickets.

In educational philosophy, learning is often categorized in three domains: the cognitive, the affective, and the psychomotor (called Bloom’s Taxonomy). Although formal education admittedly concentrates primarily on the cognitive domain, a well-rounded person gives attention to all three. The psychomotor domain typically relates to tool use and manipulation, but if one considers the body itself a tool, then athletics and physical workouts are part of a balanced approach. The affective domain is addressed through a variety of mechanisms, not least of which is narrative, much of it entirely fictional. We learn how to process emotions through vicarious experience as a safe way to prepare for the real thing. Indeed, dream life is described as the unconscious mind’s mechanism for consolidating memory and experience as well as rehearsing prospective events (strategizing) in advance. Nightmares are, in effect, worst-case scenarios dreamt up for the purpose of avoiding the real thing (e.g., falling from a great height or venturing too far into the dark — a proxy for the unknown). Intellectual workouts address the cognitive domain. While some are happy to remain unbalanced, focusing on strengths found exclusively in a single domain (gym rats, eggheads, actors) and thus remaining physically, emotionally, or intellectually stunted or immature, most understand that workouts in all domains are worth seeking out as elements of healthy development.

One form of intellectual workout is debate, now offered by various media and educational institutions. Debate is quite old but has been embraced with renewed gusto in a quest to develop content (using new media) capable of drawing in viewers, which mixes educational objectives with commercial interests. The time-honored political debate used to be good for determining where to cast one’s vote but has become nearly useless in the last few decades as neither the sponsoring organizations, the moderators, nor the candidates seem to understand anymore how to run a debate or behave properly. Instead, candidates use the opportunity to attack each other, ignore questions and glaring issues at hand, and generally refuse to offer meaningful responses to the needs of voters. Indeed, this last was among the principal innovations of Bill Clinton: roll out some appealing bit of vacuous rhetoric yet offer little to no guidance what policies will actually be pursued once in office. Two presidential administrations later, Barack Obama did much the same, which I consider a most egregious betrayal or bait-and-switch. Opinions vary.

In a recent Munk Debate, the proposition under consideration was whether humankind’s best days lie ahead or behind. Optimists won the debate by a narrow margin (determined by audience vote); however, debate on the issue is not binding truth, nor does debate really resolve the question satisfactorily. The humor and personalities of the debaters probably had more influence than their arguments. Admitting that I possess biases, I found myself inclined favorably toward the most entertaining character, though what I find entertaining is itself further bias not shared especially with many others. In addition, I suspect the audience did not include many working class folks or others who see their prospects for better lives diminishing rapidly, which skews the resulting vote. The age-old parental desire to leave one’s children a better future than their own is imperiled according to this poll (polls may vary considerably — do your own search). How one understands “better off” is highly variable, but the usual way that’s understood is in terms of material wellbeing.

Folks on my radar (names withheld) range widely in their enthusiasm or disdain for debate. The poles appears to be default refusal to accept invitations to debate (often couched as open challenges to professed opinions) as a complete waste of time to earnest desire to participate in, host, and/or moderate debates as a means of informing the public by providing the benefit of expert argumentation. As an intellectual workout, I appreciate the opportunity to hear debates (at least when I’m not exasperated by a speaker’s lack of discipline or end-around arguments), but readers can guess from the title of this post that I expect nothing to be resolved by debate. Were I ever to be offered an opportunity to participate, I can well imagine accepting the invitation and having some fun flexing my intellectual muscles, but I would enter into the event with utterly no expectation of being able to convince anyone of anything. Minds are already too well made up on most issues. If I were offered a spot on some bogus news-and-opinion show to be a talking head, shot from the shoulders up and forced to shout and interrupt to get a brief comment or soundbite in edgewise, that I would decline handily as a total waste of time.

Nicholas Carr has a pair of thoughtful new posts at his blog Rough Type (see blogroll) under the tag “infinite media.” The second of the two is about context collapse, restoration, and content collapse. I won’t review that particular post; I’m merely pointing to it for you to read. Carr is a journalist and media theorist whose work is especially interesting to me as a partial antidote to what I’ve been calling our epistemological crisis. In short, he offers primers on how to think about stuff, that stuff being the primary medium through which most people now gather information: via screens.

Relatedly, the other media theorist to whom I pay attention is Alan Jacobs, who has a recent book (which I read but didn’t review or blog about) called more simply How to Think. It’s about recognizing and avoiding cognitive biases on the way to more disciplined, clear thinking. I mention these two fellows together because I’ve been reading their blogs and books for over a decade now and have been curious to observe how their public interactions have changed over time. They have each embraced and abandoned various new media (particularly social media) and adopted more stringent media ecology. Carr posts ocassionally now and has closed comments at his blog (a shame, since his commentariat was valuable, quite unlike the troll mob at most sites). Jacobs is even more aggressive, starting and abandoning one blog after another (was active at multiple URLs, one formerly on my blogroll) and deleting his Twitter account entirely. Whatever goings-on occur at Facebook I can’t say; I never go there. These aren’t criticisms. We all evolve our associations and activities. But these two are unusual, perhaps, in that they evaluate and recommend with varying vehemence how to interact with electronic media tools.

The wide-open Web available to Americans (but restricted in some countries) used to be valorized as a wholly democratic, organic, grass-roots, decentralized force for good where information yearned to breathe free. Though pioneered by academic institutions, it wasn’t long before the porn industry became the first to monetize it effectively (cuz duh! that’s there the money was — at least initially) and then the whole thing was eventually overwhelmed by others with unique agendas and mechanisms, including commerce, surveillance, and propaganda. The surfeit of information demanded curation, and social media with algorithmic feeds became the default for folks either too lazy or just untrained (or uninterested) in how to think for themselves. Along the way, since a surprisingly large portion of human activity diverted to online media, that activity turned into a resource mined, harvested, and in turn monetized, much like the voting public has become a resource tracked, polled, channeled, activated, disenfranchized, corrupted, and analyzed to death.

An earlier media theorist I read with enthusiasm, Neil Postman, recommended that curricula include the study of semantics as applied to media. (Use of a word like semantics sends nonacademics running for the hills, but the recommendation is basically about thinking critically, even skeptically, regarding information, its sources, and its means of distribution.) The rise of handheld omnimedia postdates Postman, so I can only surmise that the bewildering array of information we confront absorb every day, which I liken to drinking from a fire hose, only compounds Postman’s concern that students are severely overmatched by media (especially advertising) intent on colonizing and controlling their minds. Thus, today’s information environment is a far cry from the stately slowness of earlier eras when teaching and learning (to say nothing of entertainment) were conducted primarily through reading, lecture, and discussion.

A comment came in on this blog chiding me for still blogging after 14 years. I admit hardly anyone reads anymore; they watch (or listen, as with audio-only podcasts). Preferred forms of media consumption have moved on from printed text, something USA Today recognized decades ago when it designed its print publication and sidewalk distribution boxes to look more like TVs. Nonetheless, the modest reproach reminded me of a cry in the wilderness by Timothy Burke: why he still blogs, though quite infrequently. (There’s a brokeback can’t-quit-you joke in there somewhere I’ll leave unformulated.) So this blog may indeed be past its proper expiration date, yet it remains for me one of the best means for organizing how I think about stuff. Without it, I’m afraid thoughts would be rattling loose inside my head, disorganized, only to be displaced by the next slurp from the fire hose.

Cenk Uygur is running for U.S. Congress in California. Good for him … I guess. Racehorse politics don’t actually interest me, at least as a topic for a blog post, but his decision to enter the electoral fray poses some curious issues. What follows is some context and unsolicited advice, the latter exceptional for me since I’m not a political advocate and don’t reside in Cenk’s district (or even state).

Unlike many who heap unwarranted praise of our interrelated systems of government and economics, or who subscribe to some version of Churchill’s quip that democracy is the worst form of government yet preferred over all the others, I regard representative democracy and capitalism both as dumpster fires in the process of burning out. Good ideas while they lasted, perhaps, but they consumed nearly all their available fuel and are now sputtering, leaving behind useless ash and detritus. As a journalist and political junkie commentator, Cenk Uygur may be sensing his Hindenburg moment has arrived to jump onto the sinking RMS Titanic (mixing metaphors of doomed ships), meaning that a serendipitous right-time-right-place opportunity presented itself. Omigawd, the humanity! Others who had their unique Hindenburg moments and made good include Rudy Giuliani in the immediate aftermath of 9/11 (only to spiral down ignominiously) and Alexandria Ocasio-Cortez (AOC, elected to the U.S. Congress in 2018). Dunno about Cenk Uygur. His campaign website linked above rather conspicuously omits his surname (couldn’t find it anywhere). Maybe like AOC and Pete Buttigieg, it’s just too challenging for folks. Curious choice.

I have mostly disregarded Cenk Uygur and The Young Turks (TYT) for some time now. They are part of the new media formed and distributed (primarily?) on the Web, though I’m doubtful they (and others) have yet established a useful revival to supplant traditional broadcast journalism (TV and radio) that have become sclerotic. How their business models (and the inevitable distortions those models introduce) differ is unclear. The main reason I ignore him/them is that TYT adopted a breezy, chatty, unscripted style that is less about reporting than interpreting mostly political news on the fly. They are essentially programming their viewers/subscribers with progressive talking points and orthodoxy, a form of narrowcasting. Onscreen “reporters” have come and gone, but none are as boorish as Cenk Uygur, who labors under the impression that he can outwit others with logic traps but really comes across as incoherent, unfocused, and ideological. TYT has also aired their dirty laundry in the form of beefs with former “correspondents.” None of this serves my political and/or intellectual interests.

The tone of TYT puzzles me, too, considering the utter seriousness of political dysfunction. Commentators appear to enjoy being in front of the camera for verbal jousting matches with each other and guests or simply to riff on the news. Another journalist clearly in love with being on-camera is Rachel Maddow, who has been pilloried for promulgating the Russiagate story relentlessly. Maybe anchors who relish (a little too much) being in the public eye is a collateral effect of news bureaus having been folded into the entertainment divisions of media conglomerates and being forced told to put forward a smiling face no matter what horrors are reported. If I want to see politics served up as jokes, I watch Jimmy Dore (who provides an alarming level of insight). If I want to watch people having entertaining fun, I watch movies or stream TV. I do not watch ideological news shows or political debates (if I watch at all) to be entertained but rather to be informed. While TYT commentators endeavor to be scrupulously factually correct in their opinions, they offer too little signal alongside the noise.

So here are a few recommendations for Cenk’s campaign, worth a couple cents at most:

  • Recognize that politic decisions voters now face are no longer merely left/right, progressive/conservative, who-gets-to-hold-office binaries. Rather, it’s now whether we should track further down the path of authoritarian rule (e.g., a fascist national security state) masking itself as populism (but instead serving the plutocracy) under any political banner or instead serve the interests of the American people (best as able) as empire and industrial civilization sputter out.
  • Recognize that logic and reason are poor substitutes for good character and clarity of vision when the public (i.e., the great unwashed masses) responds more readily to jingoism, emotionalism, and empty rhetoric.
  • Zingers, gotchas, and takedowns are gladiatorial exploits that require more than mere accuracy to hit their marks and inflict damage. Take care not to indulge without considerable preparation and nuance. Some are obviously better at this than others.
  • When answering questions and/or giving interviews, do not mistake the exchange as a speech opportunity and dominate from one side (boorishness). Riffing, having fun, and sucking all the air out of the room are the attributes of TYT but wear thin in campaigning. Listening is just as important, maybe more.
  • Align your tone with the gravity of other’s suffering rather than your enjoyment of the applause and limelight. Your personal circumstances are not the proper locus of emotion.
  • Politics is deeply intertwined with wealth, power, and corruption and accordingly creates distortion fields that threaten to undo even the purest of hearts when compromise and/or betrayal are offered as lures. It’s an inevitability, a basic feature rather than a bug. Know that it’s coming. No one is incorruptible.

Admittedly, I’m not a campaign strategist and have no access to polling data. Accordingly, this post will likely be neither read nor its recommendations heeded; I’m not a political playah. Think of this as the undesired Christmas gift so valueless it can’t even be returned for store credit.

Delving slightly deeper after the previous post into someone-is-wrong-on-the-Internet territory (worry not: I won’t track far down this path), I was dispirited after reading some economist dude with the overconfidence hubris to characterize climate change as fraud. At issue is the misframing of proper time periods in graphical data for the purpose of overthrowing government and altering the American way of life. (Um, that’s the motivation? Makes no sense.) Perhaps this fellow’s intrepid foray into the most significant issue of our time (only to dismiss it) is an aftereffect of Freakonomics emboldening economists to offer explanations and opinions on matters well outside their field of expertise. After all, truly accurate, relevant information is only ever all about numbers (read: the Benjamins), shaped and delivered by economists, physical sciences be damned.

The author of the article has nothing original to say. Rather, he repackages information from the first of two embedded videos (or elsewhere?), which examines time frames of several trends purportedly demonstrating global warming (a term most scientists and activists have disused in favor of climate change, partly to distinguish climate from weather). Those trends are heat waves, extent of Arctic ice, incidence of wildfires, atmospheric carbon, sea level, and global average temperature. Presenters of weather/climate information (such as the IPCC) are accused of cherry-picking dates (statistical data arranged graphically) to present a false picture, but then similar data with other dates are used to depict another picture supposedly invalidating the first set of graphs. It’s a case of lying with numbers and then lying some more with other numbers.

Despite the claim that “reports are easily debunked as fraud,” I can’t agree that this example of climate change denial overcomes overwhelming scientific consensus on the subject. It’s not so much that the data are wrong (I acknowledge they can be misleading) but that the interpretation of effects of industrial activity since 1750 (a more reasonable comparative baseline) isn’t so obvious as simply following shortened or lengthened trend lines and demographics up or down. That’s typically zooming in or out to render the picture most amenable to a preferred narrative, precisely what the embedded video does and in turn accuses climate scientists and activists of doing. The comments under the article indicate a chorus of agreement with the premise that climate change is a hoax or fraud. Guess those commentators haven’t caught up yet with rising public sentiment, especially among the young.

Having studied news and evidence of climate change as a layperson for roughly a dozen years now, the conclusions drawn by experts (ignoring economists) convince me that we’re pretty irredeemably screwed. The collapse of industrial civilization and accompanying death pulse are the predicted outcomes but a precise date is impossible to provide because it’s a protracted process. An even worse possibility is near-term human extinction (NTHE), part of the larger sixth mass extinction. Absorbing this information has been a arduous, ongoing, soul-destroying undertaking for me, and evidence keeps being supplemented and revised, usually with ever-worsening prognoses. However, I’m not the right person to argue the evidence. Instead, see this lengthy article (with profuse links) by Dr. Guy McPherson, which is among the best resources outside of the IPCC.

In fairness, except for the dozen years I’ve spent studying the subject, I’m in no better position to offer inexpert opinion than some economist acting the fool. But regular folks are implored to inform and educate themselves on a variety of topics if nothing else than so that they can vote responsibly. My apprehension of reality and human dynamics may be no better than the next, but as history proceeds, attempting to make sense of the deluge of information confronting everyone is something I take seriously. Accordingly, I’m irked when contentious issues are warped and distorted, whether earnestly or malignantly. Maybe economists, like journalists, suffer from a professional deformation that confers supposed explanatory superpowers. However, in the context of our current epistemological crisis, I approach their utterances and certainty with great skepticism.

There is something ironic and vaguely tragic about how various Internet platforms — mostly search engines and social media networks — have unwittingly been thrust into roles their creators never envisioned for themselves. Unless I’m mistaken, they launched under the same business model as broadcast media: create content, or better yet, crowd-source content, to draw in viewers and subscribers whose attention is then delivered to advertisers. Revenue is derived from advertisers while the basic services — i.e., search, job networking, encyclopedias and dictionaries, or social connection — are given away gratis. The modest inconveniences and irritations of having the screen littered and interrupted with ads is a trade-off most end users are happy to accept for free content.

Along the way, some platform operators discovered that user data itself could be both aggregated and individualized and subsequently monetized. This second step unwittingly created so-called surveillance capitalism that Shoshana Zuboff writes about in her recently published book (previously blogged about it here). Essentially, an Orwellian Big Brother (several of them, in fact) tracks one’s activity through smart phone apps and Web browsers, including GPS data revealing movement through real space, not just virtual spaces. This is also the domain of the national security state from local law enforcement to the various security branches of the Federal government: dragnet surveillance where everyone is watched continuously. Again, end users shrug off surveillance as either no big deal or too late to resist.

The most recent step is that, like the Internet itself, various platforms have been functioning for some time already as public utilities and accordingly fallen under demand for regulation with regard to authenticity, truth, and community standards of allowable speech. Thus, private corporations have been thrust unexpectedly into the role of regulating content. Problem is, unlike broadcast networks that create their own content and can easily enforce restrictive standards, crowd-sourced platforms enable the general population to upload its own content, often mere commentary in text form but increasingly as video content, without any editorial review. These platforms have parried by deploying and/or modifying their preexisting surveillance algorithms in search of objectionable content normally protected as free speech and taken steps to remove content, demonetize channels, and ban offending users indefinitely, typically without warning and without appeal.

If Internet entrepreneurs initially got into the biz to make a few (or a lot of) quick billions, which some few of them have, they have by virtue of the global reach of their platforms been transformed into censors. It’s also curious that by enabling end uses to publish to their platforms, they’ve given voice to the masses in all their unwashed glory. Now, everyone’s crazy, radicalized uncle (or sibling or parent or BFF) formerly banished to obscurity railing against one thing or another at the local tavern, where he was tolerated as harmless so long as he kept his bar tab current, is proud to fly his freak flag anywhere and everywhere. Further, the anonymous coward who might issue death or bomb threats to denounce others has been given means to distribute hate across platforms and into the public sphere, where it gets picked up and maybe censored. Worst of all, the folks who monitor and decide what is allowed, functioning as modern-day thought police, are private citizens and corporations with no oversight or legal basis to act except for the fact that everything occurs on their respective platforms. This is a new aspect to the corporatocracy but not one anyone planned.

This is about to get weird.

I caught a good portion of a recent Joe Rogan podcast (sorry, no link or embedded video) with Alex Jones and Eddie Bravo (nearly 5 hours long instead of the usual 2 to 3) where the trio indulged themselves in a purported grand conspiracy to destroy civilization and establish a new post-human one. The more Jones rants speaks (which is quite a lot), the more he sounds like a madman. But he insists he does so to serve the public. He sincerely wants people to know things he’s figured out about an evil cabal of New World Order types. So let me say at least this: “Alex Jones, I hear you.” But I’m unconvinced. Apologies to Alex Jones et al. if I got any details wrong. For instance, it’s not clear to me whether Jones believes this stuff himself or he’s merely reporting what others may believe.

The grand conspiracy is supposedly interdimensional beings operating at a subliminal range below or beyond normal human perception. Perhaps they revealed themselves to a few individuals (to the cognoscenti, ya know, or is that shared revelation how one is inducted into the cognoscenti?). Rogan believes that ecstatic states induced by drugs provide access to revelation, like tuning a radio to the correct (but secret) frequency. Whatever exists in that altered cognitive state appears like a dream and is difficult to understand or remember. The overwhelming impression Rogan reports as lasting is of a distinct nonhuman presence.

Maybe I’m not quite as barking mad as Jones or as credulous as Rogan and Bravo, but I have to point out that humans are interdimensional beings. We move through three dimensions of space and one unidirectional dimension of time. If that doesn’t quite make sense, then I refer readers to Edwin Abbott’s well-known book Flatland. Abbott describes what it might be like for conscious beings in only two dimensions of space (or one). Similarly, for most of nature outside of vertebrates, it’s understood that consciousness, if it exists at all (e.g., not in plants), is so rudimentary that there is no durable sense of time. Beings exist in an eternal now (could be several seconds long/wide/tall — enough to function) without memory or anticipation. With that in mind, the possibility of multidimensional beings in 5+ dimensions completely imperceptible to us doesn’t bother me in the least. The same is true of the multiverse or many-worlds interpretation. What bothers me is that such beings would bother with us, especially with a conspiracy to crash civilization.

The other possibility at which I roll my eyes is a post-human future: specifically, a future when one’s consciousness escapes its biological boundaries. The common trope is that one’s mind is uploaded to a computer to exist in the ether. Another is that one transcends death somehow with intention and purpose instead of simply ceasing to be (as atheists believe) or some variation of the far more common religious heaven/hell/purgatory myth. This relates as well to the supposition of strong AI about to spark (the Singularity): self-awareness and intelligent thought that can exist on some substrate other than human biology (the nervous system, really, including the brain). Sure, cognition can be simulated for some specific tasks like playing chess or go, and we humans can be fooled easily into believing we are communicating with a thought machine à la the Turing Test. But the rather shocking sophistication, range, utility, and adaptability of even routine human consciousness is so far beyond any current simulation that the usual solution to get engineers from where they are now to real, true, strong AI is always “and then a miracle happened.” The easy, obvious route/accident is typically a power surge (e.g., a lightning strike).

Why bother with mere humans is a good question if one is post-human or an interdimensional being. It could well be that existence in such a realm would make watching human interactions either impenetrable (news flash, they are already) or akin to watching through a dim screen. That familiar trope is the lost soul imprisoned in the spirit world, a parallel dimension that permits viewing from one side only but prohibits contact except perhaps through psychic mediums (if you believe in such folks — Rogan for one doesn’t).

The one idea worth repeating from the podcast is the warning not to discount all conspiracy theories out of hand as bunk. At least a few have been demonstrated to be true. Whether any of the sites behind that link are to be believed I leave you readers to judge.

Addendum: Although a couple comments came in, no one puzzled over the primary piece I had to add, namely, that we humans are interdimentional beings. The YouTube video below depicts a portion of the math/science behind my statement, showing how at least two topographical surfaces behave paradoxically when limited to 2 or 3 dimensions but theoretically cohere in 4+ dimensions imperceptible to us.

Everyone is familiar with the convention in entertainment media where characters speak without the use of recognizable language. (Not related really to the convention of talking animals.) The first instance I can recall (someone correct me if earlier examples are to be found) is the happy-go-lucky bird Woodstock from the old Peanuts cartoons (do kids still recognize that cast of characters?), whose dialog was shown graphically as a series of vertical lines:

When the cartoon made its way onto TV for holiday specials, its creator Charles Schultz used the same convention to depict adults, never shown onscreen but with dialogue voiced by a Harmon-muted trombone. Roughly a decade later, two characters from the Star Wars franchise “spoke” in languages only other Star Wars characters could understand, namely, Chebacca (Chewie) and R2D2. More recently, the character Groot from Guardians of the Galaxy (known to me only through the Marvel movie franchise, not through comic books) speaks only one line of dialogue, “I am Groot,” which is understood as full speech by others Guardians characters. When behemoths larger than a school bus (King Kong, Godzilla, Jurassic dinosaurs, Cloverfield, Kaiju, etc.) appear, the characters are typically denied the power of speech beyond the equivalent of a lion’s roar. (True villains talk little or not at all as they go about their machinations — no monologuing! unless it’s a James Bond film. An exception notable for its failure to charm audiences is Ultron, who wouldn’t STFU. You can decide for yourself which is the worse kind of villainy.)

This convention works well enough for storytelling and has the advantage of allowing the reader/viewer to project onto otherwise blank speech. However, when imported into the real world, especially in politics, the convention founders. There is no Babelfish universal translator inserted in the ear to transform nonsense into coherence. The obvious example of babblespeech is 45, whose speech when off the teleprompter is a series of rambling non sequiturs, free associations, slogans, and sales pitches. Transcripts of anyone’s extemporaneous speech reveal lots of restarts and blind alleys; we all interrupt ourselves to redirect. However, word salad that substitutes for meaningful content in 45’s case is tragicomic: alternately entirely frustrating or comically entertaining depending on one’s objective. Satirical news shows fall into the second category.

45 is certainly not the first. Sarah Palin in her time as a media darling (driver of ratings and butt of jokes — sound familiar?) had a knack for crazy speech combinations that were utter horseshit yet oddly effective for some credulous voters. She was even a hero to some (nearly a heartbeat away from being the very first PILF). We’ve also now been treated to a series of public interrogations where a candidate for a cabinet post or an accused criminal offers testimony before a congressional panel. Secretary of Education Betsy DeVos famously evaded simple yes/no questions during her confirmation hearing, and Supreme Court Justice Brett Kavanaugh similarly refused to provide direct answers to direct questions. Unexpectedly, sacrificial lamb Michael Cohen does give direct answers to many questions, but his interlocutors then don’t quite know how to respond considering their experience and expectation that no one answers appropriately.

What all this demonstrates is that there is often a wide gulf between what is said and what is heard. In the absence of what might be understood as effective communication (honest, truthful, and forthright), audiences and voters fill in the blanks. Ironically, we also can’t handle hear too much truth when confronted by its awfulness. None of this is a problem in storytelling, but when found in politic narratives, it’s emblematic of how dysfunctional our communications have become, and with them, the clear thought and principled activity of governance.