Archive for the ‘Consciousness’ Category

A complex of interrelated findings about how consciousness handles the focus of perception has been making the rounds. Folks are recognizing the limited time each of us has to deal with everything pressing upon us for attention and are adopting the notion of the bandwidth of consciousness: the limited amount of perception / memory / thought one can access or hold at the forefront of attention compared to the much larger amount occurring continuously outside of awareness (or figuratively, under the hood). Similarly, the myriad ways attention is diverted by advertisers and social media (to name just two examples) to channel consumer behaviors or increase time-on-device metrics have become commonplace topics of discussion. I’ve used the terms information environment, media ecology, and attention economy is past posts on this broad topic.

Among the most important observations is how the modern infosphere has become saturated with content, much of it entirely pointless (when not actively disorienting or destructive), and how many of us willingly tune into it without interruption via handheld screens and earbuds. It’s a steady flow of stimulation (overstimulation, frankly) that is the new normal for those born and/or bred to the screen (media addicts). Its absence or interruption is discomfiting (like a toddler’s separation anxiety). However, mental processing of information overflow is tantamount to drinking from a fire hose: only a modest fraction of the volume rushing nonstop can be swallowed. Promoters of meditation and presensing, whether implied or manifest, also recognize that human cognition requires time and repose to process and consolidate experience, transforming it into useful knowledge and long-term memory. More and more stimulation added on top is simply overflow, like a faucet filling the bathtub faster than drain can let water out, spilling overflow onto the floor like digital exhaust. Too bad that the sales point of these promoters is typically getting more done, because dontcha know, more is better even when recommending less.

Quanta Magazine has a pair of articles (first and second) by the same author (Jordana Cepelewicz) describing how the spotlight metaphor for attention is only partly how cognition works. Many presume that the mind normally directs awareness or attention to whatever the self prioritizes — a top-down executive function. However, as any loud noise, erratic movement, or sharp pain demonstrates, some stimuli are promoted to awareness by virtue of their individual character — a bottom-up reflex. The fuller explanation is that neuroscientists are busy researching brain circuits and structures that prune, filter, or gate the bulk of incoming stimuli so that attention can be focused on the most important bits. For instance, the article mentions how visual perception circuits process categories of small and large differently, partly to separate figure from ground. Indeed, for cognition to work at all, a plethora of inhibitory functions enable focus on a relatively narrow subset of stimuli selected from the larger set of available stimuli.

These discussions about cognition (including philosophical arguments about (1) human agency vs. no free will or (2) whether humans exist within reality or are merely simulations running inside some computer or inscrutable artificial intelligence) so often get lost in the weeds. They read like distinctions without differences. No doubt these are interesting subjects to contemplate, but at the same time, they’re sorta banal — fodder for scientists and eggheads that most average folks dismiss out of hand. In fact, selective and inhibitory mechanisms are found elsewhere in human physiology, such as pairs of muscles to move to and fro or appetite stimulants / depressants (alternatively, activators and deactivators) operating in tandem. Moreover, interactions are often not binary (on or off) but continuously variable. For my earlier post on this subject, see this.

For want of a useful way to describe multiple, intersecting problems plaguing the modern world — a nest of problems, if you will — let me adopt matryoshkas (a/k/a Russian nesting dolls). The metaphor is admittedly imperfect because problems are not discrete, resized replicas of each other that nest snugly, one inside the next. Rather, a better depiction would look more like some crazy mash-up of a Venn diagram and a Rorschach test but without the clean dividing lines or symmetry.

I use matryoshkas because they bear close relationship to each other. Also, the matryoshka is a maternal figure, much like Mother Earth. Matryoshkas are interlocking, each affecting others, though their relationships beyond the metaphor are far too complex to manage or manipulate effectively. For instance, the expansionary (growth) economy matryoshka (the paradigmatic problem of our time), nested two or three levels inside the Mother Earth matryoshka, bursts the outer dolls from within, whereas the collapsing Mother Earth matryoshka crushes the inner dolls. Similarly, if the economy matryoshka contracts (as it should and must), other inner dolls (e.g., nation states) will not survive. Which matryoshka fits inside another is a matter of interpretation. The one representing human consciousness is especially hard to position because it’s both cause and effect.

The Global Climate Strike underway this week reminds us of the outermost matryoshka, the largest one that contains or encapsulates all the others. Dealing with this biggest problem (since it’s truly an extinction level event, though slow-acting due to its global scale) has been delayed so long that (to mix my metaphors) the patient has become terminal. The diagnosis came long ago (i.e., quit smoking, or more accurately, quit burning fossil fuels and heating the planet), but treatment (cessation, really) never happened. We just kept puffing away with our transportation infrastructure (cars, boats, trains, and planes) and industrial machinery (including weaponry) because to do otherwise would — gasp — imperil the economy or negatively impact what’s become a nonnegotiable lifestyle, at least in the First World and only for a diminishing portion. The implicit decision, I suppose, is to live large now but condemn those unfortunate enough to follow in the wake of global ecological destruction.

Unless I misjudge the mood and consensus, climate change is (finally!) no longer the subject of controversy or denial except by a few intransigent fools (including political leaders and news groups that have inexplicably instituted gag orders to conceal the staggering immensity of the problem). Enough nasty events (storms, species die-offs, and epidemics — though no pandemic just yet) have piled up, including by way of example “unprecedented” flooding in Houston (never mind that flooding is a regular occurrence now, establishing a new precedent from which we steadfastly refuse to learn), that it’s impossible to dispute that we’ve entered an era of rather extraordinary instability. (That last sentence has problems with nesting, too, which I could fix by rewriting the sentence, but perhaps it’s fitting to just let the problems fester.) Indeed, as I have indicated before, we’re transitioning out of the Garden Earth (having left behind Ice Age Earth some 12,000 years ago) to Hothouse Earth. The rate of change is quite unlike similar transitions in the geological past, and we’re quite unlikely to survive.

Continuing (after some delay) from part 1, Pankaj Mishra concludes chapter 4 of The Age of Anger with an overview of Iranian governments that shifted from U.S./British client state (headed by the Shah of Iran, reigned 1941–1979) to its populist replacement (headed by Ayatollah Khomeini, ruled 1979–1989), both leaders having been authoritarians. During the period discussed, Iran underwent the same modernization and infiltration by liberal, Western values and economics, which produced a backlash familiar from Mishra’s descriptions of other nations and regions that had experienced the same severed roots of place since the onset of the Enlightenment. Vacillation among two or more styles of government might be understood as a thermostatic response: too hot/cold one direction leads to correction in another direction. It’s not a binary relationship, however, between monarchy and democracy (to use just one example). Nor are options between a security state headed by an installed military leader and a leader elected by popular vote. Rather, it’s a question of national identity being alternatively fractured and unified (though difficult to analyze and articulate) in the wake of multiple intellectual influences.

According to Lewis and Huntington, modernity has failed to take root in intransigently traditional and backward Muslim countries despite various attempts to impose it by secular leaders such as Turkey’s Atatürk, the Shah of Iran, Algeria’s Ben Bella, Egypt’s Nasser and Sadat, and Pakistan’s Ayub Khan.

Since 9/11 there have been many versions, crassly populist as well as solemnly intellectual, of the claims by Lewis and Huntington that the crisis in Muslim countries is purely self-induced, and [that] the West is resented for the magnitude of its extraordinary success as a beacon of freedom, and embodiment of the Enlightenment’s achievements … They have mutated into the apparently more sophisticated claim that the clash of civilizations occurs [primarily] within Islam, and that Western interventions are required on behalf of the ‘good Muslim’, who is rational, moderate and liberal. [p. 127]

This is history told by the putative winners. Mishra goes on:

Much of the postcolonial world … became a laboratory for Western-style social engineering, a fresh testing site for the Enlightenment ideas of secular progress. The philosophes had aimed at rationalization, or ‘uniformization’, of a range of institutions inherited from an intensely religious era. Likewise, postcolonial leaders planned to turn illiterate peasants into educated citizens, to industrialize the economy, move the rural population to cities, alchemize local communities into a singular national identity, replace the social hierarchies of the past with an egalitarian order, and promote the cults of science and technology among a pious and often superstitious population. [p. 133]

Readers may recognize this project and/or process by its more contemporary name: globalization. It’s not merely a war of competing ideas, however, because those ideas manifest in various styles of social and political organization. Moreover, the significance of migration from rural agrarian settings to primarily urban and suburban ones can scarcely be overstated. This transformation (referring to the U.S. in the course of the 20th century) is something James Howard Kunstler repeatedly characterizes rather emphatically as the greatest misallocation of resources in the history of the world. Mishra summarizes the effects of Westernization handily:

In every human case, identity turns out to be porous and inconsistent rather than fixed and discrete; and prone to get confused and lost in the play of mirrors. The cross-currents of ideas and inspirations — the Nazi reverence for Atatürk, a gay French philosopher’s denunciation of the modern West and sympathy for the Iranian Revolution, or the various ideological inspirations for Iran’s Islamic Revolution (Zionism, Existentialism, Bolshevism and revolutionary Shiism) — reveal that the picture of a planet defined by civilizations closed off from one another and defined by religion (or lack thereof) is a puerile cartoon. They break the simple axis — religious-secular, modern-medieval, spiritual-materialist — on which the contemporary world is still measured, revealing that its populations, however different their pasts, have been on converging and overlapping paths. [p. 158]

These descriptions and analyses put me in mind of a fascinating book I read some years ago and reviewed on Amazon (one of only a handful of Amazon reviews): John Reader’s Man on Earth (1988). Reader describes and indeed celebrates incredibly diverse ways of inhabiting the Earth specially adapted to the landscape and based on evolving local practices. Thus, the notion of “place” is paramount. Comparison occurs only by virtue of juxtaposition. Mishra does something quite different, drawing out the connective ideas that account for “converging and overlapping paths.” Perhaps inevitably, disturbances to collective and individual identities that flow from unique styles of social organization, especially those now operating at industrial scale (i.e., industrial civilization), appear to be picking up. For instance, in the U.S., even as mass shootings (a preferred form of attack but not the only one) appear to be on the rise at the same time that violent crime is at an all-time low, perpetrators of violence are not limited to a few lone wolves, as the common trope goes. According to journalist Matt Agorist,

mass shootings — in which murdering psychopaths go on rampages in public spaces — have claimed the lives of 339 people since 2015 [up to mid-July 2019]. While this number is certainly shocking and far too high, during this same time frame, police in America have claimed the lives of 4,355 citizens.

And according to this article in Vox, this crazy disproportion (police violence to mass shootings) is predominantly an American thing at least partly because of our high rate of fetishized civilian gun ownership. Thus, the self-described “land of the free, home of the brave” has transformed itself into a paranoid garrison state affecting civil authority even more egregiously than the disenfranchised (mostly young men). Something similar occurred during the Cold War, when leaders became hypervigilant for attacks and invasions that never came. Whether a few close calls during the height of the Cold War were the result of escalating paranoia, brinkmanship, or true, maniacal, existential threats from a mustache-twirling, hand-rolling despot hellbent on the destruction of the West is a good question, probably impossible to answer convincingly. However, the result today of this mindset couldn’t be more disastrous:

It is now clear that the post-9/11 policies of pre-emptive war, massive retaliation, regime change, nation-building and reforming Islam have failed — catastrophically failed — while the dirty war against the West’s own Enlightenment [the West secretly at war with itself] — inadvertently pursued through extrajudicial murder, torture, rendition, indefinite detention and massive surveillance — has been a wild success. The uncodified and unbridled violence of the ‘war on terror’ ushered in the present era of absolute enmity in which the adversaries, scornful of all compromise, seek to annihilate each other. Malignant zealots have emerged at the very heart of the democratic West after a decade of political and economic tumult; the simple explanatory paradigm set in stone soon after the attacks of 9/11 — Islam-inspired terrorism versus modernity — lies in ruins. [pp.124–125]

Third version of this topic. Whereas the previous two were about competing contemporary North American ways of knowing, this one is broader in both time and space.

The May 2019 issue of Harper’s Magazine has a fascinating review of Christina Thompson’s book Sea People: The Puzzle of Polynesia (2019). Beyond the puzzle itself — how did Polynesian people migrate to, settle, and populate the far-flung islands of the Central and South Pacific? — the review hits upon one of my recurring themes on this blog, namely, that human cognition is plastic enough to permit highly divergent ways of knowing.

The review (and book?) is laden with Eurocentric detail about the “discovery” of closely related Polynesian cultures dispersed more widely (geographically) than any other culture prior to the era of mass migration. Indeed, the reviewer chides the author at one point for transforming Polynesia from a subject in its own right into an exotic object of (Western) fascination. This distorted perspective is commonplace and follows from the earlier “discovery” and colonization of North America as though it were not already populated. Cartographers even today are guilty of this Eurocentrism, relegating “empty” expanses of the Pacific Ocean to irrelevance in maps when in fact the Pacific is “the dominant feature of the planet” and contains roughly twenty-five thousand islands (at current sea level? — noting that sea level was substantially lower during the last ice age some 13,000 years but due to rise substantially by the end of this century and beyond, engulfing many of the islands now lying dangerously close to sea level). Similar distortions are needed to squash the spherical (3D) surface of the globe onto planar (2D) maps (e.g., the Mercator projection, which largely ignores the Pacific Ocean in favor of continents; other projections shown here) more easily conceptualized (for Westerners) in terms of coordinate geometry using latitude and longitude (i.e., the Cartesian plane).

The review mentions the familiar dichotomy of grouping a hammer, saw, hatchet, and log in terms of abstract categories (Western thought) vs. utility or practicality (non-Western). Exploration of how different ways of knowing manifest is, according to the review, among the more intellectually exciting parts of the book. That’s the part I’m latching onto. For instance, the review offers this:

Near the middle of Sea People, Thompson explores the ramification of Polynesia as, until contact, an oral culture with “an oral way of seeing.” While writing enables abstraction, distancing, and what we generally call objectivity, the truth of oral cultures is thoroughly subjective. Islands aren’t dots on a map seen from the sky but destinations one travels to in the water.

This is the crux of the puzzle of Polynesians fanning out across the Pacific approximately one thousand years ago. They had developed means of wayfinding in canoes and outriggers without instruments or maps roughly 500 years prior to Europeans crossing the oceans in sailing ships. Perhaps I’m reading too much into the evidence, but abstraction and objectivity as a particular way of knowing, bequeathed to Western Europe via the Enlightenment and development of the scientific method, stunted or delayed exploration of the globe precisely because explorers began with a god’s eye view of the Earth from above rather than from the surface (object vs. subject). In contrast, quoting here from the book rather than the review, Polynesians used

a system known as etak, in which they visualize a “reference island,” — which is usually a real island but may also be imaginary — off to one side of the path they are following, about midway between their starting point and their destination. As the journey progresses, this island “moves” under each of the stars in the star path [situated near the horizon rather than overhead], while the canoe in which the voyagers are traveling stays still. Of course, the navigators know that it is the canoe and not the islands that are moving, but this is the way they conceptualize the voyage.

Placing oneself at the center of the world or universe — at least for the purpose of navigation — is a conceptual pose Westerners discarded when heliocentrism gradually replaced geocentrism. (Traveling using GPS devices ironically places the traveler back at the center of the map with terrain shifting around the vehicle, but it’s a poor example of wayfinding precisely because the traveler fobs the real work onto the device and likely possesses no real understanding or skill traversing the terrain besides following mechanical instructions.) While we Westerners might congratulate ourselves for a more accurate, objective orientation to the stars, its unwitting limitations are worth noting. Recent discoveries regarding human prehistory, especially megalithic stone construction accomplished with techniques still unknown and flatly impossible with modern technology, point to the existence of other ways of knowing lost to contemporary human cultures steadily triangulating on and conforming to Western thought (through the process of globalization). Loss of diversity of ways of knowing creates yet another sort of impoverishment that can only be barely glimpsed since most of us are squarely inside the bubble. Accordingly, it’s not for nothing that some unusually sensitive critics of modernity suggest we’re entering a new Dark Age.

 

This is about to get weird.

I caught a good portion of a recent Joe Rogan podcast (sorry, no link or embedded video) with Alex Jones and Eddie Bravo (nearly 5 hours long instead of the usual 2 to 3) where the trio indulged themselves in a purported grand conspiracy to destroy civilization and establish a new post-human one. The more Jones rants speaks (which is quite a lot), the more he sounds like a madman. But he insists he does so to serve the public. He sincerely wants people to know things he’s figured out about an evil cabal of New World Order types. So let me say at least this: “Alex Jones, I hear you.” But I’m unconvinced. Apologies to Alex Jones et al. if I got any details wrong. For instance, it’s not clear to me whether Jones believes this stuff himself or he’s merely reporting what others may believe.

The grand conspiracy is supposedly interdimensional beings operating at a subliminal range below or beyond normal human perception. Perhaps they revealed themselves to a few individuals (to the cognoscenti, ya know, or is that shared revelation how one is inducted into the cognoscenti?). Rogan believes that ecstatic states induced by drugs provide access to revelation, like tuning a radio to the correct (but secret) frequency. Whatever exists in that altered cognitive state appears like a dream and is difficult to understand or remember. The overwhelming impression Rogan reports as lasting is of a distinct nonhuman presence.

Maybe I’m not quite as barking mad as Jones or as credulous as Rogan and Bravo, but I have to point out that humans are interdimensional beings. We move through three dimensions of space and one unidirectional dimension of time. If that doesn’t quite make sense, then I refer readers to Edwin Abbott’s well-known book Flatland. Abbott describes what it might be like for conscious beings in only two dimensions of space (or one). Similarly, for most of nature outside of vertebrates, it’s understood that consciousness, if it exists at all (e.g., not in plants), is so rudimentary that there is no durable sense of time. Beings exist in an eternal now (could be several seconds long/wide/tall — enough to function) without memory or anticipation. With that in mind, the possibility of multidimensional beings in 5+ dimensions completely imperceptible to us doesn’t bother me in the least. The same is true of the multiverse or many-worlds interpretation. What bothers me is that such beings would bother with us, especially with a conspiracy to crash civilization.

The other possibility at which I roll my eyes is a post-human future: specifically, a future when one’s consciousness escapes its biological boundaries. The common trope is that one’s mind is uploaded to a computer to exist in the ether. Another is that one transcends death somehow with intention and purpose instead of simply ceasing to be (as atheists believe) or some variation of the far more common religious heaven/hell/purgatory myth. This relates as well to the supposition of strong AI about to spark (the Singularity): self-awareness and intelligent thought that can exist on some substrate other than human biology (the nervous system, really, including the brain). Sure, cognition can be simulated for some specific tasks like playing chess or go, and we humans can be fooled easily into believing we are communicating with a thought machine à la the Turing Test. But the rather shocking sophistication, range, utility, and adaptability of even routine human consciousness is so far beyond any current simulation that the usual solution to get engineers from where they are now to real, true, strong AI is always “and then a miracle happened.” The easy, obvious route/accident is typically a power surge (e.g., a lightning strike).

Why bother with mere humans is a good question if one is post-human or an interdimensional being. It could well be that existence in such a realm would make watching human interactions either impenetrable (news flash, they are already) or akin to watching through a dim screen. That familiar trope is the lost soul imprisoned in the spirit world, a parallel dimension that permits viewing from one side only but prohibits contact except perhaps through psychic mediums (if you believe in such folks — Rogan for one doesn’t).

The one idea worth repeating from the podcast is the warning not to discount all conspiracy theories out of hand as bunk. At least a few have been demonstrated to be true. Whether any of the sites behind that link are to be believed I leave you readers to judge.

Addendum: Although a couple comments came in, no one puzzled over the primary piece I had to add, namely, that we humans are interdimentional beings. The YouTube video below depicts a portion of the math/science behind my statement, showing how at least two topographical surfaces behave paradoxically when limited to 2 or 3 dimensions but theoretically cohere in 4+ dimensions imperceptible to us.

Some while back, Scott Adams (my general disdain for him noted but unexpanded, since I’m not in the habit of shitting on people), using his knowledge of hypnosis, began pushing the string selling the narrative that our Commander-in-Chief is cannily adept at the art of persuasion. I, for one, am persuaded by neither Adams nor 45 but must admit that many others are. Constant shilling for control of narratives by agents of all sorts could not be more transparent (for me at least), rendering the whole enterprise null. Similarly, when I see an advertisement (infrequently, I might add, since I use ad blockers and don’t watch broadcast TV or news programs), I’m rarely inclined to seek more information or make a purchase. Once in a long while, an ad creeps through my defenses and hits one of my interests, and even then, I rarely respond because, duh, it’s an ad.

In the embedded video below, Stuart Ewen describes how some learned to exploit a feature (not a bug) in human cognition, namely, appeals to emotion that overwhelm rational response. The most obvious, well-worn example is striking fear into people’s hearts and minds to convince them of an illusion of safety necessitating relinquishing civil liberties and/or fighting foreign wars.

The way Ewen uses the term consciousness differs from the way I use it. He refers specifically to opinion- and decision-making (the very things vulnerable to manipulation) rather than the more generalized and puzzling property of having an individual identity or mind and with it self-awareness. In fact, Ewen uses the terms consciousness industry and persuasion industry instead of public relations and marketing to name those who spin information and thus public discourse. At some level, absolutely everyone is guilty of seeking to persuade others, which again is a basic feature of communication. (Anyone negotiating the purchase of, say, a new or used car faces the persuasion of the sales agent with some skepticism.) What turns it into something maniacal is using lies and fabrication to advance agendas against the public interest, especially where public opinion is already clear.

Ewen also points to early 20th-century American history, where political leaders and marketers were successful in manipulating mass psychology in at least three ways: 1. drawing the pacifist U.S. public into two world wars of European origin, 2. transforming citizens into consumers, thereby saving capitalism from its inherently self-destructive endgame (creeping up on us yet again), and 3. suppressing emergent collectivism, namely, socialism. Of course, unionism as a collectivist institution still gained considerable strength but only within the larger context of capitalism, e.g., achieving the American Dream in purely financial terms.

So getting back to Scott Adams’ argument, the notion that the American public is under some form of mass hypnosis (persuasion) and that 45 is the master puppeteer is perhaps half true. Societies do sometimes go mad and fall under the spell of a mania or cult leader. But 45 is not the driver of the current episode, merely the embodiment. I wouldn’t say that 45 figured out anything because that awards too much credit to presumed understanding and planning. Rather, he worked out (accidentally and intuitively — really by default considering his job in 2016) that his peculiar self-as-brand could be applied to politics by treating it all as reality TV, which by now everyone knows is its own weird unreality the same way professional wrestling is fundamentally unreal. (The term political theater applies here.) He demonstrated a knack (at best) for keeping the focus firmly on himself and driving ratings (abetted by the mainstream media that had long regarded him as a clown or joke), but those objectives were never really in service of a larger political vision. In effect, the circus brought to town offers its own bizarre constructed narrative, but its principle characteristic is gawking, slack-jawed, made-you-look narcissism, not any sort of proper guidance or governance.

Caveat: Rather uncharacteristically long for me. Kudos if you have the patience for all of this.

Caught the first season of HBO’s series Westworld on DVD. I have a boyhood memory of the original film (1973) with Yul Brynner and a dim memory of its sequel Futureworld (1976). The sheer charisma of Yul Brynner in the role of the gunslinger casts a long shadow over the new production, not that most of today’s audiences have seen the original. No doubt, 45 years of technological development in film production lends the new version some distinct advantages. Visual effects are quite stunning and Utah landscapes have never been used more appealingly in terms of cinematography. Moreover, storytelling styles have changed, though it’s difficult to argue convincingly that they’re necessarily better now than then. Competing styles only appear dated. For instance, the new series has immensely more time to develop its themes; but the ancient parables of hubris and loss of control over our own creations run amok (e.g., Shelley’s Frankenstein, or more contemporaneously, the surprisingly good new movie Upgrade) have compact, appealing narrative arcs quite different from constant teasing and foreshadowing of plot developments while actual plotting proceeds glacially. Viewers wait an awful lot longer in the HBO series for resolution of tensions and emotional payoffs, by which time investment in the story lines has been dispelled. There is also no terrifying crescendo of violence and chaos demanding rescue or resolution. HBO’s Westworld often simply plods on. To wit, a not insignificant portion of the story (um, side story) is devoted to boardroom politics (yawn) regarding who actually controls the Westworld theme park. Plot twists and reveals, while mildly interesting (typically guessed by today’s cynical audiences), do not tie the narrative together successfully.

Still, Westworld provokes considerable interest from me due to my fascination with human consciousness. The initial episode builds out the fictional future world with characters speaking exposition clearly owing its inspiration to Julian Jayne’s book The Origins of Consciousness in the Breakdown of the Bicameral Mind (another reference audiences are quite unlikely to know or recognize). I’ve had the Julian Jaynes Society’s website bookmarked for years and read the book some while back; never imagined it would be captured in modern fiction. Jaynes’ thesis (if I may be so bold as to summarize radically) is that modern consciousness coalesced around the collapse of multiple voices in the head — ideas, impulses, choices, decisions — into a single stream of consciousness perhaps better understood (probably not) as the narrative self. (Aside: the multiple voices of antiquity correspond to polytheism, whereas the modern singular voice corresponds to monotheism.) Thus, modern human consciousness arose over several millennia as the bicameral mind (the divided brain having two camera, chambers, or halves) functionally collapsed. The underlying story of the new Westworld is the emergence of machine consciousness, a/k/a strong AI, a/k/a The Singularity, while the old Westworld was about a mere software glitch. Exploration of machine consciousness modeling (e.g., improvisation builds on memory to create awareness) as a proxy for better understanding human consciousness might not be the purpose of the show, but it’s clearly implied. And although conjectural, the speed of emergence of human consciousness contrasts sharply with the abrupt ON switch regarding theorized machine consciousness. Westworld treats them as roughly equivalent, though in fairness, 35 years or so in Westworld is in fact abrupt compared to several millennia. (Indeed, the story asserts that machine consciousness sparked alive repeatedly (which I suggested here) over those 35 years but was dialed back repeatedly. Never mind all the unexplored implications.) Additionally, the fashion in which Westworld uses the term bicameral ranges from sloppy to meaningless, like the infamous technobabble of Star Trek.

(more…)

This Savage Love column got my attention. As with Dear Abby, Ask Marylin, or indeed any advice column, I surmise that questions are edited for publication. Still, a couple minor usage errors attracted my eye, which I can let go without further chastising comment. More importantly, question and answer both employ a type of Newspeak commonplace among those attuned to identity politics. Those of us not struggling with identity issues may be less conversant with this specialized language, or it could be a generational thing. Coded speech is not unusual within specialized fields of endeavor. My fascination with nomenclature and neologisms makes me pay attention, though I’m not typically an adopter of hip new coin.

The Q part of Q&A never actually asks a question but provides context to suggest or extrapolate one, namely, “please advise me on my neuro-atypicality.” (I made up that word.) While the Q acknowledges that folks on the autism spectrum are not neurotypical, the word disability is put in quotes (variously, scare quotes, air quotes, or irony quotes), meaning that it is not or should not be considered a real or true disability. Yet the woman acknowledges her own difficulty with social signaling. The A part of Q&A notes a marked sensitivity to social justice among those on the spectrum, acknowledges a correlation with nonstandard gender identity (or is it sexual orientation?), and includes a jibe that standard advice is to mimic neurotypical behaviors, which “tend to be tediously heteronormative and drearily vanilla-centric.” The terms tediously, drearily , and vanilla push unsubtly toward normalization and acceptance of kink and aberrance, as does Savage Love in general. I wrote about this general phenomenon in a post called “Trans is the New Chic.”

Whereas I have no hesitation to express disapproval of shitty people, shitty things, and shitty ideas, I am happy to accept many mere differences as not caring two shits either way. This question asks about something fundamental human behavior: sexual expression. Everyone needs an outlet, and outliers (atypicals, nonnormatives, kinksters, transgressors, etc.) undoubtedly have more trouble than normal folks. Unless living under a rock, you’ve no doubt heard and/or read theories from various quarters that character distortion often stems from sexual repression or lack of sexual access, which describes a large number of societies historical and contemporary. Some would include the 21st-century U.S. in that category, but I disagree. Sure, we have puritanical roots, recent moral panic over sexual buffoonery and crimes, and a less healthy sexual outlook than, say, European cultures, but we’re also suffused in licentiousness, Internet pornography, and everyday seductions served up in the media via advertising, R-rated cinema, and TV-MA content. It’s a decidedly mixed bag.

Armed with a lay appreciation of sociology, I can’t help but to observe that humans are a social species with hierarchies and norms, not as rigid or prescribed perhaps as with insect species, but nonetheless possessing powerful drives toward consensus, cooperation, and categorization. Throwing open the floodgates to wide acceptance of aberrant, niche behaviors strikes me as swimming decidedly upstream in a society populated by a sizable minority of conservatives mightily offended by anything falling outside the heteronormative mainstream. I’m not advocating either way but merely observing the central conflict.

All this said, the thing that has me wondering is whether autism isn’t itself an adaptation to information overload commencing roughly with the rise of mass media in the early 20th century. If one expects that the human mind is primarily an information processor and the only direction is to process ever more information faster and more accurately than in the past, well, I have some bad news: we’re getting worse at it, not better. So while autism might appear to be maladaptive, filtering out useless excess information might unintuitively prove to be adaptive, especially considering the disposition toward analytical, instrumental thinking exhibited by those on the spectrum. How much this style of mind is valued in today’s world is an open question. I also don’t have an answer to the nature/nurture aspect of the issue, which is whether the adaptation/maladaptation is more cultural or biological. I can only observe that it’s on the rise, or at least being recognized and diagnosed more frequently.

I watched a documentary on Netflix called Jim & Andy (2017) that provides a glimpse behind the scenes of the making of Man on the Moon (1999) where Jim Carrey portrays Andy Kaufman. It’s a familiar story of art imitating life (or is it life imitating art?) as Carrey goes method and essentially channels Kaufman and Kaufman’s alter ego Tony Clifton. A whole gaggle of actors played earlier incarnations of themselves in Man on the Moon and appeared as themselves (without artifice) in Jim & Andy, adding another weird dimension to the goings on. Actors losing themselves in roles and undermining their sense of self is hardly novel. Regular people lose themselves in their jobs, hobbies, media hype, glare of celebrity, etc. all the time. From an only slightly broader perspective, we’re all merely actors playing roles, shifting subtly or dramatically based on context. Shakespeare observed it centuries ago. However, the documentary points to a deeper sense of unreality precisely because Kaufman’s principal shtick was to push discomfiting jokes/performances beyond the breaking point, never dropping the act to let his audience in on the joke or provide closure. It’s a manifestation of what I call the Disorientation Protocol.

(more…)

Reading further into Anthony Giddens’ book The Consequences of Modernity, I got a fuller (though still incomplete) sense of what is meant by his terms disembedding mechanisms, expert systems, and symbolic tokens, all of which disrupt time and space as formerly understood in traditional societies that enjoyed the benefit of centuries of continuity. I’ve been aware of analyses regarding, for instance, the sociology of money and the widespread effects of the introduction and adoption of mechanical clocks and timepieces. While most understand these developments superficially as unallayed progress, Giddens argues that they do in fact reorder our experience in the world away from an organic, immediate orientation toward an intellectualized adherence to distant, abstract, self-reinforcing (reflexive) mechanisms.

But those matters are not really what this blog post is about. Rather, this passage sparked my interest:

… when the claims of reason replaced those of tradition, they appeared to offer a sense of certitude greater than that provided by preexisting dogma. But this idea only appears persuasive so long as we do not see that the reflexivity of modernity actually subverts reason, at any rate where reason is understood as the gaining of certain knowledge … We are abroad in a world which is thoroughly constituted through reflexively applied knowledge, but where at the same time we can never be sure that any given element of that knowledge will not be revised. [p. 39]

Put another way, science and reason are axiomatically open to examination, challenge, and revision and often undergo disruptive change. That’s what is meant by Karl Popper’s phrase “all science rests upon shifting sand” and informs the central thesis of Thomas Kuhn’s well-known book The Structure of Scientific Revolutions. It’s not the narrow details that shift so much (hard sciences lead pretty reliably to applied engineering) as the overarching narrative, e.g., the story of the Earth, the cosmos, and ourselves as revealed through scientific inquiry and close examination. Historically, the absolute certainty of the medieval church, while not especially accurate in either details or narrative, yielded considerable authority to post-Enlightenment science and reason, which themselves continue to shift periodically.

Some of those paradigm shifts are so boggling and beyond the ken of the average thinker (including many college-educated folks) that our epistemology is now in crisis. Even the hard facts — like the age and shape of the Earth or its orbital relationship to other solar bodies — are hotly contested by some and blithely misunderstood by others. One doesn’t have to get bogged down in the vagaries of relativity, nuclear power and weapons, or quantum theory to lose the thread of what it means to live in the 21st century. Softer sciences such as psychology, anthropology, economics, and even history now deliver new discoveries and (re-)interpretations of facts so rapidly, like the dizzying pace of technological change, that philosophical systems are unmoored and struggling for legitimacy. For instance, earlier this year, a human fossil was found in Morocco that upended our previous knowledge of human evolution (redating the first appearance of biologically modern humans about 100,000 years earlier). More popularly, dieticians still disagree on what sorts of foods are healthy for most of us (though we can probably all agree that excess sugar is bad). Other recent developments include the misguided insistence among some neurobiologists and theorists that consciousness, free will, and the self do not exist (I’ll have a new post regarding that topic as time allows) and outright attacks on religion not just for being in error but for being the source of evil.

I have a hard time imagining other developments in 21st-century intellectual thought that would shake the foundations of our cosmology any more furiously than what we’re now experiencing. Even the dawning realization that we’ve essentially killed ourselves (with delayed effect) by gradually though consistently laying waste to our own habitat is more of an “oops” than the mind-blowing moment of waking up from The Matrix to discover the unreality of everything once believed. Of course, for fervent believers especially, the true facts (best as we can know them, since knowledge is forever provisional) are largely irrelevant in light of desire (what one wants to believe), and that’s true for people on both sides of the schism between church and science/reason.

As Shakespeare wrote in Hamlet, “There are more things in heaven and earth, Horatio, Than are dreamt of in your philosophy.” So it’s probably wrong to introduce a false dualism, though it has plenty of historical precedent. I’ll suggest instead that there are more facets and worldviews at play in the world than the two that have been warring in the West for the last 600 years.