Posts Tagged ‘Iain McGilchrist’

Color me surprised to learn that 45 is considering a new executive order mandating that the “classical architectural style shall be the preferred and default style” for new and upgraded federal buildings, revising the Guiding Principles for Federal Architecture issued in 1962. Assuredly, 45 is hardly expected to weigh in on respectable aesthetic choices considering his taste runs toward gawdy, glitzy, ostentatious surface display (more Baroque) than restraint, dignity, poise, and balance (more Classical or Neoclassical).

Since I pay little attention to mainstream news propaganda organs, I learned of this from James Howard Kunstler’s blog Clusterfuck Nation (see blogroll) as though the order had already issued, but it’s apparently still in drafting. Twas nice to read Kunstler returning to his roots in architectural criticism. He’s never left it behind entirely; his website has a regular feature called Eyesore of the Month, which I rather enjoy reading. He provides a brief primer how architectural styles in the 20th century (all lumped together as Modernism) embody the Zeitgeist, namely, techno-narcissism. (I’m unconvinced that Modernism is a direct rebuke of 20th-century fascists who favored Classicism.) Frankly, with considerably more space at his disposal, Iain McGilchrist explores Modernist architecture better and with far greater erudition in The Master and his Emissary (2010), which I blogged through some while ago. Nonetheless, this statement by Kunstler deserves attention:

The main feature of this particular moment is that techno-industrial society has entered an epochal contraction presaging collapse due to over-investments in hyper-complexity. That hyper-complexity has come to be perfectly expressed in architecture lately in the torqued and tortured surfaces of gigantic buildings designed by computers, with very poor prospects for being maintained, or even being useful, as we reel into a new age of material scarcity and diminished expectations …

This is the life-out-of-balance statement in a nutshell. We are over-extended and wedded to an aesthetic of power that requires preposterous feats of engineering to build and continuous resource inputs to operate and maintain. (Kunstler himself avers elsewhere that an abundance of cheap, easily harvested energy enabled the Modern Era, so chalking up imminent collapse due primarily to over-investment in hyper-complexity seems like substitution of a secondary or follow-on effect for the main one.) My blogging preoccupation with skyscrapers demonstrates my judgment that the vertical dimension of the human-built world in particular is totally out of whack, an instantiation of now-commonplace stunt architecture. Should power ever fail for any sustained duration, reaching floors above, say, the 10th and delivering basic services to them, such as water for sinks and toilets, quickly becomes daunting.

However, that’s a technical hurdle, not an aesthetic consideration. The Modernist government buildings in question tend to be Brutalist designs, which often look like high-walled concrete fortresses or squat, impenetrable bunkers. (Do your own image search.) They project bureaucratic officiousness and disconcern if not open hostility toward the people they purport to serve. Basically, enter at your own risk. They share with the International Style a formal adherence to chunky geometric forms, often presented impassively (as pure abstraction) or in an exploded view (analogous to a cubist painting showing multiple perspectives simultaneously). Curiously, commentary at the links above is mostly aligned with perpetuating the Modernist project and aesthetic as described by Kunstler and McGilchrist. No interruptions, difficulties, or vulnerabilities are contemplated. Commentators must not be reading the same analyses I am, or they’re blithely supportive of progress in some vague sense, itself a myth we tell ourselves.

I revisit my old blog posts when I see some reader activity in the WordPress backstage and was curious to recall a long quote of Iain McGilchrist summarizing arguments put forth by Anthony Giddens in his book Modernity and Self-identity (1991). Giddens had presaged recent cultural developments, namely, the radicalization of nativists, supremacists, Social Justice Warriors (SJWs), and others distorted by absorbed in identity politics. So I traipsed off to the Chicago Public Library (CPL) and sought out the book to read. Regrettably, CPL didn’t have a copy, so I settled on a slightly earlier book, The Consequences of Modernity (1990), which is based on a series of lectures delivered at Stanford University in 1988.

Straight away, the introduction provides a passage that goes to the heart of matters with which I’ve been preoccupied:

Today, in the late twentieth century, it is argued by many, we stand at the opening of a new era … which is taking us beyond modernity itself. A dazzling variety of terms has been suggested to refer to this transition, a few of which refer positively to the emergence of a new type of social system (such as the “information society” or the “consumer society”) but most of which suggest rather than a preceding state of affairs is drawing to a close … Some of the debates about these matters concentrate mainly upon institutional transformations, particularly those which propose that we are moving from a system based upon the manufacture of material goods to one concerned more centrally with information. More commonly, however, those controversies are focused largely upon issues of philosophy and epistemology. This is the characteristic outlook, for example, of the the author who has been primarily responsible for popularising the notion of post-modernity, Jean-François Lyotard. As he represents it, post-modernity refers to a shift away from attempts to ground epistemology and from from faith in humanly engineered progress. The condition of post-modernity is distinguished by an evaporating of the “grand narrative” — the overarching “story line” by means of which we are placed in history as being having a definite past and a predictable future. The post-modern outlook sees a plurality of heterogeneous claims to knowledge, in which science does not have a privileged place. [pp. 1–2, emphasis added]

That’s a lot to unpack all at once, but the fascinating thing is that notions now manifesting darkly in the marketplace of ideas were already in the air in the late 1980s. Significantly, this was several years still before the Internet brought the so-called Information Highway to computer users, before the cell phone and smart phone were developed, and before social media displaced traditional media (TV was only 30–40 years old but had previously transformed our information environment) as the principal way people gather news. I suspect that Giddens has more recent work that accounts for the catalyzing effect of the digital era (including mobile media) on culture, but for the moment, I’m interested in the book in hand.

Regular readers of this blog (I know of one or two) already know my armchair social criticism directed to our developing epistemological crisis (challenges to authority and expertise, psychotic knowledge, fake news, alternative facts, dissolving reality, and science denial) as well as the Transhumanist fantasy of becoming pure thought (once we evolve beyond our bodies). Until that’s accomplished with imagined technology, we increasingly live in our heads, in the abstract, disoriented and adrift on a bewildering sea of competing narratives. Moreover, I’ve stated repeatedly that highly mutable story (or narrative) underlie human cognition and consciousness, making most of us easy marks for charismatic thought leaders storytellers. Giddens was there nearly 30 years ago with these same ideas, though his terms differ.

Giddens dispels the idea of post-modernity and insists that, from a sociological perspective, the current period is better described as high modernism. This reminds me of Oswald Spengler and my abandoned book blogging of The Decline of the West. It’s unimportant to me who got it more correct but note that the term Postmodernism has been adopted widely despite its inaccuracy (at least according to Giddens). As I get further into the book, I’ll have plenty more to say.

Continuing from part 1 and part 2, let me add one further example of how meaning is reversed under the Ironic perspective. At my abandoned group blog, Creative Destruction, which garners more traffic than The Spiral Staircase despite being woefully out of date, the post that gets the most hits argues (without irony) that, in the Star Wars universe, the Empire represents the good guys and the Jedi are the terrorists despite the good vs. evil archetypes being almost cartoonishly drawn, with the principal villain having succumbed to the dark side only to be redeemed by his innate goodness in the 11th hour. The reverse argument undoubtedly has some merit, but it requires overthinking and outsmarting oneself to arrive at the backwards conclusion. A similar dilemma of competing perspectives is present in The Avengers, where Captain America is unconflicted in his all-American goodness and straightforward identification of villainy but is surrounded by other far-too-clever superheroes who overanalyze (snarkily so), cannot agree on strategy, and/or question motivations and each others’ double or triple agency. If I understand correctly, this plot hook is the basis for the civil war among allies in the next Avengers movie.

The Post-Ironic takes the reversal of meaning and paradoxical retention of opposites that characterizes the Ironic and expands issues from false dualisms (e.g., either you’re with us or against us) to multifaceted free-for-alls where anyone’s wild interpretation of facts, events, policy, and strategy has roughly equal footing with another’s precisely because no authority exists to satisfy everyone as to the truth of matters. The cacophony of competing viewpoints — the multiplicity of possible meanings conjured from any collection of evidence — virtually guarantees that someone out there (often someone loony) will speak as though reading your mind. Don’t trust politicians, scientists, news anchors, pundits, teachers, academics, your parents, or even the pope? No problem. Just belly up to the ideological buffet and cherry pick choose from any of a multitude of viewpoints, few of which have much plausibility. But no matter: it’s a smorgasbord of options, and almost none of them can be discarded out of hand for being too beyond the pale. All must be tried and entertained.

One of the themes of this blog is imminent (i.e., occurring within the lifetimes of most readers) industrial collapse resulting from either financial collapse or loss of habitat for humans (or a combination of factors). Either could happen first, but my suspicion is that financial collapse will be the lit fuse leading to explosion of the population bomb. Collapse is quite literally the biggest story of our time despite its being prospective. However, opinion on the matter is loose, undisciplined, and ranges all over the map. Consensus within expert bodies such as the Intergovernmental Panel on Climate Change, assembled specifically to study climate change and reports its findings, ought to put an end to controversy, yet waters have been so muddied by competing narratives that credulous folks, if they bother paying attention at all, can’t really tell whom to believe. It doesn’t help that even well-educated folks, including many professionals, often lack critical thinking skill with which to evaluate evidence. So instead, wishy-washy emotionalism and psychological vulnerability award hearts and minds to the most charismatic storyteller, not the truth-teller.

Perhaps the best instance of multiple meanings being simultaneously present and demanding consideration is found in the game of poker, which has become enormously popular in the past decade. To play the game effectively, one must weigh the likelihood and potential for any one of several competing narratives based on opponents’ actions. Mathematical analysis and intuition combine to recommend which scenario is most likely true and whether the risk is worth it (pot odds). If, for just one example, an opponent bets big at any point in the poker hand, several scenarios that must be considered:

  • the opponent has made his hand and cannot be beaten (e.g., nut flush, full house)
  • the opponent has a dominating hand and can be beaten only if one draws to make a better hand (e.g., top pair with high kicker or two pair)
  • the opponent has not yet fully made his hand and is on a draw (open-ended straight or four cards to a flush)
  • the opponent has a partial or weak hand and is bluffing at the pot

Take note that, as with climate change, evaluation in poker is prospective. Sometimes an opponent’s betting strategy is discovered in a showdown where players must reveal their cards; but often, one player or another mucks or folds and the actual scenario is undisclosed. The truth of climate change, until the future manifests, is to some tantalizingly unknown and contingent, as though it could be influenced by belief, hope, and/or faith. To rigorous thinkers, however, the future is charted for us with about the same inevitability as the sun rising in the morning — the biggest remaining unknown being timing.

Habitual awareness of multiple, competing scenarios extends well beyond table games and climate change. In geopolitics, the refusal to rule out the nuclear option, even when it would be completely disproportionate to a given provocation, is reckless brinkmanship. The typical rhetoric is that, like fighting dogs, any gesture of backing down would be interpreted as a display of submission or weakness and thus invite attack. So is the provocation or the response a bluff, a strong hand, or both? Although it is difficult to judge how U.S. leadership is perceived abroad (since I’m inside the bubble), the historical record demonstrates that the U.S. never hesitates to get mixed up in military action and adopts overweening strategies to defeat essentially feudal societies (e.g., Korea and Vietnam). Never mind that those strategies have been shown to fail or that those countries represented no credible threat to the U.S. Our military escapades in the 21st century are not so divergent, with the perception of threats being raised well beyond their true proportions relative to any number of health and social scourges that routinely kill many more Americans than terrorism ever did.

Because this post is already running long, conclusions will be in an addendum. Apologies for the drawn out posts.

Continuing from part 1, the Ironic is characterized by (among other things) reversal of meaning, sometimes understood as the unexpected manifested but more commonly as sarcasm. The old joke goes that in pompous, authoritarian fashion, the language/semiotics professor says to his class of neophytes, “In many languages, a double negative equals a positive, but in no language does a double positive make a negative.” In response, a student mutters under his breath, “yeah, right ….” Up to a certain age and level of cognitive development, children don’t process sarcasm; they are literal-minded and don’t understand subtext. Transcripts and text (e.g., blog posts and comments) also typically fail to transmit nonverbal cues that one may be less than earnest making certain statements. Significantly, no one is allowed to make offhand jokes in line at security checkpoints because, in that context, remarks such as “yeah, like my shoes are full of C4” are treated quite literally.

I have a vague memory of the period in my adolescence when I discovered sarcasm, at which time it was deployed almost continuously, saying the opposite of what I meant with the expectation that others (older than me) would understand the implied or latent meaning. I also adopted the same mock abuse being used elsewhere, which regrettably lasted into my late 20s. Maybe it’s a phase everyone must go through, part of growing up, and as a society, our cultural development must also pass through that phase, though I contend we remain mired in irony or ironic posturing.

The model for me was insult comedy, still in style now but more familiar from my childhood. Like most during this developmental phase, I accepted the TV as social tutor for how people communicate and what’s acceptable to say. So who can blame me or other children, fed a diet of snark and attitude (adult writers of TV shows being a lot more clever than the adolescent actors who voice the lines) from speaking the same way? But to appreciate irony more directly, consider the comedian (then and now) who levies criticism using clichés drawn from his or her own gender, race, religion, social class, etc. In comedy, sexism, racism, and class conflict are not just joke fodder but stereotyped bigotry that reinforces the very scourges they ostensibly criticize. Oh, sure, the jokes are often funny. We all know to laugh at the black comedian who trades nonstop in nigger jokes or the female who complains of being nothing more than an object for male titillation. Comedians (and special interest groups — minority or not — that lay claim to victimhood) may coopt the language of their oppressors (some actual, some imagined — see for instance those complaining about the War on Christmas), but the language and attitudes are broken down and reinforced at the same time.

This isn’t solely the domain of comedy, either. Whereas TV sitcoms are ruled by hip, ironic posturing — the show about nothing that plumbs the surprising depths inside everything trivial, banal, and inane, the show full of nerd archetypes who rise above their inherent nerdiness to be real people worthy of respect (or not surprisingly, not so worthy after all), or the endless parade of sitcom families with unrealistically precocious, smart aleck kids who take aim at everyone with a continuous stream of baleful insults, take-downs, and mockery but are, despite truly cretinous behavior, always forgiven (or passed over because another joke is imminent) and still lovable — in the virtual world (the Internet, where you are reading this), sarcasm, snark, irony, abuse, and corrosive jokiness are legion. Take, for instance, this video at Military.com and tell me there isn’t something deeply wrong with it:

One might wonder whether the intent is interdiction or recruitment (or both at once), especially if one acknowledges that most of the awful things depicted in the video are precisely what the U.S. military has been doing in the Middle East for well over a decade. The Fox News blurb linked below the video says, “The State Department is launching a tough and graphic propaganda counteroffensive against the Islamic State, using some of the group’s own images of barbaric acts against fellow Muslims to undercut its message.” Maybe the word propaganda is a mistake and publicity was intended, but I suspect that propaganda is the right word precisely because it’s understood as both pejorative and superlative. As with everything else, meaning has become polysemous.

Iain McGilchrist illustrates this with special emphasis on the arts and how substitution of symbolic tokens normalizes distortion. For instance, art theory of the Aesthetes contains a fundamental paradox:

The Aesthetes’ creed of ‘art for art’s sake’, while it sounds like an elevation of the value of art, in that it denies that it should have an ulterior purpose beyond itself — so far so good — is also a devaluation of art, in that it marginalizes its relationship with life. In other words it sacrifices the betweenness of art with life, instead allowing art to become self-reflexively fulfilled. There is a difference between the forlorn business of creating ‘art for art’s sake’, and art nonetheless being judged solely ‘as art’, not as for another purpose. [p. 409]

Isolating artistic creation in a mental or virtual transactional space ought to be quite familiar (or perhaps more accurately, assumed and thus invisible) to 21st-century people, but it was a new concept at the outset of the 20th century. The paradox is that the doctrine is both a reversal of meaning and retention of opposites together. Over the course of the 20th century, we became habituated to such thinking, namely, that a thing automatically engenders its opposite and is both things at once. For instance, what used to be called the War on Poverty, meant to help those suffering deprivation, is now also its reverse: literally a war on the poverty-stricken. Similarly, the War on Drugs, meant to eradicate drug use as a social ill, is also quite literally a war against drug users, who are a large and improper part of the bloated U.S. prison population. Reduction of government services to the poor and rampant victim-blaming demonstrate that programs once meant to assist those in need now often instead leave them to fend for themselves, or worse, pile on with criminal charges. Disinformation campaign about welfare cheats and the minimum wage are further examples of information being distorted and used to serve an unwholesome agenda.

My conclusion is not yet ready to be drawn; it’s far too subtle to fit in a Tweet or even a series of blog posts. However, consider what it means when the language we use is laden with ironic twists that force recipients of any message to hold simultaneously forward/backward, up/down, left/right, and true/false meanings. Little can be established beyond reasonable doubt not just because so many of us have been poorly served by educational institutions (or is it the students themselves — sort of a chicken-and-egg question) more interested in business and credentialing than teaching and learning that few possess the ability to assess and evaluate information (ironically, from a variety of perspectives) being spun and spoon fed to us by omnimedia but because the essential underlying structure of language and communications has been corrupted by disembedding, decontextualization, and deconstruction that relegate reality to something to be dreamt up and then used to convince others. In the end, of course, we’re only fooling ourselves.

At last, getting to my much, much delayed final book blogs (three parts) on Iain McGilchrist’s The Master and His Emissary. The book came out in 2010, I picked it up in 2012 (as memory serves), and it took me nearly two years to read its entirety, during which time I blogged my observations. I knew at the time of my previous post on the book that there would be more to say, and it’s taken considerable time to get back to it.

McGilchrist ends with a withering criticism of the Modern and Postmodern (PoMo) Eras, which I characterized as an account of how the world went mad. That still seems accurate to me: the madness that overtook us in the Modern Era led to world wars, genocides, and systematic reduction of humanity to mere material and mechanism, what Ortega y Gasset called Mass Man. Reduction of the rest of the living world to resources to be harvested and exploited by us is a worldview often called instrumental reality. From my armchair, I sense that our societal madness has shape-shifted a few times since the fin de siècle 1880s and 90s. Let’s start with quotes from McGilchrist before I extend into my own analysis. Here is one of his many descriptions of the left-hemisphere paradigm under which we now operate:

In his book on the subject, Modernity and Self-identity, Anthony Giddens describes the characteristic disruption of space and time required by globalisation, itself the necessary consequence of industrial capitalism, which destroys the sense of belonging, and ultimately of individual identity. He refers to what he calls ‘disembedding mechanisms’, the effect of which is to separate things from their context, and ourselves from the uniqueness of place, what he calls ‘locale’. Real things and experiences are replaced by symbolic tokens; ‘expert’ systems replace local know-how and skill with a centralised process dependent on rules. He sees a dangerous form of positive feedback, whereby theoretical positions, once promulgated, dictate the reality that comes about, since they are then fed back to us through the media, which form, as much as reflect, reality. The media also promote fragmentation by a random juxtaposition of items of information, as well as permitting the ‘intrusion of distant events into everyday consciousness’, another aspect of decontextualisation in modern life adding to loss of meaning in the experienced world. [p. 390]

Reliance on abstract, decontextualized tokens having only figurative, nonintrinsic power and meaning is a specific sort of distancing, isolation, and reduction that describes much of modern life and shares many characteristics with schizophrenia, as McGilchrist points out throughout the chapter. That was the first shape-shift of our madness: full-blown mechanization borne out of reductionism and materialism, perspectives bequeathed to us by science. The slow process had been underway since the invention of the mechanical clock and discovery of heliocentrism, but it gained steam (pun intended) as the Industrial Revolution matured in the late 19th century.

The PoMo Era is recognized as having begun just after the middle of the 20th century, though its attributes are questionably defined or understood. That said, the most damning criticism leveled at PoMo is its hall-of-mirrors effect that renders objects in the mirrors meaningless because the original reference point is obscured or lost. McGilchrist also refers repeatedly to loss of meaning resulting from the ironizing effect of left-brain dominance. The corresponding academic fad was PoMo literary criticism (deconstruction) in the 1970s, but it had antecedents in quantum theory. Here is McGilchrist on PoMo:

With post-modernism, meaning drains away. Art becomes a game in which the emptiness of a wholly insubstantial world, in which there is nothing beyond the set of terms we have in vain used to ‘construct’ mean, is allowed to speak for its own vacuity. The set of terms are now seen simply to refer to themselves. They have lost transparency; and all conditions that would yield meaning have been ironized out of existence. [pp. 422–423]

This was the second shape-shift: loss of meaning in the middle of the 20th century as purely theoretical formulations, which is to say, abstraction, gained adherents. He goes on:

Over-awareness … alienates us from the world and leads to a belief that only we, or our thought processes, are real … The detached, unmoving, unmoved observer feels that the world loses reality, becomes merely ‘things seen’. Attention is focussed on the field of consciousness itself, not on the world beyond, and we seem to experience experience … [In hyperconsciousness, elements] of the self and of experience which normally remain, and need to remain, intuitive, unconscious, become the objects of a detached, alienating attention, the levels of consciousness multiply, so that there is an awareness of one’s own awareness, and so on. The result of this is a sort of paralysis, in which even everyday ‘automatic’ actions such as moving one leg in front of another in order to walk can become problematic … The effect of hyperconsciousness is to produce a flight from the body and from its attendant emotions. [pp. 394–396]

Having devoted a fair amount of my intellectual life to trying to understand consciousness, I immediately recognized the discussion of hyperconsciousness (derived from Louis Sass) as what I often call recursion error, where consciousness becomes the object of its own contemplation, with obvious consequences. Modern, first-world people all suffer from this effect to varying degrees because that is how modern consciousness is warped shaped.

I believe we can observe now two more characteristic extensions or variations of our madness, probably overlapping, not discrete, following closely on each other: the Ironic and Post-Ironic. The characteristics are these:

  • Modern — reductive, mechanistic, instrumental interpretation of reality
  • Postmodern — self-referential (recursive) and meaningless reality
  • Ironic — reversed reality
  • Post-Ironic — multiplicity of competing meanings/narratives, multiple realities

All this is quite enough to the chew on for a start. I plan to continue in pts. 2 and 3 with description of the Ironic and Post-Ironic.

Returning to The Master and His Emissary by Iain McGilchrist, I finished reading the book some while back (took more than two years, I think) but have two or three more blog posts to finish off my book blogging project. Warning: this is another long post.

Part One of the book is about the divided brain: its structural and functional attributes that make us who we are. This presumes identity resides mostly in the brain/mind rather than the body (probably my presumption, not McGilchrist’s). Part Two is how the brain shaped our world, referring more to human history, institutions, and values than physical setting, though that, too, is an outgrowth of our brain structure in light of how thoroughly mankind has shaped and engineered his own environment. Part Two traces through history from the ancient world to the Renaissance, the Reformation (and Counter-Reformation), the Enlightenment, Romanticism, the Industrial Revolution, and finally to the linked Modern and Postmodern worlds. (McGilchrist uses the form Post-Modern, which I will shorten to PoMo).

I had expected to be far more comfortable with Part Two than Part One owing to my greater familiarity with themes of human history and Western culture than with brain structure and function. It surprised me how much Part Two also discusses brain structure, but in hindsight, that makes good sense because the book’s thesis is that brain structure has had substantial influence on all of Western culture. What really surprised me, however, is that the section on Modernism and PoMo affords McGilchrist the opportunity to launch into a sustained harangue. Indeed, given the virulence of his attack, it felt like the book up to that point was merely a set-up to lay foundation for a rant fulminating in McGilchrist’s mind all along.

We have learned plenty in the last eighty or more years about the brain on drugs, both recreational (caffeine, alcohol, opium, cannabis, cocaine, crack, ecstasy, etc.) and psychiatric (Quaaludes, Valium, Zoloft, Prozac, Lithium, etc.). McGilchrist provides a withering account of what the brain is like on PoMo (a late-stage intensification of Modernism, really), which now constitutes our basic operating instructions or deep culture. The account includes a comparison of the PoMo mind with the aberrant psyche of schizophrenia, drawing heavily on the work of Louis Sass. Here is McGilchrist’s not-so-brief recap of right-brain (the Master) damaged individuals, defaulting excessive processing control to the left brain (the Emissary):

(more…)

Backtracking to something in The Master and His Emissary I read a more than two months ago, McGilchrist has a fairly involved discussion of Julian Jaynes’ book The Origin of Consciousness in the Breakdown of the Bicameral Mind. I read Jaynes more than a decade ago and was pretty excited by his thesis, which I couldn’t then evaluate or assess very well. (I’m probably not much better equipped now.) Amazon.com reveals that there are other reviews and updates of Jaynes’ work since its publication in 1979, but I was unaware of them until just now. I was pleased to find McGilchrist give so much attention to Jaynes — a discussion spanning 4 pp. with the benefit of several decades of further research. I will provide McGilchrist’s summary of Jaynes’ highly original and creative thesis rather than rely on memory more than a decade old:

… [C]onsciousness, in the sense of introspective self-awareness, first arose in Homeric Greece. He [Jaynes] posits that, when the heroes of the Iliad (and the Old Testament) are reported as having heard the voices of the gods (or God) giving them commands or advice, this is not a figurative expression: they literally heard voices. The voices were speaking their own intuitive thoughts, and arose from their own minds, but were perceived as external, because at this time man was becoming newly aware of his own (hitherto unconscious) intuitive thought processes.

If one accepts (as I believe one should) that the ancient mind was fundamentally different from the modern mind, the latter of which was just beginning to coalesce at the time of the ancient Greeks (ca. 8th century BCE), this explains why all the sword-and-sandal movie epics get characters fundamentally wrong by depicting heroes especially but others as well with the purposefulness and self-possession of modern thinkers well before such qualities were established in antiquity. Antiquity is not prehistory, however, so there’s no danger of ancients being depicted as cavemen grunting and gesticulating without the benefit of language (except perhaps when they’re presented in stylized fashion as voiceless barbarians). But in typical modern gloss on centuries long past, there is little consideration of a middle ground or extended transition between modern consciousness and protoconsciousness (not unlike the transition from protolanguage to myriad languages of amazing sophistication). This is why Jaynes was so exciting when I first read him: he mapped, provisionally perhaps, how we got here from there.

McGilchrist believes that while the description above is accurate, Jaynes’ supporting details stem from a faulty premise, borne of an unfortunate mischaracterization of schizophrenia that was current in the 1970s in psychology and psychiatry. Never mind that schizophrenia is an affliction only a couple centuries old; the misunderstanding is that schizophrenics suffer from accentuated emotionalism and withdrawal into the body or the sensorium when in fact they are hyperrational and alienated from the body. The principal point of comparison between ancients and modern schizophrenics is that they both hear voices, but that fact arises from substantially different contexts and conditions. For Jaynes, hearing voices in antiquity came about because the unified brain/mind broke down into hemispheric competition where failure to cooperate resulted in a sort of split mind. According to McGilchrist, there was indeed a split mind at work, but not the one Jaynes believed. Rather, the split mind is the subject/object or self/other distinction, something readers of this blog may remember I have cited repeatedly as having initially developed in the ancient world. (Whether this is my own intuition or a synthesis of lots of reading and inquiry into historical consciousness is impossible for me to know anymore and unimportant anyway.) McGilchrist describes the subject/object distinction as the ability to objectify and to hold an object or idea as a “necessary distance” in the mind to better apprehend it, which was then generalized to the self. Here is how McGilchrist describes Jaynes’ error:

Putting it at its simplest, where Jaynes interprets the voices of the gods as being due to the disconcerting effects of the opening of a door between the hemispheres, so that the voices could for the first time be heard, I seen them as being due to the closing of the door, so that the voices of intuition now appear distant, ‘other’; familiar but alien, wise but uncanny — in a word, divine.

What’s missing from McGilchrist’s reevaluation of Jaynes is how hearing voices in the ancient world may also account for the rise of polytheism and how the gradual disappearance of those same voices as modern consciousness solidified led to monotheism, an artifact of the transitional mind of antiquity that survived into modernity. I lack the anthropological wherewithal to survey ancient civilizations elsewhere in the Middle East (such as Egypt) or in Asia (such as China), but it seems significant to me that spiritual alternatives beyond the three Abrahamic religions are rooted in animism (e.g., sun, moon, other animals, Nature) or what could be called lifeways (e.g., Taoism and Buddhism) and lack father and mother figureheads. (Mother Nature doesn’t really compare to traditional personification of sky gods.) This omission is understandably outside the scope of The Master and His Emissary, but it would have been interesting to read that discussion had it been included. Another interesting omission is how habituation with these inner voices eventually became the ongoing self-narrative we all know: talking to ourselves inside our heads. Modern thinkers readily recognize the self talking to itself, which is the recursive nature of self-awareness, and loss of proper orientation and self-possession are considered aberrant — crazy unless one claims to hear the voice of god (which strangely no one believes even if they believe in god). In short, god (or the gods) once spoke directly to us, but no longer.

For me, these observations are among the pillars of modern consciousness, an ever-moving puzzle picture I’ve been trying to piece together for years. I don’t mean to suggest that there are three large bands of historical consciousness, but it should be clear that we were once in our evolutionary history nonconscious (not unconscious — that’s something else) but developed minds/selves over the eons. As with biology and language, there is no point of arrival where one could say we are now fully developed. We continue to change constantly, far more quickly with language and consciousness than with biology, but there are nonetheless several observable developmental thresholds. The subject/object distinction from antiquity is one that profoundly informs modern consciousness today. Indeed, the scientific method is based on objectification. This intellectual pose is so powerful and commonplace (but not ubiquitous) that immersion, union, and loss of self is scarcely conceivable outside of a few special circumstances that render us mostly nonthinking, such as being in the zone, flow, sexual congress, religious ecstasy, etc., where the self is obliterated and we become “mindless.”