Archive for the ‘Religion’ Category

The last traffic report observed the 10-year anniversary of this blog. For this traffic report, I am on the cusp of achieving another significant threshold: 1,000 subscribers (just five more to go). A while back, I tried (without success) to discourage others from subscribing to this blog in hopes that it would provide responsive traffic. Since then, more than 700 new subscribers have appeared, many of them commercial blogs hawking things like photography, technology services (especially SEO), fashion, and celebrity gossip. I used to at least have one look at them, but I no longer do. The most incongruent (to those who are familiar with the themes of this blog) are the testimonial blogs in praise of (someone’s) god. If I could unsubscribe others on my end, I probably would; but alas, my basic WordPress blog does not have that feature.

So what besides the almost 1,000 subscribers has occurred here since the last report? Not a whole lot besides my regular handwringing about things still wrong in the world. There was that small matter of the U.S. presidential election, which garnered some of my attention, but that really falls within the wider context of the U.S. destroying itself in fits and starts, or even more generally, the world destroying itself in fits and starts. More than usual, I’ve reblogged and updated several old posts, usually with the suffix redux. I haven’t had any multipart blogs exploring ideas at length.

The Numbers

Total posts (not counting this one) are 474. Unique visitors are 22,017. Daily hits (views) range from 10 to 60 or so. Total hits are 95,081. Annual hits had climbed to about 12,500 in 2013 but have since declined steadily. The most-viewed post by far continues to be Scheler’s Hierarchy, with most of the traffic coming from the Philippines.

Doom Never Dies

Whereas the so-called greatest story ever told refers to Jesus for most people, I think the most important story ever told (and ignored) is how we humans drove the planet into the Sixth Extinction and in the process killed ourselves. I find more and more people simply acknowledging the truth of climate change (though not yet NTE) even as Republicans continue to deny it aggressively. Now that Republicans will control both houses of Congress and the White House (debatable whether Trump is truly a Republican), those already convinced expect not just an acceleration of weather-related calamity but accelerated stoking of the engine powering it. I leave you with this relevant quote from an article in Harper’s called “The Priest in the Trees“:

What must die is the materialist worldview in which physical reality is viewed as just stuff: “The world is not merely physical matter we can manipulate any damn way we please.” The result of that outlook is not just a spiritual death but a real, grisly, on-the-cross kind of death. “We are erecting that cross even now,” he said.

Addendum

A meaningless milestone (for me at least), but a milestone nonetheless:

1000-followers

Caveat: Apologies for this overlong post, which random visitors (nearly the only kind I have besides the spambots) may find rather challenging.

The puzzle of consciousness, mind, identity, self, psyche, soul, etc. is an extraordinarily fascinating subject. We use various terms, but they all revolve around a unitary property and yet come from different approaches, methodologies, and philosophies. The term mind is probably the most generic; I tend to use consciousness interchangeably and more often. Scientific American has a entire section of its website devoted to the mind, with subsections on Behavior & Society, Cognition, Mental Health, Neurological Health, and Neuroscience. (Top-level navigation offers links to these sections: The Sciences, Mind, Health, Tech, Sustainability, Education, Video, Podcasts, Blogs, and Store.) I doubt I will explore very deeply because science favors the materialist approach, which I believe misses the forest through the trees. However, the presence of this area of inquiry right at the top of the page indicates how much attention and research the mind/consciousness is currently receiving.

A guest blog at Scientific American by Adam Bear entitled “What Neuroscience Says about Free Will” makes the fashionable argument (these days) that free will doesn’t exist. The blog/article is disclaimed: “The views expressed are those of the author(s) and are not necessarily those of Scientific American.” I find that a little weaselly. Because the subject is still wide open to interpretation and debate, Scientific American should simply offer conflicting points of view without worry. Bear’s arguments rest on the mind’s ability to revise and redate experience occurring within the frame of a few milliseconds to allow for processing time, also known as the postdictive illusion (the opposite of predictive). I wrote about this topic more than four years ago here. Yet another discussion is found here. I admit to being irritated that the questions and conclusions stem from a series of assumptions, primarily that whatever free will is must occur solely in consciousness (whatever that is) as opposed to originating in the subconscious and subsequently transferring into consciousness. Admittedly, we use these two categories — consciousness and the subconscious — to account for the rather limited amount of processing that makes it all the way into awareness vs. the significant amount that remains hidden or submerged. A secondary assumption, the broader project of neuroscience in fact, is that, like free will, consciousness is housed somewhere in the brain or its categorical functions. Thus, fruitful inquiry results from seeking its root, seed, or seat as though the narrative constructed by the mind, the stream of consciousness, were on display to an inner observer or imp in what Daniel Dennett years ago called the Cartesian Theater. That time-worn conceit is the so-called ghost in the machine. (more…)

Events of the past few days have been awful: two further shootings of black men by police under questionable circumstances (Louisiana and Minnesota), and in response, a sniper killing five police officers (Texas) and injuring more. Everything is tragic and inexcusable; I offer no refuge for armed men on both sides of the law using lethal force against others. But I will attempt to contextualize. Yes, issues of race, guns, and public safety are present. The first two are intractable debates I won’t wade into. However, the issue of public safety seems to me central to what’s going on, namely, the constant beat of threatening drums and related inflammatory speech that together have the effect of putting everyone on edge and turning some into hair-triggers.

I’ve read news reports and opinion columns that subject these events to the usual journalistic scrutiny: factual information strung together with calm, measured assurance that what occurred was the result of intemperate individuals not representative of the public at large. So go ahead and worry, but not too much: those guys are all outliers — a few bad apples. As I take the temperature of the room (the country, actually), however, my sense is that we are approaching our boiling point and are frankly likely to boil over soon, perhaps in concert with party nominating conventions expected to break with convention and further reveal already disastrous operations of the federal government. The day-to-day,  smooth surface of American life — what we prefer in times of relative peace and prosperity — has also been punctuated for decades now with pops and crackles in the form of mass shootings (schools, theaters, churches, clubs, etc.) and a broad pattern of civil authorities surveilling and bringing force to bear against the public they’re meant to protect and serve. How long before it all becomes a roiling, uncontrollable mess, with mobs and riots being put down (or worse) by National Guardsmen just like the 1960s? Crowd control and management techniques have been refined considerably since that last period of civil unrest (I wrote about it briefly here), which is to say, they’re a lot creepier than water cannons, tear gas, and pepper spray (nothing to laugh about if one has been on the receiving end of any of those).

My question, to anyone with the equanimity to think twice about it, is this: aren’t these outcomes a rather predictable result of the bunker mentality we’ve adopted since being instructed by the media and politicians alike that everyone the world over is coming to take away our guns freedom? Further, aren’t the vague, unfocused calls to action spouted constantly by arch-conservative demagogues precisely the thing that leads some unhinged folks to actually take action because, well, no one else is? Donald Trump has raised diffuse threats and calls to action to an art form at his rallies, with supporters obliging by taking pot shots at others at the mere whiff of dissent from his out-of-tune-with-reality message. (Don’t even think about being nonwhite in one of those crowds.) He’s only one of many stirring the pot into a froth. Moreover, weak minds, responding in their lizard brains to perceived threat, have accepted with gusto the unfounded contention that ISIS in particular, terrorism in general, represents an existential threat to the U.S., and thus, generalizing the threat, are now calling for curtailing the practice of Islam (one of three Abrahamic religions arising in the ancient world with over 2 billion adherents worldwide) in the U.S. Apparently, the absolutism of freedom of religion (can also be interpreted as freedom from establishment of a state religion) enshrined in the 1st Amendment to the U.S. Constitution is lost on those whose xenophobia erases all reasoned thought.

The mood is turning quite ugly. A quick survey of history probably reveals that it’s always been that way. Many of us (by no means all of us) understand calls to “make America great again” as coded speech advocating return to a white male Christian dominated culture. So much for our vaunted freedom.

I get exasperated when I read someone insisting dogmatically upon ideological purity. No such purity exists, as we are all participants, in varying degrees, in the characteristics of global civilization. One of those characteristics is the thermodynamic cycle of energy use and consumption that gradually depletes available energy. The Second Law guarantees depletion, typically over cosmological time frames, but we are seeing it manifest over human history as EROI decreases dramatically since the start of the fossil fuel era. So playing gotcha by arguing, for instance, “You use electricity, too, right? Therefore, you have no right to tell me what I can and can’t do with electricity!” is laughably childish. Or put another way, if even an inkling of agreement exists that maybe we should conserve, forgo needless waste, and accept some discomfort and hardship, then it’s typically “you first” whenever the issue is raised in the public sphere.

In a risible article published at Townhall.com, Michelle Malkin calls the Pope a hypocrite for having added his authority to what scientists and environmentalists have been saying: we face civilization-ending dangers from having fouled our own nest, or “our common home” as the Pope calls it. As though that disrespect were not yet enough, Malkin also tells the Pope essentially to shut it:

If the pontiff truly believes “excessive consumption” of modern conveniences is causing evil “climate change,” will he be shutting down and returning the multi-million-dollar system Carrier generously gifted to the Vatican Museums?

If not, I suggest, with all due respect, that Pope Francis do humanity a favor and refrain from blowing any more hot air unless he’s willing to stew in his own.

The disclaimer “with all due respect” does nothing to ease the audacity of a notorious ideologue columnist picking a fight over bogus principles with the leader of the world’s largest church, who (I might add) is slowly regaining some of the respect the Catholic Church lost over the past few scandalous decades. I suspect Malkin is guilelessly earnest in the things she writes and found a handy opportunity to promote the techno-triumphalist book she researched and wrote for Mercury Ink (owned by Glenn Beck). However, I have no trouble ignoring her completely, since she clearly can’t think straight.

Plenty of other controversy followed in the wake of the latest papal encyclical, Laudato Si. That’s to be expected, I suppose, but religious considerations and gotcha arguments aside, the Pope is well within the scope of his official concern to sound the alarm alongside the scientific community that was once synonymous with the Church before they separated. If indeed Pope Francis has concluded that we really are in the midst of both an environmental disaster and a mass extinction (again, more process than event), it’s a good thing that he’s bearing witness. Doomers like me believe it’s too little, too late, and that our fate is already sealed, but there will be lots of ministry needed when human die-offs get rolling. Don’t bother seeking any sort of grace from Michelle Malkin.

Since the eruption of bigotry against Islam on the Bill Maher’s show Real Time last October, I have been bugged by the ongoing tide of vitriol and fear-mongering as radical Islam becomes this century’s equivalent of 20th-century Nazis. There is no doubt that the Middle East is a troubled region of the world and that many of its issues are wrapped about Islamic dogma (e.g., jihad) that have been hijacked by extremists. Oppression, misogyny, violence, and terrorism will get no apologetics from me. However, the fact that deplorable behaviors often have an Islamic flavor does not, to my mind, excuse bigotry aimed at Islam as a whole. Yet that is precisely the argument offered by many pundits and trolls.

Bill Maher did not get the ball rolling, exactly, but he gave it a good shove, increasing its momentum and seeming righteousness rightness among weak thinkers who take their cues and opinions from television personalities. Maher wasn’t alone, however, as Sam Harris was among his guests and argued that Islam is “the mother lode of bad ideas.” The notable exception on the panel that episode was Ben Affleck (Nicholas Kristof also made good points, though far more diplomatically), who called bullshit on Islam-baiting but failed to convince Maher or Harris, whose minds were already made up. Maher’s appeals to authoritative “facts” and “reality” (a sad bit of failed rhetoric he trots out repeatedly) failed to convince in the other direction.

(more…)

Backtracking to something in The Master and His Emissary I read a more than two months ago, McGilchrist has a fairly involved discussion of Julian Jaynes’ book The Origin of Consciousness in the Breakdown of the Bicameral Mind. I read Jaynes more than a decade ago and was pretty excited by his thesis, which I couldn’t then evaluate or assess very well. (I’m probably not much better equipped now.) Amazon.com reveals that there are other reviews and updates of Jaynes’ work since its publication in 1979, but I was unaware of them until just now. I was pleased to find McGilchrist give so much attention to Jaynes — a discussion spanning 4 pp. with the benefit of several decades of further research. I will provide McGilchrist’s summary of Jaynes’ highly original and creative thesis rather than rely on memory more than a decade old:

… [C]onsciousness, in the sense of introspective self-awareness, first arose in Homeric Greece. He [Jaynes] posits that, when the heroes of the Iliad (and the Old Testament) are reported as having heard the voices of the gods (or God) giving them commands or advice, this is not a figurative expression: they literally heard voices. The voices were speaking their own intuitive thoughts, and arose from their own minds, but were perceived as external, because at this time man was becoming newly aware of his own (hitherto unconscious) intuitive thought processes.

If one accepts (as I believe one should) that the ancient mind was fundamentally different from the modern mind, the latter of which was just beginning to coalesce at the time of the ancient Greeks (ca. 8th century BCE), this explains why all the sword-and-sandal movie epics get characters fundamentally wrong by depicting heroes especially but others as well with the purposefulness and self-possession of modern thinkers well before such qualities were established in antiquity. Antiquity is not prehistory, however, so there’s no danger of ancients being depicted as cavemen grunting and gesticulating without the benefit of language (except perhaps when they’re presented in stylized fashion as voiceless barbarians). But in typical modern gloss on centuries long past, there is little consideration of a middle ground or extended transition between modern consciousness and protoconsciousness (not unlike the transition from protolanguage to myriad languages of amazing sophistication). This is why Jaynes was so exciting when I first read him: he mapped, provisionally perhaps, how we got here from there.

McGilchrist believes that while the description above is accurate, Jaynes’ supporting details stem from a faulty premise, borne of an unfortunate mischaracterization of schizophrenia that was current in the 1970s in psychology and psychiatry. Never mind that schizophrenia is an affliction only a couple centuries old; the misunderstanding is that schizophrenics suffer from accentuated emotionalism and withdrawal into the body or the sensorium when in fact they are hyperrational and alienated from the body. The principal point of comparison between ancients and modern schizophrenics is that they both hear voices, but that fact arises from substantially different contexts and conditions. For Jaynes, hearing voices in antiquity came about because the unified brain/mind broke down into hemispheric competition where failure to cooperate resulted in a sort of split mind. According to McGilchrist, there was indeed a split mind at work, but not the one Jaynes believed. Rather, the split mind is the subject/object or self/other distinction, something readers of this blog may remember I have cited repeatedly as having initially developed in the ancient world. (Whether this is my own intuition or a synthesis of lots of reading and inquiry into historical consciousness is impossible for me to know anymore and unimportant anyway.) McGilchrist describes the subject/object distinction as the ability to objectify and to hold an object or idea as a “necessary distance” in the mind to better apprehend it, which was then generalized to the self. Here is how McGilchrist describes Jaynes’ error:

Putting it at its simplest, where Jaynes interprets the voices of the gods as being due to the disconcerting effects of the opening of a door between the hemispheres, so that the voices could for the first time be heard, I seen them as being due to the closing of the door, so that the voices of intuition now appear distant, ‘other’; familiar but alien, wise but uncanny — in a word, divine.

What’s missing from McGilchrist’s reevaluation of Jaynes is how hearing voices in the ancient world may also account for the rise of polytheism and how the gradual disappearance of those same voices as modern consciousness solidified led to monotheism, an artifact of the transitional mind of antiquity that survived into modernity. I lack to anthropological wherewithal to survey ancient civilizations elsewhere in the Middle East (such as Egypt) or in Asia (such as China), but it seems significant to me that spiritual alternatives beyond the three Abrahamic religions are rooted in animism (e.g., sun, moon, other animals, Nature) or what could be called lifeways (e.g., Taoism and Buddhism) and lack father and mother figureheads. (Mother Nature doesn’t really compare to traditional personification of sky gods.) This omission is understandably outside the scope of The Master and His Emissary, but it would have been interesting to read that discussion had it been included. Another interesting omission is how habituation with these inner voices eventually became the ongoing self-narrative we all know: talking to ourselves inside our heads. Modern thinkers readily recognize the self talking to itself, which is the recursive nature of self-awareness, and loss of proper orientation and self-possession are considered aberrant — crazy unless one claims to hear the voice of god (which strangely no one believes even if they believe in god). In short, god (or the gods) once spoke directly to us, but no longer.

For me, these observations are among the pillars of modern consciousness, an ever-moving puzzle picture I’ve been trying to piece together for years. I don’t mean to suggest that there are three large bands of historical consciousness, but it should be clear that we were once in our evolutionary history nonconscious (not unconscious — that’s something else) but developed minds/selves over the eons. As with biology and language, there is no point of arrival where one could say we are now fully developed. We continue to change constantly, far more quickly with language and consciousness than with biology, but there are nonetheless several observable developmental thresholds. The subject/object distinction from antiquity is one that profoundly informs modern consciousness today. Indeed, the scientific method is based on objectification. This intellectual pose is so powerful and commonplace (but not ubiquitous) that immersion, union, and loss of self is scarcely conceivable outside of a few special circumstances that render us mostly nonthinking, such as being in the zone, flow, sexual congress, religious ecstasy, etc., where the self is obliterated and we become “mindless.”

In the Toy Story movie franchise, the character Buzz Lightyear often voices the phrase “To infinity … and beyond!” One could argue that this Disney creation is as much a vehicle to market action figures as it is storytelling. Either way, the characters are stand-ins for easily recognizable archetypes, which when deployed against children’s unformed minds prove to be pretty effective brainwashing. Buzz Lightyear’s story isn’t the main focus of the Toy Story franchise, though he has full treatments elsewhere. He’s clearly a militaristic, play-by-the-rules (until they become inconvenient) type cut from the explorer/conqueror cloth that has been a human preoccupation and folly from Alexander the Great to the Spanish conquistadors to Capt. James T. Kirk of the Star Trek franchise. They all seek to expand their dominion into unknown but not necessarily unoccupied territory — a continental or interstellar land grab, if you will. For those of us in the early 21st century, an age of fully enveloping media, the fictional characters probably have as much influence as real, historical figures, even if the former’s impact is reduced to catchphrases that work like political soundbites or talking points, gaining power through heavy repetition. A character cannot be iconic without such shorthand as “Beam me up, Scotty,” “Make it so,” “Today is a good day to die,” “I’ll be back,” “Use the Force, Luke,” etc. Buzz Lightyear’s rhetoric is spatial, but humanity is also heavily interested in different aspects of time or the conflation of the two: space-time.

To say that time telescopes in human conception is both obvious and strangely hidden from view. We operate continuously according to different time horizons, from the immediate to the near-term to the long-term, and how we strategize changes completely to accommodate each. Whereas we occupy what some call an eternal present, like all other creatures in fact, where immediate sensation is ever at the forefront of cognition, we may be the only species able to project ourselves backwards and forwards in time beyond a few moments to contemplate history and the future. This isn’t to say we alone among species possess memory; that’s clearly not true. But our symbolic and conceptual thinking is unique, and it gives rise to varied and sophisticated ways of relating to space-time.

(more…)

In my ongoing reading of The Master and His Emissary, I came across something very interesting on p. 321:

[Max] Weber held that the cognitive structure of Protestantism was closely associated with capitalism: both involve an exaggerated emphasis on individual agency, and a discounting of what might be called ‘communion’. An emphasis on individual agency inevitably manifests itself, as David Bakan has suggested, in self-protection, self-assertion, and self-expansion, whereas communion manifests itself in the sense of being at one with others. ‘Agency,’ he writes, ‘manifests itself in isolation, alienation, and aloneness; communion in contact, openness and union. Agency manifests itself in the urge to master: communion in non-contractual co-operation’. Success in material terms became, under Protestantism, a sign of spiritual prowess, the reward of God to his faithful.

David Bakan was writing in his 1966 book The Duality of Human Existence: Isolation and Communion in Western Man. The degree to which his paradigm developed out of agency and communion fits the thesis of Iain McGilchrist is canny, especially considering how Bakan’s book predates McGilchrist’s by a half century.

Perhaps I’m reading too much into the issue, but the underlying concern here appears to be salvation, which has both earthly manifestations (e.g., happiness, mostly understood in terms of financial success and its concomitant material rewards) and heavenly (e.g., validation of individual righteousness and entry into heaven). But that point is buried under layers of obfuscation in the form of categorizing and describing. Indeed, this is how we respond to issues of ultimate human concern these days: we analyze. What we don’t do is sense and feel and intuit. Those basic human behaviors are overwhelmed by cognitive overactivity, whether thinking about agency and self or for that matter communion (which is self again, reconstituted as selflessness as one enters into flow, context, and intersubjectivity). This blog is no exception. Funny thing, though: social and cultural histories tell about human self-organization and mentalité, as opposed to a history of events, and how intuitive responses — expressions of the Zeitgeist, if you willforce their way through all the obfuscation with glaring clarity. Considerable hindsight is required to understand it, which is why people cannot tell their own histories well.

(more…)

Truth Abdicated

Harriett Baber has a rather surprising opinion column at The Guardian in response to the question “can we choose what we believe?” I don’t normally care about professions of faith because they can’t be held to a shared standard of logic, evidence, or reason. But when an academic — a professor of philosophy in this case — says that truth doesn’t matter, I take notice:

Truth is overrated. And it’s remarkable that the very individuals who are most vocal in their opposition to religiously motivated puritanism are the most fiercely puritanical when it comes to truth. They condemn Christians for imposing constraints on sensual pleasure but are outraged that we should take pleasure in the consolations of religion instead of squarely facing what they believe to be the hard truths about the human condition.

People in any case overestimate the value of truth and underestimate the difficulty of arriving at it. There are a great many truths in which I have absolutely no interest — truths about the lifecycle of Ctenocephalides felis, (the common cat flea) or the extensive body of truths about the condition of my teeth that my dentist imposes on me. I see no reason why I should bother with these truths or make a point of believing them.

She uses the form “I see no reason why …” repeatedly and goes on to say that truth should be no impediment to believing whatever a person wants, especially if believing gives comfort. The idea of customizable belief reminded me of Sheilaism as described by Dick Meyer in his book Why We Hate Us. (The idea of Sheilaism may originate in Robert Bellah’s Habits of the Heart.) Sheilaism is a sort of raw individualism that rejects authority (and truth) and creates a hodgepodge belief system not unlike assembling a meal haphazardly from a buffet. Sheilaism also demonstrates how ideas are fragmented, torn from their contexts, converted into free-floating consumables, and reconstituted as expressions of one’s personal tastes and predilections. While I have considerable patience for regular folks in this regard, an academic who exhibits such muddy, relativist thinking is a serious abdication of academic integrity. One can only wonder what goes on in Prof. Baber’s classroom.

Plagiarism Denied

The New York Times has an article discussing plagiarism among students in the digital age. According to the article,

Digital technology makes copying and pasting easy, of course. But that is the least of it. The Internet may also be redefining how students — who came of age with music file-sharing, Wikipedia and Web-linking — understand the concept of authorship and the singularity of any text or image.

What this means is that students increasingly don’t even get that using sources without attribution is wrong; plagiarism doesn’t even register as academic misconduct. Who teaches students academic values and upholds their worth? Teachers and professors, who are apparently failing miserably under the pressure of the copy/paste functions of word processors. The article also provides numerous examples of brazen plagiarism committed by students (and parents!) who do in fact know better but do it anyway. Similarly, outside of academe, books, news articles, blog posts, etc. use and recycle large tracts of text without proper attribution and without being called to task. Some are even award winners.

Aside: the notion that creative works embodied in a digital format suitable for easy reproduction are available for use and reuse has swept away the entire concept of copyright. File sharing via Napster or YouTube raised the legal issues, but for all intents and purposes, the horse has already left the barn since so few respect copyright anymore. Although not true in the legal sense, in practice, the public sphere has become the public domain.

Evidence Irrelevant

Finally, one academic blogger expands the NYT article linked above to the principled use of evidence in academic work and beyond:

… I’m enough of a devotee of our recent view of authorship and creativity (and property) to think that the norms established around plagiarism during the 20th Century need some kind of continuing defense, just with sufficient awareness of the changes in textual production and circulation.

What really worries me is what’s happening to the larger purpose of the analytical writing which tempts some to plagiarism. The thing I’m honestly afraid of is that we’ve come to a point where the professional value of learning to build strong arguments based on and determined by a command over solid evidence is in rapid decline.

These are good statements, but the blogger goes on to ask whether teaching sound academic standards is now a disservice to students in the professional world beyond academe where misshaped evidence, outright errors, omissions, and lies go unchecked and unpunished.

So maybe that’s the kind of writing and speaking we need to train our students to do: rhetorically effective and infinitely mutable on substance, entirely about rather than just sensibly attentive to affect and audience. At what point is it perverse to continue making buggy whips while the Ford plant churns away right next door?

As I said in my comment at the blog, if find it astonishing that an academic could even voice the question. Although I’m certain to be in the minority on this point, the answer is to me as duh! obvious as the answer to the question “should we torture?” All sorts of justifications and rationalizations exist for wriggling out from under the obvious answers, but no person of integrity entertains such a debate for longer than it takes to dismiss the question.

A friend gave me the first book of The Chronicles of Thomas Covenant by Stephen Donaldson to read. I don’t read a lot of fiction, but this is a nice diversion from my usual fare. The novel is so thoroughly derivative of Tolkien I find myself irritated frequently, but it has its own ideas and devices, too. Though only halfway into the book, one idea caught my attention distinctly.

Among the numerous races of people and characters are the Haruchai, a warrior class that serves the Lords of the Land. The Haruchai are reminiscent of the Samurai. What struck me, however, is the manner in which the Haruchai came into the service of the Lords. Some 2000 years before the time of the first novel, the Haruchai prepared to wage war against the Land, but the Lords refused to go to war lest the Haruchai be destroyed utterly. Instead, the Lords gave to the Haruchai precious gifts. Oddly, the Haruchai responded by taking a vow of service to the Lords for a debt that could never be fully repaid. Although never quite stated so baldly, the Haruchai basically enslaved themselves to the Lords, presumably out of gratitude.

These two acts — refusal to destroy one’s enemy and self-enslavement — are pretty remarkable. If applied to our current geopolitics, it would suggest that the U.S. might think twice about its preemptive wars against minor powers, and those minor powers might consider some form of tribute for the greater power’s refusal to invade or otherwise engage. Of course, that’s idealistic. What we have instead are the lone world superpower beating up on everyone else, like the tantrums of a schoolyard bully, and the irrational promises of at least one victim of our aggression to deliver the mother of all battles, only to fail in less than a month yet subsequently mount a surprisingly effective insurgency. If the situation in the Covenant novel is slightly comical, it’s certainly matched by the real-world situation in which we find ourselves.

Like Tolkien’s novels, Donaldson’s work appears to be the subject of considerable analysis. I haven’t read any of it, since I don’t want to spoil my reading pleasure. So I don’t know if this observation has been made, but it wouldn’t surprise me if Donaldson has conceived of his characters and their world as being profoundly stupid, as in cognitively challenged. Sure, they adhere to strict codes of honor and integrity (an almost child-like allegiance), and their florid, Tolkienesque language is sophisticated, but from what I’ve read so far, they’re also bumbling fools in their absolutism and inability to regain lost lore and knowledge. What else but sheer stupidity would compel a people to enslave itself out of gratitude or a generous people to accept such an arrangement?