Posts Tagged ‘Human Nature’

I revisit my old blog posts when I see some reader activity in the WordPress backstage and was curious to recall a long quote of Iain McGilchrist summarizing arguments put forth by Anthony Giddens in his book Modernity and Self-identity (1991). Giddens had presaged recent cultural developments, namely, the radicalization of nativists, supremacists, Social Justice Warriors (SJWs), and others distorted by absorbed in identity politics. So I traipsed off to the Chicago Public Library (CPL) and sought out the book to read. Regrettably, CPL didn’t have a copy, so I settled on a slightly earlier book, The Consequences of Modernity (1990), which is based on a series of lectures delivered at Stanford University in 1988.

Straight away, the introduction provides a passage that goes to the heart of matters with which I’ve been preoccupied:

Today, in the late twentieth century, it is argued by many, we stand at the opening of a new era … which is taking us beyond modernity itself. A dazzling variety of terms has been suggested to refer to this transition, a few of which refer positively to the emergence of a new type of social system (such as the “information society” or the “consumer society”) but most of which suggest rather than a preceding state of affairs is drawing to a close … Some of the debates about these matters concentrate mainly upon institutional transformations, particularly those which propose that we are moving from a system based upon the manufacture of material goods to one concerned more centrally with information. More commonly, however, those controversies are focused largely upon issues of philosophy and epistemology. This is the characteristic outlook, for example, of the the author who has been primarily responsible for popularising the notion of post-modernity, Jean-François Lyotard. As he represents it, post-modernity refers to a shift away from attempts to ground epistemology and from from faith in humanly engineered progress. The condition of post-modernity is distinguished by an evaporating of the “grand narrative” — the overarching “story line” by means of which we are placed in history as being having a definite past and a predictable future. The post-modern outlook sees a plurality of heterogeneous claims to knowledge, in which science does not have a privileged place. [pp. 1–2, emphasis added]

That’s a lot to unpack all at once, but the fascinating thing is that notions now manifesting darkly in the marketplace of ideas were already in the air in the late 1980s. Significantly, this was several years still before the Internet brought the so-called Information Highway to computer users, before the cell phone and smart phone were developed, and before social media displaced traditional media (TV was only 30–40 years old but had previously transformed our information environment) as the principal way people gather news. I suspect that Giddens has more recent work that accounts for the catalyzing effect of the digital era (including mobile media) on culture, but for the moment, I’m interested in the book in hand.

Regular readers of this blog (I know of one or two) already know my armchair social criticism directed to our developing epistemological crisis (challenges to authority and expertise, psychotic knowledge, fake news, alternative facts, dissolving reality, and science denial) as well as the Transhumanist fantasy of becoming pure thought (once we evolve beyond our bodies). Until that’s accomplished with imagined technology, we increasingly live in our heads, in the abstract, disoriented and adrift on a bewildering sea of competing narratives. Moreover, I’ve stated repeatedly that highly mutable story (or narrative) underlie human cognition and consciousness, making most of us easy marks for charismatic thought leaders storytellers. Giddens was there nearly 30 years ago with these same ideas, though his terms differ.

Giddens dispels the idea of post-modernity and insists that, from a sociological perspective, the current period is better described as high modernism. This reminds me of Oswald Spengler and my abandoned book blogging of The Decline of the West. It’s unimportant to me who got it more correct but note that the term Postmodernism has been adopted widely despite its inaccuracy (at least according to Giddens). As I get further into the book, I’ll have plenty more to say.


Here’s a familiar inspirational phrase from The Bible: the truth shall set you free (John 8:32). Indeed, most of us take it as, um, well, gospel that knowledge and understanding are unqualified goods. However, the information age has turned out to be a mixed blessing. Any clear-eyed view of the the way the world works and its long, tawdry history carries with it an inevitable awareness of injustice, inequity, suffering, and at the extreme end, some truly horrific epaisodes of groups victimizing each other. Some of the earliest bits of recorded history, as distinguished from oral history, are financial — keeping count (or keeping accounts). Today differs not so much in character as in the variety of counts being kept and the sophistication of information gathering.

The Bureau of Labor Statistics, a part of the U.S. Department of Labor, is one information clearinghouse that slices and dices available data according to a variety of demographic characteristics. The fundamental truth behind such assessments, regardless of the politics involved, is that when comparisons are made between unlike groups, say, between men and women or young and old, one should expect to find differences and indeed be rather surprised if comparisons revealed none. So the question of gender equality in the workplace, or its implied inverse, gender inequality in the workplace, is a form of begging the question, meaning that if one seeks differences, one shall most certainly find them. But those differences are not prima facie evidence of injustice in the sense of the popular meme that women are disadvantaged or otherwise discriminated against in the workplace. Indeed, the raw data can be interpreted according to any number of agendas, thus the phrase “lying with statistics,” and most of us lack the sophistication to contextualize statistics properly, which is to say, free of the emotional bias that plagues modern politics, and more specifically, identity politics.

The fellow who probably ran up against this difficulty the worst is Charles Murray in the aftermath of publication of his book The Bell Curve (1994), which deals with how intelligence manifests differently across demographic groups yet functions as the primary predictor of social outcomes. Murray is particularly well qualified to interpret data and statistics dispassionately, and in true seek-and-find fashion, differences between groups did appear. It is unclear how much his resulting prescriptions for social programs are borne out of data vs. ideology, but most of us are completely at sea wading through the issues without specialized academic training to make sense of the evidence.

More recently, another fellow caught in the crosshairs on issues of difference is James Damore, who was fired from his job at Google after writing what is being called an anti-diversity manifesto (but might be better termed an internal memo) that was leaked and then went viral. The document can be found here. I have not dug deeply into the details, but my impression is that Damore attempted a fairly academic unpacking of the issue of gender differences in the workplace as they conflicted with institutional policy only to face a hard-set ideology that is more RightThink than truth. In Damore’s case, the truth did set him free — free from employment. Even the NY Times recognizes that the Thought Police sprang into action yet again to demand that its pet illusions about society be supported rather than dispelled. These witch hunts and shaming rituals (vigilante justice carried out in the court of public opinion) are occurring with remarkable regularity.

In a day and age where so much information (too much information, as it turns out) is available to us to guide our thinking, one might hope for careful, rational analysis and critical thinking. However, trends point to the reverse: a return to tribalism, xenophobia, scapegoating, and victimization. There is also a victimization Olympics at work, with identity groups vying for imaginary medals awarded to whoever’s got it worst. I’m no Pollyanna when it comes to the notion that all men are brothers and, shucks, can’t we all just get along? That’s not our nature. But the marked indifference of the natural world to our suffering as it besets us with drought, fire, floods, earthquakes, tsunamis, hurricanes, tornadoes, and the like (and this was just the last week!) might seem like the perfect opportunity to find within ourselves a little grace and recognize our common struggles in the world rather than add to them.

Violent events of the past week (Charleston, VA; Barcelona, Spain) and political responses to them have dominated the news cycle, pushing other newsworthy items (e.g., U.S.-South Korean war games and a looming debt ceiling crisis) off the front page and into the darker recesses of everyone’s minds (those paying attention, anyway). We’re absorbed instead with culture wars run amok. I’m loath to apply the term terrorism to regular periodic eruptions of violence, both domestic and foreign. That term carries with it intent, namely, the objective to create day-to-day terror in the minds of a population so as to interfere with proper functions of society. It’s unclear to me whether recent perpetrators of violence are coherent enough to formulate sophisticated motivations or plans. The dumb, obvious way of doing things — driving into crowds of people — takes little or no planning and may just as well be the result of inchoate rage boiling over in a moment of high stress and opportunity. Of course, it needn’t be all or nothing, and considering our reflexively disproportionate responses, the term terrorism and attendant destabilization is arguably accurate even without specified intent. That’s why in the wake of 9/11 some 16 years ago, the U.S. has become a security state.

It’s beyond evident that hostilities have been simmering below the not-so-calm surface. Many of those hostilities, typically borne out of economic woes but also part of a larger clash of civilizations, take the form of identifying an “other” presumably responsible for one’s difficulties and then victimizing the “other” in order to elevate oneself. Of course, the “other” isn’t truly responsible for one’s struggles, so the violent dance doesn’t actually elevate anyone, as in “supremacy”; it just wrecks both sides (though unevenly). Such warped thinking seems to be a permanent feature of human psychology and enjoys popular acceptance when the right “other” is selected and universal condemnation when the wrong one is chosen. Those doing the choosing and those being chosen haven’t changed much over the centuries. Historical Anglo-Saxons and Teutons choose and people of color (all types) get chosen. Jews are also chosen with dispiriting regularity, which is an ironic inversion of being the Chosen People (if you believe in such things — I don’t). However, any group can succumb to this distorted power move, which is why so much ongoing, regional, internecine conflict exists.

As I’ve been saying for years, a combination of condemnation and RightThink has simultaneously freed some people from this cycle of violence but merely driven the holdouts underground. Supremacy in its various forms (nationalism, racism, antisemitism, etc.) has never truly been expunged. RightThink itself has morphed (predictably) into intolerance, which is now veering toward radicalism. Perhaps a positive outcome of this latest resurgence of supremacist ideology is that those infected with the character distortion have been emboldened to identify themselves publicly and thus can be dealt with somehow. Civil authorities and thought leaders are not very good at dealing with hate, often shutting people out of the necessary public conversation and/or seeking to legislate hate out of existence with restrictions on free speech. But it is precisely through free expression and diplomacy that we address conflict. Violence is a failure to remain civil (duh!), and war (especially the genocidal sort) is the extreme instance. It remains to be seen if the lid can be kept on this boiling pot, but considering cascade failures lined up to occur within the foreseeable future, I’m pessimistic that we can see our way past the destructive habit of shifting blame onto others who often suffer even worse than those holding the reins of power.

I have just one previous blog post referencing Daniel Siegel’s book Mind and threatened to put the book aside owing to how badly it’s written. I haven’t yet turned in my library copy and have made only modest additional progress reading the book. However, Siegel came up over at How to Save the World, where at least one commentator was quite enthusiastic about Siegel’s work. In my comment there, I mentioned the book only to suggest that his appreciation of the relational nature of the mind (and cognition) reinforces my long-held intuition that the self doesn’t exist in an idealized vacuum, capable of modeling and eventually downloading to a computer or some other Transhumanist nonsense, but is instead situated as much between us as within us. So despite Siegel’s clumsy writing, this worthwhile concept deserves support.

Siegel goes on to wonder (without saying he believes it to be true — a disingenuous gambit) that perhaps there exists an information field, not unlike the magnetic field or portions of the light spectrum, that affects us yet falls outside the scope of our direct perception or awareness. Credulous readers might leap to the conclusion that the storied collective consciousness is real. Some fairly trippy theories of consciousness propose that the mind is actually more like an antenna receiving signals from some noncorporeal realm (e.g., a quantum dimension) we cannot identify yet tap into constantly, measuring against and aligning with the wider milieu in which we function. Even without expertise in zoology, one must admit that humans are social creatures operating at various levels of hierarchy including individual, family, clan, pack, tribe, nation-state, etc. We’re less like mindless drones in a hive (well, some of us) and more like voluntary and involuntary members of gangs or communities formed along various familial, ethnic, regional, national, language group, and ideological lines. Unlike Siegel, I’m perfectly content with existing terminology and feel no compulsion to coin new lingo or adopt unwieldy acronyms to mark my territory.

What Siegel hasn’t offered is an observation on how our reliance on and indebtedness to the public sphere (via socialization) have changed with time as our mode of social organization has morphed from a predominantly localized, agrarian existence prior to the 20th century to a networked, high-density, information-saturated urban and suburban existence in the 21st century. The public sphere was always out there, of course, especially as embodied in books, periodicals, pamphlets, and broadsides (if one was literate and had reliable access to them), but the unparalleled access we now enjoy through various electronic devices has not only reoriented but disoriented us. Formerly slow, isolated information flow has become a veritable torrent or deluge. It’s not called the Information Age fer nuthin’. Furthermore, the bar to publication  — or insertion into the public sphere — has been lowered to practical nonexistence as the democratization of production has placed the tools of widely distributed exposure into the hands of everyone with a blog (like mine) or Facebook/Instagram/Twitter/Pinterest/LinkedIn account. As a result, a deep erosion of authority has occurred, since any yahoo can promulgate the most reckless, uninformed (and disinformed) opinions. The public’s attention riveted on celebrity gossip and House of Cards-style political wrangling, false narratives, fake news, alternative facts, and disinformation also make navigating the public sphere with much integrity impossible for most. For instance, the MSN and alternative media alike are busy selling a bizarre pageant of Russian collusion and interference with recent U.S. elections as though the U.S. were somehow innocent of even worse meddling abroad. Moreover, it’s naïve to think that the public sphere in the U.S. isn’t already completely contaminated from within by hucksters, corporations (including news media), and government entities with agendas ranging from mere profit seeking to nefarious deployment and consolidation of state power. For example, the oil and tobacco industries and the Bush Administration all succeeded in suppressing truth and selling rank lies that have landed us in various morasses from which there appears to be no escape.

If one recognizes his or her vulnerability to the depredations of info scammers of all types and wishes to protect oneself, there are two competing strategies: insulation and inoculation. Insulation means avoiding exposure, typically by virtue of mind-cleansing behaviors, whereas inoculation means seeking exposure in small, harmless doses so that one can handle a larger infectious attack. It’s a medical metaphor that springs from meme theory, where ideas propagate like viruses, hence, the notion of a meme “going viral.” Neither approach is foolproof. Insulation means plugging one’s ears or burying one’s head in the sand at some level. Inoculation risks spreading the infection. If one regards education as an inoculation of sorts, seeking more information of the right types from authoritative sources should provide means to combat the noise in the information signals received. However, as much as I love the idea of an educated, informed public, I’ve never regarded education as a panacea. It’s probably a precondition for sound thinking, but higher education in particular has sent an entire generation scrambling down the path of identity politics, which sounds like good ideas but leads inevitably to corruption via abstraction. That’s all wishful thinking, though; the public sphere we actually witness has gone haywire, a condition of late modernism and late-stage capitalism that has no known antidote. Enjoy the ride!

The Internet is now a little more than two decades old (far more actually, but I’m thinking of its widespread adoption). Of late, it’s abundantly clear that, in addition to being a wholesale change in the way we disseminate and gather information and conduct business, we’re running live social experiments bearing psychological influence, some subtle, some invasive, much like the introduction of other media such as radio, cinema, and TV back in the day. About six years ago, psychologists coined the term digital crowding, which I just discovered, referring to an oppressive sense of knowing too much about people, which in turn provokes antisocial reactions. In effect, it’s part of the Dark Side of social media (trolling and comments sections being other examples), one of numerous live social experiments.

I’ve given voice to this oppressive knowing-too-much on occasion by wondering why, for instance, I know anything — largely against my will, mind you — about the Kardashians and Jenners. This is not the sole domain of celebrities and reality TV folks but indeed anyone who tends to overshare online, typically via social media such as Facebook, less typically in the celebrity news media. Think of digital crowding as the equivalent of seeing something you would really prefer not to have seen, something no amount of figurative eye bleach can erase, something that now simply resides in your mind forever. It’s the bell that can’t be unrung. The crowding aspect is that now everyone’s dirty laundry is getting aired simultaneously, creating pushback and defensive postures.

One might recognize in this the familiar complaint of Too Much Information (TMI), except that the information in question is not the discomfiting stuff such as personal hygiene, medical conditions, or sexual behaviors. Rather, it’s an unexpected over-awareness of everyone’s daily minutiae as news of it presses for attention and penetrates our defenses. Add it to the deluge that is causing some of us to adopt information avoidance.

I often review my past posts when one receives a reader’s attention, sometimes adding tags and fixing typos, grammar, and broken links. One on my greatest hits (based on voting, not traffic) is Low Points in Education. It was among the first to tackle what I have since called our epistemological crisis, though I didn’t begin to use the epistemology tag until later. The crisis has caught up with a vengeance, though I can’t claim I’m the first to observe the problem. That dubious honor probably goes to Stephen Colbert, who coined the word truthiness in 2005. Now that alternative facts and fake news have entered the lingo as well (gaslighting has been revived), everyone has jumped on the bandwagon questioning the truthfulness or falsity behind anything coughed up in our media-saturated information environment. But as suggested in the first item discussed in Low Points in Education, what’s so important about truth?

It would be obvious and easy yet futile to argue in favor of high-fidelity appreciation of the world, even if only within the surprisingly narrow limits of human perception, cognition, and memory (all interrelated). Numerous fields of endeavor rely upon consensus reality derived from objectivity, measurement, reason, logic, and, dare I say it, facticity. Regrettably, human cognition doesn’t adhere any too closely to those ideals except when trained to value them. Well-educated folks have better acquaintance with such habits of mind; folks with formidable native intelligence can develop true authority, too. For the masses, however, those attributes are elusive, even for those who have partied through earned college degrees. Ironically worse, perhaps, are specialists, experts, and overly analytical intellectuals who exhibit what the French call a déformation professionelle. Politicians, pundits, and journalists are chief among the deformed and distorted. Mounting challenges to establishing truth now destabilize even mundane matters of fact, and it doesn’t help that myriad high-profile provocateurs (including the Commander in Chief, to whom I will henceforth refer only as “45”) are constantly throwing out bones for journalists to chase like so many unnourishing rubber chew toys.

Let me suggest, then, that human cognition, or more generally the mind, is an ongoing balancing act, making adjustments to stay upright and sane. Like the routine balance one keeps during locomotion, shifting weight side to side continuously, falling a bit only to catch oneself, difficulty is not especially high. But with the foundation below one’s feet shaking furiously, so to speak, legs get wobbly and many end up (figuratively at least) ass over teakettle. Further, the mind is highly situational, contingent, and improvisational and is prone to notoriously faulty perception even before one gets to marketing, spin, and arrant lies promulgated by those intent on coopting or directing one’s thinking. Simply put, we’re not particularly inclined toward accuracy but instead operate within a wide margin of error. Accordingly, we’re quite strong at adapting to ever-changing circumstance.

That strength turns out to be our downfall. Indeed, rootless adjustment to changing narrative is now so grave that basic errors of attribution — which entities said and did what — make it impossible to distinguish allies from adversaries reliably. (Orwell captured this with his line from the novel 1984, “Oceania was at war with Eurasia; therefore Oceania had always been at war with Eurasia.) Thus, on the back of a brazen propaganda campaign following 9/11, Iraq morphed from U.S. client state to rogue state demanding preemptive war. (Admittedly, the U.S. State Department had already lost control of its puppet despot, who in a foolish act of naked aggression tried to annex Kuwait, but that was a brief, earlier war quite unlike the undeclared one in which the U.S. has been mired for 16 years.) Even though Bush Administration lies have been unmasked and dispelled, many Americans continue to believe (incorrectly) that Iraq possessed WMDs and posed an existential threat to the U.S. The same type of confusion is arguably at work with respect to China, Russia, and Israel, which are mixed up in longstanding conflicts having significant U.S. involvement and provocation. Naturally, the default villain is always Them, never Us.

So we totter from moment to moment, reeling drunkenly from one breathtaking disclosure to the next, and are forced to reorient continuously in response to whatever the latest spin and spew happen to be. Some institutions retain the false sheen of respectability and authority, but for the most part, individuals are free to cherry-pick information and assemble their own truths, indulging along the way in conspiracy and muddle-headedness until at last almost no one can be reached anymore by logic and reason. This is our post-Postmodern world.


Posted: February 26, 2017 in Cinema, Culture, Idle Nonsense, Sports
Tags: , , ,

Early in the process of socialization, one learns that the schoolyard cry “Fight!” is half an alert (if one is a bystander) to come see and half an incitement to violence (if one is just entering into conflict). Fascination with seeing people duke it out, ostensibly to settle conflicts, never seems to grow old, though the mixed message about violence never solving anything sometimes slows things down. (Violence does in fact at least put an end to things. But the cycle of violence continues.) Fights have also lost the respectability of yore, where the victor (as with a duel or a Game of Thrones fight by proxy) was presumed to be vindicated. Now we mostly know better than to believe that might makes right. Successful aggressors can still be villains. Still, while the primal instinct to fight can be muted, it’s more typically channeled into entertainment and sport, where it’s less destructive than, say, warrior culture extending all the way from clans and gangs up to professional militaries.

Fighting in entertainment, especially in cinema, often depicts invulnerability that renders fighting pointless and inert. Why bother hitting Superman, the Incredible Hulk, Wolverine, or indeed any number of Stallone, Schwarzenegger, Segal, or Statham characters when there is no honest expectation of doing damage? They never get hurt, just irritated. Easy answer: because the voyeurism inherent in fighting endures. Even when the punchfest is augmented by guns we watch, transfixed by conflict even though outcomes are either predictable (heroes and good guys almost always win), moot, or an obvious set-up for the next big, stupid, pointless battle.

Fighting in sport is perhaps most classical in boxing, with weight classes evening out the competition to a certain degree. Boxing’s popularity has waxed and waned over time as charismatic fighters come and go, but like track and field, it’s arguably one of the purest expressions of sport, being about pure dominance. One could also argue that some team sports, such as hockey and American-style football, are as much about the collateral violence as about scoring goals. Professional wrestling, revealed to be essentially athletic acting, blends entertainment and sport, though without appreciable loss of audience appeal. As with cinema, fans seem to not care that action is scripted. Rising in popularity these days is mixed martial arts (MMA), which ups the ante over boxing by allowing all manner of techniques into the ring, including traditional boxing, judo, jiu-jitsu, wrestling, and straight-up brawling. If brawling works in the schoolyard and street against unwilling or inexperienced fighters, it rarely succeeds in the MMA ring. Skill and conditioning matter most, plus the lucky punch.

Every kid, boy or girl, is at different points bigger, smaller, or matched with someone else when things start to get ugly. So one’s willingness to engage and strategy are situational. In childhood, conflict usually ends quickly with the first tears or bloodied nose. I’ve fought on rare occasion, but I’ve never ever actually wanted to hurt someone. Truly wanting to hurt someone seems to be one attribute of a good fighter; another is the lack of fear of getting hit or hurt. Always being smaller than my peers growing up, if I couldn’t evade a fight (true for me most of the time), I would defend myself, but I wasn’t good at it. Reluctant willingness to fight was usually enough to keep aggressors at bay. Kids who grow up in difficult circumstances, fighting with siblings and bullies, and/or abused by a parent or other adult, have a different relationship with fighting. For them, it’s unavoidable. Adults who relish being bullies join the military and/or police or maybe become professional fighters.

One would have to be a Pollyanna to believe that we will eventually rise above violence and use of force. Perhaps it’s a good thing that in a period of relative peace (in the affluent West), we have alternatives to being forced to defend ourselves on an everyday basis and where those who want to can indulge their basic instinct to fight and establish dominance. Notions of masculinity and femininity are still wrapped up in how one expresses these urges, though in characteristic PoMo fashion, traditional boundaries are being erased. Now, everyone can be a warrior.

I pause periodically to contemplate deep time, ancient history, and other subjects that lie beyond most human conceptual abilities. Sure, we sorta get the idea of a very long ago past out there in the recesses or on the margins, just like we get the idea of U.S. sovereign debt now approaching $20 trillion. Problem is, numbers lose coherence when they mount up too high. Scales differ widely with respect to time and currency. Thus, we can still think reasonably about human history back to roughly 6,000 years ago, but 20,000 years ago or more draws a blank. We can also think about how $1 million might have utility, but $1 billion and $1 trillion are phantoms that appear only on ledgers and contracts and in the news (typically mergers and acquisitions). If deep time or deep debt feel like they don’t exist except as conceptual categories, try wrapping your head around the deep state , which in the U.S. is understood to be a surprisingly large rogue’s gallery of plutocrats, kleptocrats, and oligarchs drawn from the military-industrial-corporate complex, the intelligence community, and Wall Street. It exists but does so far enough outside the frame of reference most of us share that it effectively functions in the shadow of daylight where it can’t be seen for all the glare. Players are plain enough to the eye as they board their private jets to attend annual meetings of the World Economic Forum in Davos-Klosters, Switzerland, or two years ago the Jackson Hole [Economic] Summit in Jackson Hole, WY, in connection with the American Principles Project, whatever that is. They also enjoy plausible deniability precisely because most of us don’t really believe self-appointed masters of the universe can or should exist.

Another example of a really bad trip down the rabbit hole, what I might call deep cynicism (and a place I rarely allow myself to go), appeared earlier this month at Gin and Tacos (on my blogroll):

The way they [conservatives] see it, half the kids coming out of public schools today are basically illiterate. To them, this is fine. We have enough competition for the kinds of jobs a college degree is supposed to qualify one for as it is. Our options are to pump a ton of money into public schools and maybe see some incremental improvement in outcomes, or we can just create a system that selects out the half-decent students for a real education and future and then warehouse the rest until they’re no longer minors and they’re ready for the prison-poverty-violence cycle [add military] to Hoover them up. Vouchers and Charter Schools are not, to the conservative mind, a better way to educate kids well. They are a cheaper way to educate them poorly. What matters is that it costs less to people like six-figure income earners and home owners. Those people can afford to send their kids to a decent school anyway. Public education, to their way of thinking, used to be about educating people just enough that they could provide blue collar or service industry labor. Now that we have too much of that, a public high school is just a waiting room for prison. So why throw money into it? They don’t think education “works” anyway; people are born Good or Bad, Talented or Useless. So it only makes sense to find the cheapest possible way to process the students who were written off before they reached middle school. If charter schools manage to save 1% of them, great. If not, well, then they’re no worse than public schools. And they’re cheaper! Did I mention that they’re cheaper?

There’s more. I provided only the main paragraph. I wish I could reveal that the author is being arch or ironic, but there is no evidence of that. I also wish I could refute him, but there is similarly no useful evidence for that. Rather, the explanation he provides is a reality check that fits the experience of wide swaths of the American public, namely, that “public high school is just a waiting room for prison” (soon and again, debtor’s prison) and that it’s designed to be just that because it’s cheaper than actually educating people. Those truly interesting in being educated will take care of it themselves. Plus, there’s additional money to be made operating prisons.

Deep cynicism is a sort of radical awareness that stares balefully at the truth and refuses to blink or pretend. A psychologist might call it the reality principle; a scientist might aver that it relies unflinchingly on objective evidence; a philosopher might call it strict epistemology. To get through life, however, most of us deny abundant evidence presented to us daily in favor of dreams and fantasies that assemble into the dominant paradigm. That paradigm includes the notions that evil doesn’t really exist, that we’re basically good people who care about each other, and that our opportunities and fates are not, on the whole, established long before we begin the journey.

Anthropologists, pundits, armchair cultural critics (like me), and others sometimes offer an aspect or characteristic, usually singular, that separates the human species from other animals. (Note: humans are animals, not the crowning creation of god in his own image, the dogma of major religions.) Typical singular aspects include tool use (very early on, fire), language, agriculture, self-awareness (consciousness), and intelligence, that last including especially the ability to conceptualize time and thus remember and plan ahead. The most interesting candidate suggested to me is our ability to kill from a distance. Without going into a list of things we don’t think we share with other species but surprisingly do, it interests me that none other possesses the ability to kill at a distance (someone will undoubtedly prove me wrong on this).

Two phrases spring to mind: nature is red in tooth and claw (Tennyson) and human life is nasty, brutish, and short (Hobbes). Both encapsulate what it means to have to kill to eat, which is hardly unique to animals. All sorts of plants, insects, and microorganisms embed themselves in hosts, sometimes killing the host and themselves. Symbiotic relationships also exist. The instance that interests me, though, is the act of killing in the animal kingdom that requires putting one’s own body at risk in life-or-death attack. Examples falling short of killing abound, such as intimidation to establish hierarchy, but to eat, an animal must kill its prey.

Having watched my share of historical fiction (pre-1800, say, but especially sword-and-sandal and medieval epics) on the TeeVee and at the cinema, the dramatic appeal of warring armies slamming into each other never seems to get old. Fighting is hand-to-hand or sword-to-sword, which are tantamount to the same. Archer’s arrows, projectiles launched from catapults and trebuchets, thrown knives, spears, and axes, and pouring boiling oil over parapets are killing from a relatively short distance, but the action eventually ends up being very close. The warrior code in fighting cultures honors the willingness to put oneself in harm’s way, to risk one’s own body. Leaders often exhibit mutual respect and may even share some intimacy. War may not be directly about eating, since humans are not cannibals under most circumstances; rather, it’s usually about control of resources, so secondarily about eating by amassing power. Those historical dramas often depict victors celebrating by enjoying lavish feasts.

Modern examples of warfare and killing from a distance make raining down death from above a bureaucratic action undertaken with little or no personal risk. Artillery, carpet bombing from 20,000 feet, drone strikes (controlled from the comfort of some computer lab in the Utah desert), and nuclear bombs are the obvious examples. No honorable warrior code attaches to such killing. Indeed, the chain of command separates the execution of kill orders from moral responsibility — probably a necessary disconnect when large numbers of casualties (collateral damage, if one prefers the euphemism) can be expected. Only war criminals, either high on killing or banally impervious to empathy and compassion, would dispatch hundreds of thousands at a time.

If killing from a distance is in most cases about proximity or lack thereof, one further example is worth mentioning: killing across time. While most don’t really conceptualize the space-time continuum as interconnected, the prospect of choices made today manifesting in megadeath in the foreseeable future is precisely the sort of bureaucratized killing from a distance that should be recognized and forestalled. Yet despite our supposed intellectual superiority over other species, we cannot avoid waging war, real and rhetorical, to control resources and narratives that enable us to eat. Eating the future would be akin to consuming seed corn, but that metaphor is not apt. Better perhaps to say that we’re killing the host. We’re embedded in the world, as indeed is everything we know to be alive, and rely upon the profundity of the biosphere for survival. Although the frequent charge is that humanity is a parasite or has become as cancer on the world, that tired assessment, while more accurate than not, is a little on the nose. A more charitable view is that, as a species, humanity, as the apex predator, has expanded its habitat to include the entire biosphere, killing to eat, and is slowly consuming and transforming it into a place uninhabitable by us, just as a yeast culture consumes its medium and grows to fill the space before dying all at once. So the irony or Pyrrhic victory is that we while we may fatten ourselves (well, some of us) in the short term, we have also created conditions leading to our own doom. Compared to other species whose time on Earth lasted tens of millions of years, human life on Earth turns out to be exactly what Hobbes said: nasty, brutish, and short.

I discovered “The Joe Rogan Experience” on YouTube recently and have been sampling from among the nearly 900 pod- or webcasts posted there. I’m hooked. Rogan is an impressive fellow. He clearly enjoys the life of the mind but, unlike many who are absorbed solely in ideas, has not ignored the life of the body. Over time, he’s also developed expertise in multiple endeavors and can participate knowledgeably in discussion on many topics. Webcasts are basically long, free-form, one-on-one conversations. This lack of structure gives the webcast ample time to explore topics in depth or simply meander. Guests are accomplished or distinguished in some way and usually have fame and wealth to match, which often affects content (i.e., Fitzgerald’s observation: “The rich are different than you and me”). One notable bar to entry is having a strong media presence.

Among the recurring themes, Rogan trots out his techno optimism, which is only a step short of techno utopianism. His optimism is based on two interrelated developments in recent history: widespread diffusion of information over networks and rapid advances in medical devices that can be expected to accelerate, to enhance human capabilities, and soon to transform us into supermen, bypassing evolutionary biology. He extols these views somewhat regularly to his guests, but alas, none of the guests I’ve watched seem to be able to fathom the ideas satisfactorily enough to take up the discussion. (The same is true of Rogan’s assertion that money is just information, which is reductive and inaccurate.) They comment or joke briefly and move onto something more comfortable or accessible. Although I don’t share Rogan’s optimism, I would totally engage in discussion of his flirtation with Transhumanism (a term he doesn’t use). That’s why I’m blogging here about Rogan, in addition to my lacking enough conventional distinction and fame to score an invite to be a guest on his webcast. Plus, he openly disdains bloggers, many of whom moderate comments (I don’t) or otherwise channel discussion to control content. Oh, well.